A growing number of local and federal governments around the world have begun to implement new age verification laws — and to update existing regulations. With more compliance considerations than ever before, it can be challenging to provide compliant age verification experiences while minimizing friction.
Keep reading for an overview of the most notable global age verification laws, a breakdown on which organizations those laws apply to, and strategies for optimizing your age verification experience.
Why digital age verification laws vary
Though most digital age verification laws have the same general goal — to protect children and young adults from age-restricted or age-inappropriate content, goods, and services — not all age verification laws are created alike.
Regulations vary based on location and jurisdiction, and organizations have different requirements depending on their industry, company size, annual gross revenue, and products and services.
For example, California’s Age-Appropriate Design Code (AADC) law — which protects the privacy of kids using online platforms through a number of measures, including setting all default settings to private — applies to companies with annual gross revenues over $25 million whose content is likely to be accessed by children.
Many laws also have different age thresholds. France’s recent age verification law requires social network service providers to deactivate accounts for children under 15 unless they have explicit parental consent, while the US’s COPPA law (more on that later) applies to kids under 13.
When you gather user data, you also have to comply with regulations on collecting and processing minors’ data, like the EU’s General Data Protection Regulation (GDPR) and Brazil’s General Personal Data Protection Act (LGPD). The huge variety of regulations across regions and industries is just one of the reasons it’s hard to create an effective age verification strategy.
Challenges with implementing age verification systems
Along with the patchwork of data privacy regulations and age verification legislation companies need to consider, there are a handful of other challenges when it comes to implementing a successful age verification experience:
- Varying assurance requirements: Some laws require government IDs, for example, while other legislation accepts age-gating.
- Cultural differences: Companies need to be mindful of the cultural norms around privacy and the availability of documents in different regions.
- Availability of data in different areas: Companies may not have access to the same types of data across jurisdictions, which means they need to design different age verification experiences for users in different places.
Want a deeper dive into age verification’s biggest challenges and compliance solutions? Check out our comprehensive ebook .
Key age verification laws (and data privacy regulations) around the world
Keep in mind: this list is just a sample of the global legislation on age verification and privacy that has passed or is currently being proposed.
KOSPA (US)
The Kids Online Safety and Privacy Act (KOSPA) combines proposed updates to COPPA, and the standalone KOSA bill (both described below) into a single piece of legislation. It was passed by the US Senate in July 2024, and is in the Resolving Differences phase before being sent to the President. If it passes, KOSPA will likely give companies of all sizes a new, stronger set of requirements to follow.
Please continue to check this space, which we’ll be updating as news develops.
COPPA (US)
The Children’s Online Privacy Protection Act (COPPA) of 1998 is a US federal law that protects the privacy of children under the age of 13. Under COPPA, websites that collect information about children under the age of 13 must make this information available upon request of the child’s parents. This includes all user records, log-in data, profiles, and transaction data. If a website doesn’t comply, the FTC can fine that website they could be fined up to $51,744 per violation.
There are eight key requirements websites need to abide by for COPPA regulatory compliance, including posting a clear and comprehensive privacy policy online, getting parental consent before collecting minors’ information, and deleting children’s data after use.
COPPA applies to:
- Operators of commercial websites and online services directed at children under 13 that collect, use, or disclose children’s personal information
- Operators of general audience websites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13
- Websites or online services that have actual knowledge that they are collecting personal information directly from users of another website or online service directed to children
KOSA (US)
Recently, the US Senate passed the Kids Online Safety Act (KOSA), a bill that would require companies to take reasonable steps to protect minors from harm on their online platforms. Under the law, covered platforms (like social media networks) need to implement certain safeguards to protect minors from online threats like bullying and sexual exploitation.
In addition to providing settings that restrict access to minors’ personal data and give parents tools to supervise their childrens’ use of a platform, covered platforms also have to:
- Disclose information about personalized recommendation systems and individual-specific advertising to minors;
- Let parents and guardians report online harms;
- Stop advertising age-restricted products and services to minors; and
- Report on the foreseeable risks or harms to minors using the platform
KOSA applies to: Any applications or services that connect to the internet and are likely to be used by minors, except internet service providers, educational institutions, and email services
FDA Regulations(US)
The Food and Drug Administration (FDA) just issued a final rule to the Further Consolidated Appropriations Act of 2020 to raise the minimum age for certain restrictions on tobacco product sales.
Starting September 30, 2024, retailers must use photo IDs to verify the age of anyone under 30 — increased from 27 — trying to purchase tobacco products. Under the new law, retailers also can’t sell tobacco products in vending machines in facilities where people under 21 — previously under 18 — are allowed to enter at any time.
The FDA’s new rule applies to: Retailers that sell tobacco products online or in-store
DSA (EU)
The Digital Services Act (DSA) requires online platforms, hosting providers, and other intermediaries that operate in the EU to create safer digital spaces that protect the fundamental rights of all users.
To comply with the DSA, these online platforms, hosting providers, and intermediaries have to implement transparent systems to detect, flag, and remove illegal content from their platforms. They’re also required to protect the information of minors. Plus, if your online platform hosts people who sell or trade goods and services, you’re required to collect certain information from them to verify their identity. Companies that don’t comply are subject to a penalty equal to 6% of their global annual gross revenues.
DSA applies to:
- Intermediary services providers that operate in the EU, e.g. companies offering network infrastructure, like internet access providers and domain name registrars;
- Hosting services providers that operate in the EU, like cloud and web hosting services; and
- Online platforms that operate in the EU that connect sellers and consumers, like online marketplaces, app stores, collaborative economy platforms, and social media platforms
OSA (UK)
The Online Safety Act (OSA) is a UK law that requires social media companies, messaging apps, search engines, and other digital platforms to implement a number of measures to improve online safety for kids and adults alike.
Under the law, companies have to:
- Scan for illegal content and remove it from their platform. Illegal content includes content related to terrorism, hate speech, self-harm, child sexual abuse, and revenge pornography.
- Prevent children from accessing content that is legal but considered harmful, including content displaying extreme violence, encouraging suicidal behavior, or glorifying disordered eating.
- Prevent children from accessing age-restricted content, like pornography.
- Implement age-assurance measures and enforce age limits.
- Conduct and publish assessments about the risks these platforms pose to children.
- Provide parents and children with a mechanism to report problems they encounter.
Because the UK’s data privacy laws require that users be at least 13 to join a social media platform without parental permission, companies need to implement an age verification system to comply with OSA.
OSA applies to:
- Online companies in the UK offering user-to-user services, like social media companies, online dating services, messaging apps, image/message boards, gaming providers, and video-sharing services
- Online companies in the UK offering search services
- Companies not located in the UK, but who have “significant number of UK users, if the UK is a target market or it is capable of being accessed by UK users and there is a material risk of significant harm to such users.”
How to optimize age verification for compliance, conversion, and user experience
Consider these three strategies for implementing an age verification experience that streamlines regulatory compliance without sacrificing user experience:
1. Dynamically change the experience
When low-risk users face too many obstacles before viewing a site, they tend to leave. On the other hand, you want to make sure your flow keeps the wrong users out. Enter: progressive risk segmentation, which lets you adjust the level of friction a user experiences during identity verification based on a series of risk signals.
With Persona’s Dynamic Flow product, you can remove or add friction depending on specific user signals, then create customizable verification flows for specific events and situations.
For helpful tips on choosing an identity verification product, check out our buyer’s guide to IDV solutions.
2. Offer a breadth of verification types
An effective age verification system needs to be adaptable, accounting for a variety of contextual factors, from differing age thresholds to the databases you have access to.
That’s why it’s critical to implement a customizable IDV solution, one that offers a variety of verification types. Think: government IDs, selfie checks, age estimation, and database verification.
3. Take advantage of automation tools
Using automation tools in your age verification technology saves you time, reduces error, and increases users’ privacy.
Persona’s Verifications tool lets you automate your flows to verify users quickly and decrease manual reviews. Plus, you can set the amount of friction needed for each individual verification. And with complex retry logic, you can rest assured you’re not losing users from common user errors like typos.
Simplifying global expansion with Persona
Expanding globally is a major milestone — and having the right support is crucial. Whether you’re just starting to develop a global expansion plan and align with cross-functional partners on compliance, fraud prevention, and business strategy, or whether you need to update and refine your existing expansion strategies, Persona is here to help.
Review our guide to verifying identities around the world, or contact us to find out how we can help you verify your global users seamlessly and compliantly.
Successful age verification implementations
See real examples of companies using Persona to scale their age verification systems: