Children and teens today have unprecedented (and growing) access to the internet. As usage among young people grows, so too do the risks associated with unfettered access to the web. Global survey data from DQ Institute shows that almost 70% of young people aged 8 to 18 are exposed to one or more forms of “cyber risk,” including risky content and contact, bullying, and overuse of technology.
To contend with this concerning trend, regulators are moving in a more nuanced direction to place more meaningful guardrails on websites, platforms, and online marketplaces to ensure underage users are protected from age-inappropriate content and age-restricted services and products.
Beyond protecting minors, these regulations fundamentally reshape online experiences. Let’s examine the intricacies of regulations worldwide and discover why compliance isn’t just a matter of following the rules — it’s also about improving digital experiences for users of all ages.
Key industry-specific age verification regulations
Social media regulations
In response to growing concerns about young people and social media use, regulators are working to place stronger safeguards on social media platforms. The laws discussed below can be grouped into two categories:
- Laws to set and enforce age thresholds for the use of social media
- Laws to protect young people from harmful content and age-inappropriate experiences on social media
Age threshold regulations
While social media platforms are not yet considered an age-restricted industry in the US, there is bipartisan support for social media legislation called the Protecting Kids on Social Media Act. The bill before the 118th Congress would require parental consent for minors to create social media accounts and sets a minimum age to use social media sites at 13. It would also require social media platforms to “take reasonable steps beyond merely requiring attestation” to verify users’ ages without violating privacy.
In the absence of national legislation setting age thresholds for social media use, several states are working to pass their own laws. For example, Ohio, Louisiana, and Florida, have passed bills (yet to go into effect) that require social media sites to limit children and teens under a certain age from creating accounts without parental consent. Some of these laws also require age verification technology.
There is no EU law restricting teens and children under a certain age from creating social media profiles; however, certain member states within the EU have introduced their own laws that place age thresholds on social media sites. For example, France approved a law (not yet in effect) requiring social media platforms to verify users’ ages and obtain parental consent for those under 15.
The Online Safety Act (OSA) in the UK is not limited to social media companies; however, the law includes several provisions that level up age-related requirements for platforms, including social media networks. For example, the EU’s General Data Protection Regulation (GDPR) restricts the collection of personal data from children under 13 without parental consent. OSA builds on these protections by requiring online businesses to implement systems for age estimation.
Regulations to protect minors from harmful or inappropriate content
In the US, a number of online safety bills are pending before Congress aimed at protecting young people from sensitive and harmful content such as suicide and self-harm videos, violent content, and adult content. These laws would apply to website operators, including social media platforms.
In the absence of comprehensive federal legislation, several states have passed or are working to pass state-specific laws to protect young people online from harmful content and age-inappropriate experiences. For example:
- Louisiana passed the Secure Online Child Interaction and Age Limitation Act which restricts minors under the age of 16 from creating an account on social media unless the minor has the express consent of a parent or guardian. The bill goes into effect on July 1, 2024, and requires social media companies to “make commercially reasonable efforts” to verify the age of each existing or new Louisiana account holder. If the account holder is a minor, the site must confirm that the minor has consent from a parent or guardian to open a new account or maintain an existing one. The bill also establishes new requirements for social media companies to create safer experiences for minor accounts including but not limited to prohibiting direct messages between a minor account and any other adult user that is not linked to the account through “friending.”
- Texas passed the Securing Children Online Through Parental Empowerment Act (SCOPE) designed to protect young people from sensitive content and addictive elements that are prevalent on social media sites. For example, SCOPE requires that social media sites allow parents to alter their children’s settings and set screen time limits. The bill also prohibits social media platforms and other digital service providers from targeting known minors with advertising, sharing their personally identifiable information, and collecting their geolocation data. The law is scheduled to go into effect in September 2024.
The EU’s Digital Services Act (DSA) requires online platforms, hosting providers, and other intermediaries operating in the EU to create safer digital spaces. The DSA enables users to flag illegal content such as hate speech and illegal goods; provides protections for people targeted by online harassment and cyberbullying; and prohibits advertising targeting children based on their personal data, among other provisions.
Very large online platforms (VLOPs), many of which are social media networks, must meet more stringent accountability standards per the law. For example, each year, VLOPs and very large online search engines must perform assessments to identify potential online risks for children and young people using their services.
The aforementioned OSA in the UK also places greater responsibility on online platforms (including social media sites) to establish safer online environments for children and teens. Under OSA, companies must take measures to prevent children from accessing content that is considered harmful, such as self-harm videos, graphic materials, and content depicting extreme violence. The law also requires companies to implement age assurance methods to enforce age limits.
Adult entertainment and content
In the US, companies that sell or distribute adult entertainment content must ensure that the purchaser is at least 18. In recent years, several US states have introduced new laws that require adult content websites to perform “reasonable” age verification. Fines for companies failing to meet age verification requirements can total up to $10,000 per day. Specific requirements for age verification vary by state. Below are a few examples.
- In Utah, any website with adult content that “fails to perform reasonable age verification methods” could be held liable for “damages resulting from a minor’s accessing the material.”
- Texas requires adult content websites to use “reasonable age verification methods,” suggesting the submission of a government-issued ID, “verification through an independent, third-party age verification service,” or usage of a “commercially reasonable method that relies on public or private transactional data” to verify users' ages as potential pathways for verification.
- Mississippi requires “pornographic media” to have age verification systems in place and specifies “reasonable methods,” such as commercial age verification systems that verify a user’s age with a government-issued ID or “any commercially reasonable method that relies on public or private transactional data.”
In the EU, the earlier-mentioned DSA aims to put into practice the principle that what is illegal offline is illegal online. Since access to pornography by minors is prohibited in the EU, the DSA provides greater protection for children against exposure to this content. Additionally, several such sites fall into the DSA’s VLOP categorization, meaning these sites must comply with higher legal accountability standards, including obligations to conduct annual assessments to determine the risks associated with the services offered by the platform, with particular scrutiny placed on disseminating illegal content. The law specifies that VLOPs must put into place “reasonable, proportionate, and effective mitigation measures… which may include age verification and parental control tools.”
Alcohol retail
Technology has transformed the way people purchase beer, wine, and spirits. According to the International Alliance for Responsible Drinking, between 2020 and 2024, online alcohol sales are expected to grow by over 74% in 20 key markets worldwide. Selling alcohol online requires website operators to consider not just who initiates the purchase of alcohol online but also who accepts the delivery of alcohol in person.
In the US, under the National Minimum Drinking Age Act of 1984, alcohol sales are restricted to those 21 and over. Along with federal laws, many states have enacted laws to ensure individuals under the age of 21 do not intercept alcohol deliveries. For example, in New York, it is a crime to sell, deliver, or give away alcoholic beverages to any person under 21; businesses are strongly encouraged to confirm the age of alcohol recipients upon delivery. In California, online businesses selling age-restricted products or services must ensure the purchaser is of legal age at the time of purchase or delivery, “including, but not limited to, verifying the age of the purchaser.”
Most countries have legal age requirements for alcohol sales. The World Health Organization (WHO) cites legal minimum ages for 173 countries. Below are some examples of international laws that include age verification requirements.
- In Ontario, Canada, under the Liquor License Act, it is unlawful to knowingly supply, serve, or sell alcohol to anyone under the age of 19. The retailer must inspect the ID of a person who appears to be under 19 before alcohol is sold, served, or delivered.
- In Malaysia, the Food (Amendment) Regulation of 2016 bumped the minimum drinking age from 18 to 21. Alcohol retailers must confirm that purchasers meet the minimum age before sale.
Tobacco retail
Just like alcohol, tobacco retailers selling online must ensure purchasers meet legal age requirements.
In the US, the Tobacco 21 legislation, signed into law in 2019, raised the federal minimum age for the sale of tobacco products from 18 to 21. According to the law, it is illegal to sell any tobacco product, including cigarettes, cigars, and e-cigarettes, to anyone under 21. Sellers must inspect the photo ID of anyone under 27 who attempts to purchase a tobacco product.
Other countries have enacted similar laws setting age minimums for the sale of tobacco products. According to Global Tobacco Control, 56 countries and jurisdictions have minimum age requirements for purchasing, selling, and using e-cigarette products; 35 countries and jurisdictions have minimum age of purchase/sale/use provisions for heated tobacco products; and 11 countries have minimum age of purchase/sale/use provisions for nicotine pouch products. Below are a couple of examples.
- Singapore’s Tobacco (Control of Advertisements and Sale) Act prohibits the sale of tobacco to individuals under 21. Retailers face a first-time seller fine of $5,000 and a license suspension. Subsequent violations carry $10,000 fines and licenses will be revoked.
- England and Wales’ The Children and Young Persons (Sale of Tobacco etc.) Order 2007 raised the minimum age for buying tobacco from 16 to 18.
Managing digital age verification globally: compliance and beyond
Global regulations make age verification essential for certain industries. However, deploying a dynamic age verification system doesn’t just enable your business to remain compliant as new regulations emerge and existing regulations shift — it also allows you to create the best experiences for your users by:
- Ensuring young people don’t inadvertently discover or access harmful or age-inappropriate materials
- Enabling a seamless age verification process so that of-age users can access age-restricted goods, services, and content online without experiencing friction and frustration
Below are a few areas of consideration when deploying an age verification system to ensure your business can comply with existing regulations, stay nimble in adapting to new and emerging laws, and deliver an excellent user experience.
Compliance considerations
- Storage of underage data. Data collection and processing laws in several jurisdictions include provisions related to storing and accessing minors’ data. To remain compliant, look for providers that maintain an audit trail of onboarded users without storing underage users' personally identifiable information (PII).
- Multiple verification methods. Verification method requirements vary depending on the regulated industry and jurisdiction. Work with vendors that offer a robust library of age verification methods, such as government ID checks, selfie verification, and credit card checks.
- Low- or no-code systems. The regulatory landscape will likely continue to evolve. Consider solutions that do not require engineering support to adjust user flows. This will enable product and compliance teams to make changes as regulations shift without compromising other critical functions.
Ease-of-use considerations
- Intuitive experience. Ensure the age verification process is easy to understand for the user. Look for a solution that allows your team to test different user flows and return data on pass rates to better understand user behaviors and pinpoint areas for improvement.
- Multiple verification methods. Some users may not have access to certain verification documents such as a government ID or credit card. Offering multiple verification options enables more users to quickly move through the verification process.
Case study: Playboy
Playboy, a global adult entertainment brand, launched a new premium platform to connect creators with their fans, expand their communities, and build their own content businesses. To launch the platform, Playboy needed a system to verify that users were over 18.
Playboy wanted a fast and reliable way to enforce age restrictions, so it leveraged Persona’s government ID verification with selfie verification for an added layer of assurance. With Persona’s configurable system, Playboy could customize and adjust the user flows to craft an optimal age verification workflow and user experience. As Playboy’s CPO explained, “One of the main things I was looking for was the ability to customize the back end. What was really appealing to me was Persona’s automation workflows.”
With Persona’s flexible system and robust set of verification methods, Playboy meets age compliance requirements in its global markets with an automated, adaptable verification process that enables the platform to scale as it grows.
Simplifying global expansion with Persona
Persona’s flexible age verification system empowers teams to build custom identity flows, dynamically present relevant experiences, and automate decisions. With Persona’s modular tools and robust library of verification options, compliance teams can efficiently stay on top of changing regulations, and global expansion teams can gain confidence in successfully entering into new markets.
Need an age verification solution that can scale with your business? Contact us to learn more or get started for free.