It's challenging to recall a time when people weren't attached to their smartphones, and it’s hardly surprising, given how integrated our digital lives have become with our daily routines. We use the internet for everything, from checking the news to connecting with friends. A Pew Research survey from January 2024 shows that nine in ten U.S. adults go online daily. This includes 62% of adults aged 18 to 29 who report using the internet “almost constantly.”
While digital conveniences offer abundant benefits, the expansiveness of the web also poses significant risks for underage users who can easily access harmful or age-restricted content. With the digital divide by the wayside, lawmakers are proposing and enacting guardrails to help maintain some sense of safety on the information superhighway.
Right-to-use regulations: An overview
Right-to-use regulations focus on creating safer digital environments and are typically centered on age restrictions and limitations around using a platform or site. These regulations vary in scope, but collectively they aim to:
- Prevent minors from accessing potentially harmful information such as overly violent, sexual, or graphic content; self-harm videos; or other sensitive materials.
- Protect children and teens from experiencing online harm such as harassment, bullying, and grooming.
- Offer parents greater oversight over their children’s online experiences.
Global right-to-use age verification regulations
Lawmakers worldwide have passed bills to protect individuals from online harm. Below is a sample of some of the prominent pieces of legislation from countries worldwide.
Digital Services Act (DSA)
The Digital Services Act (DSA) mandates online platforms, hosting service providers, and intermediary services operating in the EU to establish safer digital spaces for all users, regardless of age. The law, in effect since 2023, aims to achieve this by safeguarding consumers from illegal content, goods, and services. For instance, if hate speech is deemed illegal in a specific member state, the online platform must implement mechanisms to moderate and remove such content. Similarly, if a particular product or service is prohibited in a member state, such as counterfeit goods, an online marketplace must prohibit buyers and sellers from engaging in transactions involving that product or service.
The DSA also includes special provisions to protect minors from online harm. For instance, the law bans using minors’ data to create targeted advertisements and introduces stronger protections for people targeted by online harassment and bullying. Additionally, for services primarily directed at minors, the platform must explain terms of use in a way minors can understand.
Online Safety Act (OSA)
The Online Safety Act (OSA) is a UK law that requires social media companies, messaging apps, search engines, and other digital platforms to implement mechanisms to create safer online spaces for children and adults. The OSA applies to online business accessible to UK users and specifically focuses on online companies offering two types of services: user-to-user services and search services.
The law, among many other requirements:
- Requires regulated companies to scan for illegal content and remove it from their platforms
- Prevents children from accessing harmful content such as extreme violence and age-restricted content such as pornography
- Implements age-assurance measures and enforces age limits
- Provides parents and children with mechanisms to report harmful content
Kids Online Safety Act (KOSA)
The Kids Online Safety Act (KOSA) is a piece of pending U.S. Senate legislation that has been awaiting floor consideration since December 2023. The bipartisan bill aims to protect minors from online harm by imposing requirements on certain platforms commonly used by minors, such as social media sites.
If passed as written, covered platforms must implement measures to:
- Prevent and address online harm, such as sexual exploitation and online bullying
- Allow users to report harmful content
- Provide safeguards for minors’ personal privacy and enable tools for parents and guardians to monitor their children’s platform use by controlling their privacy and account settings, for example
- Restrict advertising of age-restricted products or services to minors
Audiovisual Media Services Directive (AVMSD)
Since 2020, the Audiovisual Media Services Directive (AVMSD) has governed EU-wide coordination of national legislation on all audiovisual content, including traditional TV broadcast and online media.
Under the AVMSD, member states must ensure that video-sharing platform providers in their respective jurisdictions take appropriate measures to protect minors from programs, user-generated videos, and audiovisual commercial communications that could “impair the physical, mental, or moral development.” The level of protection required depends on the level of harm the content could impose on the minor.
For linear online platforms, i.e., live television broadcasts, audiovisual content that “might seriously impair” the development of minors is not allowed to be published, “signifying a total ban.” For non-linear platforms, i.e., video-on-demand, programs that "might seriously impair" the development of minors are allowed, but they may only be made available in a way that minors will not “normally hear or see them”. This might include using PIN codes or age verification systems to gate access.
Protection of Young Persons Act (Germany)
The Protection of Young Persons Act (JuSchG) is a German law that regulates minors' access to age-inappropriate content and activities to safeguard their physical, mental, and moral development. The original law placed restrictions on the sale, supply, and consumption of drugs, access to films and computer games, gambling, and the presence of children and young people in certain entertainment venues like bars and clubs. The law was amended in 2021 with requirements specific to digital media and online gaming.
The original law focused on limiting children’s exposure to media that could “impair [their] development,” including media that is excessively violent or graphic in nature. The 2021 amendment now considers not just the age-appropriateness of the content but also the age-appropriateness of the platform experience. For example, an online game without graphic or violent content may still be considered unsuitable for children if it allows players to communicate with one another without any settings that protect minors from the possibility of risks like online grooming and sexual exploitation by other users.
The law lists several precautionary measures platforms must take to limit children’s exposure to age-inappropriate content and experiences. In some cases, age verification is one such requirement.
Complying with regulations: The role of age verification
Laws protecting children from harmful and age-restricted content often lack specific directives on how companies must verify user ages to ensure age-appropriate experiences. While some regulations require more than just attestation, others allow companies to determine how they will assess users’ ages.
Given the variance of global regulations, it’s wise to be prepared with an adaptable age verification system that enables your business to meet a wide range of scenarios. It’s also important to remember that while certain regulations may not include specifics on technology or methods today, they could include such directives in the future. The regulatory environment is shifting quickly, and companies with flexible age verification systems will be well prepared to adapt to new requirements.
That’s why Persona built its platform to be configurable, enabling teams to quickly deploy user flows that meet specific regulatory requirements without engaging additional resources such as engineering teams.
Beyond compliance: Creating safer digital environments with age verification
It’s challenging to anticipate changes in age verification regulations, however, your business can proactively consider how it will maintain its commitment to fostering safe and inclusive environments for users of all ages. Implementing reliable age verification enables your business to confidently tailor your offerings to create age-appropriate experiences for your entire community of users.
Several social media platforms are already leading the way in creating age-appropriate experiences for younger account holders to protect them from harm and ensure safe interactions. For example, YouTube created a separate kid-friendly version of its platform for younger users. Meanwhile, TikTok does not allow users between the ages of 13 to 15 to send or receive direct messages.
Accounting for user experience with age verification
Ensuring age-appropriate experiences is crucial for building user trust, however, businesses must also consider how the age verification process will impact the user experience. Age verification should feel seamless and natural to the end user, whether they are a minor or an adult. If they must take additional steps, these requests should come with guidance as to why such information is needed to decrease the chance of drop-off when introducing friction.
With the right age verification solution, however, the process can be seamless and take only a few seconds to complete. Below are some user experience questions to ask when adding an age verification system, along with process considerations.
Simplify global expansion of age verification with Persona
At Persona, we recognize that global companies need customizable age verification solutions to demonstrate compliance with a breadth of evolving regulatory requirements while also meeting user expectations for seamless online experiences. That’s why the Persona platform offers modular tools that empower teams to customize workflows and continuously fine-tune them to improve the user experience and remain compliant.
Need robust, user friendly age verification? Contact us to learn more or get started for free.