Industry
Published April 03, 2025
Last updated April 07, 2025

UK’s Online Safety Act: Quickly approaching dates for protecting children online

The UK’s Ofcom has released a timeline for conducting a number of risk assessments related to the Online Safety Act. Learn more.
Kerwell Liao
Kerwell Liao
8 min
Key takeaways
To promote children’s safety online, the UK’s Online Safety Act requires online businesses to conduct various types of assessments. 
The children’s access assessment is due by April 16, 2025, while the children’s risk assessment is due three months after final guidance is issued (likely July 2025). 
A highly effective age assurance program (including age verification and/or age estimation) is key to compliance for many businesses.

The UK’s Online Safety Act (OSA) officially became law of the land in October 2023, kicking off a three-phase process for how guidelines would be drafted and ultimately enforced. In the interim, businesses have been waiting for final directions to inform the design and implementation of their OSA compliance strategies. 

All of that is changing in 2025. On March 3, 2025, Ofcom officially launched an enforcement program to monitor compliance with the record-keeping duties and illegal content risk assessment duties established under the law. 

And by now, businesses should have already completed the illegal content risk assessment (it was due March 16, 2025). 

Over the coming months, there will be a series of important dates to be aware of (as of this writing, the next significant deadline is April 16, 2025 — the date by which businesses must complete a children’s access assessment). 

undefined

Below, we briefly recap what the Online Safety Act is and the legislation’s key goals. We also take a look at the assessments required under the OSA that focus on the protection of children, including relevant deadlines for completing them, and discuss what this means for online businesses with operations or users in the United Kingdom. 

A quick refresher: What is the UK’s Online Safety Bill?

The Online Safety Bill or Online Safety Act is a UK law that requires certain types of online businesses to implement measures designed to “keep the internet safe” for children and adults. Its requirements apply to three main types of online businesses that are accessible to UK citizens:

  1. Search services, which include search engines as well as other businesses with search functionality

  2. User-to-user services, which allow for user-generated content that can be shared with or discovered by other users on the platform. These services are provided by social media companies, some messaging apps, forums, image boards, message boards, video-sharing platforms, online dating services, and mobile and online gaming providers.

  3. Providers of pornographic content, which generate, publish, or display pornographic video, images, or audio online.

As part of Online Safety Act compliance, businesses need to conduct and document a number of assessments about the risks posed to children by their platforms. 

These assessments should then inform how the company will prevent children from accessing legal but harmful content and age-restricted materials, such as:

  • Content tied to bullying and extreme violence

  • Content that glorifies or provides instructions for self-harm, suicide, and eating disorders

  • Pornographic content 

Learn more: How to maintain compliance with global age verification laws

Children’s online safety assessments required under the OSA

The UK’s Online Safety Act requires an assortment of assessments. We’ll focus on the two assessments businesses need to complete that focus on children’s online safety. These are:

Children’s access assessments

Filing due date: April 16, 2025 (and ongoing)

The goal of this assessment is to establish whether a service is likely to be accessed by children. It has two stages:

  1. Stage 1 assesses whether it’s possible for children to normally access the services provided by companies

  2. Stage 2 assesses whether there is a significant number of children who are users of the service

Businesses can only conclude it’s not possible for children to access their service if they have highly effective age assurance in place to prevent children’s access. 

Those without such measures in place must assess whether children are likely to be on the service (or a part of the service). This includes the following circumstances:

  • The service currently has a significant number of users that are children

  • The service is likely to attract a significant number of children

What constitutes a “highly effective” age assurance program? Ofcom UK defines it as one that is technically accurate, robust, reliable, and fair. It can include both age verification and age estimation technologies. 

While Ofcom does not specify specific methods that must be used, it does state that photo-ID matching, facial age estimation, and reusable identity services have the potential to meet these criteria. 

Importantly, the following age assurance methods are not considered to be highly effective:

  • Payment methods that don’t require the user to be over 18

  • Your terms and conditions, even if they say the service is for over 18s only

Not sure if your platform is likely to attract a significant number of children? Ofcom has supplied the following factors, which may indicate that this is likely:

  • Your service provides benefits to children

  • Content on your platform is appealing to children 

  • The design of your platform or service is appealing to children

  • Your commercial strategy includes children (in part or fully)

If the results of your children’s access assessment indicate that children are likely to access your platform, you must complete a children’s risk assessment (below). 

Learn more: Kids’ online privacy and safety: Navigating data collection and protection regulations for minors

Children’s risk assessments

Filing due date: July 2025

Once Ofcom publishes its final guidance for conducting a children’s risk assessment, businesses will have three months to conduct and document it. With these directions expected sometime in April 2025, the filing due date will likely be in July 2025.

While final guidance is not yet available, Ofcom has outlined a four-step risk assessment process that businesses should familiarize themselves with. Currently, this process includes the following steps:

  1. Understand the harms: Identify the kinds of content harmful to children that need to be assessed, taking into account Ofcom’s risk factors, including specific product features your service includes.

  2. Assess the risk of harm to children: Consider any other characteristics of your service that may increase or decrease risks of harm; assess the likelihood and impact of each kind of content harmful to children; consider how the design of your service may affect how much children use your service and the impact this could have on the level of risk of harm. Also, assess the risk of cumulative harm and assign a risk level for each kind of content.

  3. Decide measures, implement, and record: Determine which children’s online safety measures (e.g. highly effective age assurance) will reduce the risk of harm for children while they use your service. Then, implement the changes and record the outcomes of your risk assessment.

  4. Report, review, and update risk assessments: File your risk assessment with Ofcom and monitor its effectiveness, reviewing and updating your risk assessment periodically. 

Although these measures may change somewhat with final guidance, it is a good idea for businesses to begin thinking about each step and understanding how they will comply once final directions are available. 

What does the Ofcom Online Safety Act mean for your business?

With rapidly approaching deadlines for each type of Online Safety Act assessment, online businesses don’t have much time left to figure out how they will comply with the law and ensure child and youth safety online. 

In completing the children’s access assessment, you will need to have a plan by April 16 for how you’ll determine whether children are accessing your service. Incorporating age verification can be an instrumental tool in determining whether a user is a child. Any service likely to be accessed by children will then need to complete the children’s risk assessment by July 2025 (assuming Ofcom publishes the children’s risk assessment guidance in April 2025). 

What happens if you don’t comply? Ofcom has been empowered to deploy a number of enforcement powers, including:

  • Issuing fines of up to 10% of turnover or £18m (whichever is greater)

  • Apply to a court to block a site in the UK (for the most egregious cases)

How Persona can help you comply with online child safety requirements in the UK

For many businesses subject to the requirements of the Online Safety Act, age assurance will play a big role in compliance. What’s more, companies will have unique considerations for their age assurance approach, meaning a cookie-cutter solution won’t suit their needs. 

But that doesn’t mean organizations are left to figure it out alone. At Persona, we partner with customers to understand their compliance and business needs. Our flexible suite of identity tools empowers them to design a bespoke age assurance program that aligns with their goals. Take advantage of:

  • Multiple age assurance options evaluated by regulators and independent testing bodies, including government ID verification, database verification, selfie verification, and a combined approach

  • Robust privacy controls, including automatic censoring, redaction, retention policies, and audit log creation

  • Dynamic Flow, which allows you to tailor age assurance flows to individual users in real time based on a number of factors, including geography, age thresholds, risk levels, and more

  • Safe and secure data collection, built around industry best practices, including encryption via HTTPS for web traffic and AES-256 encryption for stored data (with decryption keys stored on separate hosts and rotated on a regular basis)

  • Written record safekeeping, enabling simplified, durable, and immutable recordkeeping to ensure consent logs are retained and accessible for any inspection or audit

Ready to learn more about how Persona can help you get and stay compliant with the UK’s Online Safety Act and other regulations around the world? Contact us to get a custom demo today.

The information provided is not intended to constitute legal advice; all information provided is for general informational purposes only and may not constitute the most up-to-date information. Any links to other third-party websites are only for the convenience of the reader.
FAQs

The UK’s Online Safety Bill establishes 17 categories of priority offenses, also called illegal harms. Online businesses regulated by the law are required to take a number of steps to identify, mitigate, and remove this content if it appears on their platforms:

  • Terrorism

  • Harassment, stalking, threats, and abuse offenses

  • Coercive and controlling behaviour

  • Hate offences

  • Intimate image abuse

  • Extreme pornography

  • Child sexual exploitation and abuse

  • Sexual exploitation of adults

  • Unlawful immigration

  • Human trafficking

  • Fraud and financial offenses

  • Proceeds of crime

  • Assisting or encouraging suicide

  • Drugs and psychoactive substances

  • Weapons offenses (knives, firearms, and other weapons)

  • Foreign interference

  • Animal welfare

To comply with Ofcom’s requirements for the Online Safety Act, businesses are required to complete an illegal harms risk assessment, which determines how likely users could encounter illegal content on their platforms. For user-to-user services, the assessment must also determine how the platform itself might be used to facilitate or commit certain criminal offenses, such as distributing or sharing illicit materials. 

To facilitate compliance, Ofcom has published a full report detailing what an illegal harms risk assessment is and the steps involved in it. Your process should include four main steps:

  1. Understanding the kinds of illegal content that need to be assessed

  2. Assessing the risk of harm

  3. Deciding measures, implementing, and recording

  4. Reporting, reviewing, and updating

The illegal harms risk assessment was due March 16, 2025. For businesses that become subject to the OSA in the future — for example, a new online service that begins operating in the UK — it is expected that such a risk assessment will need to be completed at that time. 

Kerwell Liao
Kerwell Liao
Kerwell is a product marketing manager focused on Persona’s identity verification solutions. He enjoys watching basketball and exploring the world with his German Shepherd.