Industry

Online Safety Act: Everything you need to know

The Online Safety Act recently became law in the United Kingdom. Learn more about what that means for online businesses operating in the UK.

An icon of a laptop representing online safety.
Read time:
Share this post
Copied
Table of contents
⚡ Key takeaways
  • The Online Safety Act, formerly the Online Safety Bill, became law in the UK on October 26, 2023 and is expected to go into effect within two months, subject to phased implementation of regulations. 
  • The law requires regulated online businesses to remove illegal content, prevent children from accessing age-restricted or harmful content, and give users more control over the types of content and users they see and interact with.
  • In order to comply with these requirements, businesses subject to the law should begin thinking about age and identity verification measures.

As people of all ages increasingly live their lives online, governments around the world are focused on establishing regulations to protect children and teens using the internet. 

In some countries, like the U.S., a patchwork of state laws has emerged. In other countries, like the United Kingdom, national regulation has become the goal. Case-in-point: the UK’s recently passed Online Safety Act, which was written with the goal of protecting children and teens from harmful content online.

Below, we take a closer look at what the Online Safety Act is, when it goes into effect, and the types of businesses it regulates. We also discuss the law’s key requirements and steps that businesses can take to become and remain compliant. 

What is the Online Safety Act?

The Online Safety Act is a UK law that requires social media companies, messaging apps, search engines, and other digital platforms to implement a number of measures designed to “keep the internet safe for children” and adults. 

The law, which has been described as “sprawling,” consists of more than 200 clauses outlining various types of illegal or harmful content, what is expected of regulated companies, the consequences of noncompliance, and more. 

The law requires regulated companies to:

  • Scan for illegal content and remove it from their platform. This includes content related to terrorism, hate speech, self-harm, child sexual abuse, and revenge pornography. 
  • Prevent children from accessing content that is legal but considered harmful. This includes content that glorifies or encourages eating disorders, as well as content that provides instructions for self-harm and suicide. Content tied to bullying or extreme violence also falls under this umbrella.
  • Prevent children from accessing age-restricted content, such as pornography.
  • Implement age-assurance measures and enforce age limits.
  • Conduct and publish assessments about the risks posed to children by their platforms.
  • Provide parents and children with a mechanism to report problems when they are encountered. 

The law also makes it easier for adults to control the type of content and users that they see or interact with online. It does this by requiring regulated companies to:

  • Enforce the promises they make to users in their terms and conditions agreement 
  • Allow users to filter out potentially harmful content they don’t wish to see, such as content involving bullying, violence, or self-harm
  • Allow verified users to interact only with other verified users if they wish to do so

Which businesses does the Online Safety Act affect?

The Online Safety Act applies to online companies offering two types of services: user-to-user services and search services.

User-to-user services: If a platform allows for user-generated content that can be shared with or discovered by another user on the platform, it falls under the scope of the law. Examples of user-to-user services include social media companies, online dating services, forums, image/message boards, video-sharing services, online and mobile gaming providers, pornography sites, and some messaging apps. 

Search services: If an online business is a search engine, or includes search functionality, it is considered a search service under the law. However, the definitions for what counts as a search engine subject to the law are complex. According to the text of the Act, any search engine that “includes a service or functionality which enables a person to search some websites or databases (as well as a service or functionality which enables a person to search (in principle) all websites or databases)” is subject to the law. But search engines that “enable a person to search just one website or database” are not subject to the law.

Importantly, the Online Safety Act does not just apply to businesses based in the UK. Any online business which is accessible to UK users is subject to the law. 

When does the Online Safety Act go into effect?

The Online Safety Act officially became law on October 26, 2023 after it received Royal Assent. According to a press release published by the UK government, the law’s requirements will be implemented with a phased approach. 

Ofcom, the regulator responsible for enforcing the law, has broken enforcement out into three phases:

  • Phase 1: Ofcom will publish draft guidelines on compliance with the law’s requirements around harmful content on November 9, 2023 and plans to publish a statement on their final decisions in fall 2024, subject to final approval by the government.
  • Phase 2: Ofcom will publish draft guidance for sites that host pornographic content, including guidance on age verification, in December 2023. Additional draft codes of practice related to the protection of children will be released in the spring of 2024.
  • Phase 3: Ofcom will publish guidelines around additional duties for specific categories of services, such as how regulated companies must deploy user empowerment measures and release transparency reports, in spring 2024.  

User verification under the Online Safety Act

In order to comply with the Online Safety Act’s various requirements, businesses must implement processes for verifying a user’s age and identity. 

Age verification

In the UK, data privacy laws require that users must be at least 13 years old in order to join a social media platform without parental permission. The Online Safety Act builds on these protections. In order to ensure that children are not accessing inappropriate content on their platforms, as defined by the law, online businesses must implement a process for estimating or verifying the user’s age. Currently, the law does not specify which estimation methods are acceptable and what they might look like.

The same is true for platforms that allow users to discover harmful content — only instead of 13, companies must verify that users are at least 18 years old. This requirement may be tricky for social media platforms and forums that are not exclusively used for pornography, but which host pornographic content in addition to non-pornographic content (such as X or Reddit). 

Ofcom has not yet provided specific guidance on what age verification or estimation processes should look like, but the office is expected to publish recommendations in the coming months. 

Adult user verification

Any business considered to be a Category 1 service under the law (e.g., user-to-user services) must offer adult users the option of verifying their identity. Verified users must then be given the option to filter out any non-verified users if they wish to do so. 

While the law does not specify what this filtering process should look like, it does specify that it should have the effect of:

  • Preventing non-verified users from interacting with content shared, uploaded, or generated by verified users who have filtered out non-verified users
  • Reducing the likelihood that the verified user will encounter content shared, uploaded, or generated by non-verified users

As with the age verification requirements discussed above, Ofcom has not yet provided guidance for acceptable or recommended forms of identity verification

Free white paper
See how experts evaluate verification solutions

Preparing for the law

Though Ofcom has not yet released guidance as to which processes will be acceptable or recommended for age and adult user verification, online platforms should begin planning for compliance and evaluating potential solutions. Options may include government ID verification, selfie verification, database verification, and other methods. 

Want to learn more about how Persona can help you become and remain compliant with the Online Safety Act and other emerging laws? Get a custom demo today.

Frequently asked questions

Who is responsible for enforcing the Online Safety Act?

The Online Safety Act is enforced by the Office of Communications, or Ofcom, which is responsible for overseeing all regulated communication services, including television, radio, broadband, and home and mobile phone services. The law also grants the UK’s Secretary of State a number of powers to direct Ofcom in its enforcement duties. 

What are the penalties for not complying with the Online Safety Act?

If online platforms fail to comply with the Online Safety Act’s requirements, they may be subject to significant fines by Ofcom. These fines can total £18 million (roughly $22 million) or 10% of their global annual revenue — whichever is higher. Fines for the largest platforms could be in excess of billions of pounds. 

Additionally, executives and senior management may be held criminally liable if they are notified by Ofcom about instances of child exploitation and sexual abuse and fail to remove it as required under the law. Failure to answer information requests from Ofcom may also result in prosecution, with the possibility of jail time. 

What is the difference between the Online Safety Act and the Digital Services Act?

The Online Safety Act is a UK law primarily affecting social media platforms, forums, message boards, search engines, and certain other online businesses. It has many requirements, particularly around shielding children and adult users from content that is illegal, age-restricted, potentially harmful, or simply unwanted by the user. Any business with services accessible by UK users is subject to the law, regardless of where the business is based. 

On the other hand, the Digital Services Act is a somewhat comparable law in the European Union. Like the Online Safety Act, its requirements apply to search engines, social media platforms, and other services that allow for content sharing. But it also applies to online marketplaces. Many of its requirements are similar to those of the Online Safety Act. This includes scanning for and removing illegal or harmful content, giving users a means of reporting such content, and more.

Continue reading

Continue reading

Know Your Employee (KYE): How identity verification fits in the picture
Know Your Employee (KYE): How identity verification fits in the picture
Industry

Know Your Employee (KYE): How identity verification fits in the picture

A thorough Know Your Employee (KYE) process helps you verify the identity, credentials, and background of new and existing employees to control for fraud.

Data subject access requests for the GDPR
Data subject access requests for the GDPR
Industry

Data subject access requests for the GDPR

Learn about data subject access requests (DSARs) for the GDPR and individuals’ rights to access their personal data.

Online KYC during user onboarding
Online KYC during user onboarding
Industry

Online KYC during user onboarding

Many businesses need to have a KYC process for onboarding new users. Learn what's required, common steps, and more.

Age verification system: How to add it into your business
Industry

Age verification system: How to add it into your business

Any business that sells age-restricted products, provides access to age-gated activities, or delivers services that require adult consent must verify ages. Learn more.

How to add the right age verification system to your business
Industry

How to add the right age verification system to your business

Adding the right digital age verification system that works across regions and use cases can be complex. Learn what to look for in a solution provider.

DAC7 compliance: What is it, and who does it impact?
Industry

DAC7 compliance: What is it, and who does it impact?

See how DAC7 impacts businesses, consumers, and governments, and understand what you need to know to stay compliant. Learn how Persona can help.

Ready to get started?

Get in touch or start exploring Persona today.