Industry

Online Safety Act: Everything you need to know

The Online Safety Act recently became law in the United Kingdom. Learn more about what that means for online businesses operating in the UK.

An icon of a laptop representing online safety.
Last updated:
6/6/2024
Read time:
Share this post
Copied
Table of contents
⚡ Key takeaways
  • The Online Safety Act, formerly the Online Safety Bill, became law in the UK on October 26, 2023, subject to phased implementation of regulations.
  • The law requires regulated online businesses to remove illegal content, prevent children from accessing age-restricted or harmful content, and give users more control over the types of content and users they see and interact with.
  • Businesses subject to the law, both user-to-user and search services, must consider age and identity verification measures.

As people of all ages increasingly live their lives online, governments around the world are focused on establishing regulations to protect children and teens using the internet. 

In some countries, like the U.S., a patchwork of state laws has emerged. In other countries, like the United Kingdom, national regulation has become the answer. Case-in-point: the UK’s recently passed Online Safety Act, which was written with the goal of protecting children and teens from harmful content online.

Below, we take a closer look at what the Online Safety Act is and the types of businesses it regulates. We also discuss the law’s key requirements and steps that businesses can take to become and remain compliant. 

What is the Online Safety Act?

The Online Safety Act is a UK law that requires social media companies, messaging apps, search engines, and other digital platforms to implement a number of measures designed to “keep the internet safe for children” and adults. 

The law, which has been described as “sprawling,” consists of more than 200 clauses outlining various types of illegal or harmful content, what is expected of regulated companies, the consequences of noncompliance, and more. It impacts more than 100,000 online companies and platforms used by individuals in the U.K.

The law requires regulated companies to:

  • Scan for illegal content and remove it from their platform. This includes content related to terrorism, hate speech, self-harm, child sexual abuse, and revenge pornography. 
  • Prevent children from accessing content that is legal but considered harmful. This includes content that glorifies or encourages eating disorders, as well as content that provides instructions for self-harm and suicide. Content tied to bullying or extreme violence also falls under this umbrella.
  • Prevent children from accessing age-restricted content, such as pornography.
  • Implement age-assurance measures and enforce age limits.
  • Conduct and publish assessments about the risks posed to children by their platforms.
  • Provide parents and children with a mechanism to report problems when they are encountered. 

The law also makes it easier for adults to control the type of content and users that they see or interact with online. It does this by requiring regulated companies to:

  • Enforce the promises they make to users in their terms and conditions agreement 
  • Allow users to filter out potentially harmful content they don’t wish to see, such as content involving bullying, violence, or self-harm
  • Allow verified users to interact only with other verified users if they wish to do so

Which businesses does the Online Safety Act affect?

The Online Safety Act applies to online companies offering two types of services: user-to-user services and search services.

User-to-user services: If a platform allows for user-generated content that can be shared with or discovered by another user on the platform, it falls under the scope of the law. Examples of user-to-user services include social media companies, online dating services, forums, image/message boards, video-sharing services, online and mobile gaming providers, pornography sites, and some messaging apps. 

Search services: If an online business is a search engine, or includes search functionality, it is considered a search service under the law. However, the definitions for what counts as a search engine subject to the law are complex. According to the text of the Act, any search engine that “includes a service or functionality which enables a person to search some websites or databases (as well as a service or functionality which enables a person to search (in principle) all websites or databases)” is subject to the law. But search engines that “enable a person to search just one website or database” are not subject to the law.

Importantly, the Online Safety Act does not just apply to businesses based in the UK. Any online business accessible to UK users is subject to the law. 

When will guidelines for the Online Safety Act be available?

The Online Safety Act officially became law on October 26, 2023, after it received Royal Assent. 

Ofcom, the country’s communications regulator, responsible for enforcing the law and providing clarity on its provisions, has split the guidelines and enforcement process into three phases:

  • Phase 1: Ofcom published draft guidelines, or “consultations,” on compliance with the law’s requirements around harmful content and legalities on November 9, 2023. Public responses were accepted until February 2024. Ofcom is spending the remainder of 2024 reviewing comments and plans to publish final decisions by the end of 2024, subject to final approval of the legal codes by Parliament. Impacted companies will be expected to create illegal harm risk assessments in response.
  • Phase 2: Ofcom published two sets of draft guidelines on protecting children. The first, for sites that host pornographic content, including guidance on age verification in December 2023. Public responses were accepted until March 2024. Final guidelines are expected in early 2025. These will be followed by draft guidelines and a response and review period on specific protections for women and girls. The second set is general guidelines and codes to protect children. It was published on May 9, 2024 with public responses due by July 17, 2024. Final guidelines are expected in the 3rd quarter of 2025, subject to approval of codes by Parliament. 
  • Phase 3: Ofcom is reviewing additional duties for specific categories of services, such as how regulated companies must deploy user empowerment measures and release transparency reports. The process kicked off in March with a call for evidence from industry and expert groups to help inform the process, a window that closed in May 2024. Draft guidelines and codes will be published in 2025. Additionally, research and advice was published in March 2024 for the government on categorization of services, subject to approval.

User verification under the Online Safety Act

In order to comply with the Online Safety Act’s various requirements, businesses must implement processes for verifying a user’s age and identity. 

Age verification

In the UK, data privacy laws require that users must be at least 13 years old in order to join a social media platform without parental permission. The Online Safety Act builds on these protections. 

In order to ensure that children are not accessing inappropriate content on their platforms, as defined by the law, online businesses must implement a process for estimating or verifying the user’s age

Platforms that enable users to experience harmful content, have a slightly different mandate in that they must only verify that users are at least 18 years old. This requirement may be tricky for social media platforms and forums that are not exclusively used for pornography, but which host such content alongside all-age content (such as X or Reddit). 

The Online Safety Act, as written, does not specify which age assurance methods are acceptable or what they might look like. Ofcom is responsible for developing these guidelines. The draft guidance published in May 2024 defines “highly effective” age assurance processes as technically accurate, robust, reliable, and fair. Acceptable methods include photo-ID matching, facial age estimation, and reusable digital identity services. Methods such as self-declaration of age and general contractual age restrictions are considered ineffective.

Adult user verification

Any business considered to be a Category 1 service under the law (e.g., user-to-user services) must offer adult users the option of verifying their identity. Verified users must then be given the option to filter out any non-verified users if they wish to do so. 

While the law does not specify what this filtering process should look like, it does specify that it should have the effect of:

  • Preventing non-verified users from interacting with content shared, uploaded, or generated by verified users who have filtered out non-verified users
  • Reducing the likelihood that the verified user will encounter content shared, uploaded, or generated by non-verified users

As with the age verification requirements discussed above, Ofcom is preparing guidance for acceptable or recommended forms of identity verification

Free guide
Learn about regulations for online marketplaces + platforms

Preparing for the law

Though Ofcom has not yet released guidance as to which processes will be acceptable or recommended for age and adult user verification, online platforms should begin planning for compliance and evaluating potential solutions. Options may include government ID verification, selfie verification, database verification, and other methods. 

Want to learn more about how Persona can help you become and remain compliant with the Online Safety Act and other emerging laws? Get a custom demo today.

Published on:
11/13/2023

Frequently asked questions

Who is responsible for enforcing the Online Safety Act?

The Online Safety Act is enforced by the Office of Communications, or Ofcom, which is responsible for overseeing all regulated communication services, including television, radio, broadband, and home and mobile phone services. The law also grants the UK’s Secretary of State a number of powers to direct Ofcom in its enforcement duties. 

What are the penalties for not complying with the Online Safety Act?

If online platforms fail to comply with the Online Safety Act’s requirements, they may be subject to significant fines by Ofcom. These fines can total £18 million (roughly $22 million) or 10% of their global annual revenue — whichever is higher. Fines for the largest platforms could be in excess of billions of pounds. 

Additionally, executives and senior management may be held criminally liable if they are notified by Ofcom about instances of child exploitation and sexual abuse and fail to remove it as required under the law. Failure to answer information requests from Ofcom may also result in prosecution, with the possibility of jail time. 

What is the difference between the Online Safety Act and the Digital Services Act?

The Online Safety Act is a UK law primarily affecting social media platforms, forums, message boards, search engines, and certain other online businesses. It has many requirements, particularly around shielding users from content that is illegal, age-restricted, potentially harmful, or simply unwanted by the user. Any business with services accessible by UK users is subject to the law, regardless of where the business is based. 

The Digital Services Act is a somewhat comparable law in the European Union. Like the Online Safety Act, its requirements apply to search engines, social media platforms, and other services that allow for content sharing. But it also applies to online marketplaces. Many of its requirements are similar to those of the Online Safety Act. This includes scanning for and removing illegal or harmful content, giving users a means of reporting such content, and more.

Continue reading

Continue reading

Age verification system: How to add it into your business
Industry

Age verification system: How to add it into your business

Any business that sells age-restricted products, provides access to age-gated activities, or delivers services that require adult consent must verify ages. Learn more.

How to add the right age verification system to your business
Industry

How to add the right age verification system to your business

Adding the right digital age verification system that works across regions and use cases can be complex. Learn what to look for in a solution provider.

DAC7 compliance: What is it, and who does it impact?
Industry

DAC7 compliance: What is it, and who does it impact?

See how DAC7 impacts businesses, consumers, and governments, and understand what you need to know to stay compliant. Learn how Persona can help.

Ready to get started?

Get in touch or start exploring Persona today.