Internet Regulation Updater
16 March 2025 marked a significant milestone in the Online Safety regulatory regime in the UK. To fulfil their legal duties under the Online Safety Act, over 100,000 in-scope services must have completed a suitable and sufficient Illegal Harms Risk Assessment by this date.
With illegal content duties coming into force today alongside publication of Ofcom’s final illegal content codes of practice, companies who fail to comply with these duties risk enforcement action which could result in fines of up to 10% of their qualifying worldwide revenue or £18million – whichever is greater.
Our Deloitte multi-disciplinary digital regulation team of lawyers, risk and compliance specialists take you through the implications of this milestone below.
What is the Illegal Content Risk Assessment?
In December 2024 Ofcom published its final guidance on Illegal Content Risk Assessments setting out in detail the requirements it expects services to meet. The guidance recommends that services follow a four-step methodology in order to meet the requirements of the Act, assessing the risk to users of 17 kinds of priority illegal content (and other illegal content) separately. Once platforms have assessed the impact and likelihood of the harms from these types of illegal content, they will need to decide whether to implement applicable measures to reduce the risk of harm to users. They will also have to ensure the outcome of the risk assessment is recorded and reported through appropriate governance channels and that companies maintain the recorded evidence on which their assessment is based for at least three years.
The Illegal Content Risk Assessment must also take into account a range of factors including:
- The characteristics of the service including its users, functionalities, algorithmic systems and business model.
- Ofcom’s Risk Profiles.
- How the design and operation of the service, including any existing mitigations, impact the level of risk.
- How the service is used.
Going forward, all services must keep this risk assessment up to date – reviewing it at least every 12 months and carrying our further risk assessments before they make significant changes to existing services, or launch new ones in scope of the Act. A further risk assessment may also be required if Ofcom makes a significant change to a Risk Profile that relates to an in-scope service.
What happens next?
Ongoing Enforcement Programme: Ofcom launched an enforcement programme in early March which will require certain large platforms and high-risk services to submit their completed risk assessments to the regulator by 31 March. The enforcement programme will “monitor if services meet their illegal content risk assessment duties and record keeping duties under the Online Safety Act” and will last for at least 12 months. Ofcom has said previously that they “won’t hesitate to take early action against deliberate or flagrant breaches” against providers who do not comply with their duties.
The Codes of Practice on Illegal harms were formally issued in February and came into force today on 17 March 2025 along with the illegal content safety duties set out in the Act. Providers will need to take the safety measures set out in the Codes (or other alternative measures that meet the safety duties) to protect their users from illegal content from this date. Ofcom has said it will expect service providers to start implementing these measures from today. Those that are ‘relatively quick or simple’ should be ‘completed rapidly’, and Ofcom expects almost all measures recommended in the Codes of Practice to be in place within six months i.e. September 2025.
Child Access Assessments: All user-to-user and search services must complete these by 16 April to determine whether their service is likely to be accessed by children. Also in April, Ofcom is expected to publish the final Children’s Risk Assessment guidance. Once published, platforms which are found likely to be accessed by children will have three months to complete, and Ofcom will expect specific platforms to provide these from 31 July 2025.
What should platforms be doing now?
Evidence collection
Platforms should collect and store evidence underpinning the risk assessment in case of a formal request for information (RFI) from Ofcom.
Process for regulator engagement
Failing to adequately respond to RFIs can result in potentially serious penalties. Platforms should ensure systems are in place to properly respond to the Regulator and answer RFIs .
Implementing risk mitigants
Now that platforms have a holistic view of online safety risk, the question is how to implement and monitor appropriate mitigants and assess the impact they have on overall risk scores as part of ongoing risk management. For platforms where legal or compliance teams have performed the risk assessment there needs to be a mechanism to hand over ownership of the risk to the relevant product teams and put in place appropriate governance structures to review the effectiveness of the ongoing risk management.
Preparing for the Children’s Risk Assessment
Although there may be some overlap between the illegal content and children’s risk assessments these need to be distinct from each other and the requirements are different. Children’s risk assessments cover a wider range of potential harms and require additional consideration of factors such as the impact of that harm on children of different ages. Platforms should start to work now on ensuring they are prepared for the July deadline to complete this.
Alignment with EU Digital Services Act
Ofcom have been clear that a risk assessment completed for the European Commission’s Digital Services Act is different and would be unlikely to meet the OSA requirements if it was not modified to the requirements of the OSA, so where firms have taken this approach based on bandwidth, and/or where the risk assessments have been undertaken in isolation, an alignment and gap assessment should now be undertaken to ensure coverage and simplify the overall risk assessment process for future years.
Your contacts
If you would like to speak to the Deloitte team supporting clients on complying with the Online Safety Act and other evolving global internet regulations, please contact:
Joey Conway, Internet Regulation Partner, Legal Lead
Nick Seeber, Global Internet Regulation Lead Partner
Laurie Gilchrist, Internet Regulation Director
Sian Bundred, Internet Regulation Director
Brij Sharma, Media Regulation Expert
Alana Warbrick, Internet Regulation Legal Associate