Online Safety Act: illegal harms codes published today
Ofcom has today published its illegal harms statement which comprises the final illegal harms risk assessment guidance along with the final codes of practice for user-to-user services and search, setting out the measures regulated services will need to take to comply with their illegal harms duties under the Online Safety Act 2023. Alongside this, it has published its register of risks and its illegal content judgements guidance, along with the regime’s final record keeping and review guidance and the final enforcement guidance. All these products were initially published in draft for consultation last November.
Also published today is a new consultation on Ofcom’s power to issue a Technology Notice under section 121 of the OSA: this requires a Part 3 provider to use or develop accredited technology to deal with CSEA and/or terrorism content.
This is a significant milestone in the implementation of the Online Safety Act, firing the starting gun for compliance with the illegal harms duties for regulated services and commencing Ofcom’s powers of enforcement against those duties. We explain more about this below. The Secretary of State, Peter Kyle, has published a Written Ministerial Statement confirming that the Codes have been laid in Parliament, along with regulations to set the categorisation thresholds under the OSA.
As Ofcom’s Chief Executive, Melanie Dawes says in the press release: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today. The safety spotlight is now firmly on tech firms and it’s time for them to act.”
What do these documents say?
Detailed analysis of the nearly 2500 pages will take a while, though the regulator has provided a number of overview documents to help navigate what follows and a summary. However, Ofcom’s press release focuses on the following key issues:
- Senior accountability for safety.
- Better moderation, easier reporting and built-in safety tests.
- Protecting children from sexual abuse and exploitation online, including tackling pathways to online grooming.
Our focus, along with those of our civil society partners, will very much be on the areas where we had called for changes to Ofcom’s approach: our joint statement on the consultation proposals is here. We will publish more detail on this in the New Year.
Ofcom have already confirmed that some of the issues raised in responses to their consultation will be addressed in their consultation on the next iteration of the codes, which is due next Spring. (Overview: pp2-3) These proposals will include:
- Banning the accounts of those found to have shared CSAM;
- Crisis response protocols for emergency events (such as the riots in August 2024)
- Use of hash matching to prevent the sharing of non-consensual intimate imagery and terrorist content; and
- Tackling illegal harms including CSAM through the use of AI.
While this additional consultation is welcome, the time it will take for the updated codes to be enforceable is considerable. An iterative approach shouldn’t be at the expense of user safety so we would hope that Ofcom will consider how to streamline and expedite its processes going forward.
What happens now?
The most significant consequence is that, as of today, services in scope of the Act need to start their risk assessments for illegal content on their service (Ofcom has provided an online tool for services to check if they are in scope and will provide a tool early next year to help companies comply.)
“Providers now have a duty to assess the risk of illegal harms on their services, with a deadline of 16 March 2025. Subject to the Codes completing the Parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity. We are ready to take enforcement action if providers do not act promptly to address the risks on their services.” (Overview: page 1)
The Codes of Practice are laid in Parliament for 40 days and are subject to the negative procedure, meaning that there is no debate or vote on them; once approved, there is a 21-day period before they come into force, hence Ofcom’s assessment that they will become enforceable in mid-March. As they said in their roadmap, “at this point we can begin investigations and impose sanctions if we find that services are not compliant with these duties.”
There is - as has become strikingly evident in the timing above - a disconnect between providers doing their risk assessments (completed before the codes come into force) but then only being required to take the measures that Ofcom has set out in the codes to be in compliance with the Act. In our response to the illegal harms consultation, we called for a requirement to be included in the codes for providers to take steps to mitigate *all the risks* they found in their risk assessment.
It does not appear from our initial skim through the headlines that this has been addressed by Ofcom. They have, however, according to their summary document, “removed some measures from smaller low risk services, where the evidence we received suggested they were not proportionate”, which is a concern given that their advice on categorisation - which the Secretary of State has confirmed he is accepting, as per the WMS above - does not include small (albeit high-risk) platforms within category 1 either, a decision which they justified on the basis that all small platforms would be caught by the illegal harms duties anyway. What measures have been removed are not clear at first glance.
What else can we expect in 2025?
This is just the start of what Ofcom is calling “a year of action” in 2025. We can expect to see the following in the next few months:
- January: part 5 duties on pornography providers expected to come into force
- January: publication of final child access assessment duties guidance
- February: consultation on the draft guidance for protection of women and girls
- April: publication of final protection of children statement, children’s risk assessment guidance, code of practice and related guidance
- April: advice to the Secretary of State on Qualifying Worldwide Revenue for the fee regime
- Spring: further consultation on illegal harms codes
- Spring: consultation on automated content detection tools
- Spring: final transparency guidance published by Ofcom
More detail on Ofcom’s proposed publications and subsequent consultations and activities is available in Ofcom’s roadmap. They have also published a handy timetable setting out the key dates for compliance across all the duties.
The recent draft Statement of Strategic Priorities from the DSIT Secretary of State, Peter Kyle MP, suggests further activities and areas for prioritisation by the regulator, which it must “have regard to” in delivering its online safety duties. (See our initial commentary on that here.) We can expect a response from Ofcom (as a consultee) in January and, once the final version of the statement is published later in the Spring, a formal response from the regulator setting out how they intend to incorporate the Secretary of State’s ambitions in the work that is already underway.