Online Safety Act Network

Commentary

Categorisation of services: next steps

The publication, by Ofcom, of its illegal harms codes may have garnered all the media attention yesterday (16 December) but - as per the Written Ministerial Statement from the Secretary of State for Science, Innovation and Technology - DSIT has also published its response to Ofcom’s advice on categorisation of servicesand has laid the regulations along with an explanatory memorandum and an impact assessment. The Secretary of State’s decision to accept the advice, which excludes small but risky platforms from category one and therefore exempts them from the strongest possible duties under the Act, goes against strong exhortations from a variety of campaigners - including mental health and suicide prevention charities, as well as campaigners against racist and misogynistic hate and abuse.

Online Safety Act: illegal harms codes published today

Ofcom has today published its illegal harms statement which comprises the final illegal harms risk assessment guidance along with the final codes of practice for user-to-user services and search, setting out the measures regulated services will need to take to comply with their illegal harms duties under the Online Safety Act 2023. Alongside this, it has published its register of risks and its illegal content judgements guidance, along with the regime’s final record keeping and review guidance and the final enforcement guidance. All these products were initially published in draft for consultation last November.

The evidence on children's screentime: the CMOs' 2019 advice

A little bit of historical context for the announcement by the Department for Science, Innovation and Technology that they had launched a new research project “to boost the evidence base” on online harms. “The first stage of the project will examine what methods will best help the government understand the impact of smartphones and social media use on children after a review by the UK Chief Medical Officer in 2019 found the evidence base around the links to children’s mental health were insufficient to provide strong conclusions.”

The OSA and the draft Statement of Strategic Priorities

Peter Kyle, the Secretary of State for the Department for Science, Innovation and Technology, today published his draft Statement of Strategic Priorities for Ofcom, accompanied by a forthright interview in the Telegraph which refers to his statement telling the regulator they need to “look again at whether they are being assertive enough in certain areas”. The press release - which also announces a review into the evidence on smartphone and social media use by children - is here. But what is a Statement of Strategic Priorities (SSP) and why is the Secretary of State issuing it now?

The Online Safety Act: one year on

 The Online Safety Act 2023 celebrates its first birthday this week - as does the OSA Network. So it would be remiss of us not to mark the occasion with a review of what’s happened in the past 12 months and to look forward to the year ahead. We could of course talk about the growth of our Network, which now counts over 70 organisations in its membership - from the largest national charities to grassroots organisations, expert academics and individual campaigners - and whose contributions to our regular discussions and support for our collective endeavours are hugely valuable. But we hope they know that already and we’re proud to work alongside them on this issue.

Safety by design: has its time finally come?

Peter Kyle, the Secretary of State for the Department for Science, Innovation and Technology, told the BBC’s Laura Kuenssberg recently that he was going to “close loopholes” in the Online Safety Act and went on to talk about the importance of building in safety into the online world in order to ensure the opportunities of tech can be realised. He said that as far as he was aware, the tech sector was the “only sector .. that can release products in to society without proving they’re safe before release”; he wanted to take steps “to try and make sure safety is there at the start, not picking up the pieces afterwards” promising that increasingly the Government would be working with the US and others “to make sure safety is proven before the release of products”.

Ofcom's protection of children consultation: our summary response

Ofcom’s protection of children consultation closed on Wednesday 17 July. In this blog post, we summarise our response to their proposals and reiterate the recommendation we made in response to their previous illegal harms consultation to deliver a more outcome-focused approach to risk mitigation in their draft codes of practice. Our full response is available here. Background Ofcom’s protection of children consultation is the second major plank of its implementation of the regulatory regime that it will be enforcing under the Online Safety Act 2023. The first -the illegal harms consultation - closed in February 2024 and Ofcom’s response has not yet been published. The protection of children’s proposals relate to the Online Safety Act’s child safety duties (section 12) and risk assessment (section 11); Ofcom is consulting on two draft codes of practice (for user-to-user services and for search) along with draft guidance for risk assessment and for children’s access assessments.

Manifesto watch: the headlines

With the UK General Election just three weeks away, the three main parties have now published their manifestos and we’ve produced a comparison table on how their pledges on online safety and wider policy and regulatory initiatives stack up. Both Labour and the Conservatives are promising further measures to build on the Online Safety Act but neither have gone down the route of promising bans on smartphone use by children; the Conservatives have, however, promised to put their guidance on banning mobile phones in schools on a statutory footing and to provide funding to schools to implement it and will consult on further measures to protect children online. Labour have committed to bringing back the provision to allow coroners to access information held by tech companies after a child’s death - which had been included in the now-scrapped Data Protection and Digital Information Bill. The Liberal Democrats propose setting up a new independent advocacy body for children and a new Online Crime Agency to “tackle illegal content and activity online, such as personal fraud, revenge porn and threats and incitement to violence on social media”.

Categorisation of services in the Online Safety Act

A PDF version of this piece is available to download at the bottom of the page. Issue Ofcom has recently published its advice to the Secretary of State on thresholds for categorisation, along with a call for evidence to inform the implementation of the related duties. The categorisation of services under the Online Safety Act determines which of the regulated user-to-user or search services will be subject to additional duties. These include user empowerment duties and additional responsibilities relating to terms of service - duties which are the only remaining routes to providing additional safety protections for adults since the Government’s decision to remove the wider adult safety provisions from the Online Safety Bill in autumn 2022.

OSA Network statement on illegal harms consultation

The Online Safety Act Network has today (20 February) released a statement co-signed by 22 civil society organisations, campaigners and experts setting out a number of concerns about the proposals in Ofcom’s first consultation on the new Online Safety Act regime, which closes this Friday. The statement, which has the backing of prominent charities spanning children’s rights (Barnardo’s, 5 Rights Foundation), suicide prevention (Samaritans, Molly Rose Foundation) and anti-racism and abuse campaigners (Antisemitism Policy Trust, Glitch, Kick It Out, End Violence Against Women Coalition), details concerns about the approach Ofcom has chosen to take in its first draft codes of practice - on illegal harms - and the impact these choices will have on user safety.