Online Safety Act Network

Commentary

Manifesto watch: the headlines

With the UK General Election just three weeks away, the three main parties have now published their manifestos and we’ve produced a comparison table on how their pledges on online safety and wider policy and regulatory initiatives stack up. Both Labour and the Conservatives are promising further measures to build on the Online Safety Act but neither have gone down the route of promising bans on smartphone use by children; the Conservatives have, however, promised to put their guidance on banning mobile phones in schools on a statutory footing and to provide funding to schools to implement it and will consult on further measures to protect children online.

Categorisation of services in the Online Safety Act

A PDF version of this piece is available to download at the bottom of the page. Issue Ofcom has recently published its advice to the Secretary of State on thresholds for categorisation, along with a call for evidence to inform the implementation of the related duties. The categorisation of services under the Online Safety Act determines which of the regulated user-to-user or search services will be subject to additional duties. These include user empowerment duties and additional responsibilities relating to terms of service - duties which are the only remaining routes to providing additional safety protections for adults since the Government’s decision to remove the wider adult safety provisions from the Online Safety Bill in autumn 2022.

OSA Network statement on illegal harms consultation

The Online Safety Act Network has today (20 February) released a statement co-signed by 22 civil society organisations, campaigners and experts setting out a number of concerns about the proposals in Ofcom’s first consultation on the new Online Safety Act regime, which closes this Friday. The statement, which has the backing of prominent charities spanning children’s rights (Barnardo’s, 5 Rights Foundation), suicide prevention (Samaritans, Molly Rose Foundation) and anti-racism and abuse campaigners (Antisemitism Policy Trust, Glitch, Kick It Out, End Violence Against Women Coalition), details concerns about the approach Ofcom has chosen to take in its first draft codes of practice - on illegal harms - and the impact these choices will have on user safety.

Ofcom's illegal content judgements guidance

Issue There are a number of concerns in relation to the definition of illegal content and the Illegal Content Judgements Guidance (Annex 10) proposed by Ofcom in its consultation on the Online Safety Act Illegal Harms duties which in effect define the scope of the regime relating to illegal content: how Ofcom’s approach fits with “safety by design” principles; the degree to which the Guidance is focussing on identification of criminal conduct rather than content associated with a criminal offence; standards of proof should be civil not criminal – the regime is regulatory; impact on protection of human rights.

Ofcom's approach to human rights in the illegal harms consultation

(A PDF version of this analysis piece is available to download at the end of the page.) Issue The Online Safety Act directs Ofcom to consider freedom of expression (Article 10 ECHR) and privacy (Article 8 ECHR), but these are not the only relevant rights – as indeed Ofcom notes. All the rights protected by the European Convention on Human Rights should be considered when considering the impact of the regime – or the lack of it.

Media literacy by design: a response to Ofcom's consultation

Alongside the hefty consultations launched since the Online Safety Act achieved Royal Assent, Ofcom have also recently been asking for views on proposed principles for “media literacy by design”. Under the Communications Act 2003, the regulator has an ongoing duty to promote media literacy. This consultation sets out some media literacy design principles to guide the interventions that online platforms might make to “help internet users engage with online services critically, safely and effectively”.

The OSA regime and the case for "governance by design"

Recent discussions with members of the OSA Network have focused on the approach to risk management being proposed by Ofcom in its consultation on the Online Safety Act illegal harms duties. Volume 3 of the suite of Ofcom documents covers this topic, including some initial proposals and evidence on governance and accountability. Governance structures, along with robust risk assessment processes, are fundamental to influencing product design choices with a view to reducing the risk of harm.

The Online Safety Act: the next chapter(s)

On 9 November, Ofcom published its illegal harms consultation - 1700 pages of it - the first of three main phases of consultations to get the Online Safety Act regime up and running. Inevitably, the length of the consultation has provoked much commentary and some angst amongst those with an interest in this agenda. It is - undoubtedly - long. It will require time-consuming and detailed analysis by those who wish to respond to it.

Welcome to the Online Safety Act Network

The Online Safety Act received Royal Assent last week (Thursday 26 October) and with that Ofcom picked up the reins from the Department for Science, Innovation and Technology and this new network was born. The aims of the Online Safety Act Network are simple: building on the work of Carnegie UK, we hope to help coordinate and support civil society engagement to secure the effective implementation of the Act. There is much to do on that score.

Bringing small high-harm platforms into the online safety regime: how one word changed the game

The online safety regime splits in-scope regulated services into categories: Category 1, Category 2A and Category 2A. Category 1 services, the largest user-to-user services with certain ‘functionality’ (defined in cl. 234), receive the toughest risk mitigation duties: having to provide ‘user empowerment tools’ (cls 15-16) and effective terms and conditions in relation to content that would have formerly been referred to as content ‘harmful to adults’. Category 1 services are also under obligations regarding fraudulent advertising (as are Category 2A search services), and more detailed obligations generally.