Online Safety Act Network

The OSA and the draft Statement of Strategic Priorities

Tags:

Peter Kyle, the Secretary of State for the Department for Science, Innovation and Technology, today published his draft Statement of Strategic Priorities for Ofcom, accompanied by a forthright interview in the Telegraph which refers to his statement telling the regulator they need to “look again at whether they are being assertive enough in certain areas”. The press release - which also announces a review into the evidence on smartphone and social media use by children - is here.

But what is a Statement of Strategic Priorities (SSP) and why is the Secretary of State issuing it now?

Statements of Strategic Priorities

Statements of Strategic Priorities (or equivalent products) are well-established regulatory tools whereby the Secretary of State with responsibility for a particular regulator can - usually once in each Parliament - establish a broad, outcomes-based framework or set of expectations for the work of that regulator. The then Secretary of State for DCMS issued one for Ofcom in 2019 with regard to its telecoms, radio spectrum and postal obligations. Last week, the Chancellor issued a “letter of recommendation” for the Financial Conduct Authority setting out the role the financial services regulators play in driving forward the Government’s growth mission: “we must have proportionate, effective regulation that allows firms of all sizes to compete, innovate and grow, creates a stable, attractive environment which encourages businesses to establish and expand in the UK, and adequately protects consumers”, wrote Reeves; and the Government has indicated that it will consult on its statement for the Competition and Markets Authority as part of its Industrial Strategy.

In his foreword to the draft SSP for Ofcom, Kyle writes:

“This is a critical document which outlines government’s areas of focus for online safety and which the independent regulator, Ofcom, must have regard to as it continues its work to implement the Act. … The statement reinforces the outcomes the government believes should be prioritised for creating a safer online environment.”

The OSA framework

Section 172 of the Online Safety Act sets out the role of the SSP within the regulatory regime: “The statement is a statement prepared by the Secretary of State that sets out strategic priorities of His Majesty’s Government in the United Kingdom relating to online safety matters … The statement may, among other things, set out particular outcomes identified with a view to achieving the strategic priorities.” The SSP is intended to last for five years without amendment, unless “a Parliamentary general election has taken place”, or “there has been a significant change in the policy of His Majesty’s Government affecting online safety matters”.

The OSA (section 173) requires that the Secretary of State allow for a 40-day consultation on the draft statement with Ofcom and “such other persons as the Secretary of State considers appropriate”. Once the statement is finalised after this consultation, the Secretary of State must lay the statement before Parliament and wait for a further 40-day period before it can be designated.

Once designated, section 92 sets out what Ofcom must do next: firstly, they must “have regard to” the SSP “when carrying out their online safety functions”; this is important terminology given the regulator’s independence from Government. A further forty days after the statement is designated, Ofcom have to “explain in writing” what they propose to do as a result of the SSP and then, every 12 months after that for the period covered by the SSP, Ofcom have to “publish a review of what they have done during the period in question in consequence of the statement”.

What’s in the statement?

The introductory section to the SSP frames the priorities for Ofcom (and the regulated tech companies) within the context of safety: “Keeping people safe is the first duty of government and it should be the first duty online platforms have towards their users.” It sets out five “ambitious” areas for Ofcom:

  • implementing safety by design to stop more harm occurring in the first place
  • increasing transparency and accountability of online platforms
  • maintaining regulatory agility to keep pace with changing technology and behaviour
  • building an inclusive and resilient online society of well-informed users
  • supporting continued innovation in safety technologies

The priority on safety by design highlights the Government’s desire that “while protections should be strongest for our children, all users should have better protections and feel supported to make choices about what they see”. It emphasises concerns about “the amount of abuse women and girls receive online”, as well as the impact of fraud and the means by which “the proliferation of hateful content online fuelled violence and civil unrest across the UK”.

This, in particular, is welcome:

When we discuss safety by design, we mean that regulated providers should look at all areas of their services and business models, including algorithms and functionalities, when considering how to protect all users online. They should focus not only on managing risks but embedding safety outcomes throughout the design and development of new features and functionalities, and consider how to make existing features safer.

The government believes the goal should be to prevent harm from occurring in the first place, wherever possible.

The SSP then goes on to specifically mention a number of areas that it wants to see work on, reflecting many of the concerns that civil society organisations have raised about Ofcom’s approach to date:

  • developing the evidence base to support children to have safe, age-appropriate experiences;
  • ensuring companies are effectively deploying age assurance technology;
  • deploying effective and accessible additional protections for adult users, particularly vulnerable users (with a particular focus on “the vast amount of misinformation and disinformation that can be encountered by users online”, and an expectation that platforms should have “robust policies and tools in place” to minimise this content);
  • using risk and evidence-based approaches to work towards ensuring there are no safe havens online for illegal content and activity (more on this below);

Under the transparency and accountability priority, the SSP sets out four specifics:

  • increasing understanding of the harms occurring on platforms, why they are occurring and the best way to tackle them - this includes references to coordination with other regulators on “algorithmic transparency” and Ofcom’s use of the Advisory Committee on Disinformation and Misinformation to provide “forward-looking, impact focused advice”;
  • support for bereaved parents;
  • clear and consistent terms of service;
  • greater accountability from services to their users - with a recommendation that Ofcom “analyses user complaints to identify trends that may require further systemic-level action.”

The agile regulation section suggests that “Ofcom might find significant benefits in designing a forward-looking approach to regulation that quickly mitigates significant risks that emerge”, including “where individuals are carrying out illegal and harmful activity in new ways”. The specifics include:

  • monitoring, risk assessing and “where appropriate” mitigating changes in the use of technology that enable online harm - where Ofcom are encouraged to “utilise the full provisions of the Act” as well as being “proactive” in terms of making relevant changes to the guidance and codes;
  • mitigating threats from AI-generated content and activity - this requires Ofcom to consider, on an annual basis, whether they need to produce an updated Strategic Approach to AI and again, suggests “working collaboratively” with other regulators to align their approaches across regulatory remits;
  • international cooperation to “enable new ideas to tackle online safety to be shared, building a global consensus on online safety”;
  • effective regulation of small but risky services - including Ofcom keeping its approach to small platforms “under continual review”.

The fourth inclusivity and resilience priority sets out the role of media literacy and how Ofcom’s powers enable them to request information from platforms on how they are “tackling misinformation and disinformation” within the scope of the Act. It also crucially sets an expectation that Ofcom’s guidance on protections for women and girls, due next February, will summarise “in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will ensure that it is easy for platforms to implement holistic and effective protections for women and girls across their various duties.” The underpinning priorities the Government wish to see delivered include:

  • users who are aware of and resilient to mis- and disinformation;
  • widespread adoption of best practice principles for literacy by design;
  • parents, carers and children understand risks and are supported to stay safe against online harm;
  • including young people in the policymaking process;
  • effective mitigation of risks to trust in online information due to AI-generated content.

The final technology and innovation priority sets out some specific requirements in relation to fostering an environment that encourages innovation of online safety technologies, including for Ofcom to build on its Online Safety Technology Lab to support innovation, driving up adoption of safety technologies and supporting the development of more effective age assurance technologies.

Our view

We welcome the fact that the Secretary of State is using this vehicle in such a clear and far-reaching way: the opportunity of a new Government and a new Parliamentary term is the right time to establish a forward-looking relationship with Ofcom and - crucially - giving them a clear mandate to act. In that regard, it is also very welcome that these priorities pick up many of the concerns that have been raised by civil society about Ofcom’s approach during the past 12 months. While this is too late for the first sets of codes, it is timely enough now to set a framework for the next round of consultations and subsequent iterations of those codes.

In particular, the upfront emphasis being given to safety by design in the first priority is welcome. We noted previously that the Secretary of State had been talking more regularly in these terms with regard to tackling online harm and Professor Woods has provided some helpful thinking as to how a “safety by design” approach might apply to this regime. The juxtaposition of this with the reference to violence against women and girls is important too, given that the guidance that Ofcom will consult upon next February needs to take a cross-cutting, holistic approach if it’s to deliver the intent of Parliament that it tackle the disproportionate impact of online harm experienced by women and girls. This guidance is going to be pivotal to the success of the Government’s target to halve VAWG - in this regard, DSIT will need to be held to account as much as Ofcom in delivering against this in this Parliament.

The reference to “no safe havens” seems to reflect concerns about how the Ofcom has approached its illegal content judgements guidance - an approach that is further problematic given that the illegal content codes - as well as the children’s codes - give companies a “safe harbour”. (The regulatory impact of the latter can’t be solved by the SSP, however, but will need a change to the Act to remove it.)

Significantly, the detail in the SSP indicates that, to tackle the problem of illegal content at scale, providers may need to go beyond the narrow interpretation that Ofcom has worked to in drawing up its guidance and the measures in the codes and:

“use risk and evidence-based approaches to ensure there is no room for illegal content and activity on their platforms. We expect providers to understand the level of risk of their service being used to facilitate illegal activity, including potentially during periods of crisis, and to embed proportionate safety by design principles to mitigate these risks.”

There is also a welcome reference to the impact of AI which, when combined with “established tactics to manipulate the information environment” raises the risk of “foreign interference seeking to undermine the UK’s core values and processes”.

The reference in the “agile regulation” priority - “ensuring the framework is robust in monitoring and tackling emerging harms - such as AI-generated content” - is also promising, as it appears to signal an expectation of more pace and less rigidity from Ofcom, that they can’t be waiting for a perfect evidence base before recommending measures but need to be more responsive to harm as they identify it, through whatever routes are open to them.

The “inclusivity and resilience” priority is a (slightly belated) nod to the fact that DSIT itself has responsibilities here across the broader digital policy agenda, even if the machinery of government change last year, with its focus on shiny innovation and growth, distracted them from it. That said, the SSP’s reference to work here “including disinformation” is still somewhat optimistic given that this is a large gap in the Act and one where Ofcom - except for where disinformation is illegal or harmful to children - has no duties or levers beyond its media literacy powers and the establishment of its Advisory Committee.

But that’s where this comprehensive and stretching SSP may have its most significant long-term impact: establishing how far Ofcom feels it can go within the parameters of the Act to deliver the Secretary of State’s priorities and providing DSIT with a platform, should it wish to use it, to amend the legislation accordingly.

As the SSP says:

“We will continue to monitor progress and be evidence-driven. Where it is clear some problems cannot be solved without new legislation, the government will consider this, but our goal is to be innovative and deliver maximum outcomes within the Act’s existing provisions.”