Online Safety Act Network

Ofcom's draft guidance on protecting women and girls

Tags:

The Online Safety Act, at section 54, stipulates that:

“OFCOM must produce guidance for providers of Part 3 services which focuses on content and activity —

(a) in relation to which such providers have duties set out in this Part or Part 4, and

(b) which disproportionately affects women and girls."

Amongst other things, the guidance may include examples of best practice as well as highlight aspects of Codes of Practice which are particularly relevant.

The draft guidance – entitled “A Safer Life Online for Women and Girls” – is now out for consultation; the deadline for responses is 23 May. This blog post summarises Ofcom’s approach and the wider policy context, then provides analysis on three aspects relevant to the framing of Ofcom’s consultation:

A. Freedom of Expression and other Human Rights

B. Status of Guidance

C. Safety by Design.

More detail on Prof Woods’s analysis of Ofcom’s approach to Freedom of Expression and Other Human Rights is available in her detailed paper, attached as a PDF at the bottom of this page.

Ofcom’s approach

Ofcom frames its draft guidance as promoting “a safety-by-design approach, demonstrating how providers can embed the concerns of women and girls throughout the operation and design of their services, as well as their features and functionalities. This approach envisages tech firms taking greater responsibility at all levels for women and girls’ online safety, taking steps to prevent harm through safer design and having effective mechanisms to support women and girls and respond when harm does occur.” (Consultation document, page 4)

It groups the actions for service providers into nine areas, underneath which it lists “foundational steps” – measures already set out in its published codes on illegal harms, its draft codes on protecting children, and its transparency reporting proposals – and “good practice” steps, to demonstrate where companies “can go further if they are serious about addressing the range of harms women and girls face online”.

The nine areas for action are:

Taking responsibility

1. Ensure that governance and accountability processes address online gender-based harm, for example by consulting subject matter experts and setting policies that prohibit these harms.

2. Conduct risk assessments that focus on harms to women and girls, for example by engaging with survivors and victims and conducting user surveys.

3. Be transparent about women and girls’ online safety, for example through sharing information about the prevalence of harms on a service and the effectiveness of safety measures.

Preventing harm

4. Conduct abusability evaluations and product testing, for example by using red teaming to identify ways malicious actors may try to use service features to perpetrate harm.

5. Set safer defaults, for example by ‘bundling’ default settings together to make it easier for women experiencing pile-ons to secure their accounts.

6. Reduce the circulation of online gender-based harm, for example by using hash matching to detect and remove intimate images shared without consent.

Supporting women and girls

7. Give users better control over their experiences, for example by providing the option to block multiple accounts at once.

8. Enable users who experience online gender-based harm to make reports, for example by building reporting systems designed in a way that is supportive and accessible for those experiencing domestic abuse.

9. Take appropriate action when online gender-based harm occur

Their “Guidance at a Glance” document is a useful reference tool to see which specific actions are recommended under each of those nine areas.

In addition, Ofcom suggests service providers focus particularly on four areas of harm:

  • Online Misogyny
  • Pile-ons and harassment
  • Online domestic abuse
  • Image-based sexual abuse

Policy context

Ofcom’s draft guidance is an important contribution to the Government’s efforts to tackle online Violence Against Women and Girls (VAWG), which came under scrutiny recently in Parliament in the Public Accounts Committee’s grilling of senior officials from the Home Office, Department for Education, Ministry for Housing, Communities and Local Government and the Department for Science, Innovation and Technology. The Government has set itself a target to reduce VAWG by half in a decade and DSIT, in its draft Statement of Strategic Priorities on Online Safety, made clear that the Online Safety Act had a crucial role to play:

“As part of the government’s specific mission to take back our streets we have committed to halve violence against women and girls over the next decade. To achieve this goal, it is vital we tackle the abuse faced by women and girls online. The Act, and the approaches in this statement, will tackle illegal and misogynistic content to ensure increased safety online for women and girls.”

Later in the draft SSP, it states: “In order to strengthen protections for women and girls, Ofcom will also produce guidance that summarises in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will ensure that it is easy for platforms to implement holistic and effective protections for women and girls across their various duties.”

More recently, DSIT published a report of work carried out by Public on “Platform Design and the Risk of Online Violence Against Women and Girls”, aimed at a global audience, which had three objectives relevant to the approach Ofcom has taken:

  1. Understand how design features of online platforms and services can enable the perpetration of online VAWG
  2. Develop an understanding of how existing safety by design approaches can protect against the risk of online VAWG
  3. Understand potential new design approaches to ensure safety for women and girls on online platforms and services

Finally, an important touchstone in our analysis of Ofcom’s approach – and one which the regulator acknowledges at many points in its consultation document - is the work we contributed to, along with other civil society organisations and experts, to produce a VAWG code of practice during the passage of the Online Safety Bill which ultimately led to the previous Government’s concession to include the requirement for guidance on protecting women and girls in the legislation.

Next steps

We will be publishing a full consultation response in due course, informed by the insights and views of experts in the VAWG sector. In the meantime, we hope that the commentary below provides some initial food for thought for organisations and individuals considering their own responses.

 

A: Freedom of expression

A detailed paper from Professor Lorna Woods analysing the approach Ofcom has taken with regards to freedom of expression and other human rights is available at the bottom of this page. The main points are summarised here.

Ofcom summarises the position regarding human rights (specifically freedom of expression and privacy) as follows:

“Any interference with these ECHR rights must be prescribed by law; pursue a legitimate aim and be necessary in a democratic society. The interference must be proportionate to the legitimate aim pursued and corresponding to a pressing social need.” (A1 (Legal Annex), para 2.115)

The purpose of our detailed analysis, in the full paper, is to demonstrate that, while the three-stage test Ofcom refers to is a starting point for analysis of freedom of expression and has been re-stated by the Court in many cases (eg Delfi (App no 64569/09), para 131), it is not the totality of the Strasbourg Court’s approach. The Court’s jurisprudence consistently emphasises the importance of freedom of expression in a democratic society, which is based on pluralism, tolerance and broadmindedness.

While the Legal Annex and the Impact Assessment (found at Annex A2) recognise that there needs to be a balance between conflicting rights, and that one person’s exercise of a right may lead to an interference with another person’s rights, it is not clear how Ofcom has gone about the balancing process. It is certainly not clear that Ofcom has recognised the impact on women and girls of the four types of content has an impact on their fundamental human rights. The jurisprudence of the Strasbourg Court is far more nuanced and multi-factoral than the brief summary in the Annex to the Consultation on the Guidance suggests and probably allows for greater freedom of action than an initial reading implies.

Three aspects of the case law are elaborated in the detailed paper:

  1. is all content protected expression?;
  2. even within Article 10, is all speech equal?; and
  3. how to balance conflicting human rights.

Prof Woods’ paper makes reference to the jurisprudence from the Court of Human Rights in Strasbourg and looks at relevant judgements in relation to recent cases. Based on her analysis, Prof Woods concludes that, while Ofcom’s summary of the approach to fundamental rights reflects the Court’s approach, it does not go far enough into the detail of the jurisprudence. There is a considerable amount of balancing to be done and - given that much of the speech falling into the four categories of harm covered in the guidance will be speech attracting little protection (if any), and the matters that Ofcom seeks to protect go to the core of Article 8 (respect for private and family life) - a stronger statement, not just about the space available to Ofcom to work in, but also about the positive obligations, would have been desirable. While freedom of expression should never be cavalierly dismissed, it is Prof Wood’s view that there was space here for Ofcom to be more courageous in its protection of the Article 8 rights of women and girls.

B. Status of Guidance

Ofcom has noted that the Guidance is not mandatory. Does this mean that the Guidance has no impact, or a lesser status than other documents that Ofcom has produced to the point where it cannot be enforced against? This latter question requires us to understand the nature of the risk assessment guidance and compare it to that of the Guidance on Women and Girls. (Ofcom’s final illegal harms risk assessment guidance is here; the draft children’s risk assessment guidance is here.)

As we have noted previously, there is no particular status accorded in general by the law to the notion of “guidance”. The impact of the guidance depends on context (to the point where some statutory guidance is effectively mandatory). Clearly, in the Online Safety Act, this is not the case – the Act distinguished between codes of practice which are described as having a particular legal effect (ss 49(1) and 50 OSA); the various forms of guidance envisaged by the Act do not have this effect. Nonetheless, this does not mean that they are totally ignorable. The usual expectation is that the person receiving the guidance has due regard to it. This means that, although the addressee need not follow the Guidance, the addressee should engage with and have cogent reasons for not following it and this must be more than a disagreement as a matter of principle.

2. Enforceability and the Relationship with Risk Assessment Guidance

There is no obligation in the Act allowing Ofcom to enforce against the contents of its guidance directly. Rather, the enforceable obligation relates to the requirement to carry out a “suitable and sufficient” risk assessment in relation to illegal content risks and, where applicable, risks from content harmful to children. In carrying out the risk assessment(s), the provider is required to take into account relevant risk profiles produced by Ofcom under s 98 OSA following Ofcom’s own risk assessment. The risk assessment guidance (which is not mentioned in the risk assessment duties) is described as being a mechanism to assist providers in carrying out this duty (s 99 OSA). Implicitly, there is a distinction between documentation about what is determined to be risky (risk profiles) to which the service providers must have regard and documents detailing how to carry out a risk assessment which are not mandatory. In this sense, we might describe the risk assessment guidance as good practice. It would not be true, however, to suggest that the Guidance is entirely optional. The enforceable obligation imposes certain qualitative criteria on the risk assessment and the Guidance forms part of the picture of what a suitable and sufficient risk assessment looks like. So while service providers may do something different, they should have the Guidance in mind as a form of informal benchmark.

Similar points can be made regarding the Guidance on Women and Girls (indeed we queried Ofcom’s reference to the Guidance as “best practice” in our response to its Plan of Work consultation; see page 23 in Ofcom’s response) – the reference to advice and best practice is not so different from the function of the risk assessment guidance. Indeed, this guidance - insofar as it discusses risks which might disproportionately impact women and girls - forms part of the picture of what a suitable and sufficient risk assessment looks like in a manner similar to the risk assessment guidance.

C. Safety by Design

A “by design” approach is increasingly common, particularly in the regulation of communication technologies. But there is no common, cross-disciplinary understanding of the term. This is even more the case when we consider safety by design. In particular, are we expecting an end-point (“safety”) or the adoption of a process – or a hybrid position? Note, the OSA refers to services being “safe by design” (s 1(3)(a) OSA), a phrase which puts greater emphasis on the end-point than the means to get there. Safety by design is in any event clearly a key element of the OSA regime, as we note above in our review of the policy context and the recently publications from DSIT.

We have discussed possible meanings of safety by design (and Ofcom acknowledges the variety of approaches (para 2.2.4)), here.There are three distinct - albeit interconnected - aspects to safety by design:

  1. scope
  2. temporal scope
  3. priorities

We expand on what these aspects require in this section. This section also places the nine actions Ofcom has proposed in its consultation within these themes, identifying where the “safety-by-design” framing could be strengthened.

1. Scope

A safety-by-design approach should apply at all stages in the communication chain provided by a online service provider. For user-to-user services, we can suggest the following stages to provide a broad framework for analysis (a broadly similar set of stages could be relevant to search):

a. account and content creation;
b. content and user discovery/navigation;
c. user response; and
d. platform response.

a) Account and Content Creation

Anonymous accounts have been much discussed, but in this space we should also consider the possibility for “disposable” accounts as well as bot networks. The degree of friction during the on-boarding process is another factor that is relevant here, as is the information about risks and expected behaviours – including terms of service. There is no standard approach across user-to-user services as to whether and how various forms of online gender based violence are defined and treated, and where these issues are raised there are questions around granularity of the provisions. We might also consider augmented reality features, emojis and nudges to users when composing or uploading material. Incentives for creation whether in the form of revenue-sharing arrangements, ease of accessing mass audiences for third-party advertising (eg influencers) and metrics and features such as “streaks” could also be considered here. Finally, account settings are relevant here; again, the issues of defaults is one which Ofcom raised in its nine actions.

b) Content and User Discovery

Recommender tools have rightly been discussed both as regards concerns about filter bubbles and the propensity of “outrageous” material to be prioritised, as well as the role of recommendation technologies in radicalisation. Other features could be considered in content discovery – eg the recommendation of people to follow; hashtags and trending topics. Features to facilitate ongoing engagement (autoplay, infinite scroll) also fall into this category of features. In the context of domestic violence, the recommendation of people to follow may prove problematic for those seeking to avoid an abuser.

c) User Response

This could cover user complaints, tools given to users to rate content (either their own or other’s) as well as functionalities such as upvotes, likes, quote reposts and ease of forwarding or reposting more generally. Direct messaging (though envisaging a two-way process) fits here too. The lack of friction in responding to user content is a theme underpinning a number of these features and allows the creation of links between users.

d) Platform Response

This in brief covers the service’s moderation functionalities and the steps they take to enforce their terms of service, as well as any appeal systems (either for content take-down or decision not to take content down). This set of measures tends to operate ex post, but visible enforcement of terms of services can operate to reinforce community standards potentially having a role in setting expectations for standards for future communications. Many survivors of online gender-based violence – particularly those who have experienced image-based sexual abuse through NCII – have commented how inappropriate many services’ reporting systems are – an issue Ofcom picks up in its draft Guidance. It may be that survivors could be retraumatised through the reporting process if it is inappropriately designed.

Commentary

Safety can be considered at each of the above stages – the draft Statement of Strategic Priorities emphasised that safety by design requires looking at all areas of services and business models. In principle Actions 1-5 and 7 in Ofcom’s Guidance could apply across the board. Defaults may be particularly important at stage (a), for example regarding privacy and security settings (eg location data; meta data on photos; 2FA as noted at para 2.63). Demonetisation – which affects the sorts of content created – is mentioned, albeit only in passing (yet it is presumably an important feature of those who provide misogyny for clicks and revenue). Action 6, looking at virality, is particularly relevant to stage (b), although some of the suggestions (eg those relating to friction around material posted) relate to (a), whilst hash matching could relate to content curation as well as content creation. The consultation did not, however, specifically consider the frictions around account creation (beyond the issue of anonymous accounts) and the impact of easy access to “burner” accounts.

As the DSIT-commissioned report (Feb 2025) notes (p 21), the language used by different services to elaborate their policies in this area is not consistent, compounding challenges in reporting and making user tools especially those related to blocking more important – and the lack of platform policies was one of the key issues the DSIT report on platform design identified (p. 25). This however effectively places the burden for rule enforcement on the victim. Moreover, as the DSIT report noted “reliance on user responsibility can further exacerbate online VAWG in specific circumstances. Stakeholders across sectors noted that some safety features may be ineffective for women in public positions who experience large volumes of online violence” (p 26). Note, however, that the discussion of down-ranking/deprioritisation seems to suggest that Ofcom envisages this happening on an item-by-item (or possibly user-by-user) basis, rather than expecting the provider to look more fundamentally at the values that their tools encapsulate. Actions 8 and 9 seem more addressed to stage (d).

Moreover, functionalities should be considered together (as Ofcom in some contexts recognises eg para 2.65), both as regards to their impact to increasing risk or in terms of reducing it – and in this abusability testing (Action 4) is important. Taking them together may provide a better outcome than seeking to rely on just one stage – and a multi-layered approach may allow services to compensate for features which may have beneficial aspects but also may be capable of abuse (eg anonymous accounts, screen-shotting).

We note and agree with the approach set out in the consultation that it is not an attempt to provide “instructions or directions” (para 2.21) and that it is Ofcom’s view that “[u]ltimately, it is up to service providers to determine how they can achieve the action set out”. The restatement of the providers’ responsibilities as risk creators is in keeping with a duty of care approach which seeks to ensure services are safe by design.

One of the concerns – set out in the VAWG sector response to Ofcom’s illegal harms proposals and our Network response to the children’s proposals - was that while Ofcom’s risk assessment noted functionalities across this content distribution chain, the main focus in the Codes seemed to be on content moderation (with some emphasis on content curation). Given the status of the codes in giving safe harbour to services, the Guidance here, which covers a wider range of issues, cannot change that unfortunate fact.

2 Temporal Scope

The main theme of Ofcom’s approach to safety by design is what we have termed here “temporal” in that it looks at design considerations across the lifecycle of the product – so the initial design and development of the service or feature of a service, its deployment and operation, maintenance and eventual decommissioning or removal. This aspect can be seen in that both testing and design are considered as well as operations and maintenance (para 2.25). As the draft Statement of Strategic Priorities noted, service providers should consider how to make existing features safer. Testing before the service/features is released, as well as on an ongoing basis to understand how it is used, are key aspects of a safety by design approach – and which should be thought about (at least to some degree) by all providers. This links into risk assessments, which should be gender sensitive (Action 2) and – as Ofcom recognises – trauma informed; and the need to think about how features could be abused to try to mitigate that (even if not a full red-teaming exercise). The consultation also notes retirement – the removal of features. On the one hand, retirement of a feature might be seen as the only solution to an intractable problem (se eg TikTok’s response to dark design concerns in relation to TikTok lite). Conversely, it may be that the removal of a feature adds to risk factors (eg removal of particular policies relating to impermissible speech as shown by Meta).

3. Priorities

The two aspects discussed relate mainly to the what and how of safety by design, but does not consider to what end. The draft Statement on Strategic Priorities describes services that are safe by design: “where features are chosen and designed to limit the risk of harm to users”. It also notes that the responsibility for keeping themselves safe cannot fall on users, a point also made by DCMS during the policy development of the Act. So while user empowerment tools, which can have a useful role to play in allowing users to curate their own environment (especially important given people’s experiences and responses differ (para 2.75)), can be designed in, they do not constitute safety by design (they may rather be seen as a form of safety tech); choices on defaults are, however, important (Action 5). This consideration could be said to be the reverse side of the principle that service providers should take responsibility (noted above). Earlier guidance from DCMS explained its view of safety by design as follows:

“the process of designing an online platform to reduce the risk of harm to those who use it. Safety by design is preventative. It considers user safety throughout the development of a service, rather than in response to harms that have occurred”.

Drawing on this, we propose that the model found in our earlier work (as well as in approaches notably those found in the UN Guiding Principles) is relevant. Prof Woods’ paper on safety by design suggested:

“[t]hree types of response to hazards and risks can be seen: hazard reduction; then hazard management; and finally remediation. Hazard reduction rather than hazard management or the provision of remediation should be prioritised, with remediation being the mechanisms of last resort.” (Prof Lorna Woods: Safety by Design; October 2024)

The aim is to “design out” the problem at source and - while that may be impossible in practice, this point is a useful factor in orientating design processes. Borrowing from the language of privacy by design means that safety becomes an essential component of the core functionality being delivered.

This implies that product testing is an essential aspect of the design cycle and that ex post responses and safety tech are not sufficient (though they may be necessary) (Actions 7-9). So while designing in effective content moderation (including the suggestion for an oversight mechanism for decision-making in this space) is valuable, such mechanisms operate ex post (at stage (d) above) and fall into the third stage in the hierarchy of safety-by design responses. While there will always be a need for content moderation and ex post remediation, this should not be considered the primary mechanism for dealing with harms. When considering hazards the position of all users should be considered. This includes taking into account intersecting characteristics (mentioned by Ofcom) and the position of girls.

The DSIT-commissioned report notes that while girls of all ages are affected, girls of different ages are likely to have a different exposure to risk and a one-size all approach to risks for children would not be appropriate. Steps taken to minimise privilege hazard are important- and we note the suggestions relating to governance and consultation of subject matter expects (para 249), though we note the limitations on that due to resources of service providers. While not a substitution for meaningful consultation, some understanding of user experience might be gained through an analysis of user and trusted flagger complaints (discussed under transparency at para 255, Action 8). Moreover, design choices should be made to optimise functionality and security where possible, rather than relying on saying that there’s a trade-off (or a choice between the interests of one group of users over another).

In sum, while the framing Ofcom has adopted some mechanisms that would be necessary to implement this approach, Ofcom has not expressly adopted a hazard minimisation approach. The suggestion that a hazard reduction approach should not be taken as suggesting that the other two aspects to safety by design are unimportant. Rather, the suggestion is that all three elements contribute to safety by design.

 

 

Download assets