Online Safety Act Network

Response to Women and Equalities Committee inquiry into non-consensual intimate image abuse

Tags:

Executive Summary

1. Ofcom’s powers were not designed to provide individuals with redress. Their powers are inadequate to respond to the need for thousands of images, across many websites to be removed. Any orders made would relate to the service overall not individual items of content and would come at the end of a lengthy enforcement process and are designed to be exceptional.

2. Suggestions to add intimate image abuse offences to the priority offences in the Online safety Act therefore will not solve the problem of failing to get specific material removed from the internet.

3. An amendment could be made to the Online Safety Act which ensured that once a non-consensual intimate image has been removed, all further posts of such imagery are similarly removed without need for further moderation process: a ‘stay-down’ provision.

4. A comprehensive approach is required which provides for civil rights of actions for victims and the possibility of seeking civil orders to get material deleted and removed. This could also be enforced by a regulator with similar powers, together with a straightforward, accessible enforcement process.

5. Such an approach would follow best practice in other jurisdictions which have introduced (a) online and straightforward court processes for victims to get civil orders for image removal and (b) a regulator with powers to order platforms, individuals and ISPs to remove or block material.

6. Further reforms to the criminal law are still required, most particularly criminalising the creation of sexually explicit deepfakes. The current Government proposal only covers some forms, requiring proof of specific motives. It was specifically these motive requirements that were removed in the Online Safety Act due to the fact that they restricted victim’s access to justice and hindered police investigations.

Focus of legislative action has been on reforming the criminal law

7. In seeking to tackle intimate image abuse, the legislative focus has been on reforming the criminal law. This has led to significant improvements in the law, particularly recognising the wide-ranging nature of this abuse, perpetrated in a variety of ways and for many different, overlapping purposes. The introduction, therefore, of a consent-based model for the criminal law, without having to prove specific motives, has been a significant step forward in providing redress options to victims.

Recommend comprehensive civil law regime to complement criminal law

8. However, this focus on the criminal law does not provide victims with rights to civil redress or to civil orders to get material removed or deleted. The Online Safety Act (OSA) does not fill this gap. What is required is a comprehensive legislative response which includes criminal law reforms, as well as a statutory civil right of action for intimate image abuse, together with civil orders.1 See further below for more details.

Why Ofcom’s powers are insufficient to tackle the removal of non-consensual intimate imagery

9. Following the introduction of the Online Safety Act 2023, there have been calls for Ofcom, as the regulator under the Act, to use its powers to enable non-consensual intimate imagery to be removed from the internet. This follows the evidence of survivors and the Revenge Porn Helpline that there are websites which will not remove material, even after there has been a criminal conviction for non-consensual intimate image abuse.2

10. In response, a Government statement to the Daily Express stated that ‘when the full measures of the Online Safety Act come into force this will “require sites to block access to websites hosting illegal non-consensual intimate images if ordered to by a court via Ofcom’s powers”.

11. However, Ofcom’s powers were not designed to provide remedies to individuals.3 The OSA was designed to incentivise service providers into designing and running their services better, including by providing better complaints mechanisms. Indirectly, therefore, the OSA may have some impact on the availability of non-consensual intimate imagery. Significantly, an individual cannot complain to Ofcom about a specific item or series of items of content.

12. Instead, the remedies that are available to Ofcom envisage a gradual increase in pressure on providers. The starting point is an enforcement notice, according to which a provider can be directed to remedy a defect in its systems. Although Ofcom has no powers to make determinations in relation to specific items of content, Ofcom might identify that a service provider dealt with a category of content, such as non-consensual intimate imagery, in an ineffective way.

13. Again, the enforcement here would be in relation to categories of content, not specific items of content. The service provider may comply with Ofcom’s requirements. But this would be in relation to their systems and processes for dealing with categories of content, not specific items of content (albeit that it might have an indirect effect in the service provider removing the items of concern).

14. Failure to comply leads to fines and ultimately business disruption measures which include ‘access restriction orders’. Access restriction orders could be made against an ISP seeking that it block access to certain sites - section 146. However, it should be noted that the process of gaining business disruptions orders is complicated – requiring a court order – and must fulfil specified grounds. They are envisaged as applying at the end of a long enforcement process once other mechanisms have been tried.

15. Further, they are only designed to be used in exceptional circumstances, evidenced by the fact that if a business disruption order is granted, the Secretary of State has to be informed.

16. Such powers, therefore, are wholly inadequate to deal with concerns about specific images, including many thousands of images, across many websites and where urgency is key to ensuring images are removed and victims provided with redress and support.

17. It should also be noted that while the Government statement to the Daily Express refers to ‘illegal images’, the whole problem is that the specific material, the content, is not illegal per se. It is the activity, the non-consensual sharing, that is unlawful. As such, the material itself is not ‘illegal’. This is why it is difficult to get it removed.

Proposals to add intimate image abuse offences to Online Safety Act will not solve the problem of failing to get images removed

18. Accordingly, the proposal to add more of the intimate image abuse offences to the Online Safety Act, as priority offences, will not achieve the aim of enabling removal of non-consensual intimate imagery.

19. The Online Safety Act currently includes the offence of non-consensual sharing of intimate images as a priority offence. This means that services to which the Act applies are obliged to have systems in place to reduce the prevalence of such material on their sites and to operate a system such that they can remove such content swiftly when notified. The duties are specified to require proportionate steps; this suggests that the system need not be effective in every single case to satisfy the duty.4

20. It should be noted, however, that the Ofcom draft guidance on illegal harms states that service providers need to consider each item/post specifically and determine whether there are reasonable grounds to infer, in relation to each post, whether there is evidence of the commission of a criminal offence – and this will include the mental elements and availability of defences. Only then is the content of a sort that triggers the illegal content duty.

21. In relation to non-consensual intimate imagery, this means that the obligation is in relation to every specific post, as discussed further below, rather than to the image itself. Therefore, there is no obligation, according to Ofcom’s draft guidance, for a service provider to remove all copies of an intimate image, even after the first posting has been found to constitute a non-consensual sharing and therefore breach of the criminal law. It might choose to do so, but there is no obligation to do so. To reiterate, the obligation is to consider each individual post, when notified. It is likely that any criminal offences that are added to the list of priority offences will be treated in the same way.

22. In a case where there are thousands of images in circulation, following a criminal conviction, the images shared by the individual convicted will clearly be images to be removed (as clearly there was the commission of a criminal offence, satisfying the service provider’s obligation to consider whether a criminal offence has been committed).

23. However, in relation to every subsequent posting, the obligation is to consider afresh whether there is evidence of a criminal offence likely having been committed. It will not always be obvious that an offence has been committed, particularly where an image has been repeatedly shared for years. Therefore, there is no automatic obligation on a service provider to remove it.

24. Adding more intimate image offences to the list of priority offences does not change this approach by Ofcom and the Online Safety Act. Listing an offence as a ‘priority offence’ within the Act does not in and of itself make content illegal (and therefore subject to removal). It simply determines that nature of the specific obligations under the Act, with stricter requirements applying where the conduct/content relates to a priority offence.

25. The powers of Ofcom remain limited (and not designed for this purpose). Where there is intimate image abuse content online that has not been removed, the enforcement options in the Act only relate to the service provider (as noted above) in relation to general categories of content and are, in any event, lengthy and exceptional.

26. Even if a provider removes the items of concern, the next week they may have been uploaded again or elsewhere and the whole – lengthy - process may need to be started again.

27. What is required is a specific statutory regime that will enable individuals to bring actions against individuals, websites and possibly ISPs swiftly. A regulatory regime is also required that can take action on behalf of victims.

Online Safety Act amendment for material to ‘stay-down’

28. Non-consensual distribution of intimate images is listed as a priority offence in the Online Safety Act meaning platforms have obligations to reduce the presence of this material online and remove it when possible. However, the current draft Guidance from Ofcom states that each time the same material is posted, the issue of whether the posting constitutes a criminal offence should be examined afresh.5 This means that even if a platform determines that an image has been non-consensually shared, it is not under an obligation to remove all copies of that material. It can determine, in each individual instance, whether it has an obligation to remove the material. While some platforms will remove all material, this is not an obligation flowing from the Ofcom guidance. This places a significant burden on those whose images have been posted to monitor the sites for their reappearance and then complain (again).

29. Therefore, an amendment could be made that would ensure that once a platform has determined that the material is non-consensual intimate imagery, that it removes all copies of that material and prevents further uploads. Such a ‘stay-down’ amendment would so some way to fulfilling the aims of the legislation and reducing the spread of intimate image abuse material.

Introduce statutory civil right of action

30. While it is possible to bring a civil case against a perpetrator for some forms of intimate image abuse, the law is difficult to access. Detailed knowledge of civil law is required, together with access to funds to pay for the claims and the civil courts may not move particularly swiftly.

31. A more straightforward, and therefore easier to access, option is a statutory civil claim. This would set out clearly the grounds on which a claim can be brought, enabling swifter and more cost-effective claims. This follows the approach of the Protection of Harassment Act 1997 which includes both civil and criminal remedies.

32. There are also many examples across the US and Canada of states that have introduced civil rights together with criminal laws.

Civil orders to take down and delete material against perpetrators and platforms

33. While courts do have powers to make some orders to remove material, they are not comprehensive or well-known. They are also difficult to access as specific legal knowledge and advice is required which can be costly.

34. Instead, a statutory regime which sets out the orders that can be granted should be introduced.6 This follows best practice in many north American states. The most recent and comprehensive example is British Columbia which also introduced a straightforward, online court regime to process claims and provide the orders.

35. These civil orders would include:

a. prohibiting the offender from distributing the intimate image.

b. Requiring offender to delete any images.

c. requiring the offender to take down or disable access to an intimate image

d. requiring the provider and/or end user of a social media service, relevant electronic service or designated internet service to remove an intimate image from the service

e. requiring a hosting service provider who hosts an intimate image to cease hosting the image.

Why civil remedies are important

36. Recognises victim-survivors’ desire for avenues of support and redress beyond the criminal law.

37. Ability to take fast, effective and at times pre-emptive action to have images removed and limit further distribution with minimal additional stress to victims.

38. Potential to reduce the burden on the criminal justice system by providing a complementary avenue for victim-survivors to pursue.

39. Provide comprehensive response to problem of intimate abuse.

40. Addresses the borderless nature of online distribution channels by targeting both content hosts and individuals that share images without consent.

Regulator with powers to support individuals is required

41. In addition to powers for individuals to seek civil orders against perpetrators and platforms, a regulator that is able to take action on behalf of victims would provide further options. Such a regulator would be a trusted flagger and could take civil actions (as above), as well as additional measures against internet service providers and others that may be able to reduce the spread of non-consensual intimate imagery. There are models of such regulators elsewhere, such as Australia’s eSafety Commission.7

Criminalise all forms of creation of sexually explicit deepfakes, regardless of motives

42. Further reforms to the criminal law are required, including the non-consensual creation of sexually explicit deepfakes. However, the Government proposal is limited and only covers some cases where it can be proven that the perpetrator acted with specific motives.

43. This limits the scope of the law and will make prosecutions difficult. The law should be consent-based, as with the other intimate image abuse offences.

44. For more detail, see:

· Clare McGlynn, Policy Briefing on Creating Sexually Explicit Deepfakes Without Consent: Options for Law Reform, 30 April 2024.

· Clare McGlynn, ‘The new deepfake laws are already making the internet safer for women, but there’s still more to do’, Glamour, 24 April 2024: https://www.glamourmagazine.co.uk/article/new-deepfake-lawswhats-next-opinion

· Clare McGlynn, ‘Deepfake porn: why we need to make it an offence to create it, not just share it’ The Conversation, 9 April 2024: https://theconversation.com/deepfake-porn-why-we-need-to-make-it-a-crimeto-create-it-not-just-share-it-227177

 




  1. This is not a new suggestion; just that the legislative focus has been on criminal sanctions. For a detailed justification of the need for both a criminal and civil law regime to tackle image-based sexual abuse, see Clare McGlynn and Erika Rackley, (2017) 37(3) ‘Image-Based Sexual Abuse’ Oxford Journal of Legal Studies 534. ↩︎

  2. See the campaign by the Revenge Porn Helpline https://revengepornhelpline.org.uk/resources/not-yours-to-view/ ↩︎

  3. For an analysis of the OSA illegal content duties and Ofcom’s draft guidance, see Lorna Woods, Analysis: Ofcom’s illegal judgments guidance, February 2024. ↩︎

  4. For more detail on the illegal content duties for “user-to-user” services see Lorna Woods and Alexandros Antoniou, User-to-user Illegal Content Duties, 7 November 2023 https://www.onlinesafetyact.net/analysis/user-to-user-illegal-content-duties/ ↩︎

  5. For more detail, see Lorna Woods (above) and Clare McGlynn, Ofcom Consultation: Protecting People from Illegal Harms Evidence Submission, February 2024. ↩︎

  6. For more information on these recommendations, see Clare McGlynn and Erika Rackley, ‘Policy Briefing on Law Commission Consultation on Intimate Image Abuse’, 5 May 2021. For a discussion of these and related recommendations as part of the legislative debates on the Online Safety Bill, see the evidence of Clare McGlynn to the Joint Committee of the Online Safety Bill, September 2021. ↩︎

  7. For a discussion of why such a regulator is required, see McGlynn et al Shattering Lives and Myths: a report on image-based sexual abuse (2019). ↩︎