Ofcom's approach to human rights in the illegal harms consultation

Tags:

Issue

The Online Safety Act directs Ofcom to consider freedom of expression (Article 10 ECHR) and privacy (Article 8 ECHR), but these are not the only relevant rights – as indeed Ofcom notes.

All the rights protected by the European Convention on Human Rights should be considered when considering the impact of the regime – or the lack of it. So, as well as the qualified rights of freedom of expression, the right to private life and rights noted by Ofcom – e.g. the right to association – we should consider other rights including the unqualified rights – the right to life, freedom from torture and inhuman and degrading treatment as well as the prohibition on slavery and forced labour (e.g people trafficking). Note also that rights can include positive obligations as well as an obligation to refrain from action; a public body can infringe human rights by failing to protect as well as by interfering itself in an individual’s rights.  

Article 14 ECHR constitutes the requirement for people not to be discriminated against in the enjoyment of their rights; all people (and not just users of a particular service) should be considered. This reflects the general principle of human rights that all people’s right should be treated equally – and indeed that the starting point is that no right – for example, freedom of expression – has automatic priority over another. It also means that the European Court has adopted a specific methodology for balancing rights of equal weight, 1 rather than its typical approach where a qualified right may suffer an interference in the public interest but that interference must be limited. This difference in methodology reaffirms the significance of seeing all the rights in issue when carrying out balancing exercises. A failure to carry out a proper balance by national authorities has itself led to a finding of a violation of the procedural aspects of the relevant right.

Note also that Article 17 prohibits the abuse of rights so that “any remark directed against the Convention’s underlying values would be removed from the protection of Article 10 by Article 17” 2. While this applies only to a narrow sub-set of speech, it is nonetheless a factor that should form part of the balancing exercise where relevant. Areas where Article 17 might be relevant include threats to the democratic order 3; racial hatred4; holocaust denial 5; religious 6 or ethnic 7 hate; hatred based on sexual orientation; incitement to violence and support for terrorist activity. 8

What the Act says

Section 22 sets out duties with regard to freedom of expression and privacy, and says that all user-to-user services:

(2) When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users’ right to freedom of expression within the law.
(3) When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a user-to-user service (including, but not limited to, any such provision or rule concerning the processing of personal data).

An analogous provision (section 33) applies in relation to search services.

As a public body, Ofcom falls within section 6 of the Human Rights Act which specifies that “[i]t is unlawful for a public authority to act in a way which is incompatible with a Convention rights”.

Ofcom’s proposals

The concern here is that Ofcom’s approach, as set out in its illegal harms consultation, considers only the rights of users (as speakers) and has principally focused on their freedom of expression. In doing so, it has not really considered the nature of the speech (which the Convention court does take into account), nor provided evidence that speech in some instances would be chilled 9 – it has rather hypothesised a rather theoretical concern. It has not considered the rights of other users and non-users that require steps to be taken against rights infringing harms – and where the infringement of a right has been recognised in the judgments of the European Court, or the opinion of UN Special Rapporteurs. This means that any balancing exercise is skewed towards not taking action for fear of inconveniencing users (who could well be infringing the rights of others) and companies.

We set out a number of examples below taken from various sections of the Illegal Harms consultation to demonstrate our concern. These intersect with our concerns about the proportionality judgements that underpin the overall approach in the consultation on which we will be writing further.

Ofcom on prioritising rights of users over rights of intended victims with regard to content moderation:

  • “Content moderation is an area in which the steps taken by services as a consequence of the Act may have a significant impact on the rights of individuals and entities - in particular, to freedom of expression under Article 10 ECHR and to privacy under Article 8 of the European Convention on Human Rights (‘ECHR’).” (Vol 4, 12.57)

Ofcom on applying a proportionality test to including measures in the codes

  • “to include a measure in the Codes, we need to assess that the measure is proportionate (with reference to both the risk presented by a service, and its size, kind and capacity) and does not unduly interfere with users’ rights to freedom of expression and privacy.” (Vol 4 11.22) – no mention of the rights of others – whether this be their freedom of expression and privacy or other aspects of Article 8, let alone Articles 2, 3 or 4.

Ofcom on recommending cumulative risk scoring systems

  • “We consider that cumulative risk scoring systems could provide various benefits for tackling illegal harms such as fraud, drugs and weapons offences, child sexual exploitation and abuse, terrorism, and unlawful immigration. We recognise however that there is significant complexity involved in these systems, and that there could be adverse impacts on user privacy or freedom of expression if the operation of the system were to result in inappropriate action being taken against content or user accounts. We have limited evidence on this at present. As a result, we are not proposing to include a recommendation that services use cumulative risk scoring systems in our Codes of Practice at this time” (Vol 4, 14.322) – adverse impact on freedom of expression and privacy trumps “various benefits” for tackling illegal harms but does not consider the need to protect fundamental rights

Ofcom’s interpretation of the “chilling effect”

  • “In addition, there could be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented a more effective content moderation process as a result of this option. (Vol 4, 13.52)Potential interference with users’ freedom of expression arises insofar as content is taken down on the basis of a false positive match for CSAM or of a match for content that is not CSAM and has been wrongly included in the hash database. In addition, there could be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented hash matching in accordance with our option.” (Vol 4, 14.87)
  • “Potential interference with users’ freedom of expression arises insofar as content detected by services deploying keyword detection technology in accordance with this option does not amount to a priority offence regarding articles for use in frauds, but is wrongly taken down on the basis that it does. There could also be a risk of a more general ‘chilling effect’ if users were to avoid use of services which have implemented keyword detection technology in accordance with this option.” (Vol 4. 14.281)
  • “We recognise that these user support measures may have a limited chilling effect on the rights to freedom of expression and freedom of association in that they would briefly delay children from disabling defaults and may result in children being less likely to do so (preserving the existing restrictions on their rights outlined in paragraph 18.65 above). The measures may also result in children being less likely to establish new connections or communicate with new users online.” (Vol 4. 18.135)
  • “Chilling effect” here is to dissuade or inconvenience users from using services that act robustly on illegal harms including CSAM and fraud. There is no mention of “chilling effect” in relation to impact of individual users on others.

Ofcom on balancing freedom of expression rights with recommending measures for strikes or blocking of accounts

  • “Although blocking and strikes may be a way of tackling illegal content, there are also concerns about the use of these systems on lawful speech. Preventing a user from accessing a service means removing their ability to impart and receive information and to associate with others on that service. It therefore represents, for the duration of the block and in respect of that service, a significant interference with that user’s freedom of expression and association. The impact also extends to other users, who will be unable to receive information shared by the blocked user on the service in question. Restricting access to certain functionalities as part of a strikes system may also interfere with user rights, for example if the user is prevented from posting content on the service.” (Vol 4 21.39) - no consideration of rights protected through such blocking; or the value ascribed to the speech in the blocked account as regards both the speakers’ rights and those receiving the information.
  • “Our proposed recommendation around strikes and blocking in this consultation relates to proscribed groups. We are inviting further evidence from stakeholders to be able to explore broadening this in future work; in particular, we are aiming to explore a recommendation around user blocking relating to CSAM early next year. We are particularly interested in the human rights implications, how services manage the risk of false positives and evidence as to the effectiveness of such measures.” (Vol 4 11,15) - measures for CSAM blocking not recommended in the first codes as a result despite the impact on children and likely interference with children’s Article 8 and 3 rights and possibly also Articles 2 and 4.

Evidence

1. The Silencing Effect of Abuse – Article 10 ECHR

As Ofcom has recognised, women and other minoritised groups receive a disproportionate amount of abuse 10 – abuse here can take various forms from direct threats to gendered or racist misinformation and the use of deepfakes to undermine and harass – to name but a few. Yet it was established for at least 5 years that “online gender-based abuse and violence assaults basic principles of equality under international law and freedom of expression”. 11 Dubravka Šimonović, the UN Special Rapporteur, in 2018 highlighted the importance of applying a human rights-based approach to online violence against women, 12 and it has been recognised that women in particular are being targeted, especially those in public life. 13 As the UN Special Rapporteur on Freedom of expression emphasises, there should be no trade-off between the right to be safe and the right to speak. 14

This point can be made in relation to other minoritised groups – and those with intersectional identities suffer particularly. In short, the failure to provide a safe environment in which to express themselves – which the European Court of Human Rights recognises is part of the positive obligations under Article 10 15 – constitutes an infringement of the victims’ expression rights, as well as those who share relevant characteristics with them. The point about freedom of expression is particularly important for those in public life, but the underlying facts in any given case may also implicate Article 8 and have an even wider impact.

This point about shared characteristics is also important. Of course, men receive abuse online too, but it seems more addressed to ideas expressly (and thus could be categorised as an extreme form of debate) whereas women seem to be targeted for their characteristics, 16 which is pure abuse, which does not receive high protection under the Convention if it receives any at all 17. By contrast, as discussed below, negative stereotyping of a group, when it reaches a certain level, is capable of impacting on the group’s sense of identity and the feelings of self-worth and self-confidence of its members. It is in this sense that it can be seen as affecting their “private life” within the meaning of Article 8(1). 18

Moreover, the approach to dealing with misogynistic trolling in particular in imposing obligation on the victim itself contributes to an environment in which victims are not taken seriously 19 and rape culture (through symbolic violence) continues. The impacts, while clearly affecting speech, are deep, wide-ranging and often misunderstood and undervalued 20 – especially when the violation of rights is not recognised and what is going on is characterised vaguely as harmful.

2. CSAEM/Grooming of Children – Article 8 and 3

Article 8 is not just about the confidentiality of communications. The text of Article 8 covers four groups of interests: private life, family life, home and correspondence, each of which has been interpreted broadly. As for Article 10, in addition to protecting interference with these rights by public authorities, there are positive obligations to ensure that Article 8 rights are respected even as between private parties. Positive obligations are particularly significant when the interests at stake involve “fundamental values” or “essential aspects” of private life, and the Court looks for deterrence and effective protection. So, granting an amnesty for the perpetrator of sexual assault constituted a breach of Article 8 (and also Article 3) – a point that should be borne in mind when understanding severity of harm and appropriate balances in terms of impact on with service providers or the user speaking. 21

As regards the relevance of Article 8 to CSAEM offences, KU v Finland specified that an advert of a sexual nature placed in the name of a 12-year-old boy on an Internet dating site, leaving him open to approach by paedophiles, was indisputably within Article 8 which covers the physical and moral integrity of a person. Here the Court emphasises the potential threat to the boy’s physical and mental and moral welfare, as well as the boy’s vulnerability because of his age. This then was a grave threat: “sexual abuse is unquestionably an abhorrent type of wrongdoing, with debilitating effects on its victims. Children and other vulnerable individuals are entitled to State protection, in the form of effective deterrence, from such grave types of interference with essential aspects of their private lives”. 22 In this instance, although there were laws in place they were ineffective and the Finnish Government had failed to “put in place a system to protect child victims from being exposed as targets for paedophiliac approaches via the Internet”.23

In Söderman,^30 the step-father of a 14 year old covertly videoed her underdressing before showering; the film was subsequently destroyed without anyone seeing it. He was subsequently acquitted because the act of filming was not illegal in itself. Again, this fell within Article 8 and the State’s obligation to protect the physical and psychological integrity of an individual from other persons – especially where that person is a child. Rape and sexual abuse of a child implicate fundamental values and essential aspects of private life, and aggravating factors include the offence taking place in the child’s home where the child should feel safe. So, “in respect of less serious acts between individuals, which may violate psychological integrity, the obligation of the State under Article 8 to maintain and apply in practice an adequate legal framework affording protection …” that must be “sufficient”. 24 So this is not just the legislative framework that is considered, but also the implementation of that framework.

KU was dealt with under Article 8; it did not involve physical assault of the child. There is some suggestion that verbal abuse without physical violence would fall within Article 8 rather than Article 3. 25 These more serious assaults may trigger Article 3 or even Article 2 (both of which are unqualified rights). Here there are also positive obligations on the state to protect the personal integrity of a child, with breaches of the right being found where state procedures are ineffective 26.

3. VAWG related offences – Articles 8 and 3

A similar analysis can be made in relation to a whole range of offences related to violence against women. Coercive control, domestic violence and similar behaviours likewise infringe the survivors’ Article 8 and, in some instances, Article 3 rights. 27 While many of the cases involve a State’s failure to protect the victim of domestic violence against physical injuries and psychological damage, there are cases involving digital tools. In Volodina v Russia (No 2), the claimant claimed breaches of her Convention rights arising from the State’s failure to take action in respect of cyber-harassment: her former partner had used her name, personal details and intimate photographs to create fake social media profiles, that he had planted a GPS tracker in her handbag, that he had sent her death threats via social media. In this case the Court found a violation of Article 8. Image-based sexual abuse likewise engages article 8: Ismayilova. 28 More generally, the Court has recognised ‘cyber-bullying’ as an aspect of violence against women and girls and that it could take on a variety of forms, including cyber breaches of privacy, intrusion into the victim’s computer and the capture, sharing and manipulation of data and images, including private data. In this case, Buturugă v Romania, the Court found a violation of Articles 3 and 8.

These offences also contravene the Istanbul Convention 29 and the UN Convention on the Elimination of Discrimination Against Women ^30. They should be seen as infringing fundamental rights, infringements in respect of which the State has positive obligations. ^31 Ofcom’s balancing and proportionality assessments should take this into account; so far the analysis has not recognised this.

The President of the Court of Human Rights recently noted that, “the victims of domestic and gender-based violence are not born vulnerable. They are rendered vulnerable, on their journey from girl to womanhood, by the imbalanced social structures into which they are born, by the law and by law-makers, and by attitudes and patterns of behaviour in their regard which are ignored, permitted or endorsed by society, including the State.” She suggests that the focus of the Court “must remain the actions and omissions of State authorities” and suggests the key question “were the applicants accorded equal and sufficient protection before the law?” 30

4. Trafficking – Article 4

Article 4 prohibits “slavery or servitude” in para (1), while Article 4(2) prohibits forced or compulsory labour. The Court has distinguished between these terms. 31 In S.M v Croatia, the Court clarified that “forced or compulsory labour” covers serious exploitation, for example forced prostitution, irrespective of whether it is linked to trafficking. Trafficking in human beings, by its very nature, is based on the exercise of powers attaching to the right of ownership. It threatens the human dignity and other fundamental freedoms of its victims. While the prohibitions in Article 4(1) arguably relate to more serious infractions of fundamental rights, both Article 4(1) and Article 4(2) constitute rights that should be considered as such in any balancing of interests.

Recommendation

Ofcom should review its recommendations in the light of their obligations, taking into account the weight of the rights violations as against companies’ costs in particular.