Online Safety Act Network

Responses

Response to Ofcom's Proposed Plan of Work 2024/5

Ofcom’s proposed plan of work for 2024/5 does not mention any engagement with civil society organisations in its Online Safety deliverables (indeed, there is no mention of “civil society” within the whole document), nor does it seem to take account of the work and related resourcing requirements arising from the consequential, iterative work on its first codes of practice that will be required in the light of the first consultation on illegal harm. We set out more detail on these points in the attached PDF version of the consultation response we’ve submitted to Ofcom.

Response to DSIT consultation on the OSA super-complaint mechanism

The response available as a PDF below is from the Online Safety Act Network, in conjunction with the Molly Rose Foundation, and reflects inputs from other civil society partners with an interest in this area. The response sets out how we are concerned that the effect of the eligibility criteria will significantly limit the number of expert organisations who might otherwise enter the super-complaints mechanism and that the impact of the procedural requirements on the number of super complaints that Ofcom can consider at any one time will further weaken the regime’s effectiveness.

Response to the Digital Regulation Cooperation Forum call for input on its 2024/5 workplan

This is the response submitted by the OSA Network to the Digital Regulation Cooperation Forum (DRCF) call for input on its 2024/5 workplan which can also be downloaded as a PDF below. We welcome the opportunity to provide input to the DRCF’s workplan for 2024/5. The Online Safety Act Network has been set up to continue the work that Carnegie UK took forward on the online harms agenda, in particular providing policy development and advice to civil society organisations and convening discussions on priority issues arising from the Ofcom consultation programme.

Response to Ofcom call for evidence on categorisation

This is the response submitted by Carnegie UK to Ofcom’s call for evidence on categorisation. In addition to the proforma submission, which can be downloaded as a PDF below, the following additional points were made. Additional evidence Some smaller platforms should be seen as part of a network of harm. This is in two directions: as a seed or catalyst, generating harmful content (such as a nudification app service, racist or suicide memes) which are amplified/distributed on larger platforms or in the opposite direction as an extreme destination to which people are teased from larger services.