Please enable javascript in your browser to view this site

Debrief: Ofcom’s Online Safety series

As Ofcom in the UK becomes the latest regulator tasked with enforcing online safety laws, we joined its workshop series on implementing the Online Safety Act 

Through a series of workshops, Ofcom prepares industry for implementation

In December 2023 and January 2024, Ofcom hosted a workshop series to prepare stakeholders for the upcoming implementation of the UK’s Online Safety Act. Given the broad expansion of the Ofcom’s duties through the Online Safety Act, the outreach series began by introducing a four-part framework for implementation for stakeholders, consumers and the regulator:

  1. Governance: decision-making and accountability structures within firms

  2. Design and operations: technical controls including trust and safety mechanisms

  3. Choice: transparency and control for users alongside robust reporting channels 

  4. Trust: improved credibility for Ofcom in its new remit and for platforms under the new regulation 

Overall, Ofcom emphasised its interest in working quickly to begin implementing the law and noted the ongoing opportunities it will have to update or adjust its enforcement work through future public consultations. The regulator also shared that it expects its work to be defined by proportionality in enforcement and a long-term vision of the desired impacts of the Online Safety Act. 

Illegal harms will be the first portion of the law implemented

As part of its planning, Ofcom laid out a three-part timeline for rolling out its new remit, beginning with its ongoing consultation on the illegal harms portion of the law. The workshop series focused on the proposed codes of practice on illegal harms, dividing recommended measures between “harms agnostic” safety tools and “harms specific” tools depending on the functionality of the service adopting them. Some of the recommendations made to platforms include creating a suite of default settings to prevent the grooming of young users, implementing a dedicated reporting channel for trusted flaggers of illegal content and devising a prioritisation system for reviewing reported content. These recommendations are also dependent, in part, on the size of the service applying them, with larger platforms (those with more than 7m UK users) facing more stringent obligations similar to the structure of the EU’s Digital Services Act (DSA). Interested stakeholders will also be able to comment on the forthcoming codes of practice on child safety and the additional responsibilities of categorised services. 

Online safety laws continue to be implemented fully around the world

Outside of the UK, a few global peers have moved ahead with implementing online safety laws more quickly. In Australia, the eSafety Commissioner and the Minister for Communications recently launched consultations on industry codes of practice for internet and electronic services as well as updates to the Basic Online Safety Expectations. Both consultations include updates to account for the development of generative artificial intelligence (AI) and the unique safety risks posed by the emerging technology. As the EU prepares for the full implementation of the DSA for all regulated services on 17 February 2024, the EC has already sent requests for information to a number of Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) regarding data access, mitigations for disinformation and child safety. The EC also opened its first investigation under the DSA regime against X, formerly Twitter, on 18 December 2023, citing concerns over its content moderation and transparency practices among other potential violations of the law. While Ofcom has been explicit in its decision to prioritise speed in implementing the Online Safety Act, these other global regulators could prove useful benchmarks applying existing online safety frameworks to newer technological developments such as generative AI.