The OSRA Bill would provide victims of online abuse the right to sue platforms for monetary damages, creating risks for fragmentation and increased compliance costs
The Government has proposed the bill to strengthen its existing online safety framework
On October 15 2025, Singapore’s Government, via the Ministry of Digital Development and Information (MDDI) and Ministry of Law (MinLaw), introduced the Online Safety (Relief and Accountability) (OSRA) Bill in Parliament. The legislation follows on from MDDI’s Perceptions of Digitalisation Survey, published on 10 October 2025, which found that 84% of consumers saw harmful content online and 33% of consumers were targeted by harmful content in the last year. The OSRA Bill would be a significant modification to the online safety framework set out by the Online Safety Act, passed in 2022, which centres on the enforcement of existing community guidelines and transparency standards through the Infocomm Media Development Authority (IMDA). In announcing the draft legislation, the Government highlighted a number of findings linking online harms to a chilling of speech among victims and emphasised that the OSRA Bill was drafted in consultation with industry, academics, civil society and other experts.
A new regulator to adjudicate harms
If passed, the OSRA Bill would instruct the creation of a new Online Safety Commission (OSC) responsible for adjudicating claims brought forward by victims of targeted online harms. The OSC would be led by a commissioner, appointed by MDDI, and empowered to order a number of remedies on behalf of victims, directed either at the poster of harmful content or the platform where the content was posted, including:
To take down the harmful content;
To suspend or disable the account of the poster;
To mandate that a platform provide space for the victim to respond to the harmful content;
To block access to uncooperative platforms; and
To remove uncooperative firms from app stores.
The Government has laid out a series of especially harmful types of content that it would instruct the OSC to prioritise enforcement against by the end of H2 2026, including: online harassment, doxxing (disclosing someone’s personal information for the purposes of harassment), stalking, intimate image abuse and child sexual abuse material (CSAM). In a secondary list of types of content for “progressive implementation”, the Government also highlights online impersonation, incitement of violence and various forms of disinformation and the abuse of “inauthentic materials” as other harms within the OSC’s proposed remit. While in most cases, victims would be required to report issues directly to platforms before seeking intervention from the OSC, victims of doxxing, intimate image abuse and CSAM proliferation would all be able to make direct claims to the regulator, given the severity of the harms involved.
Establishing a private right to sue platforms and posters over abusive content
To support the enforcement of the OSC’s orders and the empowerment of victims to seek other remedies, the regulator would also be able to compel platforms to assist in identifying users posting harmful content under anonymous profiles. That identification would be particularly important under the legislation’s other means of redress for victims: the ability to take civil action against both posters and platforms. Under the OSRA Bill, victims would be given standing to sue for monetary damages and injunctions related to many of the severe types of harmful content outlined by the Government as priorities for the OSC, except some forms of misinformation and disclosures of private information. This approach, of permitting a private right to legal action for victims of various forms of online abuse, has not been widely adopted among other national governments but has become particularly popular as a proposed enforcement measure in state-level child online safety legislation in the US. Though useful as an approach that provides relief directly to the victims impacted by malicious or negligent conduct online, the use of private litigation runs the risk of significantly complicating compliance for platforms given the unpredictability of individual courts and the potential for fragmentation between the judicial system and relevant regulators.
