top of page

Policing our Platforms: Who Bears Responsibility for Online Harm?

In this Explainer, find out…

  1. Why was the Online Safety (Relief and Accountability) Bill introduced, and what does it entail?

  2. Who does the Bill apply to, and how will it work in practice?

  3. What challenges or unintended effects could arise, and how might they be addressed?


Introduction


The Singapore Parliament passed the Online Safety (Relief and Accountability) Bill on 5 November 2025, a major update to Singapore’s online safety framework. Introduced by the Ministry of Law (MOL) and the Ministry of Digital Development and Information (MDDI), the Bill reflects the Government’s commitment to strengthening online safety by empowering victims of online harm to seek timely relief and obtain redress.  The core purpose of the Bill is to empower and protect victims of online harm, moving beyond content regulation alone by providing enforceable remedies. It achieves this by combining regulatory enforcement, civil remedies for victims, and legal responsibilities for online platforms.



Rationale for the Online Safety (Relief and Accountibility) Bill 


Growth and Complexity of the Bill 


Online harms such as harassment, doxxing, impersonation, and image-based abuse have become more prevalent on online platforms and are increasingly cross-border in nature. Perpetrators often exploit anonymous identities on platforms and jurisdictional gaps to avoid accountability, making it difficult for victims to obtain timely and effective relief. 


Limits of Existing Legal Frameworks


Before the Bill, victims relied on fragmented legal remedies such as defamation law, the Personal Data Protection Act (PDPA), and criminal offences under various statutes. These regimes were often reactive rather than preventive. The processes were slow and costly for victims, which deterred them from seeking redress, and they were ill-suited to addressing platform-enabled harms and the rapid dissemination of harmful content online.


Shift in the Regulatory Framework 


The Bill reflects Singapore’s shift towards a hybrid regulatory model that does not require platforms to proactively monitor all content, but instead focuses on how platforms and administrators must respond once harm is reported.  



Features of the Online Safety (Relief and Accountibility) Bill


The Online Safety (Relief and Accountability) Act broadly encompasses three key tenets:

  1. Establishment of a New Online Safety Commission;

  2. New Statutory Torts in Practice; and

  3. Improved User Identity Information Disclosure.


Together, these three tenets seek to complement existing laws on online harm.


Who Does the Bill Apply to?


The Bill applies across the online ecosystem, regulating communicators (individuals who are responsible for posting harmful content); Administrators (entities managing or exercising editorial control over online locations), e.g., moderators of an online forum or a company that controls a community page; and Online Service Providers (OSPs), including social media platforms.


Its reach extends to online content that can be accessed by users in Singapore, even if platforms are based overseas.


New Online Safety Commission (OSC)


Overview


The Bill creates the Office of the Commissioner of Online Safety, which oversees the new OSC to be set up in the first half of 2026. The OSC is the dedicated regulator for online harms. It will be empowered to issue directions to address varying categories of online harms, starting initially with the most severe and prevalent first five harms: online harassment (including sexual harassment), intimate image abuse, image-based child abuse, doxxing and online stalking.


The Bill aims to provide a reasonable and cost-efficient oversight process, while acting swiftly with multiple channels of recourse.


How do Victims Report Harmful Content?


In most cases, victims are required to first report the harmful content to the online service provider (e.g., social media platform). If they do not act within a certain timeframe, the victim can contact the OSC for help. However, for more severe harms (e.g., intimate image abuse), victims may bypass the usual step of reporting to the platform first and communicate with the OSC directly.


Should either the victim or perpetrator be unsatisfied with the outcome, they may submit an appeal to the OSC, which may issue a new decision.


Powers of the OSC


After assessing the case, the OSC can issue directions to online service providers, administrators of online pages or communicators of online harm (individuals who perpetrate the harm) to take down the harmful content, to restrict the perpetrator’s online account, or to allow the victim to post a response.


The Commissioner of Online Safety is granted the power to issue binding directions to stop, restrict, label or reduce the reach of harmful online content. 


Online service administrators who fail to comply may face access blocking or app removal orders by the OSC. Online harm perpetrators may also be prosecuted for the offence of non-compliance.


New Statutory Torts in Practice


A statutory tort is a civil wrong created directly by statute. This gives individuals the legal standing to commence legal proceedings for those responsible for causing or failing to act against online harms. The new statutory torts allow victims, in serious cases, to bring civil claims directly against perpetrators and even the online platforms themselves.


Ultimately, it aims to clarify the duties that individual users, administrators and platforms owe to each other, and motivate all actors to act responsibly in the online space.


Improved User Identity Information Disclosure


The OSC can require platforms to take reasonable steps to provide specified information that may identify users suspected of committing online harms, such as collecting names or verified contact details.


If the OSC has determined that an online harm has been committed, and the applicant would like to seek legal recourse from an anonymous perpetrator, they may apply to the OSC for disclosure of the perpetrator’s user information. The OSC will then obtain the user information from platforms and disclose it to the applicant under certain conditions.



Potential Policy Challenges and Mitigation Measures


The Bill represents a sophisticated effort to ensure digital accountability without compromising fundamental user rights, and the primary challenge lies in its application across a borderless online environment. While the Bill intends to fortify security in the digital space preemptively, the Government will remain agile to mitigate the unintended consequences that may result.


Balancing Accountability with Privacy


The Bill empowers the OSC to require platforms to disclose the identities of end-users suspected of causing online harm. While this is well-intentioned — aimed at unmasking perpetrators who exploit anonymity — it has raised concerns over potential misuse for revealing personal data. Once disclosed, such information could be exploited for reverse-doxxing or weaponised as a tool of intimidation in personal or political disputes, including by victims themselves.


To address these risks, the OSC will impose strict conditions on how disclosed identity information may be used. Victims may only use the information for limited purposes, such as pursuing statutory tort claims or protecting themselves from further harm.


Any breach of these conditions constitutes a criminal offence. In addition, the Bill introduces a new tort action against individuals who submit frivolous or false online harm notices, providing a legal check against abuse of the system.


A Chilling Effect on Public Discourse and Legitimate Speech


Furthermore, the Bill targets broad categories of conduct, such as the “online instigation of disproportionate harm” and the “publication of statements harmful to reputation”. Because the OSC may issue directions based on a relatively low threshold of a “reason to suspect” harm, rather than the higher standard of “reasonable grounds to believe”, concerns have been raised about the breadth of its discretion.


This may incentivise defensive moderation by platforms, leading them to pre-emptively remove content that is even remotely controversial or critical, with a chilling effect on free expression.


To counter this, the Government will establish a right for affected parties to request the Commissioner to reconsider a platform’s decision to remove content, followed by a right of appeal to an independent Panel comprising experts from multiple sectors.


There will also be a phased rollout of thirteen harm categories, intended to give the OSC the bandwidth to build institutional capacity and refine its interpretive approach, before extending enforcement to more ambiguous forms of harm.


Fragmentation of Communal Digital Platforms


Interestingly, the Act’s definition of an “Administrator” extends legal duties, liability, and potential criminal penalties to individuals who organise or manage online spaces, such as volunteer moderators of chat groups or niche forums.


However, this added burden may deter community members from taking on moderation roles, increasing the risk that such spaces become unsustainable or dissolve altogether.


Thus, the OSC has clarified that an Administrator’s obligations are triggered only upon receipt of a written notice of harm. The courts are also directed to consider factors such as the seriousness and persistence of the alleged damage. This avoids rigid formulas and allows for context-sensitive judgments that better reflect the diverse realities of online interactions.


Ineffectiveness of Geo-Blocking


The Bill imposes a short 24-hour window for platforms to act on reports of online harm, presenting a significant technical challenge. Given that most online services operate globally, compliance often relies on geo-blocking content within Singapore.

However, geo-blocking is easily circumvented through the use of virtual private networks (VPNs). If platforms implement geo-blocking but are unable to prevent widespread VPN-based access, the OSC may still deem their response unreasonable, potentially triggering escalated enforcement actions or penalties.

To mitigate this, Singapore will expand bilateral and multilateral cooperation through regional platforms such as the ASEAN Ministerial Conference on Cybersecurity. This aims to strengthen collective responses to transboundary online harms by promoting a more consistent and shared understanding of what constitutes harmful content across jurisdictions. Still, much is to be discussed to ensure platforms do not face unfair penalties.


Legal Complexity


The Bill’s statutory tort framework overlaps with existing legal regimes, including the Protection from Harassment Act (POHA) and defamation law. This may create uncertainty for victims over the most appropriate avenue for recourse.

As a result, relief may become fragmented across multiple legal pathways, disadvantaging individuals without the resources or legal knowledge to navigate civil litigation against well-resourced perpetrators or platforms.

To reduce legal complexity, a “no wrong door” approach will be adopted, ensuring that victims receive guidance and assistance regardless of which agency they approach first.

The Government has also committed to exploring support measures, such as pro bono schemes and legal clinics, to ensure that access to justice does not depend on a victim’s financial means, to maintain equity.



Conclusion


The Online Safety (Relief and Accountability) Bill recognises that online harm, far from being a marginal or episodic issue, is a structural feature of the online environment today. It rethinks how responsibility should be distributed across victims, platforms and the state, attempting to close the gap between the speed at which online harm occurs and the pace at which remedies have traditionally been available.


A careful attempt to balance competing imperatives also emerges from the Bill. The need for swift intervention and meaningful accountability is weighed against concerns over privacy and free expression. The inclusion of procedural safeguards signals an awareness that regulatory overreach carries real risks, and these must be recognised and ameliorated. In this sense, the Bill represents an important step in Singapore’s broader effort to govern online spaces in a manner that is both protective and principled.

This Policy Explainer was written by members of MAJU. MAJU is a ground-up, fully youth-led organisation dedicated to empowering Singaporean youths in policy discourse and co-creation.


By promoting constructive dialogue and serving as a bridge between youths and the Government, we hope to drive the keMAJUan (progress!) of Singapore.


The citations to our Policy Explainers can be found in the PDF appended to this webpage.


Comments


MAJU: The Youth Policy Research Initiative

By youths, for youths, for Singapore.

  • LinkedIn
  • Telegram
  • Instagram
bottom of page