Ctrl-Alt-Regulate: Establishing a Safe and Conducive Social Media Space
- Joyden Choo, Keegan Goh, and Siddhant Vaduvur
- Mar 27
- 9 min read

In this Explainer, find out...
What are some issues arising from social media usage, especially among youth?
What are the current strategies that Singapore adopts to combat such issues?
What are some implications and considerations when implementing these strategies?
Introduction
On 7 January 2025, Minister of State for Digital Development and Information Rahayu Mahzam announced Singapore’s intention to legislate age limits that bar underaged youth from accessing social media platforms. While it may seem surprising, this announcement aligns with Singapore’s history of strong stances towards harmful digital content, especially in recent years.
As Singapore continues to debate the feasibility of social media bans, it is worth revisiting the latest policies that have forged a conducive and safe social media space in Singapore. These include the Online Safety (Miscellaneous Amendments) Act, and the recently updated Code of Practice for Online Safety.
In this Policy Explainer, we will be exploring the issues arising from social media usage in Singapore, and the strategies to combat them. We will analyse the effectiveness of these strategies, including some limitations and unintended consequences.
Threats Of Social Media
Social media, as proven by research, has become the leading source of mental health issues amongst its largest target demographic — youth. In 2022, the Institute of Mental Health (IMH) conducted its first National Youth Mental Health Study, surveying a total of 2,600 Singaporean youths aged 15 to 35 years old. The results were worrying — nearly one-third of Singaporean youth suffer from mental health issues, ranging from anxiety to severe depression. Experts pointed to excessive social media use as the root cause for the numerous factors linked to youths’ mental health issues, such as cyberbullying and body image anxieties.
This resonates with a speech made by Prime Minister Lawrence Wong in February 2024, which highlights “the constant pressure to present a positive image online, the fear of missing out, algorithms that flood news feeds with stories that are designed to spark outrage, and cyberbullying” as causes associated with a surge in mental health issues amongst youth. Research has also found youth to be easily influenced by content on social media, and are thus more susceptible when such content perpetuates harmful narratives.
As social media is rapidly integrated into the daily practices of Singaporean youth, the mental toll of unfettered social media usage also becomes more deeply embedded. Therefore, more regulation is needed to safeguard youth against the harms of excessive social media usage.
In the next section, we will explain how various strategies work hand in hand to combat the dangers of social media, especially for children and youth.
Online Safety (Miscellaneous Amendments) Act (2022)
The Online Safety (Miscellaneous Amendments) Act 2022 took effect from 1 February 2023. It aims to create a safer online environment for Singaporean users and protect youth from harmful content. Under the act, the Infocomm Media Development Authority (IMDA) is empowered to designate Social Media Services (SMSs) and App Distribution Services (ADSs) to comply with the Codes of Practice For Online Safety.
Code of Practice For Online Safety – Social Media Services (SMSs)
Effective 18 July 2023, the Code of Practice for Online Safety – Social Media Services grants IMDA the authority to designate SMSs with a significant reach or impact in Singapore. Examples of SMSs include Instagram, TikTok and YouTube.
The Code strives to achieve three main objectives: enhancing user safety, empowering users, and ensuring accountability.
Enhancing User Safety
To minimise users’ exposure to harmful content, designated SMSs must implement measures to comply with IMDA’s safety requirements.
Age-Appropriate Guidelines and Measures
Designated SMSs must establish community guidelines and content moderation measures to tackle harmful content. The guidelines must minimally cover these main categories of content:
Sexual content;
Violent content;
Suicide and self-harm content;
Cyberbullying content;
Content endangering public health; and
Content facilitating vice and organised crime.
Children’s accounts have separate guidelines and measures that they can easily understand. SMSs must also ensure that these accounts do not receive advertisements or promoted content that is harmful for children’s well-being.
User Tools
Designated SMSs must implement customisable settings so users can ensure a safe online experience for themselves. These include tools to hide harmful content and unwanted interactions, restricting the visibility of their accounts, and location sharing. Parents or guardians can manage these settings for their children’s accounts.
For instance, Instagram, a designated SMS, has moved teens under 18 into Teen Account from 21 January 2025. There are three main distinctions between a regular account and a Teen account: automatic private account creation, sensitive content restriction and parental controls. A private account requires users to manually approve follow and message requests, thus protecting underage users from unwanted attention.
Additionally, teen accounts restrict viewership of videos of people fighting, people promoting cosmetic procedures, violent movie scenes, and other sensitive content from appearing on feeds. Lastly, parental controls are implemented by setting usage limits, gaining insights into who their child is talking to and viewing topics their child has been looking at.
Empowering Users
Designated SMSs must establish reporting mechanisms that are readily accessible to users, allowing them to report harmful content or unwanted interactions promptly. They must also respond in a timely and diligent manner, while further informing users of the actions taken. This ensures transparency and clear communication to the affected users.
Ensuring Accountability
Lastly, designated SMSs are required to submit an annual online safety report to IMDA. These reports will be published on IMDA’s website, with information on SMSs’ efforts to combat harmful content.
Code of Practice For Online Safety – App Distribution Services (ADSs)
Effective 31 March 2025, the Code of Practice for Online Safety – App Distribution Services empowers IMDA to designate ADSs with significant reach or impact, such as the Apple App Store, Google Play Store, and Huawei App Gallery. This is important as ADSs are key gateways to accessing apps, which may contain inappropriate content. Similar to designated SMSs, designated ADSs must implement measures to comply with IMDA’s safety requirements.
Age Assurance Measures
Age assurance measures aim to prevent underage users from downloading apps that are not appropriate for their age group. These measures have been studied worldwide, including Australia, the United Kingdom and the European Union.
Age assurance measures aid in establishing the likely age or age range of the user. They include methods such as:
Age estimation: Using systems such as artificial intelligence, machine learning technology or facial age analysis algorithms; and
Age verification: Using verified sources of identification such as digital ID or credit cards.
Designated ADSs will have to submit an implementation plan of the age assurance measures to IMDA. The plan should include a proposal on how ADSs intend to determine whether a user is a child, while complying with current data protection rules. Initially, designated ADSs should restrict children from their highest age-rated apps, such as those limited to users aged 18 years and above.
How Do The Codes Complement Each Other?
The Codes act together as a robust, triple-layer safeguard.
Firstly, the Code of Practice for Online Safety for App Distribution Services prevents children and youth from downloading age-inappropriate apps at the outset.
Despite this measure, harmful content may still be accessible on downloadable social media platforms. This is where the Code of Practice for Online Safety plays a crucial role. As a second layer, designated SMSs enforce mandated community guidelines which actively filter harmful content. Users and designated SMSs can, respectively, report and respond to harmful content, ensuring swift content removal.
As a final line of defence, IMDA can issue orders to SMSs and ADSs to take down egregious content promptly. Together, stakeholders such as designated SMSs and ADSs, users, parents and IMDA can collaborate to provide a safe online environment. That said, there are implications arising from such strategies. The next section will explain these implications while considering how other countries have responded to similar issues.
Potential Impacts of The New Codes of Practice
Potential Issues with Data Protection
Before the implementation of these policies, users were simply notified about the minimum age required to use certain social media platforms. With no proof of age required, younger users could access these platforms by misrepresenting their true age. To tackle this, the new Codes of Practice ensure that policies to safeguard children against harmful content on social media are enforceable and verifiable. However, this will require data collection.
Notably, current data protection provisions for users have yet to receive updates in line with the new Codes of Practice. The main policy in Singapore on data protection, the Personal Data Protection Act (PDPA) 2012, was last amended in 2022. While the Personal Data Protection Commission (PDPC) has released Advisory Guidelines on the PDPA for Children’s Personal Data in the Digital Environment, these guidelines are not currently mandated by law.
This may cause concerns regarding the safety of users’ collected data. Depending on each service’s implementation plan, SMSs and ADSs may conduct online profiling to ascertain the age or age range of its users or collect and analyse data in national identity documents.
Nonetheless, data concerns have been considered by the Government. Designated SMSs and ADSs are required to practise data minimisation, where only the minimum amount of personal data necessary for establishing the age or age range of a user should be collected.
Moreover, Singapore will consider international practices as it tackles concerns of data protection. For instance, it has engaged Australia on its social media laws. Australian SMSs and ADSs must destroy personal information collected through age assurance measures, and are barred from using such information for any other purposes beyond age verification. It remains to be seen how Singapore’s data protection policies will evolve with its stance on social media.
Switching to Unregulated Platforms
It is highly challenging for the Government to completely regulate all social media platforms accessible to youth. The presence of unregulated platforms thus raises concerns about the effectiveness of our Codes of Practice in minimising social media usage among youth.
The new Codes of Practice are only applicable to designated SMSs and ADSs deemed to have a significant impact on youth and adults in Singapore. However, their effectiveness could be limited if youths choose to shift over to alternate, unregulated social media platforms.
Further, the policies exclude widely used online platforms that offer social media services, but whose primary function is not social media. Examples of these platforms include WhatsApp, Discord, and Telegram. This concern was raised by Member of Parliament Louis Ng in 2022, who had cited examples of e-commerce platforms, online games, and semi-private communities that currently lack regulation. This is especially significant as these platforms have large demographics of youth users, but are excluded from current regulations.
Hampering Youth Representation and Awareness
Preventing youth from accessing social media platforms may also limit the ability for youth to represent themselves. This concern has been raised in Australia, where those under the age of 18 face a social media ban. For example, the Australian Human Rights Commission asserts that “the law may infringe [on the] human rights of young people by interfering with their ability to participate in society”.
Further, restrictions on social media could hinder youths’ ability to use these platforms to engage their communities. This may undermine one of Singapore’s most prevalent trends amongst youth — youth volunteerism. Secondary schools throughout Singapore are encouraging their students to engage in self-directed volunteerism. However, with the new Codes of Practice preventing youth from accessing social media platforms, this may hamper the outreach of youth volunteer projects and limit their impact.
Moreover, the new Codes of Practice restrict the content that youth are exposed to. Information that is not intended to cause harm to youth may be obscured from youths’ social media feeds, because they cover sensitive topics such as raising awareness on sexual abuse and discrimination. This is worrying, as social media is a key avenue for youth to receive updates on current affairs, as well as a platform for them to find groups and causes that they resonate with.
As such, the new restrictions on social media may not only result in a less represented youth demographic, but also a less informed one that is unable to discuss important yet sensitive topics that affect our society.
Conclusion
Currently, the effectiveness of Singapore’s framework for regulating social media access is yet to be seen. The plan demands accountability from platforms who have traditionally left the responsibility of online safety to their users, reflecting Singapore’s persistent efforts to safeguard its youth.
Singapore must continue working with its youth, parents, and society as a whole to allow them to see the benefits of these regulations, and how they can play their part in identifying and responding to egregious content on social media. To mitigate the harms of social media, Singapore must forge connections within society, and progress towards building a safe and conducive digital space together.
This Policy Explainer was written by members of MAJU. MAJU is a ground-up, fully youth-led organisation dedicated to empowering Singaporean youths in policy discourse and co-creation.
By promoting constructive dialogue and serving as a bridge between youths and the Government, we hope to drive the keMAJUan (progress!) of Singapore.
The citations to our Policy Explainers can be found in the PDF appended to this webpage.
Comments