Singapore tightens online safety rules with new requirements for app distributors

Updated as of: 28 March 2025

The largest app stores will need to adopt age assurance and content moderation – among other measures – to fulfil their enhanced online safety obligations that take effect on 31 March 2025.

Shutterstock.com/TippaPatt

Singapore’s Online Safety Code of Practice for App Distribution Services (ADS), which requires designated ADSs to implement measures aimed at safeguarding users from harmful content, will take effect on 31 March 2025. 

The Code applies to ADSs with “significant reach or impact,” namely the Apple App Store, Google Play Store, Huawei App Gallery, Microsoft Store and Samsung Galaxy Store. These services will need to implement system-level measures, including age-assurance to protect users, especially children.  

ADSs are the second category of “online communication service” to be designated under the Broadcasting Act 1994 (BA 1994). Safety requirements were first introduced for social media platforms in July 2023. This designation enables the Infocomm Media Development Authority (IMDA) of Singapore to impose specific online safety requirements and issue penalties for non-compliance.  

For violations of the IMDA’s Online Safety Codes, designated companies may be fined up to SGD 1 million (US$747,000), and for ongoing offences, an additional fine of up to SGD 100,000 (US$75,000) can be added for each day the offence continues. The IMDA also has the power to require companies to block access to “egregious content” for users in Singapore

This latest step comes as part of a broader drive in the Asia-Pacific region to improve online safety standards. For instance, social media companies will soon be required to prevent under-16s in Australia from owning accounts on their platforms as part of a nationwide ban, while Malaysia passed its own Online Safety Bill 2024 at the end of last year. 

Key provisions

The Code set outs six categories of harmful content that designated ADSs must act to protect users from, this includes:

  • sexual content;
  • violent content;
  • suicide and self-harm content;
  • cyberbullying content;
  • content endangering public health; and
  • content facilitating vice and organised crime. 

User protections 

Designated ADSs are required to implement “reasonable and proactive measures” to minimise users’ exposure to harmful content. This should include, but is not limited to, establishing a set of content guidelines and standards for app developers and reviewing the content of apps and app updates before they become available to users.

Action should be taken against apps that breach the ADS’ guidelines, which may include warnings, suspensions or bans.   

Users must be provided with an effective, transparent and easily accessible channel to report harmful content and ADSs must respond to such reports in a timely manner that is proportionate to the potential risk of harm. 

ADSs are also required to utilise technologies capable of detecting and removing the most serious forms of harmful content, including child sexual abuse material and terrorism content.

Child protection measures 

The Code includes numerous provisions to ensure ADSs take specific steps to safeguard children. For example, children must not be targeted with content that is likely to be detrimental to their physical or mental well-being.

To this end, ADSs are required to adopt age assurance measures to determine users’ age with “reasonable accuracy.” Unless the service prohibits child users, children must be offered restricted accounts by default with settings to minimise their access to harmful and inappropriate content.

User-generated content 

Unless an ADS explicitly prohibits apps that enable users to share their own content, it must require apps on its service to have in place:

  • content moderation measures to detect, assess, and remove harmful content; and
  • an in-app channel for users to report harmful content.

Accountability 

ADSs must offer users clear and easily comprehensible information to help them assess the safety measures they provide. ADSs are also required to submit annual online safety reports to the IMDA, including information such as the volume of user reports related to harmful content and the time taken to respond to such reports. ADS’ online safety reports will be published on the IMDA’s website. 

See our interactive Compliance Calendar for key upcoming deadlines and dates in core compliance areas throughout 2025, including enforcement dates, reporting deadlines and changes to regulations.  

Stay up to date with key developments and in-depth articles on online safety by following Lexology’s Internet and Social Media Hub