Safeguarding the Digital Playground: New Regulations in Australia, Indonesia, and Singapore for Protecting Minors on Social Media

Singapore

In an era of ubiquitous social media usage, governments worldwide are grappling with the challenges of regulating digital platforms. Protecting minors from harmful content and mental health risks is a top priority. Recent developments in Australia, Singapore, and Indonesia are indicative of governments proactively introducing measures to protect minors as they are particularly susceptible to the harms associated with social media.

Australia

Australia passed the Online Safety Amendment (Social Media Minimum Age) Act 2024 (“Amendment Act”), amending the Online Safety Act 2021. Social media platforms will have approximately 12 months to implement the necessary systems to prevent children under 16 from creating accounts on their platforms.

A. Key Definitions

The new regime applies to “age-restricted social media platforms”, which are defined as electronic services specified in the legislative rules, or which satisfy the following conditions:

  • the sole purpose, or a significant purpose, of the service is to enable online social interaction between 2 or more end‑users;
  • the service allows end‑users to link to, or interact with, some or all the other end‑users; and
  • the service allows end‑users to post material on the service;

B. Scope of Application

This definition is broad and the Australian Parliament noted that services like TikTok, Facebook, Snapchat, Reddit, Instagram and X (formerly Twitter) will likely be captured under the new rules (among others). However, if a platform's electronic services are not accessible or delivered to any end-users in Australia, it won't be affected by the new regime.

While the provision of advertising material and the revenue generated from it will be disregarded when determining if the primary or significant purpose of the social media platform is to enable online social interaction between end-users, organizations that provide advertising services or generate revenue from such services are not automatically exempt from the new obligations. Therefore, these organisations could still be subject to the new obligations if they meet other criteria outlined above. 

C. Key Obligations

Age-restricted social media platforms will be required to take reasonable steps to prevent children below the age of 16 from having accounts. This means that all Australian users of social media platforms will need to verify their age. ‘Reasonable steps’ has not defined in the Amendment Act. While no technical specifications have been published thus far, Parliament has noted that some level of age assurance will be necessary, and new guidelines will be issued in due course.

D. Penalties & Enforcement

If an age-restricted social media platform fails to take reasonable steps to prevent age-restricted users from having accounts, it may be liable to pay a fine of up to c. USD 31 million. The Commissioner may issue a written notice requiring age-restricted social media platforms to provide information about their compliance with new obligations, or to an electronic service provider to determine whether their service constitutes an age-restricted social media service. A failure to comply with a notice carries a civil penalty of up to USD 100,000. Further, the Commissioner may publish a statement of the violation on its website, which could harm the provider's reputation by highlighting non-compliance with the new regulations.

Indonesia

Following Australia’s lead, Indonesia’s Communications and Digital Minister (“Minister”) announced on 13 January 2025 that the Indonesia government was considering similar regulations on children’s use of social media. There have been concerns that such measures preclude children’s ability to access information and stunt their ability to develop sufficient digital literacy, and the Indonesian government has committed to examining the options open to it. With an emphasis on child protection, the government is seeking to protect children from “physical, mental, or moral perils” and to prepare children to use social media in a positive manner. The Minister clarified that the government intends to pass a permanent law to implement a minimum age for children to access social media but needs to study the situation before taking concrete steps. In the interim, she noted that the government will issue a regulation outlining certain preliminary measures.

Singapore

The Infocomm Media Development Authority (“IMDA”) has issued a Code of Practice for Online Safety for App Distribution Services (the “Code”). The Code will require designated App Distribution Services (“ADSs”) to implement measures limiting the risk that users, especially vulnerable children, are exposed to harmful content. The Code will take effect from 31 March 2025.

A. Key Definitions

The Code aims to protect:

  • ADS Users: A person who accesses ADSs, including a person who downloads app(s) and app update(s) from the ADSs, and is not an app provider.
  • Children: Individuals who are below 18 years old.

The Code imposes obligations on ADSs and App Providers to protect users. ‘App Providers’ are defined as persons that caused the app to be distributed or made available for download by means of the ADS.

B. Key Obligations

The Code seeks to limit the spread of harmful content, including sexual content; violent content; suicide and self-harm content; cyberbullying content; content endangering public health; and content facilitating vice and organised crime.

The obligations under the Code fall into 3 categories: (1) user safety; (2) user reporting and resolution; and (3) accountability.

1. User Safety

The Code imposes three main obligations on ADSs:

  1. Limit Harmful Content: ADSs must implement reasonable measures to limit users' access to harmful content by establishing content guidelines, standards, and moderation measures. This includes reviewing apps and updates to ensure compliance and taking enforcement actions like warnings, suspensions, or bans.
  2. Online Safety Information: ADSs should provide clear and intelligible information about online safety, including content guidelines, moderation measures, actions for breaches, and reporting mechanisms for harmful content.
  3. Minimize Exposure to Exploitation and Terrorism Content: ADSs must take steps to detect and remove child sexual exploitation and terrorism content quickly. This involves working with App Providers to prevent the spread of prohibited content through established guidelines and standards.

Furthermore, the Code imposes obligations specifically to protect Children.

  1. ADSs must implement reasonable measures to minimize children's exposure to harmful and inappropriate content. This includes establishing content guidelines and moderation measures, with stricter standards for sexual, violent, suicide, self-harm, and cyberbullying content. Additionally, ADSs should ensure that content detrimental to children's well-being is not marketed to them.
  2. ADSs must implement specific measures that limit the level of access children have: 
    • Children must have differentiated accounts with robust, age-appropriate settings to limit harmful and inappropriate content by default. 
    • ADSs must establish systems and processes, like age verification, to accurately determine users' age. This must be in line with the provisions of the Personal Data Protection Act 2012 and guidelines issued by the Personal Data Protection Commission. 
    • If age verification systems are yet to be implemented, ADSs will need to present an implementation plan and timeline for approval by the IMDA. 
    • Children and their parents/guardians must be warned about the implications of opting out of default settings and should have access to information and tools to manage children's safety and content exposure. 
    • Information on online safety, including tools and steps for supervision, should be easily accessible and understandable by children and their parents/guardians.
  3. Unless apps with user generated content functionalities are expressly prohibited on the ADS, it must ensure that App Providers have implemented content moderation measures to identify and remove harmful content, as well as internal channels for ADS Users to report any harmful or inappropriate content. ADSs must take appropriate action against App Providers where they fail to resolve such reports.

2. User Reporting and Resolution

ADSs must provide simple and effective reporting and resolution mechanisms for ADS Users to report harmful or inappropriate content, and ADSs must address such reports adequately and in a timely manner.

Where an ADS User submits a report that is not frivolous or vexatious, the ADS must promptly inform the ADS User of its decision and action taken, unless they have expressly opted out of being kept informed. The ADS User must also be allowed to request that the ADS review its decision and any action taken.

Further, where ADSs remove a reported app from the app store or disable an App Provider’s account due to their reported app, all ADS Users who downloaded the reported app within the preceding 6 months must be informed as soon as reasonably practicable.

3. Accountability

The Code requires that ADS Users be provided access to clear and simple information enabling an informed assessment of the level of safety and related safety measures by ADSs. Therefore, ADSs must submit annual online safety reports to IMDA for publication on its website. Suitable information and metrics should be included in these reports, reflecting Singapore ADS Users’ experience using the ADS, including the steps taken and the effectiveness of those steps in limiting ADS Users’ exposure to harmful or inappropriate content, as well as actions taken in response to ADS Users’ reports.

The metrics, which are subject to agreement by IMDA, should include the:

  • volume, breakdown by content category and outcomes of ADS User reports regarding the categories of harmful and inappropriate content;
  • prevalence and breakdown by content category of apps that provide harmful or inappropriate content, as well as the App Providers supplying the apps;
  • promptness of the ADS in reviewing and addressing ADS Users reports; and
  • effectiveness of proactive steps in identifying and addressing apps, accessible by ADS Users in Singapore, that violate policies against harmful or inappropriate content.

 C. Penalties & Enforcement

An ADS that fails to comply with the Code may be ordered to pay a fine up to S$1 million or directed to take any necessary steps to remedy the non-compliance. If an ADS does not comply with such a direction, it may be liable to a fine up to S$1 million.

Conclusion

The introduction of stricter social media regulations in Australia, Indonesia, and Singapore underscores a significant evolution in legal approaches to online child safety. Australia's and Indonesia's focus on age restrictions address foundational concerns about minors' access to social platforms. Singapore’s Code aims to introduce a comprehensive regulatory framework. These developments follow a global shift trending towards greater accountability for social media platforms.

The information provided above does not, and is not intended to, constitute legal advice pertaining to the Australian, Indonesian and Singaporean regulatory regimes; information, content, and materials stipulated above is based on our reading of the amendments and are for general informational purposes only.