KUALA LUMPUR, Dec 4 (Bernama) -- The Malaysian Communications and Multimedia Commission (MCMC) is developing 10 subsidiary legislations under the Online Safety Act 2025 (Act 866), focusing on issues related to online child protection, including ensuring that content is appropriate for child users based on their age.
According to the Communications Ministry in a written reply published on the official Parliament website, under the subsidiary regulations, service providers will be responsible for ensuring their platforms are not accessible to users under the age of 16.
Service providers will also be required to ensure that content shown to users under the age of 18 is suitable for their age group.
“In addition, service providers must offer parental control settings in line with their community guidelines or terms of use,” the ministry informed in response to a question from Pang Hok Liong (PH-Labis).
Pang had asked whether the ministry was prepared to introduce legislation to restrict or prohibit youths under the age of 16 from using social media platforms such as Facebook, Instagram, TikTok and others.
To ensure transparency and accountability among service providers in meeting online safety obligations, they will be required to prepare an online safety plan demonstrating compliance with the requirements of Act 866, the ministry said.
The ministry added that several related laws have been formulated and enforced, including the introduction of licensing requirements for internet messaging and social media service providers that meet the licensing criteria to apply for and obtain the Class Application Service Provider Licence [ASP(C)] under the Communications and Multimedia Act 1998 (Act 588).
“This measure ensures that all licensed internet messaging and social media service providers remain accountable in terms of content regulation and algorithm management,” it said.
To support the licensing framework, MCMC has issued a Code of Conduct (Best Practices) for Internet Messaging Service Providers and Social Media Service Providers, it said.
The code outlines the responsibilities of service providers in safeguarding children and adolescents from harmful content, including requirements to implement age verification measures, provide effective parental control settings and empower child users to protect themselves from harmful material.
In addition, it said the government enacted Act 866 to regulate harmful content and establish obligations for licensed application service providers, content application service providers and network service providers.
-- BERNAMA