Online Safety Act 2025: What it means for platforms and users

Iklan

With online scam losses running into billions and growing concern over child safety, the new law shifts responsibility onto digital platforms.

THE Online Safety Act 2025 (ONSA), which came into force on Jan 1, marks a significant change in how Malaysia regulates digital platforms and online harm.

Passed by Parliament in December 2024 and gazetted in May 2025, the Act operates alongside existing laws such as the Communications and Multimedia Act 1998 but introduces new obligations specifically targeting platforms that host or distribute user-generated content.

Iklan
Iklan

At its core, ONSA places greater responsibility on platforms to manage online risks, particularly those affecting children. The move comes amid concerns that harmful content often remains accessible due to inconsistent enforcement driven largely by platforms’ internal policies.

When flagged material is not removed promptly, a relatively small number of unresolved cases can still expose large numbers of users to online harm.

Iklan

ONSA was introduced against the backdrop of a sharp rise in online harm in Malaysia.

It was reported that police recorded RM2.7 billion in reported losses from online scams between January and November 2025.

Iklan

On Oct 24, police announced they dismantled a criminal network linked to child sexual abuse material (CSAM), arresting 31 individuals and seizing more than 880,000 digital files.

Since 2022, authorities have ordered the removal of 38,470 items linked to cyberbullying and online harassment.

Iklan

Data from the Malaysian Communications and Multimedia Commission (MCMC) shows that while major platforms removed about 92 per cent of the 697,061 posts flagged as harmful between January 2024 and November 2025, more than 58,000 posts remained accessible online.

Even a shortfall of one per cent, regulators note, can leave thousands of harmful posts circulating.

Graphics by Nurain Sofia.

The growing use of automated systems, artificial intelligence and deepfake technology has further complicated detection and enforcement efforts.

ONSA applies to application service providers and content application service providers, including both local and foreign platforms that operate in or target users in Malaysia.

The Act identifies several categories of harmful content, including CSAM, online scams and financial fraud, obscene or pornographic material, harassment and abusive communications. It also covers content linked to violence or terrorism, material that encourages self-harm among children, content that promotes hostility or disrupts public order and content related to dangerous drugs.

Rather than policing individual posts, the law focuses on how platforms manage risks within their systems, including content distribution, recommendation algorithms and user interaction features.

Platforms are required to identify and reduce exposure to ‘priority harms’, ensure certain content is made inaccessible and provide reporting channels and user support mechanisms. They must also submit an online safety plan outlining how risks are addressed.

Enforcement measures are available for non-compliance, although the Act sets out procedural safeguards governing how regulatory directions are issued and reviewed.

A key feature of ONSA is its emphasis on protecting children online.

Platforms are required to implement safeguards such as age-appropriate controls and restrictions on access to harmful material. These measures are expected to influence how default settings, content discovery tools and interaction features are designed for younger users.

ONSA does not apply to private one-to-one messaging, nor does it authorise general monitoring of users. It also does not introduce new criminal offences related to lawful speech or political expression.

Safeguards built into the framework include notice requirements before enforcement action, opportunities for platforms to make representations, public records of regulatory directions and access to appeal and judicial review mechanisms.

While ONSA establishes a formal regulatory structure for online safety, it does not replace broader efforts such as digital literacy education, parental supervision or community awareness initiatives.

As implementation progresses, its real-world impact will depend on how platforms adjust their systems and how consistently regulatory oversight is applied.