Malaysia faces industrial-scale CSAM threat, under-16 protection becomes critical

Experts have warned that without robust technical and regulatory measures, minors remained highly vulnerable to exploitation, grooming and coercion in increasingly sophisticated online networks.

WAN AHMAD ATARMIZI
WAN AHMAD ATARMIZI
02 Dec 2025 04:19pm
Photo for illustration purposes only. - CANVA
Photo for illustration purposes only. - CANVA

SHAH ALAM - Malaysia faces an escalating threat from industrial-scale Child Sexual Abuse Material (CSAM), prompting urgent calls for stronger safeguards, effective age verification and enhanced platform accountability to protect children under 16.

Experts have warned that without robust technical and regulatory measures, minors remained highly vulnerable to exploitation, grooming and coercion in increasingly sophisticated online networks.

Child activist Datuk Dr Hartini Zainudin highlighted the severity of the issue.

Photo for illustration purposes only. - CANVA
Photo for illustration purposes only. - CANVA

“CSAM in Malaysia has escalated from isolated incidents to an industrial-scale problem. Content is now produced and circulated across mainstream apps, encrypted groups and dark-web channels, with cashless payments such as e-wallets, QR codes and cryptocurrency.

“This facilitates anonymous transactions and turns CSAM into a profitable underground market. Files spread rapidly, being duplicated, resold and mirrored within minutes. Offenders now include opportunistic users and teenagers, reflecting both greater accessibility and the rise of online grooming.

“Global reporting indicates a sharp increase in Malaysia-linked cyber tips, signalling an expanding and increasingly hard-to-track ecosystem,” she told Sinar Daily.

Hartini highlighted that technical safeguards for detecting CSAM remained inconsistent across platforms. She noted that hash-matching was not applied universally to all upload points, including cloud backups, private chats and group uploads.

She also pointed out weaknesses in video and livestream detection, with many platforms relying primarily on user reports.

Encrypted or closed groups continued to be blind spots with minimal monitoring and artificial intelligence (AI) tools designed to identify new CSAM were underutilised or applied only to small portions of content.

“Real-time reporting between platforms and law enforcement is largely absent.

“Urgent technical measures are needed, including universal hash-matching for images and videos, AI-based classifiers to detect previously unseen CSAM at upload, behavioural risk scanning in encrypted spaces and rapid law-enforcement escalation systems,” she added.

Hartini emphasised the critical need to protect minors, highlighting that children under 16 were particularly vulnerable due to their undeveloped impulse control, which increases their susceptibility to grooming, coercion and online extortion.

She highlighted Malaysian data showing that 24 per cent of children have been exposed to unwanted sexual content, while four per cent have experienced clear online sexual exploitation.

Hartini added that once such images circulate, the resulting harm was often permanent.

“Current age-assurance measures are inadequate, as ‘age gates’ are easily bypassed. Effective age assurance requires ID- or mobile-based verification on high-risk platforms, privacy-protective biometric age estimation when needed and layered systems combining behavioural signals with device settings.

“Safe access pathways must also be provided for refugee, stateless or undocumented children,” she said.

The child activist emphasised that children involved in CSAM should be treated as victims rather than offenders.

She said principle was clear: children drawn into CSAM channels were never “young criminals” but were young individuals failed by the adults, platforms and systems meant to protect them.

The urgency of these measures was spotlighted by a nationwide police and Malaysian Communications and Multimedia Commission operation between Sept 23 to Sept 30, which uncovered over 880,000 files linked to CSAM and resulted in the detention of 31 individuals across 37 locations.

Authorities found that illegal content circulated through private messaging services, social platforms and dark web channels, often involving anonymous accounts, closed groups and cashless payments and that youths were sometimes exposed to or involved in these high-risk environments.

Officials stressed that platform accountability and proactive safeguards were essential to preventing minors from accessing harmful content.

This aligns with the government’s broader plan to introduce a minimum social media age of 16, supported by age-verification and identity-assurance mechanisms to restrict under-age users from high-risk online spaces.

Download Sinar Daily application.Click Here!

More Like This