CSAM distributors exploiting digital filter loopholes to circulate explicit material - MP

Government alone cannot monitor content, platforms must upgrade security now

NORAFIDAH ASSAN
10 Dec 2025 03:19pm
CSAM distributors are becoming more deceptive by using fake accounts and certain applications such as Telegram to form private groups and closed channels for spreading illicit material.
CSAM distributors are becoming more deceptive by using fake accounts and certain applications such as Telegram to form private groups and closed channels for spreading illicit material.

SHAH ALAM – Child Sexual Abuse Material (CSAM) is now circulating more covertly through private channels, closed groups and anonymous accounts across multiple digital platforms, raising fresh alarm among policymakers.

Bentong MP Young Syefura Othman said distributors were becoming increasingly deceptive by using fake accounts and applications such as Telegram to form private groups and closed channels for distributing explicit material.

According to her, these methods made harmful content difficult for authorities to detect and highlighted the need for tighter regulation over digital platform operations.

Young Shefura
Young Shefura

“That is why we need to make it compulsory for platforms to conduct filtering. Platforms must take responsibility to screen any content sent through their platform,” she said.

She stated that current digital filtering systems still relied heavily on databases of previously identified content, creating loopholes that modified material could exploit.

“The existing digital filtering systems mostly rely on databases of known content, which creates an opening for modified material to escape detection.

Related Articles:

“These CSAM distributors are cunning; they use artificial intelligence (AI), make small alterations, paste someone else’s face onto another person’s body, causing the content to frequently slip through the filter,” she told Sinar.

Young Syefura urged platforms to upgrade their security and screening systems immediately to match the rapid evolution of technology, adding that it was unrealistic for the government alone to monitor the massive daily volume of uploaded content.

She stressed that platform providers bore the primary responsibility for early intervention, while the government could play its role by tightening licensing requirements.

“However, the role of users, especially parents, is far more important in ensuring children’s safety in the digital world.

“Children today are smart. We block websites, they already know how to use virtual private networks (VPNs). So parents must monitor, conduct surprise checks, and see who their children are communicating with online,” she said.

She added that efforts to control internet use could not rely solely on schools without active involvement from parents.

Children under 16 vulnerable to exploitation

Young Syefura warned that children below 16 faced greater risks of sexual exploitation due to their psychological and social immaturity.

Young Shefura also reminded parents not to use their children as social media content, warning that early online exposure could harm their development.
Young Shefura also reminded parents not to use their children as social media content, warning that early online exposure could harm their development.

She said they were more susceptible to manipulation and grooming through social media, often without parents noticing.

“We always remind children not to talk to strangers face to face, but we do not know who our children are in contact with online. This is more dangerous and requires monitoring.

“That is why limiting the use of social media for those under 16 is a preventive measure in line with global standards, but the mechanism must be implemented clearly and firmly by platforms,” she said.

She also reminded parents not to use their children as social media content, warning that early online exposure could harm their development.

Commenting on platform responsibility in curbing CSAM, she said swift content spread required early proactive filtering rather than waiting for complaints.

“When we detect it, the content has already spread far and been reproduced. Any offenders we catch must be banned,” she said.

Young Syefura further proposed that the Sexual Offender Registry be made accessible online for public checks, noting that perpetrators could come from any background, including individuals close to the victim.

Download Sinar Daily application.Click Here!