The Digital Paradox: Trained for AI, untrained for pressure
A cybersecurity expert warns that without emotional and algorithmic literacy, restrictions alone may leave young users just as vulnerable online.

CLASSROOMS can teach digital concepts but they do not prepare students for the pressures of going viral, constant social comparison and emotional manipulation driven by algorithms.
Digital policy and cybersecurity researcher Dr Chua Kee Man said this gap underlies Malaysia’s so-called digital paradox, which restricts social media access for children under 16 while simultaneously expanding artificial intelligence (AI) education in schools.
He said delaying social media access can be protective, particularly for children aged 12 and below who are at a high-risk developmental stage, but it does not automatically build resilience.
“Without changes during the delay, risks such as cyberbullying, harmful comparison, manipulation and exploitation often resurface later, compounded by stronger peer pressure and social intensity.
“If nothing changes during the delay, risk returns in a more socially intense way. The delay works best when it is used as preparation time, not simply a pause,” Chua warned.
He stressed that preparation must extend beyond policy into homes and platforms. This includes consistent device rules at home, safer default settings for teens and parental alerts that keep adults involved rather than reactive.
While Chua partly agrees with the Education Ministry’s emphasis that AI education should focus on digital fluency and competency rather than recreational internet use, he cautioned that technical literacy alone does not equip students to navigate algorithm-driven social environments.
“Classrooms can teach concepts, but students need rehearsal for pressure,” he said.
While AI and digital literacy programmes explain how algorithms work, they rarely address how algorithms affect people — shaping emotions, behaviour and attention.

Based on his research, Chua identified gaps in current digital and AI literacy efforts, namely the lack of algorithmic coping strategies such as reshaping feeds and resisting engagement traps; emotional regulation under public scrutiny or viral attention and influence literacy, including recognising persuasion tactics, outrage bait and creator incentives.
“These skills are often absent or treated too lightly. Yet they are exactly what students need to survive real-world digital spaces,” he said.
Without such training, students may understand AI as a tool but remain vulnerable as users, especially in environments designed to maximise engagement rather than well-being.
From a cybersecurity standpoint, Chua warned that restrictions on mainstream platforms can sometimes displace risk rather than eliminate it.
“Restrictions can reduce certain mainstream harms but they can also push activity into less regulated, harder-to-monitor spaces,” he said.
These include private messaging apps, Discord-style communities and gaming-adjacent channels, where oversight and moderation may be weaker, exposing young users to scams, grooming or toxic communities.
“So a restriction reduces certain harms but can also reroute risk unless you strengthen device-level safeguards, reporting pathways and education covering private-channel threats,” he added.
On enforcement, Chua acknowledged that age-verification systems, especially ID-based ones, are robust but far from foolproof. Workarounds are common, including borrowing IDs or creating accounts with parental help.
“We are already seeing this on major platforms like TikTok. Kids are having accounts created by their parents or guardians,” he said, adding that such practices undermine policy if bans are framed as absolute rather than harm-reduction measures.
Chua stressed that parents remain the most critical factor in determining whether a social media ban protects or backfires.
“If parents lack the skills to manage devices and permissions, young people end up self-governing in secret — and that is usually worse,” he said.
He called for more public resources for parental digital literacy and device governance, rather than relying solely on prohibition.
At the same time, Chua urged greater accountability from platform operators, stressing that safety features should not be optional but the default, especially for younger users.
Download Sinar Daily application.Click Here!

