Overview
Police in Indonesia and security agencies across Southeast Asia report a spike in teenagers influenced by white supremacist propaganda and celebratory content honoring mass attackers. The phenomenon gained particular attention after an attack on a Jakarta high school on November 7 that injured 96 people. When officers detained a teenage suspect in connection with that bombing, they found a life-size toy rifle etched with phrases including "welcome to hell" and the names of white supremacist mass killers.
Indonesian authorities say that at least 97 youths - the youngest only 11 years old - are being monitored after exposure to material that glorifies mass violence and white supremacists, much of which spread on messaging platforms such as Telegram. Officials told investigators that at least two of those youths were planning attacks in the aftermath of the Jakarta bombing.
Regional spread and law enforcement response
Security officials in Indonesia, Singapore, Malaysia, Thailand and the Philippines describe a wider regional problem: teenagers plotting violence inspired by far-right actors such as Brenton Tarrant, the Christchurch mosque attacker. Singapore's Internal Security Department (ISD) has detained four youths since December 2020 on grounds that they subscribed to "violent far-right extremism ideologies" and were planning attacks. The ISD has named far-right extremism as a top national security threat.
Authorities note that none of the teenagers being monitored in Singapore and Indonesia are white. In some cases young people framed planned attacks as a means to "protect" the existing racial or religious makeup of their countries; in other instances, security officials say, the youths were inspired by the violent acts of far-right attackers absent comparable personal or political grievances.
Officials in both countries warn that social media posts and online communities have been central to the radicalization pathways in the investigated cases. Telegram groups in particular were cited by Indonesian police as providing a sense of belonging to isolated or disaffected youths. Police commissioner Mayndra Eka Wardhana, a spokesperson for the Indonesian counter-terrorism squad, said the platform often does not take action on content authorities have reported as extremist.
Telegram responded to questions by saying the company maintains an open channel of communication with Indonesian authorities and removes content that breaches its terms of service when reported. A spokesperson for Telegram added that the platform supports peaceful free speech but explicitly forbids calls to violence.
How extremist material circulates online
Investigations found that the Indonesian teenagers identified as radicalized were often affiliated with the "true crime community," a popular internet subculture where users exchange memes and content that romanticizes killers. Screenshots provided by police and a review of several groups revealed content glorifying attackers like Tarrant and discussions that included bomb-making instructions and encouragement toward violent acts.
White supremacist motifs have migrated and been localized across platforms. Videos and posts have been seen that mix Southeast Asian symbols with Nazi imagery. In some cases racist caricatures of Chinese people and other minorities such as Rohingya Muslims were published alongside acronyms or coded phrases that security analysts interpret as calls for mass violence. One set of acronyms - described by researchers as coded slogans - has been used in videos that drew hundreds of thousands of views.
Platforms have taken uneven approaches to such localized extremist content. TikTok removed certain identified posts after being contacted about its moderation practices. A company spokesperson said there is no place on the platform for content that spreads beliefs or propaganda encouraging violence or hate, and that it blocks certain keywords from search suggestions when it finds they are being used as coded language. TikTok also said it consults Southeast Asian advisors on online safety.
Two people working on online-safety teams at TikTok told investigators they were not familiar with the existence of policies tailored to moderating posts that feature localized variants of white supremacist slogans. They were interviewed on condition of anonymity because they were not authorized to speak publicly.
Algorithms and youth vulnerability
Authorities and researchers highlight the role of algorithmic recommendation systems in amplifying extremist material to susceptible users. Singapore's ISD said an 18-year-old detained last year - identified as Nick Lee Xing Qiu - had been recommended far-right extremist content by unspecified platform algorithms. He was held on suspicion of plotting attacks against Singapore's Malay Muslim minority and is detained under a law that allows for detention without trial; authorities did not provide his legal representative to comment.
Experts who track online radicalization say many of the young people who become radicalized are disillusioned and lonely, and find in extremist online communities a nihilistic worldview and a sense of belonging. Pravin Prakash, who researches Southeast Asia for the Center for the Study of Organised Hate, described the profile of those drawn into far-right messaging in those terms.
In Singapore's cases, two youths self-identified as "East Asian supremacists" online and referenced the neo-Nazi "great replacement theory" in posts, claiming to be inspired to "fight back," according to ISD statements.
Content adaptation and coded language
Security analysts note that white supremacist ideology is adaptable to local contexts. Researchers at an institute in Singapore monitoring such content said that some posts use local motifs and coded phrases to mask genocidal messages. Anti-discrimination groups have previously documented Western white supremacist usage of similar coded acronyms advocating extermination of minority groups; local users are said to have adopted variants of these codes.
One example cited by a Singapore researcher was a hashtag used by an Indonesian creator that attracted more than half a million views before the content was removed. The researcher said the shorthand was interpreted as calling for violence against specific minority groups, aligning with how extremist communities repurpose and localize slogans to attract and mobilize new adherents.
Rehabilitation and prevention efforts
Authorities in Indonesia and Singapore are employing different approaches to prevent further incidents and to rehabilitate young people who have been detained or placed under watch. Indonesian officials said they are concerned about the risk that teenagers radicalized by exposure to violent extremist content could become targets for recruitment by organized terror groups.
Many of the youths under scrutiny are minors or have not successfully carried out violent acts. The Jakarta bombing suspect is being held by child protective services while investigators build their case; he has not been charged or entered a plea, authorities said. A family member who identified herself by one name asked authorities to provide counselling instead of punishment if possible.
Indonesia has announced plans to restrict social-media access for children under 16, a policy that authorities say will help combat youth radicalization though officials acknowledge it is not a complete remedy. In Singapore, some detainees have been referred to the Religious Rehabilitation Group (RRG), a non-profit established by Muslim scholars in 2003 originally to rehabilitate suspected Islamist militants. RRG volunteers counsel young detainees and help them prepare for national examinations, according to an RRG counsellor who is also an expert on radicalism at a research institute in Singapore.
RRG worked with Singapore's first far-right extremist detainee, who was held in 2020 at age 16 for allegedly planning machete attacks on two mosques; that individual was released from rehabilitation in 2024, the counsellor said.
Longer-term concerns and international echoes
Officials warn that rehabilitation groups will confront a fast-moving phenomenon as extremist scenes gain influence across borders. Authorities pointed to a disturbing echo outside the region: one month after the Jakarta bombing, a 15-year-old in Russia was accused of stabbing a migrant child to death in the Moscow area. That Russian suspect published a manifesto on Telegram that researchers with the Global Project on Hate and Extremism authenticated; in it he labelled the Indonesian teenage suspect a hero and argued that white supremacists should be capable of producing more such attacks.
Indonesian counter-terrorism officials said they are coordinating with counterparts in neighboring countries, marking a level of regional cooperation that officials described as the first of its kind on this specific type of radicalization. Singaporean and Indonesian security and police agencies said they were coordinating efforts in response to the trend.
Voices from experts and officials
Experts who advise governments and tech platforms cautioned that corporate moderation efforts have historically focused on Islamist extremist content in Southeast Asia, sometimes overlooking far-right content that adapts quickly to local symbols and coded language. Munira Mustaffa, who has advised governments and social media firms on extremism, said neo-Nazi concepts can be repurposed to fit local narratives and that young perpetrators often seek status within online communities for carrying out violent acts.
Indonesian police and security officials emphasized that Telegram groups gave some youths a sense of belonging. Telegram officials reiterated that calls to violence are forbidden on their platform and said they remove reported content that violates their terms. Meanwhile, social-media companies have said they block certain keywords from appearing in search suggestions and consult regional advisors on online safety.
Closing note
Authorities in Southeast Asia say they face an evolving challenge: a wave of teenagers drawn to violent far-right content that has crossed borders and platforms, blended with local motifs, and attracted young people who are often socially isolated. Governments and civil-society groups are turning to detention, monitoring, content moderation and rehabilitation as immediate tools, while acknowledging that the speed of online influence and the international circulation of extremist messages complicate prevention and recovery efforts.