TikTok Reveals The Platform Algorithm Promoting Teen Suicide Using Dark Content

In this photo illustration, an 11-year-old boy looks at the TikTok app on a smartphone screen in the village of St Jean d'Aulps on April 04, 2023, near Morzine, France. TikTok was revealed in a study of teens being targeted by an algorithm of dark content. MATT CARDY/GETTY IMAGES


By Alberto Arellano

The latest study from Ekō, an international consumer watchdog group about the safety of young children who own a TikTok that had seen teen suicide between the ages of 13 and 17.

Teen suicide has put TikTok in the spotlight as the company CEO, Shou Zi Chew had testified before Congress regarding privacy sharing. He was hit with cases of teen suicide by that included a teen who committed suicide after watching dark TikTok videos.

TikTok CEO Shou Zi Chew testifying before the House Energy and Commerce Committee during a hearing on data privacy and child protection in Washington, DC on March 23rd, 2023. NATHAN POSNER/GETTY IMAGES

“According to a recent study conducted by Ekō, TikTok had a higher rate of suicidal content than any other social media platform,” said Yaron Litwin, the Chief Marketing Officer of Canopy. “With 35.3% of the sample featuring references of self-harm or suicide.”

The controversy surrounding TikTok’s reputation included in the congressional testimony of teen suicide when Chase Nasca died at the age of 16 as the platform purposely sent the teenage more than 1,000 videos promoting self-harm, hopelessness, and suicide.

The Nascas had filed a wrong death lawsuit against TikTok and their parent company, ByteDance, over their son’s death 

“There are several reasons why suicide hashtags may get overlooked, including algorithmic biases, lack of resources to monitor content, and a general lack awareness about the issue,” Litwin said of the hashtags. “It’s important for social media platforms to prioritize the safety and well-being of their users and take steps to prevent harmful content from circulating.”

According to Ekō that Litwin pointed out that #sh had the most posts at 926,000 with 6 million views. The hashtag #wubba lubba dub dub had the most views at 87 million with 14,000 posts where #ihatemyself come in with 68 million views along with 17,000 posts. The hashtag #sigmamale had 5 million views at 177,000 posts and #alphamale had 2 million views at 111,000 posts.

The TikTok video promotion of loneliness and pain had given parents troubling alarm where these videos found in these hashtags have affected their mental health. 

Ekō researchers had conducted research using 13-year-old accounts conducting investigations and research of the content what is being fed to teenagers triggering suicides and mental health issues. 

It was found that multiple types of content were available for teens where triggering hashtags along with dark content had pushed for teen suicides.

“The study found that the top common hashtags related to teen suicide on TikTok were #suicide, #selfharms, #depressaion, and #sad,” Litwin said of the hashtags. “These hashtags express feelings of sadness and self-harm can be a cause for concern among parents and social media users.”

The algorithm that TikTok had promoted was aimed at children where a viral video called “We Can Still Be Friends” appeared on the For You Page with a loaded gun and text suggesting suicide. The video had a massive engagement of 1.1 million views, 186,000 likes, and over 3,200 shares. 

Another study case included a young man named “Joey” imitating his own suicide during his birthday.” A text read, “feeling like joe fr (for real)”. The video had over 1.9 million views in less than 24 hours including over 3,600 comments and over 4,600 shares. It was removed from the platform where comments included multiple suicide dates, times, and locations.

“Signs of teen suicide can include changes in mood or behavior, such as withdrawal from friends and activities, irritability, and increased sadness,” Litwin said about possible signs of suicide. “To prevent suicide, parents and social media users should educate themselves on the warning signs, encourage open communication, and seek professional help when needed.”

Incel and manosphere content is readily available for children where videos were that celebrated controversial figures such as the late Elliot Rodger and retired kickboxer Andrew Tate.

“Parents should be aware of influencers who promote harmful or dangerous behaviors, such as self-harm or suicide,” Litwin said about the potentials of following dangerous figures like Tate. “It’s important for parents to have open conversations about the potential risks of following certain influencers.”

Dean Nasca, whose son Chase allegedly committed suicide after receiving unsolicited suicidal videos in TikTok, listens as TikTok CEO Shou Zi Chew testifies before the House Energy and Commerce Committee hearing on “TikTok: How Congress Can Safeguard American Data Privacy and Protect Children from Online Harms,” on Capitol Hill, March 23, 2023, in Washington, DC. JIM WATSON/GETTY IMAGES

One video celebrated Rodger in a TikTok video that said “To My fav Murderer” the day before his killing spree. Another video showed a longer version of his video manifesto saying, “Tomorrow is the day of retribution, if I can’t have you, girls, I will destroy you.” The videos combined have 2.9 million views and 197,000 likes. This type of content influenced gun violence and suicide.

Andrew Tate had a video where he was imitating Laurence Fishburne’s character, Morpheus, in “The Matrix” presenting the red pill and the blue pill. Tate used the red pill as an attack against women where men felt the need to live up to their hyper masculinity. The red pill movement was criticized as bigoted and misogynistic where it pushed for a decrease in lower self-esteem. 

“Moderation and enforcement of community guidelines on the platform can help address these issues. Additionally, education and awareness among users about the impact of their actions online can play a role in prevent,” Litwin said about chain of events that lead to suicide among children.

It was that TikTok’s content was pushing incel to children where it started in other online communities that included 4chan and Reddit. An Ekō set up a 13-year-old account on TikTok showing a manosphere and incel content that included various hashtags, such as #sigma,  #alphamale, and #modernwomen that included celebration icons in these movements.

This included content included scenes from a Jake Gyllenhall film from a scene saying, “Shoot me. Shoot me in the (explicitly deleted) face.” The video amassed 440,000 likes, 2.1 million views, over 7,000 comments and over 11,000 shares. Once commenter suggested their own suicide in the next few hours.

“Social media companies need to take a proactive approach to moderate the content on their platforms and enforce their community guidelines. It is equally important for parents to educate their children about responsible online behavior and how to report concerning content.”

Ekō’s conclusion to the research corroborated that young teens are targeted by TikTok’s algorithm. The study had advocated for the revival of the Platform Accountability and Transparency Act , requiring platforms to provide data on viral content.

“The study did not specifically identify the leading causes of suicide on TikTok,” Litwin said of the potential chain of events. “However, it did not that a significant amount of content related to suicide on the platform was graphic and potentially triggering.”

The use of hashtags related to suicide had gotten high amount of like and several engagements that related to cyberbullying. Today, TikTok has banned the use of the search word of suicide that leads to the 988-suicide hotline.

Recommended from our partners



The post TikTok Reveals The Platform Algorithm Promoting Teen Suicide Using Dark Content appeared first on Zenger News.


Warning: count(): Parameter must be an array or an object that implements Countable in /data/11/0/112/52/764704/user/782211/htdocs/site/wp-content/themes/dialy-theme/includes/single/post-info.php on line 4