According to a recent study, TikTok suggests possible harmful content around topics such as self-harm and eating disorders to new users within minutes of joining the platform. The app is overwhelmingly popular with teens
CBS News reports that a report released on Wednesday by the Center for Countering Digital Hate (CCDH) revealed that TikTok’s algorithms are recommending self-harm and eating disorder content to new users shortly after they join the platform.
The study involved researchers creating TikTok accounts and posing as 13-year-old users interested in body image and mental health content. The results showed that within just 2.6 minutes of joining the app, TikTok’s algorithms began to recommend suicidal content to the new accounts. Additionally, eating disorder content was recommended within as little as eight minutes.
During the course of the study, researchers discovered 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. James P. Steyer, Founder and CEO of Common Sense Media, commented on the study stating:
The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform’s dangerous algorithmic amplification.
TikTok’s algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens’ mental health crisis.
Chinese company ByteDance launched TikTok in 2017, and the social media app has quickly become one of the fastest growing in the world. By 2021, it had reached one billion active monthly users. TikTok operates through algorithms that are informed by personal data, such as a user’s likes, follows, watch-time, and interests.
Breitbart News has reported extensively on the dangers of TikTok. In a column advocating for the Chinese app’s banning from the United States, reporter Alana Mastrangelo wrote:
The Danger for Kids and Teens
China is a hostile foreign country that is using TikTok to get U.S. teens to participate in trends that are dangerous and life-threatening.
In September, the FDA warned parents of a deadly new TikTok challenge that involves children cooking chicken in NyQuil, “presumably to eat.” The trend on the China-owned app was just the latest example of a dangerous stunt spread to young Americans.
Another challenge seen on TikTok in 2020 involved urging users to take large doses of the allergy medication Benadryl (diphenhydramine) to induce hallucinations. The challenge resulted in reports of teens being rushed to the hospital, and in some cases, dying.
Last year, school officials spoke out against TikTok’s “bathroom challenge,” which encouraged students to vandalize school restrooms. The challenge, also known as “devious licks,” involved videos depicting vandalism of trophy cases, hallways, and classrooms, as well as theft of fire extinguishers, school signs, and other property.
To add insult to injury, the Chinese app also rewards U.S. users by making them go viral or become “TikTok famous” for behaving foolishly. On one hand, this is accomplished by posting frivolous dance videos, but on the other, it is for popularizing dangerous trends or posting sexual content.
TikTok’s algorithm also makes it easier for teenagers to go viral and gain internet fame when they post sexualized videos. This has also taken a toll on teens’ mental health, according to mental health professionals. And the Chinese company seemingly tailors its algorithms to have different impacts on various communities. For example, a lawsuit filed in July alleges that TikTok pushes especially violent content on black teenagers.
Unlike other social media platforms — such as Facebook, Instagram, YouTube, and Twitter — TikTok makes it much easier and faster for teens to obtain what they perceive as “fame” on the app. This fame can come with a terrible price.
Read more at CBS News here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan
COMMENTS
Please let us know if you're having issues with commenting.