China-owned TikTok is facing claims that its algorithm steers violent videos towards black users rather than white users in a lawsuit blaming the platform for the death of a 14-year-old girl named Englyn Roberts.
Bloomberg reports that a recently filed lawsuit alleges that China-owned TikTok’s algorithm purposefully steers violent content to black teens over white teens. The claim comes as part of a lawsuit over the death of a 14-year-old black girl named Englyn Roberts. The complaint further names Facebook, Snapchat, and TikTok’s parent company ByteDance as defendants.
The lawsuit is the latest in a long line blaming social media companies for teens becoming addicted to their platforms and products. The parents of Englyn Roberts, who died in September 2020 approximately two weeks after she attempted to take her own life, allege that TikTok is aware of biases in its algorithm relating to race and socioeconomic status.
The complaint, which was filed on Wednesday in San Francisco federal court, claims that Roberts would not have seen and become addicted to harmful content that contributed to her death if not for TikTok’s biased algorithm.
“TikTok’s social media product did direct and promote harmful and violent content in greater numbers to Englyn Roberts than what they promoted and amplified to other, Caucasian users of similar age, gender, and state of residence,” the lawsuit alleges.
The Seattle-based advocacy group Social Media Victims Law Center filed the lawsuit. Representatives of TikTok, Facebook, and Snap did not respond to requests for comment.
Read more at Bloomberg here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan