Appeals Court Rules China’s TikTok Not Shielded by Section 230 in ‘Blackout Challenge’ Death Lawsuit

TikTok CEO interview
Christopher Goodney/Bloomberg/Getty

An appeals court has revived a lawsuit against China’s TikTok, reversing a lower court’s ruling that Section 230 immunity shielded the app from liability after a child died participating in the dangerous viral “blackout challenge.”

In a significant ruling with potential implications for all social media platforms, Ars Technica reports that the Third Circuit Court of Appeals has determined that TikTok is not protected by Section 230 of the Communications Decency Act (CDA) in a lawsuit related to the death of a child who participated in the “blackout challenge.” The challenge, which encouraged users to choke themselves until losing consciousness, tragically claimed the lives of several children including a 10-year-od Pennsylvania girl.

Tawainna Anderson, the mother of 10-year-old Nylah Anderson who died after taking part in the challenge, attempted to sue TikTok in 2022. However, the initial ruling stated that TikTok was not responsible for recommending the video that led to Nylah’s death.

The appeals court has now overturned that decision, with Judge Patty Shwartz stating in her opinion that Section 230 does not bar Anderson from arguing that TikTok’s algorithm, which curates and recommends content to users, is an “expressive product” that communicates to users that the selected videos will be of interest to them. This ruling is based on a recent Supreme Court decision that held that a platform’s algorithm reflecting “editorial judgments” about compiling and presenting third-party content is the platform’s own “expressive product” and thus protected by the First Amendment.

Crucially, while Section 230 shields platforms from liability for third-party speech, it does not protect platforms’ own speech or “expressive activity.” As TikTok’s For You Page (FYP) algorithm decides which content to include, exclude, and organize, it counts as TikTok’s own “expressive activity” and is therefore not shielded by Section 230, according to the ruling.

The case will now return to the district court to rule on Anderson’s remaining claims. Judge Paul Matey, concurring in part, noted that TikTok was aware of the dangers posed by the “Blackout Challenge” but took inadequate action to prevent the spread of the challenge and stop it from being shown to children on their FYPs. He encouraged a narrower interpretation of Section 230, stating that it should not permit companies like TikTok to display “casual indifference to the death of a 10-year-old girl.”

Anderson’s lawyers have previously stated that the Communications Decency Act was not intended to allow social media companies to send dangerous content to children and that they will continue advocating for the protection of youth from an industry that “exploits youth in the name of profits.”

This ruling marks a significant development in the ongoing debate surrounding the scope and application of Section 230, with potential ramifications for how social media platforms curate and recommend content to their users, particularly minors. As the case progresses, it may set a precedent for holding platforms accountable for the real-world consequences of their algorithms and content promotion practices.

Read more at Ars Technica here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.