A recent study by Stanford’s Internet Observatory has unveiled a disturbing prevalence of child sexual abuse material (CSAM), otherwise known as child pornography, on Mastodon. The decentralized social media platform has been a popular destination for leftists fleeing from Twitter since Elon Musk purchased the platform.
The Verge reports that social network Mastodon has an alarmingly high frequency of child sexual abuse material (CSAM), according to a recent analysis by Stanford’s Internet Observatory. The platform, often touted as a viable alternative to Twitter, has been found to harbor a significant amount of child pornography, raising concerns about its moderation policies.
In a two-day research period, the team discovered 112 instances of known child pornography across 325,000 posts on Mastodon. Alarmingly, the first instance of such material was identified within just five minutes of the investigation. “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said David Thiel, one of the report’s researchers.
The researchers employed Google’s SafeSearch API and PhotoDNA, a tool designed to identify explicit images, to conduct their investigation. Their search yielded 554 pieces of content that matched hashtags or keywords frequently used by child sexual abuse groups online. All of these were identified as explicit in the “highest confidence” by Google SafeSearch.
The study also revealed 713 uses of the top 20 CSAM-related hashtags across the Fediverse on posts that contained media. Additionally, there were 1,217 text-only posts that pointed to “off-site CSAM trading or grooming of minors.” The researchers noted that the open posting of CSAM is “disturbingly prevalent.”
The decentralized nature of Mastodon, which is part of its appeal to many users, also presents a unique challenge when it comes to moderation. Unlike mainstream sites like Facebook, Instagram, and Reddit, each decentralized instance on Mastodon has control over its own moderation. This can lead to inconsistency across the Fediverse, which is a network of decentralized social media platforms.
In light of these findings, the researchers have suggested that networks like Mastodon employ more robust tools for moderators. They also recommend the integration of PhotoDNA and CyberTipline reporting to help combat the spread of child pornography.
Read more at the Verge here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan