Facebook (now known as Meta) has pledged to intensify its efforts to combat the promotion of pedophilia content on its Instagram platform. This commitment comes in the wake of a disturbing report that revealed Instagram’s algorithm was aiding the spread of child pornography and pedophilia accounts.
Breitbart News recently reported that Instagram failed to stop the connection and promotion of a vast network of accounts involved in the creation and purchase of child pornography, according to recent investigations by researchers at Stanford University and the University of Massachusetts Amherst.
Now Engadget reports that the parent company of Instagram, Mark Zuckerberg’s Facebook has promised to step up its efforts to stop the promotion of pedophilia content on its platform.
Researchers found that a sizable network of accounts dedicated to sharing pornographic content was being facilitated by Instagram’s systems. It was discovered that the platform’s algorithm not only facilitated these activities, but also promoted them.
In response to these troubling findings, Facebook has launched a number of initiatives to address this problem. “Child exploitation is a horrific crime,” a spokesperson for Facebook stated. “We’re continuously investigating ways to actively defend against this behavior.”
Facebook claims it has created an internal task force to address this issue as part of its action plan. Additionally, the business is attempting to block networks that disseminate child pornography. According to reports, Facebook has blocked thousands of associated hashtags and dismantled 27 pedophile networks over the past two years.
The report did draw attention to some problematic elements of Instagram’s algorithms, though. It was discovered that the platform frequently connected potential abusers and recommended terms related to child pornography. Researchers who took part in the study said that after looking at just one account, they were given suggestions for potential child porn sellers and buyers.
Facebook has come under fire for the way it handled user reports of child sex content. The company has now removed the capability to view search results linked to child pornography, but despite these initiatives, the report contends that Instagram’s recommendations unintentionally assisted in the reconstruction of the network that the platform’s safety staff was attempting to destroy.
In response to these findings, Meta said it actively works to ban such users, and in January alone, it took down 490,000 accounts for violating child safety rules. Alex Stamos, the company’s former security director, called for additional spending on human investigators to address the problem. “That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, underscoring the need for more robust internal measures. “I hope the company reinvests in human investigators.”
UMass Rescue Lab director Brian Levine commented: “Instagram is an onramp to places on the internet where there’s more explicit child sexual abuse.” The Stanford research group also found that child pornography was “particularly severe” on Instagram, stating: “The most important platform for these networks of buyers and sellers seems to be Instagram.”
Read more at Engadget here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan
COMMENTS
Please let us know if you're having issues with commenting.