Elon Musk’s Twitter dissolved its Trust and Safety Council on Monday night. This comes days after three members of the council announced their resignation saying that contrary to Elon Musk’s claims, “the safety and wellbeing of Twitter’s users are on the decline.”
The Washington Post reports that Twitter dissolved its Trust and Safety Council on Monday. In an email to board members, the company wrote: “As Twitter moves into a new phase, we are reevaluating how best to bring external insights into our product and policy development work. As part of this process, we have decided that the Trust and Safety Council is not the best structure to do this.”
The email adds: “We are grateful for your engagement, advice and collaboration in recent years and wish you every success in the future.” This move comes just days after three members of the board announced their resignation from the council, stating that they were doing so as “it is clear from research evidence that, contrary to claims by Elon Musk, the safety and wellbeing of Twitter’s users are on the decline.”
One council member told the Washington Post that Twitter is throwing away “years of institutional memory that we on the council have brought” to the company. “Getting external experts and advocates looking at your services makes you smarter,” they added.
Larry Magid, chief executive of ConnectSafely, a Silicon Valley nonprofit that advises consumers about children’s internet use, commented: “By disbanding it, we got fired instead of quit. Elon doesn’t want criticism, and he really doesn’t want the kind of advice he would very likely get from a safety advisory council, which would likely tell him to rehire some of the staff he got rid of, and reinstate some of the rules he got rid of, and turn the company in another direction from where he is turning it.”
Despite the lofty pronouncements of the former Trust and Safety Council, the company has done a terrible job at stopping child pornography from appearing on the platform. In fact, their inability to stop illegal material caused the company to change its business plans.
Twitter reportedly considered allowing porn stars to monetize their adult content via an OnlyFans-style subscription feature, but the plans were quickly shelved after internal teams found that the company is unable to effectively police child sexual abuse material (CSAM) already on its platform.
The Verge reports that earlier this year, Twitter was developing a new revenue idea — an OnlyFans-style subscription feature for pornography on its platform. The company quickly put together a specialized team, referred to as the “Red Team” to “pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly.”
However, the Red Team quickly discovered a major obstacle — Twitter was unable to allow porn performers to sell subscriptions as the company is still unable to effectively police harmful sexual content, including child sexual abuse material, on its platform.
Read more at the Washington Post here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan