In her speech at the UN in New York Thursday, the British Prime Minister, Theresa May, warned social media companies that if they do not clamp down on terrorist on their sites, she will introduce new legislation that would make them liable for any extremist content posted; failure to remove it would result in fines.
Terrorist material is openly available online, with guides such as how to carry out truck, knife, and bomb attacks easily accessible by a few clicks of the mouse. ISIS also runs a massive social media operation, with around 27,000 videos posted from January to May of this year alone, mostly aimed at radicalizing impressionable people. Around 180 new posts are made by ISIS on Facebook, Twitter and YouTube per day. Two-thirds of the dissemination of this material takes place within two hours of uploading, spreading the content around like wildfire.
Social media organizations have insisted they are cracking down on terrorist content. Twitter revealed they had taken down over a million accounts linked to terrorist activity in the past two years, with Google claiming they were making “significant progress” on the issue. Despite this, they have repeatedly been criticized for leaving content from proscribed groups up. The Solicitor General for England and Wales, Robert Buckland, claimed that Google might be in breach of the Terrorism Act for failing to take down material from proscribed extremist organizations. Facebook also came under fire recently for exposing the identities of their moderators to terrorists.
The Prime Minister originally announced plans to make social media companies liable back in June this year, after meeting with the then-newly elected President of France, Emmanuel Macron. Macron, along with the Italian Prime Minister Paolo Gentiloni, are backing May’s plans. In her speech to the UN, she argued that “a fundamental shift in the scale and nature of our response — both from industry and governments — [is needed] if we are to match the evolving nature of terrorists’ use of the internet.”
If the companies do not act within one month to put in place a method of removing material quickly, they will be made legally liable for the content, and will be forced to remove it within two hours of uploading or face fines. The industry will then be expected to cut this down to one hour, and eventually automate “the detection and removal of terrorist content online, and developing technological solutions that prevent it being uploaded in the first place,” the Prime Minister said.
However, there has been blowback to the government’s plans to issue monetary penalties to companies. Max Hill QC, the independent reviewer of terrorism legislation for the government, asked if it was “absolutely necessary” for fines to be imposed due to non-compliance, explaining that the tech companies regularly co-operate with the police in their investigations, and that the real problem is the sheer “bulk of the material.”
Jack Hadfield is a student at the University of Warwick and a regular contributor to Breitbart Tech. You can like his page on Facebook and follow him on Twitter @JackHadders or on Gab @JH.
COMMENTS
Please let us know if you're having issues with commenting.