March 17 (UPI) — Facebook removed 1.5 million videos of the mass shootings at two mosques in New Zealand in the day after the attack Friday.
Mia Garlick, director of policy for Facebook New Zealand, issued a statement saying that Facebook removed 1.5 million videos of the shooting, including 1.2 million that were blocked at upload, in the first 24 hours after the attack.
Garlick did not specify the reason why 20 percent of the videos were not blocked at upload.
“Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content,” she added.
The accused shooter, who has since appeared in court, live streamed the attack at two mosques in the city of Christchurch on Friday in which 50 people were killed and 36 were hospitalized as of Saturday.
Facebook said police alerted it to a video of the shooting on the platform shortly after the shooter started the live stream his Facebook profile. The social media giant said it “quickly removed both the shooter’s Facebook and Instagram accounts and the video.”
Garlick said Facebook will also remove “any praise or support for the crime and the shooter or shooters as we’re aware.”
“We will continue working directly with New Zealand Police as their response and investigation continues,” she said.
New Zealand Prime Minister Jacinda Ardern said that there were “further questions to be answered” by Facebook and other social media sites, regarding their response to the events surrounding the shooting.
On Sunday, New Zealand’s government informed Facebook, Twitter, YouTube and other online platforms that sharing any version of the footage, including those edited to remove graphic content, is a violation of the law.
Ardern said New Zealand did as much as it could do to “remove or seek to have removed some of the footage” but the primary task of preventing the spread of the videos was “up to those platforms.”