March 17, 2019 – In Breaking News – CBS News
As New Zealand reels from a terrorist attack against two mosques in Christchurch, Facebook announced it deleted 1.5 million videos of the shootings in the first 24 hours following the massacre.
The tech company said in a tweet late Saturday that it prevented 1.2 million videos from being uploaded to its platform, which has more than 2.2 billion global users.
However, it implies 300,000 versions of the video were available to watch for at least short periods of time before Facebook nixed them.
(Reuters reports Facebook Inc said it removed 1.5 million videos globally of the New Zealand mosque attack in the first 24 hours after the attack. Facebook said in a tweet late Saturday “In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…” The company said it is also removing all edited versions of the video that do not show graphic content out of respect for the people affected by the mosque shooting and the concerns of local authorities. Courtesy of Wochit News and YouTube. Posted on Mar 17, 2019.)
It also reveals how quickly such provocative and graphic content circulate online and the challenges facing social media companies such as Facebook have as they try to stamp them out.
Video of the brutal attack was livestreamed on Facebook by the suspected gunman Brenton Tarrant, an Australian native who appeared in court this weekend and has been charged with murder.
Tarrant is likely to face more charges when he goes in front of the Christchurch high court April 5. An online manifesto spewed a message of hate replete with references familiar to extremist chat rooms and internet trolls.
Video of the attack showed the gunman taking aim with assault-style rifles painted with symbols and quotes used widely by the white supremacist movement online.
(At least 49 people were killed and dozens more injured in mass shootings at two mosques in Christchurch, New Zealand, Friday. Courtesy of ABC News and YouTube. Posted on Mar 15, 2019.)
Facebook told CNET on Friday it had removed the footage and was pulling down posts that expressed “praise or support” for the shootings not long after the shooting broke out.
It said it is working with police on the investigation.
In a tweet Friday from YouTube, which is owned by Google, also said it has been “working vigilantly to remove any violent footage.”
Continue reading… Facebook removed 1.5 million videos of New Zealand terror attack
(At least 49 people were killed in mass shootings at two new Zealand mosques. The mosques were full of worshippers attending Friday prayers. The attacks were carried out by a suspect intent on killing who had written a rambling manifesto. Courtesy of seattletimesdotcom and YouTube. Posted on Mar 15, 2019.)
AST strives to meet a 3 STAR trustworthiness rating, based on the following criteria:
- Provides named sources
- Reported by more than one notable outlet
- Includes supporting video, direct statements, or photos