Facebook says it will restrict live-streaming to try to curb online violence in the aftermath of the Christchurch mosque massacre.
The company said on Wednesday it would introduce a “one-strike” policy for use of Facebook Live, temporarily restricting access for anyone who has faced disciplinary action for breaking its most serious rules.
Facebook and other social media giants have been under pressure to do more to curb online violence following the live-streaming of the March shootings, in which 51 people died.
It was New Zealand’s worst peacetime shooting and spurred calls for tech companies to do more to combat extremism on their services.
Facebook has said it removed 1.5 million videos globally that contained footage of the attack in the first 24 hours after it occurred and had identified more than 900 versions of the video.
Wednesday’s announcement came as New Zealand Prime Minister Jacinda Ardern prepared to co-chair a meeting in Paris with French President Emmanuel Macron that seeks to have world leaders and tech company chiefs sign the “Christchurch Call”, a pledge to eliminate violent extremist content online.
The meeting will be on the sidelines of a Tech For Humanity summit attended by digital ministers from G7 nations. Leaders of global tech giants, including Google, Facebook, Microsoft and Twitter, will be among those attending.
In an opinion piece in The New York Times on Saturday, Ms Ardern said the “Christchurch Call” would be a voluntary framework that committed signatories to introduce measures to prevent the uploading of terrorist content.
Ms Ardern has not made specific demands of social media companies, but has called for them “to prevent the use of live streaming as a tool for broadcasting terrorist attacks”.
Facebook’s vice-president integrity, Guy Rosen, said first-time offenders would be suspended from using Live for set periods. The company will also broaden the range of offences that will qualify for one-strike suspensions.
“Starting today, people who have broken certain rules on Facebook – including our ‘dangerous organisations and individuals policy’ – will be restricted from using Facebook Live,” Mr Rosen said.
The policy includes people who are involved in or support terrorist activity, organised hate, mass or serial murder, human trafficking, or organised violence or criminal activity.
“We will now apply a ‘one strike’ policy to [Facebook] Live in connection with a broader range of offences,” Mr Rosen said.
“From now on, anyone who violates our most serious policies will be restricted from using Live for set periods of time – for example, 30 days – starting on their first offence.
“For example, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time.”
A Facebook spokeswoman said the alleged Christchurch shooter would not have been able to use Live on his account under the new rules.
The company said it plans to extend the restrictions to other areas in coming weeks.
It also said it would fund university research on techniques to detect manipulated media, which Facebook’s systems struggled to spot in the aftermath of the mosque attack.