A group of tech companies that control some of the world’s largest online platforms have signed on to a new pledge to work more with governments and each other to combat the threat of violent extremism on the internet two months after a live-streamed mass shooting in New Zealand.
Facebook, Twitter, Google, Microsoft and Amazon said Wednesday they have signed on to the agreement, dubbed the Christchurch Call to Action, named after the city in New Zealand where the mass shooting took place.
Among other efforts, the companies said they would work with governments and NGOs to establish “crisis protocols” for responding to active events so information is shared and acted upon faster. They also pledged greater industry support for research into online hate and offline violence as well as their intention to continue improving technology to catch extremist content.
“Terrorism and violent extremism are complex societal problems that require an all-of-society response,” the tech companies said in a joint statement provided to CNN Business. “The commitments we are making today will further strengthen the partnership that Governments, society and the technology industry must have to address this threat.”
The call to action was formally unveiled Wednesday as part of a meeting of government and industry leaders in Paris, including French President Emmanuel Macron and New Zealand Prime Minister Jacinda Ardern, the latter of whom has emerged as a leader in the effort to crack down on online extremism since the shooting that left dozens dead.
The Trump administration thanked Ardern and Macron “for organizing this important effort,” but declined to join the call to action. In the past, President Donald Trump has accused social media companies of “censorship” when they’ve banned certain users who are extremists.
“Technology is moving quite quickly. We may not have all of the responses we need now, but we need to work on them and we need to work together,” Ardern said in an interview this week with Christiane Amanpour. “That’s why this call to action is not just about regulation, but instead about bringing companies to the table [and saying] you have a role too and we have expectations of you.”
In mid-March, a mass shooting at a mosque in New Zealand was live-streamed on Facebook. Copies of the video spread rapidly across Facebook, YouTube and Twitter, with the social media companies struggling to remove them all. Nearly two months after the shooting, copies were still circulating on some of these sites.
In an opinion piece earlier this week, Ardern called it a “horrifying new trend” of an attack that “was designed to be broadcast on the internet.” She called on nations and private businesses to collaborate to stop the internet from being used as a “tool for broadcasting terrorist attacks.”
Separately, Facebook announced plans to change its livestreaming rules. Starting Wednesday, people who break Facebook’s “most serious policies” will be immediately banned from using Facebook Live for a period of time, such as 30 days.
Under the new policy, the alleged Christchurch shooter would not have been able to livestream the massacre from his account in March, a Facebook spokesperson told CNN Business. They did not say what rules he had previously broken.