Malicious information operations are not going away and to deal with them society must learn that not all voices are equal, according to Facebook’s chief security officer.
Speaking to a NATO-affiliated cyber security conference in Tallinn, Estonia, Alex Stamos stressed that the social media company wanted to defend democracy and open societies.
He said that the company had learned a lot over the past two years about how to tackle people and organisations using Facebook as part of their information operations.
However, the company has not managed to find a solution to individual people “aggressively re-sharing” fake news to reinforce their own beliefs, and copying and pasting memes and fake news.
Facebook has had to come to terms with the ways in which its stated mission – connecting people and providing them with a voice – is being undermined by the illegitimacy of some voices.
Individuals were often unwittingly becoming engaged in information operations, said Mr Stamos, noting that this was difficult to approach as a security issue.
“Part of freedom is the freedom to be wrong,” he said.
Individual participants were the fourth in a model of four key threat actors in a “very simplified taxonomy” of groups who were seeking to use Facebook to disrupt democracies, according to Mr Stamos.
His colleagues were developing a more complicated system of modelling people who could abuse Facebook to attack democracies, although this wasn’t ready to be shared.
The first group was commercially-driven fake news organisations, clickbait publications which create false content just to generate revenue through advertising.
The second – which has dominated news coverage of Facebook in recent years – was foreign influence campaigns.
In January, the company admitted it was “too slow to recognise” the Russian government’s use of Facebook to interfere in the 2016 US presidential election.
Mr Stamos said that the foreign influence campaigns on the social media site were part of hybrid offensive activities, and had many forms beyond the 80,000 posts seen by up to 126 million Facebook users ahead of the presidential election.
He noted the DC Leaks website, which he said was an information operation run by Russian military intelligence, released information stolen by hackers for the purpose of creating and amplifying useful narratives in the media.
These tactics were also at play in the third category of malicious use of Facebook: domestic influence operations.
These information operations are intended to manipulate domestic audiences for the ruling party in societies and countries where such parties have a lot of control, including Russia.
They tend to amplify state-controlled media and suppress dissent either by intimidation or by maliciously reporting legitimate content to Facebook’s moderators.
To address these issues, Facebook is introducing more transparency for its advertisements and content so people can see who is funding and producing it.
The measures to boost transparency will be introduced in the UK by July.