Fake News

Written by

Published

Category

Imperial College Business School researchers have identified key characteristics by which to spot fake news and the accounts that share it

We all know there’s no such thing as neutral information when it comes to politics, but using social media to manipulate information has become an increasingly sophisticated operation – one that’s hard to identify unless you know what you’re looking for. As we come to rely on social media for our news, so we risk falling prey to the fake accounts that are set up to play to our psychological biases.

The power of social media to influence political opinions became obvious during the 2016 US presidential election. “Hilary Clinton sold weapons to ISIS” and "Pope Francis Shocks World, Endorses Donald Trump for President" were among the surprising (and largely untrue) headlines flung around.

How can we be sure what is real and what is fake, and how can we spot political misinformation? Our team of researchers looked at the systematic way political misinformation is engineered, and how those efforts are designed to exploit our existing biases. What we found was that tweets containing misinformation and the accounts posting them have distinct features.

Political misinformation spread faster than any other type

We looked at a dataset of tweets collected just after the 2016 presidential election, each of which had been retweeted at least 1,000 times. We found that posts containing misinformation were distinctive in their use of language, i.e. they made greater use of exclamation marks, capitalisation and digits. 

We also identified differences between the sentiments expressed in tweets engaged in spreading misinformation and those not. The former generally exhibited less joy, trust and other positive emotions than the latter, but did show more surprise. That’s important because research has shown novelty attracts our attention – it’s how we update our understanding of the world. So, if we see a tweet saying Venus flytraps can count to three, or that presidential nominee Hillary Clinton is disqualified from holding federal office, we are more likely to retweet it than, say, one about what a friend had for breakfast.

When it came to political misinformation, the sophistication of the engineering of such tweets meant it spread faster than any other type of misinformation.

The accounts sending out these tweets also have distinctive characteristics. They tend to be created more recently, are less likely to be verified and have fewer updates than those sharing other types of information. They are also more likely to use weird characters in both their screen name and description, and have fewer followers – tending to follow others more often. They also tend to create directed links to other profiles as a way of encouraging those users to link back to them.

Novelty attracts our attention – it's how we update our understanding of the world

We’ve a new presidential election later this year, and the fake news machine is already gearing up. President Trump will be hoping for re-election and he has established Twitter as a key platform from which to influence public opinion. Most recently, US Attorney General William Barr faced accusations of having bowed to presidential pressure after reducing the sentence of Roger Stone, a former advisor to and long-time friend of Donald Trump. The President had used social media to criticise Stone’s conviction on counts of witness tampering, obstructing an official proceeding, and making false statements. 

The use of social media to sway public opinion is a big issue, particularly when what is being said is untruthful and when it seeks to influence democratic processes. Combatting it will be a big undertaking, but by examining how fake news is created and how it is spread, we may have a better idea of where we should be looking.

As the President put it, there is “So much FAKE NEWS!”

This article draws on findings from “Not All Lies are Equal. A Study into the Engineering of Political Misinformation in the 2016 Presidential Election” by Axel Oehmichen, Kevin Hua, Julio Amador Diaz Lopez (Imperial College Business School), Miguel Molina-Solana (University of Granada), Juan Gómez-Romero (University of Granada) and Yi-Ke Guo (Imperial College London, Faculty of Engineering).

Written by

Published

Category

Dr Julio Amador Imperial College

About Julio Amador

Junior Research Fellow
Dr Julio Amador is currently dedicated to studying advertising in social networks, microfinance and data-mining algorithms.

You can find the author's full profile, including publications, at their Imperial Professional Web Page