During a week in which the US sparked a global trade war, North Korea’s missile-happy leader agreed to a face-to-face with Trump, and a porn star sued our President, you could be forgiven if you missed arguably the week’s biggest news. MIT revealed the results of a comprehensive study that showed that fake news – lies by any other name ­– travels six times faster on social media than the truth.

In an age when we often consume little more than tweets, headlines, posts and soundbites, this is one research article worth reading in its entirety for the simple reason that it may transform how you think about truth and lies in our modern age. Three scientists from MIT’s Media Lab – Soroush Vosoughi, Deb Roy and Sinan Aral – conducted a detailed machine-learning-styled experiment published in the March 9 issue of Science. They examined 126,000 stories tweeted by 3 million people more than 4.5 million times, gauging truth and lies with multiple fact-checking organizations (Snopes, PolitiFact, Factcheck.org, etc.).

Lies, wrote the researchers, spread like wildfire, diffusing “significantly farther, faster, deeper, and more broadly than the truth.” Lies inspired “fear, disgust and surprise” in contrast to true stories which inspired “anticipation, sadness, joy, and trust.” Lies beat the truth by a factor of six to one in how fast they spread, and “falsehood also reached far more people than the truth.” Humans, wrote the researchers, not bots, are mainly to blame. We’re simply programmed to react more intensely to lies because lies are novel and tap into our desire to be surprised: “we do find that false news is more novel and that novel information is more likely to be retweeted.”

Not surprisingly, the study proves that lying about politics works better than any other kind of lying (such as rumors about terrorism, natural disasters, science, and other news). George Orwell, the author of 1984, knew this in his bones 70 years ago: “Political language … is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”

The MIT study  provides an empirical modern basis for what dictators and demagogues have known for centuries and what Joseph Goebbels, Hitler’s henchman said long ago. “If you tell a lie big enough and keep repeating it, people will eventually come to believe it.” This is why Trump kept repeating a bald-faced lie about Obama’s birthplace. No matter how many times Obama or defenders called it a lie, the lie still snowballed, developing its own momentum, impervious to the truth. Today’s social media engines are optimized for lies because of their peer-to-peer nature, and the almost total lack of checks and balances. “The spread of falsehood was aided by its virality,” wrote the researchers, “meaning that falsehood did not simply spread through broadcast dynamics but rather through peer-to-peer diffusion characterized by a viral branching process. “

History, as seen through the lens of MIT’s revealing new evidence, raises a critical challenge. Fighting lies with truth has always been a bit like trying to put out a five-alarm fire with a garden hose. What’s more, the study raises a telling current question the authors don’t address. Snopes, PolitiFact, and factcheck.org are all too little too late, and emblematic of today’s stacked deck, a structural imbalance that favors rumor over reality.  Fact-checking after the fact in the age of tweets, bots, and damaging viral untruths might make us feel better, but we’ve already lost. The people (or bots) who retweet specious claims are by definition not verifying the information first, they’re merely knee-jerking. And sites that raise red flags hours or days after a lie has already gone global, while well-intentioned, are not offering a programmatic solution.

Studies and investigations of Russian interference are not enough. Let’s mobilize tech to address this existential problem.  It’s past time for government, institutions, Twitter, Facebook and Google to join together and put in place technical barriers to this social malaise that’s threatening nations. Tackling the corrupting influence of propaganda requires the same sort of aggressive campaign we might apply to other threats to our national security. Just as countries need soldiers and weapons to defend themselves, we need talented people and new tools to identify and squelch falsehoods in the moment, through technology and crowdsourcing.

Let’s kill the lies at the source before they kill us.

 

The Russian Ransomware Blues

Datagate: FBI Connects the Bots to Kushner