Blogs > Intentional Insights > The Behavioral Science of Political Deception in the 2016 Election

Oct 31, 2017

The Behavioral Science of Political Deception in the 2016 Election


tags: psychology,politics,2016 election,Fake News,alternative facts,gleb tsipursky,post-truth,behavioral science


Dr. Gleb Tsipursky is the author of The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge.Caption: Head with brain and puzzle pieces (Geralt/Pixabay)


How did Donald Trump win, when he used so many misleading statements and outright deceptions? Couldn’t people see through them? As an expert in brain science, I want to share why his followers fell for his lies and what can be done to address this situation in the future.

First, let’s get the facts straight. Politifact.com, a well-known non-partisan website, rates only about 4 percent of statements by Trump as fully “True” and over 50 percent as either completely “False” or what they call ridiculously false – “Pants on Fire,” with the rest in the middle. By comparison, Hillary Clinton rated 25 percent as fully “True” and only 12 percent as either “False” or “Pants on Fire.”

The Washington Post, one of the most reputable newspapers in the country, wrote that “There’s never been a presidential candidate like Donald Trump — someone so cavalier about the facts and so unwilling to ever admit error, even in the face of overwhelming evidence.” In their rulings on statements made by Trump, this paper’s editors evaluated 64 percent of them as Four Pinocchios, their worst rating. By contrast, statements by other politicians tend to get the worst rating 10 to 20 percent of the time.

These sentiments are representative of other prominent news media and fact-check outlets, yet according to an ABC News/Washington post poll, most voters on the eve of the election perceivedDonald Trump as more trustworthy than Hillary Clinton. This false perception came from the Trump campaign building up on previous Republican criticism of Clinton, much of it misleading and some accurate, to manipulate successfully many voters into believing that Clinton is less honest, in spite of the evidence that she is much more honest than Trump. The Trump campaign did so through the illusory truth effect, a thinking error in our minds that happens when false statements are repeated many times and we begin to see them as true. In other words, just because something is stated several times, we perceive it as more accurate.

You may have noticed the last two sentences in the previous paragraph had the same meaning. The second sentence didn’t provide any new information, but it did cause you to believe my claim more than you did when you read the first sentence.

The Biology of Truth Vs. Comfort

Why should the human brain be structured so that mere repetition, without any more evidence, causes us to believe a claim more strongly? The more often we are exposed to a statement, the more comfortable it seems. The fundamental error most people make is mistaking statements that make them feel comfortable for true statements.

Our brains cause us to believe something is true because we feel it is true, regardless of the evidence – a phenomenon known as emotional reasoning. This strange phenomenon can be easily explained by understanding some basic biology behind how our brain works.

When we hear a statement, the first thing that fires in our brain in a few milliseconds is our autopilot system of thinking, composed of our emotions and intuitions. Also known as System 1, the autopilot system is what the Nobel Prize-winning scientist Daniel Kahneman identified as our two systems of thinking in his 2011 ThinkingFast and Slow, and represents the more ancient system of our brain. It protected us in the ancestral environment against dangerous threats such as saber-toothed tigers by making us feel bad about them and drew us toward what we needed to survive such as food and shelter by making us feel good about them. The humans who survived learned well to heed the autopilot system’s guidance, and we are the children of these humans.

Unfortunately, the autopilot system is not well calibrated for the modern environment. When we hear statements that go against our current beliefs, our autopilot system perceives them as threats and causes us to feel bad about them. By contrast, statements that align with our existing beliefs cause us to feel good and we want to believe them. So if we just go with our gut reactions – our lizard brain – we will always choose statements that align with our current beliefs.

Meme saying “Lizard brain thinking is killing democracy – Please think rationally” 
 (Ed Coolidge, made for Intentional Insights)

Where Do We Get Our News?

Until recently, people got all their news from mainstream media, which meant they were often exposed to information that they didn’t like because it did not fit their beliefs. The budget cuts and consolidation of media ownership in the last decade resulted in mainstream media getting increasingly less diverse, well described in the 2009 Media Ownership and Concentration in America by Eli Noam. Moreover, according to a 2016 survey by Pew Research Center, many peopleare increasingly getting their news mainly or only from within their own personalized social media filter bubble, which tends to exclude information that differs from their own beliefs. So their own beliefs are reinforced and it seems that everyone shares the same beliefs as them.

This trend is based on a traditional strong trust in friends as sources of reliable recommendations, according to the 2015 Nielsen Global Trust in Advertising Report. Our brains tend to spread the trust that we associate with friends to other sources of information that we see on social media. This thinking error is known as the halo effect when our assessment of one element of a larger whole as positive transfers to other elements. We can see this in research showing that people’s trust in social media influencers has grown over time, nearly to the level of trust in their friends, as shown by a 2016 joint study by Twitter and analytics firm Annalect.

Even more concerning, a 2016 study from Stanford University demonstrated that over 80 percent of students, who are generally experienced social media users, could not distinguish a news story shared by a friend from a sponsored advertisement. In a particularly scary finding, many of the study’s participants thought a news story was true based on irrelevant factors such as the size of the photo, as opposed to rational factors such as the credibility of the news source outlet.

The Trump team knows that many people have difficulty distinguishing sponsored stories from real news stories and that’s why they were at the forefront of targeting voters with sponsored advertorials on social media. In some cases they used this tactic to motivate their own supporters, and in others they used it as a voter suppression tactic against Clinton supporters. The Trump campaign’s Republican allies created fake news stories that got millions of shares on social media. The Russian propaganda machine has also used social media to manufacture fake news stories favorable to Trump and critical of Clinton.

Additionally, Trump’s attacks on mainstream media and fact-checkers before the election, and even after the election, undercut the credibility of news source outlets. As a result, trust in the media amongst Republicans dropped to an all-time low of 14 percent in a September 2016 Gallup poll, a drop of over 200 percent from 2015. Fact-checking is even less credible among Republicans, with 88 percent expressing distrust in a September 2016 Rasmussen Reports poll.

All this combined in the unprecedented reliance on and sharing of fake news by Trump’s supporters on social media. With the rise of the Tea Party, a new study by the Center for Media and Public Affairs (CMPA) at George Mason University used Politifact to find that Republicans have tended to make many more false statements than Democrats. Lacking trust in the mainstream media and relying on social media instead, a large segment of Trump’s base indiscriminately shared whatever made them feel good, regardless of whether it was true. Indeed, one fake news writer, in an interview with The Washington Post, said of Trump supporters: “His followers don’t fact-check anything — they’ll post everything, believe anything.” No wonder that Trump’s supporters mostly believe his statements, according to polling. By contrast, another creator of fake news, in an interview with NPR, described how he “tried to write fake news for liberals — but they just never take the bait” due to them practicing fact-checking and debunking.

Meme saying “People are most comfortable dealing with reality in terms of black or white, but reality tends to like shades of grey” 
 (Wayne Straight, made for Intentional Insights)

This fact-checking and debunking illustrates that the situation, while dismal, is not hopeless. Such truth-oriented behaviors rely on our other thinking system, the intentional system or system 2, as shown by Chip and Dan Heath in their 2013’s Decisive: How to Make Better Choices in Life and Work. The intentional system is deliberate and reflective. It takes effort to use but it can catch and override the thinking errors committed by system 1 so that we do not adopt the belief that something is true because we feel it is true, regardless of the evidence.

Many liberals associate positive emotions with empirical facts and reason, which is why their intentional system is triggered into doing fact-checking on news stories. Trump voters mostly do not have such positive emotions around the truth, and believe in Trump’s authenticity on a gut level regardless of the facts. This difference is not well recognized by the mainstream media, who treat their audience as rational thinkers and present information in a language that communicates well to liberals, but not to Trump voters.

To get more conservatives to turn on the intentional system when evaluating political discourse we need to speak to emotions and intuitions – the autopilot system, in other words. We have to get folks to associate positive emotions with the truth first and foremost, before anything else.

To do so, we should understand where these people are coming from and what they care about, validate their emotions and concerns, and only then show, using emotional language, the harm people suffer when they believe in lies. For instance, for those who care about safety and security, we can highlight how it’s important for them to defend themselves against being swindled into taking actions that make the world more dangerous. Those concerned with liberty and independence would be moved by emotional language targeted toward keeping themselves free from being used and manipulated. For those focused on family values, we may speak about trust being abused.

These are strong terms that have deep emotional resonance. Many may be uncomfortable with using such tactics of emotional appeals. We have to remember the end goal of helping people orient toward the truth. This is a case where ends do justify the means. We need to be emotional to help people grow more rational – to make sure that while truth lost the battle, it will win the war.

P.S. To learn more about truth-seeking strategies in politics and other life areas, check out the article author’s book, The Truth-Seeker’s Handbook: A Science-Based Guide.



comments powered by Disqus