Benjamin Disraeli (UK Prime Minister, 1874-1880) once said “There are three types of lies – lies, damn lies, and statistics.” Everybody lies to a certain extent. It is difficult to imagine a society where everybody is utterly honest in everything that they say or do – where it is impossible to lie. The 2009 film “The Invention of Lying” tried to, but the result was not something we would recommend watching.
It is an integral part of our world, across all sectors, from finance to personal relationships. There are many different reasons why people lie (to preserve their own self interest, to avoid conflicts or to protect others for instance), but how do people rationalise lying and what effect does it have on our brains and future behaviour?
In 1959, psychologists Festinger and Carlsmith set up an experiment with Stanford University students to test how people rationalise lying. This involved assigning a particularly dull and pointless task to the test subjects, then asking them to present the task to the next participant (who is actually an accomplice) in a positive light, specifically saying that it was great, fun and even riveting. Basically lie their socks off. The final part of the experiment was filling in a questionnaire so as to measure their real opinion of the task.
The subjects were split into three groups: one that was paid the mighty sum of $1 for lying, the second $20, the third group was not asked to lie and was not paid – it acted as a control.
The point of the experiment was to observe the answers to the questionnaire, and finding out which group was the most enthusiastic when it came to rating the pointless and dull task (filling out the questionnaire is not remunerated, it was designed to see how the different groups really felt about the task).
It turned out to be the first group, and the psychologists who set up the study came up with an interpretation that has been extended to a whole range of other situations. They observed that the test subjects had been made to act in a contradictory fashion to their own feelings about the task, by having to lie about the nature of the task. This result a state of tension that must be reduced to stay “rational”: a cognitive dissonance. This refers to the contradiction that occurs when the subject’s behavior is not in line with personal values. It’s a kind of “bug” that has to be solved because most people have to be rational in their own eyes.
In the experiment, this means that the subject has to find a way to justify the act of lying. For the second group it is very simple: “I got paid” (this was 1959 after all, $20 now is worth about $167 today). However for the first group it was not that simple – being paid the measly dollar was not enough to justify the lie on its own. The solution for this group was to change their attitude to the task, by telling themselves that they did not do it for the money, they did it because the task was interesting.
For these subjects the real lie was not the one they were paid to tell, but the one they were telling themselves to avoid being irrational and to avoid cognitive dissonances.
Festinger, in his book “The End of a Prophecy”, also observed this phenomenon in people belonging to cults that predict the end of the world. The followers always believe increasingly far fetched explanations from the leader, even after the apocalypse deadline passed without incident time and time again, and then keep going out to spread the word.
It’s by this rationalization mechanism that we may be inclined to prefer a more expensive wine to a cheaper wine of equal quality, why the latest smartphone is definitely so much better than all the previous models and so on. Marketing applications of this mechanism are boundless: it can be used to develop brand loyalty, or to make consumers convince each other of the increased quality due to an increased price.
In 2009, some neuroscientists from the University of California tried to gain a better understanding of this phenomenon, and used MRI to see what happens in a person’s brain when they go through a version of the Festinger and Carlsmith experiment.
The results showed that two cerebral regions that are associated with detecting conflict were highly active when the subjects were made to lie, and increasingly so in proportion with the scale of the lie.
When the participants weren’t paid for their lie, the regions in question were even more active as the dissonance is even stronger without the obvious (financial) reason for lying. The answers to the questions are in line with the answers in the original 1959 study: the non-remunerated subjects claim to have enjoyed the experiment more than the other group.
Our brain seems to modify our original perception, so as to avoid making us feel like hypocrites due to our current behaviour, and to stop us feeling bad about the lie. The more we lie, the more we are comfortable lying, and this can lead to as slippery slope to compulsive or pathological lying. This is defined as the habit of lying uncontrollably, where the individual is more comfortable lying than telling the truth. Pathological liars are thought to lie with a clear motive in mind (this seems to be the only distinction with compulsive liars).
In “The Brain adapts to dishonesty” by Ariely et al, it is shown that the more an individual lies, the more the propensity for dishonesty is high, and that the amygdala (part of the brain associated with emotions, survival instincts, and memory) adapts to consistent dishonest behaviour.
They show that all else being equal, the scale and quantity of dishonesty gradually increases the more a person lies – with the reduced signal to the amygdala consistently predicting the upcoming increase in dishonesty for the next decision in the experiment.
“When dishonesty was in name of someone else’s benefit, the level of dishonesty was more or less constant over time. However, when the person lied in their own self-interest, they observed an escalation of dishonesty, or a slippery slope.”
The most interesting thing about this neurological study is that it differentiates between motivations for lying in their experiments, and come up with the conclusion that for lying to escalate, it needs to be done in one’s self interest.
A certain amount of physiological and neurological signals are observed when people engage in self-serving dishonesty. This was shown by a study that gave students a drug that reduced/blocked the physiological and neurological signals associated with the guilt of lying: those who took the drug were twice as likely to cheat on an exam, compared to those who took a placebo. The unease that is felt when lying for one’s own benefit seems to be what scares people away from lying most of the time, rather than any deeply held moral convictions.
Repeatedly engaging in self-serving dishonesty seems to have the same effect, as seen on the students in the previous study:
repeated exposure to the feelings of guilt/unease due to these lies reduces the strength of these feelings, and thus increases the propensity to lie in one’s own self-interest.
On the other hand, as Elim Garak said:
“Lying is a skill like any other and if you want to maintain a level of excellence, you have to practice constantly.”
For the rational agent “Homo Economicus”, he must act selfishly in his own best interests and not worry about the welfare of others, and so if lying is beneficial to the individual then he must do it. However dishonesty on an aggregate scale also has devastating effects, particularly for the economy. Mauro (1995) shows a significant negative link between corruption and growth. The paper takes the example of Bangladesh, stating that if it were to improve “the integrity and efficiency of its bureaucracy” by one standard-deviation increase in the bureaucratic efficiency index (so that it would reach the level of Uruguay), annual GDP growth would rise by over half a percentage point.
We all tell ourselves (big or small) lies to rationalise our lives at some point. In the classic TV series The Wire, Omar Little kills on a daily basis to steal drugs from drug dealers, but consistently defends himself by saying “I do some dirt too, but I never put my gun on no one who wasn’t in the game [the drug trade]”. By telling himself that his victims are part of the “game”, he distances himself from the repeated acts of murder in the name of profit; he rationalises the murders to himself by telling himself that he abides by a certain code, and diminishes the act of murder in his own mind.
We may rationalise our lies or actions by telling ourselves lies sometimes. But on a simpler note, Mark Twain once said “If you tell the truth, you don’t have to remember anything.”
by Guillaume Communay & Tristan Salmon