To Thine Own Self Be Truer
By Jon Jeckell
Lying is far more pervasive than most people want to admit. People naturally want to believe that they are honest and have good intentions, and only bad people lie, cherry pick facts for an argument, or pass themselves off as being more certain about something than they really are.
There are many forms of lying and deception, and nearly all of them are dangerous to the competitiveness of an individual or organization. Sun Tzu warned leaders to “know themselves” as well as the enemy because knowing the true state of your capabilities is crucial to understanding how your organization will perform. Commanders and corporate leaders have inspection programs to guard against embellished record keeping that is as prevalent in some parts of the world today as it was when Sun Tzu (allegedly) wrote those words. Likewise, the fourth item in the Standing Orders for Roger’s Rangers tacitly recognizes the human tendency to lie or embellish for various reasons, but warns Rangers about the hazards of passing along false information, no matter what the reason.
One of the most corrosive cousins of the lie is bullshit, as outlined in the book On Bullshit, where people are more concerned with looking smart than portraying something that is true or useful. The dangerous difference between a lie and bullshit is that the bullshitter doesn’t know whether what they are saying is false or not.
Falsehoods extend to cognitive biases, cherry picking, and other ways of crafting a situation to our own favor. This has been a well-known hazard in military planning and the intelligence community for some time, and is exacerbated by working with incomplete information and the often the inability to conclusively prove what was true even after the fact.
The larger an organization gets, the more it depends upon its members to at least coordinate internally using correct information. So large corporations, government agencies, and other large bureaucracies rely upon the honesty of their members to function. This is why government agencies in many parts of the world are in shambles and large corporations (not family run) are not common in places in the world where corruption is rampant and personal relationships are how things get done.
The democratic process itself also relies upon social trust to assure people that the outcome is fair and just, that there will be no reprisals against the losing parties, and a safe transition of power, and that the transfer of power is not a permanent loss. Spackling elections and the trappings of democratic society doesn’t work with societies with a lack of impersonal trust in the news, government institutions, and the mechanism for electing government officials. If people believe the people being elected, or the process itself is fraudulent, and the legitimacy of the government and the people in it dwindles, enforcing laws, collecting taxes, and all other interactions with the government become commensurately more difficult.
So fake news and divisive propaganda are particularly corrosive to democratic societies by eroding trust in each other, and in the institutions and the processes that allow rule by consensus. Although many are blaming the Russians and other foreign powers for driving a wedge in our national unity, there is no shortage of other sources of fake news and divisive discourse, and unfortunately, it only works because the American public (and other Western nations) seem particularly receptive to it. In fact, simple twitter bots and other unsophisticated techniques seem to be more effective overall than elaborate fake news or DeepFakes.
The key to the effectiveness of fake news, propaganda, and DeepFakes is a willingness to believe the message. What is troubling is that this is not something confined to the stupid and gullible, but is a pervasive behavior trait that may afflict the most intelligent of us, regardless of political, socio-demographic, or educational background. Cognitive biases, including cherry picking of evidence, are a well-known hazard in intelligence community and in military planning. But while those institutions have mechanisms to try to structurally enforce intellectual honesty, the general public does not.
The willingness to believe lies is related to the ability to fool oneself, and the more intelligent and creative a person is, the better they can rationalize their beliefs and actions. Most people believe that other people lie or are gullible, but nearly everyone believes they are honest. But cognitive psychologist Dan Ariely found that most people lie up to the extent that they can rationalize to themselves that they are still a good person, and wrote about his findings in the book The Honest Truth About Dishonesty The experiments conducted in the research described in the book revealed that deception and self-deception are pervasive. Worse, Ariely discovered that deception is a social contagion, and that exposure to a lie makes one more likely to lie themselves. Ariely also found evidence that consciousness of bad behavior can snowball into bigger transgressions through “what the hell” effect.
Another study recently discovered that people who reject science, on both the right and left, do not do so because they are stupid or ignorant. They begin to cherry pick pieces of factual information and use creative distortion for the rest when confronted with information that conflicts with their interests and beliefs. As before, their willingness and ability to do so scales with their ability to creatively rationalize to themselves that they are still a good person. This also means that they are unlikely to change unless they are confronted with evidence to that they are not a good person for holding these beliefs, and not just confronted with evidence their beliefs are wrong. Moreover, influence is most effective when it comes from peers they respect, and not any other source directly targeting them, according to Alex Pentland in Social Physics.
This fits with Robert Cialdini’s assessment of how a person’s social network is critical to changing their beliefs. Three of Cialdini’s principles of influence, consistency, social proof, and liking, are all oriented on a person’s relationship with others that matter. Consistency is like social inertia because people want to appear consistent in their beliefs and resist efforts to change them. The need to appear consistent can lead people to double down on their beliefs when confronted with conflicting evidence, particularly in front of their peer group. This is probably why FaceBook seems to have such nasty debates as opposed to platforms with more anonymity or social distance that lets people regroup or give ground without losing face with their core peer group. Consistency is also at the core of why the impacts of fake news, DeepFakes, and other tricks persist long after they have been debunked. However, consistency also applies to consistency among a person’s beliefs. Social proof is the tendency to mimic the behavior of those around us, particularly people we admire and respect. Finally, the need to be liked by others makes people curb or modify their beliefs and actions to fit in with those around us the most.
So what can we do about it? There is a lot more we can do about individual honesty in transactions and within institutions. It may sound corny, but Ariely found that oaths and pledges (moral reminders) actually work by reminding people of their obligation to be honest and fair, particularly when it matters the most. Ariely found putting a pledge and signature at the top of an exam, for example, resulted in far less cheating. Interestingly he found that personal benefit and the likelihood of being caught were relatively unimportant. However, conflicts of interest and a personal stake, or the chance to help someone in your social network, does make it easier to self-rationalize bad behavior. Cracking down on dishonesty, and the perception of it, also help arrest the social contagion. Being well rested and being in good condition and a good mood when making important decisions also helps make for more honest decisions. Ariely also suggested that a resetting ritual (such as a secular counterpart to Catholic confession) can reset the “what the hell” effect by allowing people to face their smaller transgressions, remind themselves of their obligations, and do better.
So how to inoculate yourself from these biases? Knowing how to recognize cognitive biases and fallacies can help. Recognizing deliberate social engineering is difficult because people who are good at it know how to use those techniques when you are most vulnerable to them. Nevertheless, Cialdini’s principles of influence are good to keep in mind. Putting off important decisions or actions when you are tired, hungry, agitated, or otherwise depleted helps avoid getting yourself into situations you will later regret. Unfortunately most people cannot be so selective about when they engage with their friends, social media, or the news. Visceral emotional reactions may also be a sign to pause before engaging on something. Recognizing the signs of being duped, or being confronted about someone duping you can work, but can also lead to a further retreat into other trusted connections and sources.
As for democratic society as a whole, I recommend Peter Turchin’s Ages of Discord to understand the multiple cycles at work in driving people together into closer cooperation and trust, or pulling them apart. This trust and unity is at the core of what makes societies susceptible to having internal divisions become malignant. Turchin uses data throughout American history as a case study and found a link between the erosion of social cohesion and the rise of political violence. Turchin identified a number of things driving integration and disintegration, and unsurprisingly a dangerous external threat is one of the most potent drivers bringing people together, the more closely they identify with each other and the more alien the enemy. The perception of scarcity and conspicuous consumption together are corrosive to unity, as is aggressive competition among elites.
So while fake news, DeepFakes, and propaganda seem like tractable problems to attack, they are merely effective only because they land on fertile ground when our trust in each other and our institutions has already been weakened. But that is a topic for another time.