I’m sure we've all seen it before. The conversation around the family dinner table becomes political, and immediately the guests are divided into two camps. You manage to keep the debate civilized and friendly, but not everyone may be as educated on the subject as you are. Having read several articles on the subject at hand, you feel confident enough to introduce some cold hard facts and statistics into the conversation. Objective, unbiased data are sure to convince the other side, right? But they seem reluctant to change their minds. Why are facts not good enough to change our minds?
It’s time we admitted that humans are not rational creatures. What, then, motivates us, if not the cold hard facts?
This irrational behavior of the human mind was first noticed in studies conducted at Stanford University in the 1970s. Since then, countless other studies have shown that our irrationality is deeply embedded in our anatomy through evolution and that we act on emotion rather than rational facts.
Our lack of logic stems from our evolutionary desire to belong. Straying too far away from the herd, even in search of shelter or food, immediately lowers your chances of survival in the wild. By sticking to our opinions, we subconsciously signal to our surroundings that we are an inherent part of the herd. We belong. And in this respect, trying to convince someone to change their mind is equivalent to trying to convince them to leave their tribe. How would they survive without a community?
Another anatomical element of the irrational mind is the fact that our brain has mechanisms that constantly work to protect our ego, worldview, and sense of identity. When our worldview is challenged, the same part of the brain in charge of processing physical danger is activated. This may explain why arguments can sometimes become aggressive.
This is especially interesting when you realize that the voice of reason was also developed through evolution. As Homo sapiens started to evolve language, it allowed them to cooperate. Cooperation is difficult to establish and sustain, and that's why “reason developed not to enable us to solve abstract, logical problems or even to help us conclude unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.“ Source
Then there’s also the “illusion of explanatory depth” - a term used to describe a situation in which a person who is relatively ignorant or unknowledgeable in a certain field imagines themselves as competent, masterful, and skillful- even though they are far from it.
Think of it like that – you probably operate a zipper daily. You know how it works, don’t you? You grab the handle, you move it, and the zipper, well, zips. But how does it zip the two sides together? When you come to think of it, you realize that you don’t really know that much about zippers. Steven Sloman and Philip Fernbach, two cognitive scientists, explain it like this: "People believe that they know way more than they actually do. What allows us to persist in this belief is other people."
When it comes to everyday things, like operating a zipper or a toilet, we know that another human designed it so that we can all operate it easily. “We have been relying on one another’s expertise ever since we figured out how to hunt together […] So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others begin […] There’s no sharp boundary between one person‘s ideas and knowledge and those of other members of the group.“ Source
When it comes to new technologies, this ignorance is actually empowering. When it comes to politics, it divides us and creates trouble. We can easily operate a zipper without knowing how it works, but we cannot favor a political stance without understanding it.
Another physiological angle to our biased mind was discovered by a study conducted by Jack and Sarah Gorman, a psychiatrist, and public health specialist respectively. They found that processing information that supports our beliefs gives us a surge of dopamine – the happiness hormone.
Realize that a drastic ideological shift is not very realistic, and so your best strategy would be to tell someone they’re right before telling them they are wrong. Point out where you both agree on something, and then offer a gentle alternative perspective rather than a groundbreaking fact.
When you find common ground, you essentially communicate to the other side that you are, in fact, together in the same tribe, and if they change their mind, they won’t be alone. This is also a great way to dismantle aggressive behavior.
The more you share beliefs with someone, the more likely they are to change your mind. Writer James Clear put it like this: "You already agree with them in most areas of life […] The closer you are to someone, the more likely it becomes the one or two beliefs you don’t share will bleed over into your own mind and shape your thinking.“
Related: Win Every Argument: How to Avoid Conflict and Navigate Group Conversations Gracefully
Try to knock some practicality into the argument if you run into a smug opposition. Ask them to describe the implications of their radical beliefs. What would life look like for all of us if we went with their idea? When using this form of simulation, it's very important to use facts only. Otherwise, the conversation would stray from the point.
If the argument becomes heated, that's when you know someone is irrational. Try to consider the possibility that you may be wrong yourself- understand that what you know, think, or believe is only right based on what you know now and that your beliefs may change if you acquire new knowledge. Maybe the facts are not on your side, in which case, admitting it will help you show the other person that it’s okay to be wrong.
When trying to convince, use impersonal language. Instead of saying “your argument,” for instance, say “the argument.“ Your goal is to change minds, not criticize or attack. Use your curiosity and your kindness. Argue to learn, not to win.