the three biases secretly controlling us
“Maintaining one’s vigilance against biases is a chore, but the chance to avoid a costly mistake is sometimes worth the effort.” — Daniel Kahneman
We are much more easily influenced than we’d like to admit. We think we have free will, that we make decisions using complete information, that we look through a crystal clear lens with nothing but our own judgment pulling the strings.
However, as Daniel Kahneman describes in his book Thinking Fast and Slow, we are deeply mistaken.
Kahneman outlines for us the few ways that our cognitive biases take over the wheel and prevent us from making clear, sound decisions. This process starts with our decision-making being broken down into two main systems:
System 1: Our fast decision-making faculty System 2: Our slow decision-making faculty
Our Brain as a Newspaper Office
Kahneman uses the analogy that our brain operates like a newspaper office. System 1 is producing stories, writing them very quickly, not checking facts and spitting them out to the editor, System 2, who is meant to check them and publish them. However, oftentimes, the editor doesn’t have enough time to fact-check and critically analyze all of these stories. Thus, they publish them somewhat blindly — and quickly — and then end up defending stories which might not be correct, because they’re already published.
This is how we make decisions. We make snap judgments, quickly commit to them without investigating them closely, and then end up defending these opinions without questioning them. As much as we would like to think that we are all humble, stoic, and open-minded creatures with no ego to prevent us from admitting when we are wrong — this isn’t true.
We often defend something we don’t actually agree with simply because we’ve already committed to the argument.
If we trace it back though, it quickly becomes apparent that there were many factors playing into that decision which we may not have detected. Kahneman points out three main sources of biases that lead to fast decisions which make it through System 2 undetected.
Frequent Exposure Bias
“A reliable way to make people believe in falsehoods is frequent repetition because familiarity is not easily distinguished from truth.” — Daniel Kahneman
What you repeatedly see and hear, you will be more likely to approve of. This is Marketing 101. This tactic is constantly used to manipulate our decision-making, both intentionally and unintentionally.
In elections, if you don’t have a specific preference, you’re more likely to vote for the name you have heard or seen the most. You’re more likely to order Coca Cola than any other beverage on a menu because of how frequently it has been marketed to us. It’s extremely hard to detect when we’re experiencing frequent exposure bias because we still feel as though we do have ownership over our decisions when in reality the primary reason we are making them is due to familiarity.
To avoid being subject to frequent exposure bias, we must ask ourselves:
Where did this decision come from?
By paying closer attention to the nature of our decisions, habits, tendencies, and beliefs, we can more closely detect when we’re merely doing something because we’re used to hearing it or seeing it. It’s remarkable how significantly we can be influenced by simply bumping into something enough times that we begin to identify with it.
Once I became alerted to this, I found myself succumbing to frequent exposure bias more than I was aware of — starting with my relationship with social media. A couple of years back, I went completely radio silent on social media, deleted all the apps, and stopped logging into all of my accounts.
Upon coming back to it several months later, I realized how many people I followed which I thought I related to or aligned with, whom I actually had nothing in common with. The sole reason I had been consuming their content was that I was used to it. That’s it. Once I realized this, I did a complete cleanse of anyone or anything on my digital platforms that had been hanging around purely because I had become familiar with seeing them and had gotten used to allowing their lives to penetrate mine.
If we’re not careful, digital overwhelm, especially in a time like this, can be detrimental for our own personal well-being. We have to be vigilant in protecting ourselves against things that do not serve us to begin influencing our decisions.
Before buying someone’s e-book or signing up for their program, we should ask ourselves:
Is this truly something I would want if I hadn’t seen this product or this person before?
Status Quo Bias
“People will choose unhappiness over uncertainty.” — Tim Ferriss
Why do we stay in jobs we don’t like or relationships which aren’t good for us? Because we are more inclined to think that what we have is better than what we could have. We overweigh losses and over-value what we have — causing us to be trapped by the past and destined to maintain the status quo.
We are not able to think clearly, to weigh options honestly and holistically unless we intentionally set out to do so, because we are constantly bombarded by the status quo bias.
Those who refuse to conform to the status quo open themselves up to more possibilities, creativity, freedom, and innovation.
You cannot get ahead if you are constantly trying to maintain the same pace as everyone else.
Individuals who are particularly resistant to being influenced by others are extremely rare. For most of us, it takes significant effort and active debugging of our brain to detect when we’re acting out of the status quo bias instead of intention.
To innovate, we need to look ahead, break out of the status quo bias, and ask ourselves:
What am I losing by maintaining the status quo right now?
This question leads to independent thinking, break-throughs, and decisions free of bias. It is this kind of introspection and constant reflection which keeps us on a positive, independently chosen path, regardless of the norms around us.
When I was a senior in high school, I hadn’t yet strengthened my resist-the-status-quo-muscle and succumbed to society’s norms instead of exercising independent thinking myself. I was good at math and science. I wasn’t sure what I wanted to do in university, but it didn’t seem to matter because the automatic decision was to go into science or engineering since I had strengths in the STEM fields. What I didn’t consider at the time was that I was also strong in the artistic subjects and that my creative side — while not getting the same public recognition or praise — might have actually been much stronger than my inclination for the more reason-based, quantitative fields. But I went into engineering because that was the status quo for any student that “could” pursue math or science.
Everyone told me that an Arts degree wouldn’t be as prestigious or give me the same credibility in society — and perhaps they were right about that. But what people don’t tell you about going into a subject you don’t enjoy is that the tedious, day-to-day work that should be somewhat interesting, or at the very least tolerable, becomes painful and treacherous. That is much of what my engineering degree became. Not only was I studying extremely challenging courses, but I didn’t have a natural interest in them which made them even more difficult to get through — let alone excel at.
But I felt trapped by the status quo. What was I going to do, switch degrees in my third year of university, and start over?
It seemed impossible to me. And so I stayed. And now I’m finishing my engineering degree (my last exam is today!), and I’ll never know what it would have looked like to break out of the status quo with what I chose to study in undergrad.
But now I’m starting a job in a completely different field than engineering, I’ve allowed my creative side to blossom through writing, and I’ve read about subjects I’m genuinely interested in throughout my degree. My engineering degree has still been an incredible learning experience in many ways, but the most important thing it taught me was this:
Do not let society make the big decisions in life for you. It’s not worth it to fit in if you resent what it takes to do so.
“What You See Is All There Is” Bias
“We often fail to allow for the possibility that evidence that should be critical to our judgment is missing — what we see is all there is.” — Daniel Kahneman
Our brain tries to conserve space and time in our decision making faculties. As a result, it uses limited information to make decisions and then blocks out other conflicting information to avoid clogging System 2 with more decision-making responsibilities. System 1 will infer causes and intentions, and then neglects ambiguity and suppresses doubt. Our judgment often gets clouded by small pieces of information we have latched onto to prop up our argument. We refuse to take off our blinders and remain committed to our premature decisions despite the contrarian evidence available to us.
To avoid getting caught by this tunnel vision bias, we must ask ourselves:
Why might the opposite be true?
This question alone will help tear down the walls we use to shelter us from reality — from the other side of the argument, a different decision, a new opinion. By asking ourselves why we might be wrong, we can begin to see a more holistic picture of the topic and make better, slower, and smarter decisions.
My brother is especially good at avoiding this bias for some reason, and it has always impressed me. I sometimes see a flashy news headline and spit it out immediately to whoever is around me to share the information. Most people ooh and ahh over it like I do and we begin a full-fledged discussion over it before we have even read the article — completely convinced that what we just read is true.
Not my brother though. He always asks me: where did you hear that? Have you even read the article? Did you read the other side of the story on the same topic?
My answer always used to be a shy no — and often still is — as I sit in disbelief that I’ve been duped by a clever journalist’s headline once again. But as I’ve gotten older, and as times have gotten direr, I’ve realized that invoking that sort of reaction is that person’s job, and to avoid it you need to actively work against it. You need to investigate further. You need to make an effort to understand what you don’t yet see.
It’s our natural instinct to trust others — it’s how society functions smoothly. But we need to question the information we are fed because we can never be too sure that we’ve seen all we need to see before deciding on something. If it seems like someone is trying to get you to form an opinion without showing you all the evidence, it is likely that the opinion they’re convincing you of is not the most fact-based one. Look closely before making your decisions about what you believe. There is an entire industry out there trying to rush us into opinions using the What You See Is All There Is Bias and the only way to avoid it is to take out a magnifying glass and look for more.
Ultimately, as Daniel Kahneman has so carefully pointed out, we are not perfect decision-makers — not even close. We are biased, flawed, over-confident, risk-averse, and generally too fast with our decision-making. However, while we cannot expect to perfect the art of decision-making, we can equip ourselves against these three hidden biases to prevent ourselves from being controlled by our conditioning, our ignorance, and our environment.