"

3.6. Cognitive Biases

By Michael Ireland, adapted by Marc Chao and Muhamad Alif Bin Ibrahim


Many of the fallacies we have discussed are so persuasive because they tap into our cognitive biases, which are built-in tendencies that make us vulnerable to flawed reasoning, incomplete evidence, ambiguous language, and irrelevant distractions. These biases can significantly interfere with our ability to think critically and rationally, which is precisely the focus of this chapter. One of the primary reasons we often fail to reason effectively is our blindness to these cognitive biases and the subtle ways they shape our judgement.

Cognitive biases are systematic patterns of deviation from rational judgement, where the framing or context of information distorts how we perceive, evaluate, and decide. As we have already explored, humans rely on mental filters and shortcuts (heuristics) to process information quickly and efficiently. These mental shortcuts evolved not to guarantee accuracy or correctness, but to help us make fast and generally useful decisions in survival-oriented contexts. However, while heuristics can simplify complex decision-making, they also introduce errors and distortions, making our reasoning more prone to mistakes.

While informal fallacies and cognitive biases both lead to flawed reasoning, they differ in their origins and how they manifest. Informal fallacies are errors in the structure or content of an argument, often resulting from poor reasoning, misrepresentation, or misuse of evidence. They are typically identifiable within the framework of a specific argument. In contrast, cognitive biases are deeply ingrained psychological tendencies, which are systematic patterns of thought and judgement that operate subconsciously. Fallacies are usually found in external arguments, while biases are internal habits of thought that influence how we interpret and construct those arguments in the first place.

At this point, it might feel like this chapter is turning into something of a ‘listicle’, which is a term often used for articles that are structured as lists rather than fully developed discussions. That is because, just like informal fallacies, cognitive biases are commonly presented in categorised lists. A quick online search will return titles like “The Top 10 Cognitive Biases You Need to Know” or “5 Cognitive Biases That Shape Your Thinking”. However, the goal here is not to overwhelm you with an exhaustive catalogue but to introduce a foundational set of key biases that will enable you to recognise and understand others more easily.

In cognitive psychology, particularly in the context of cognitive therapy, the term ‘cognitive bias’ takes on a more specific meaning. It refers to habitual patterns of distorted thinking, often referred to as cognitive distortions, that contribute to and exacerbate emotional distress, anxiety, and depression. If you are studying psychology, you will likely encounter these concepts in more detail. For the purposes of this chapter, however, we are using the term ‘cognitive bias’ in a broader sense, referring to general thinking habits that can lead to reasoning errors.

One of the most intriguing and ironic aspects of cognitive biases is our tendency to easily spot them in others while remaining oblivious to them in ourselves. This bias, in itself, is a cognitive bias. It is incredibly common to notice flawed reasoning, selective interpretation, or emotional decision-making in someone else while overlooking the same patterns in our own thought processes.

For this reason, it is crucial to remain open to feedback from others when they point out potential biases in your reasoning. If you are like most people, and you are, you likely have a few cognitive biases that you are unaware of. Recognising and acknowledging them is the first step toward more accurate, balanced, and reflective reasoning.

The Dunning-Kruger Effect

The Dunning-Kruger Effect describes a fascinating and counterintuitive phenomenon: the less someone knows about a topic, the more confident they are in their knowledge. You have likely encountered this effect in everyday life, even if you were not aware it had a name.

This cognitive bias was first formally described by psychologists David Dunning and Justin Kruger in their influential 1999 study titled “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” In essence, the Dunning-Kruger Effect highlights a troubling truth: the less knowledge or skill someone has in a given area, the less capable they are of recognising their own limitations.

At its core, the effect shows that ignorance breeds overconfidence. People with limited understanding of a subject often overestimate their expertise because they lack the self-awareness to recognise their own gaps in knowledge. On the other hand, individuals who possess genuine expertise tend to be more cautious and humble in their assessments, often underestimating their own competence because they are keenly aware of what they do not know.

Even Charles Darwin observed a version of this effect long before it was given a formal name, writing that “Ignorance more frequently begets confidence than does knowledge.”

The Dunning-Kruger Effect serves as a powerful reminder of the importance of intellectual humility and the need for self-awareness in evaluating our own abilities and knowledge. Understanding this bias can help us approach unfamiliar topics with a more balanced perspective and encourage more thoughtful self-assessment.

Confirmation Bias

Confirmation bias is one of the most widespread and influential cognitive biases. At its core, confirmation bias refers to our tendency to seek out, interpret, remember, and give more weight to information that supports our existing beliefs, while ignoring or downplaying evidence that contradicts them.

This bias stems from our deep psychological attachment to being right. Changing our minds is uncomfortable, mentally taxing, and often feels like admitting failure. As a result, we become emotionally invested in our existing beliefs and subconsciously filter the world to reinforce them.

In the Information Age, confirmation bias has become an even bigger problem. With unlimited access to information online, it is incredibly easy to find articles, studies, or opinions that align with whatever belief we already hold. A quick Google search can provide endless “evidence” to validate almost any position, no matter how flawed or incorrect. This creates an echo chamber effect, where we shield ourselves from opposing viewpoints and avoid confronting the possibility that we might be wrong.

But confirmation bias does not just affect what information we seek out, it also influences how we process and remember evidence. When presented with two pieces of evidence, one that supports our belief and one that challenges it, we are far more likely to scrutinise and dismiss the contradictory evidence while readily accepting the supporting evidence. Furthermore, we are more likely to remember the confirming evidence and forget or distort the contradicting information over time.

Understanding confirmation bias is essential for critical thinking because it reminds us to approach evidence and opposing viewpoints with intellectual humility and an open mind. Overcoming this bias requires a conscious effort to actively seek out disconfirming evidence, question our assumptions, and evaluate all evidence with equal scrutiny, regardless of whether it aligns with our preexisting beliefs.

Self-Serving Bias

Self-serving bias refers to our natural tendency to attribute positive outcomes to our own actions or character while blaming negative outcomes on external factors. In essence, we protect our self-esteem by taking credit for successes and shifting blame for failures onto something, or someone, outside of ourselves.

This bias serves an important psychological function: it helps us maintain a positive self-image and avoid feelings of guilt, shame, or inadequacy. When something goes well, we are quick to attribute the success to our skills, intelligence, or effort. However, when something goes wrong, we are equally quick to point to bad luck, other people’s mistakes, or uncontrollable circumstances as the cause.

A classic example can be seen in the aftermath of a car accident. Both drivers involved are far more likely to blame the other party, even if they themselves contributed to the collision. Similarly, in academic or professional settings, people often credit their hard work and intelligence for their successes but blame unfair teachers, bad bosses, or difficult circumstances for their failures.

Interestingly, self-serving bias does not operate the same way in everyone. Individuals with low self-esteem or depression may actually experience a reversed self-serving bias. In these cases, people are more likely to blame themselves for negative outcomes and attribute positive outcomes to luck or external factors, further reinforcing their negative self-perception.

Understanding the self-serving bias is essential for developing self-awareness and accountability. Recognising when we are falling into this pattern can help us take responsibility for our actions, learn from failures, and grow from our experiences rather than defaulting to protective but ultimately unproductive mental habits.

The Curse of Knowledge and Hindsight Bias

Knowledge, while valuable, comes with its own set of drawbacks. One of these is the curse of knowledge, sometimes referred to as the curse of expertise. This bias occurs when someone who is well-versed in a topic fails to recognise how much less others may know about it. Experts, such as lecturers or seasoned professionals, often assume that their audience shares their foundational understanding. This assumption can lead to poor communication, misunderstandings, and unrealistic expectations about how others will interpret or act on information.

The problem arises because once we have fully integrated a piece of knowledge into our understanding of the world, it becomes intuitive and seemingly obvious. Explaining something to someone without that foundational knowledge suddenly feels far more difficult than we realise. This creates barriers to teaching, collaboration, and even predicting how others might respond in certain situations.

Closely related is hindsight bias, which deals not with knowledge but with events and outcomes. After something significant happens, it feels inevitable in retrospect, even if it was not obvious beforehand. This bias convinces us that we “knew it all along” or that the outcome was clearly predictable. However, this false sense of foresight overlooks the uncertainty and complexity present before the event occurred.

A common example of hindsight bias can be found in studying historical events. When analysing the events leading up to World War I, for instance, it is tempting to wonder how experts at the time failed to see the impending crisis. With the clarity of hindsight, every detail appears to have been a clear warning sign, even though those living through the events were navigating ambiguity and incomplete information.

Both the curse of knowledge and hindsight bias highlight how our perspective on knowledge and events changes once we have additional information. Being aware of these biases can help us communicate more effectively, remain humble about what we “knew” beforehand, and approach complex situations with a clearer understanding of uncertainty and perspective.

Optimism and Pessimism Bias

Humans have a notoriously poor grasp of probability, and optimism and pessimism biases are clear examples of this shortcoming. These biases influence how we perceive the likelihood of positive or negative outcomes, and they often lead us to make flawed judgements about future events. Our ability to accurately assess probabilities is heavily influenced by factors such as our personality, mood, and the nature of the situation we are evaluating.

Optimism bias leads us to overestimate the likelihood of positive outcomes and underestimate the risk of negative ones. For example, many university students surveyed believe they are less likely to experience negative life events, such as divorce or alcohol addiction, compared to their peers. At the same time, they tend to overestimate their chances of positive outcomes, like owning a home or living past the age of 80. These skewed perceptions are often reinforced by confirmation bias, which makes it easy for us to focus on evidence that supports our optimistic expectations while dismissing contradictory information.

Conversely, pessimism bias causes people to overestimate the likelihood of negative outcomes and underestimate positive possibilities. While optimism and pessimism biases seem like opposites, they can actually coexist within the same person, depending on the specific scenario. For example, someone might feel overly optimistic about their career prospects while simultaneously being overly pessimistic about their health outcomes.

It is also worth noting that pessimism bias can be more pronounced in individuals with mental health conditions, such as depression. These individuals may consistently interpret future events with an exaggerated sense of risk or inevitability of failure.

Ultimately, both optimism and pessimism biases reveal how subjective and unreliable our assessments of probability can be. Recognising these biases helps us approach future planning with greater realism and balance, making us more aware of our tendency to lean too far in either direction when predicting outcomes.

The Sunk Cost Fallacy

The sunk cost fallacy occurs when we continue investing time, money, or effort into something simply because we have already invested so much, even when it no longer makes sense to do so. A sunk cost refers to any expense, whether financial, emotional, or in terms of time, that has already been incurred and cannot be recovered. Rational decision-making tells us that these past costs should not influence our future choices, but our emotions and psychological biases often override this logic.

This fallacy leads us to overvalue past investments while undervaluing future or ongoing costs. Essentially, the more we have invested in a decision, the harder it becomes to walk away, even when the most logical choice would be to stop. For example, someone might overeat at a buffet because they feel they need to “get their money’s worth”, even if they are uncomfortably full. Similarly, a business might continue funding a failing project simply because they have already poured significant resources into it, rather than redirecting those resources to something more promising.

At its core, the sunk cost fallacy exploits our emotional attachment to past investments. We are naturally resistant to the idea of “wasting” what we have already put in, even if persisting results in further losses. The belief that we must “see it through” creates a powerful psychological pull, making it feel like abandoning the effort is a failure rather than a wise strategic choice.

Recognising the sunk cost fallacy involves a shift in perspective; learning to evaluate decisions based on their future value and potential outcomes rather than past investments. By focusing on what can still be gained (or avoided) moving forward, rather than what has already been lost, we can make more rational and effective decisions.

Negativity Bias

Humans have a natural tendency to focus more on negative experiences and emotions than on positive ones, even when their intensity is the same. This phenomenon, known as negativity bias, means that negative events have a greater psychological impact on us than positive ones of equal significance.

For example, we are far more likely to fixate on an insult or mistake than to dwell on a compliment or success. Even if the praise and criticism are equally strong, the criticism tends to linger in our minds, while the praise fades more quickly. This imbalance is not just a quirk of personality; it is deeply rooted in how our brains process emotional experiences. Negative emotions and events are registered more intensely, are more easily recalled, and tend to dominate our thought patterns.

This bias serves an evolutionary purpose. Historically, paying close attention to potential threats or negative events increased our chances of survival. However, in modern contexts, this bias can distort our perspective, making us overly focused on problems, setbacks, or critical feedback while overlooking positive experiences and successes.

It is important to note that negativity bias is distinct from pessimism bias. While negativity bias focuses on how we process past and present events, pessimism bias is about our expectations for future events. Understanding this distinction helps clarify how these biases influence our thoughts and emotions in different contexts.

Recognising negativity bias allows us to be more intentional about balancing our focus on positive experiences and not letting negative events dominate our mental space. By consciously acknowledging and celebrating positive outcomes, we can counterbalance this natural tendency and develop a more even-handed perspective on our experiences.

The Backfire Effect

We like to think of ourselves as rational beings, ready to adjust our beliefs when presented with new facts and evidence. However, reality tells a different story, one that becomes painfully clear after five minutes on social media. Instead of welcoming evidence that contradicts our views, we often respond by digging in our heels even deeper.

The backfire effect describes this counterintuitive reaction. When confronted with evidence that challenges deeply held beliefs, instead of reconsidering our stance, we often become even more committed to our original position. It is as if admitting we were wrong is so uncomfortable, so threatening to our sense of self, that we would rather reject reason and evidence entirely.

Rather than softening our stance, new information can feel like an attack on our identity, triggering a defensive response where we double down on our original belief, as though preparing for a long and stubborn stand-off. This effect is especially common with beliefs tied to our identity, values, or worldview.

The irony of the backfire effect is that it often strengthens the very beliefs it seeks to challenge, making productive conversations around controversial topics incredibly difficult. It is a psychological bunker mentality: when challenged, we fortify our mental defences rather than opening the gates for reflection and growth.

And like many cognitive biases, while it is easy to spot this behaviour in others, it is notoriously difficult to notice in ourselves. Recognising the backfire effect in our own thinking requires humility, self-awareness, and a willingness to sit with the discomfort of being wrong, which is a task far easier said than done. Understanding this bias can help us approach difficult conversations with more patience, empathy, and a focus on collaboration rather than confrontation.

The Fundamental Attribution Error

The fundamental attribution error highlights a common imbalance in how we explain behaviour, both our own and others’. When judging our own actions, we tend to blame external circumstances, while when judging others, we often attribute their behaviour to their character or personality.

For example, if we accidentally cut someone off in traffic, we might excuse ourselves by saying, “I didn’t see them” or “I was in a hurry.” In other words, we justify our actions by pointing to the situation we were in. However, if someone else cuts us off, we are far less generous in our interpretation. Instead of considering that they might be rushing to an emergency or simply did not see us, we are more likely to think, “What a careless jerk!”

This bias occurs because it is easier to observe others’ behaviour than the situational factors influencing them. When it comes to our own actions, we have a deeper understanding of our intentions, pressures, and constraints. But with others, we only see the outcome of their actions, not the invisible situational context behind them.

This tendency to favour dispositional (personality-based) explanations over situational ones is also known as the correspondence bias or attribution effect.

Recognising this bias is crucial because it affects our judgement, empathy, and relationships. By reminding ourselves that everyone operates within their own set of circumstances, just as we do, we can approach others’ behaviour with more understanding and less immediate judgement.

In-Group Bias

Humans have a natural tendency to categorise the world into social groups, which is how we make sense of complex social dynamics. However, this inclination often leads to in-group bias, also known as in-group favouritism. This bias refers to our tendency to favour people we perceive as part of our own group, viewing them and the group as a whole, more positively than those outside it.

At first glance, this might not seem like a bias. After all, if we did not view our group positively, we would probably just join a different one. Whether it is supporting a sports team, identifying with a cultural group, or aligning with a political party, our group affiliations are deeply tied to our sense of identity and self-esteem.

What is particularly striking about in-group bias is how easily it can emerge. Studies have shown that people start displaying favouritism even when they are randomly assigned to completely meaningless groups, with no shared history or meaningful connection. In these experiments, participants often favour their new group members in everything from distributing rewards to forming opinions, even though the group itself was created arbitrarily and holds no real significance.

This bias is not inherently malicious; it is a byproduct of our natural desire to belong and feel valued. Our group memberships help shape our self-identity and reinforce our sense of self-worth. However, unchecked in-group bias can lead to unfair treatment of others, reinforce stereotypes, and create unnecessary divisions between groups.

Being aware of in-group bias allows us to reflect on our assumptions and judgements, encouraging us to evaluate others based on their individual merits rather than group affiliations. In doing so, we can move towards more balanced and fair interactions across different social groups.

The Forer Effect (also known as The Barnum Effect)

The Forer Effect, sometimes called the Barnum Effect, refers to our tendency to believe vague and general personality descriptions are uniquely tailored to us. This bias explains why people often find horoscopes, personality tests, or fortune-telling surprisingly accurate, even when the descriptions are so broad they could apply to almost anyone.

For example, a horoscope might describe someone as “strong-willed yet sensitive, often drawn to creative pursuits and deeply caring for loved ones.” While these traits could easily apply to a wide range of people, individuals often interpret them as being specifically about themselves, especially if they align with their self-image.

This effect works because people tend to focus on details that feel personally meaningful while overlooking the generic nature of the description. Essentially, we are drawn to self-relevance, even in statements designed to be universally relatable.

A playful way to test the Forer Effect is to pretend you are a different zodiac sign when talking to someone who strongly believes in astrology. You will likely find they quickly identify traits from that sign in your behaviour or personality. And when you eventually reveal your true sign, they might simply brush it off with a comment like, “Oh, well, that sign is known for being deceptive!”

This interaction also highlights how confirmation bias and the backfire effect can reinforce these beliefs. Once someone feels their personality aligns with a description, they become resistant to evidence suggesting otherwise.

The Forer Effect reminds us how easily we can fall into the trap of seeing patterns and personal relevance in vague statements. Recognising this bias can help us approach generalised claims, whether in horoscopes, personality quizzes, or marketing materials, with a more critical and discerning mindset.


Chapter Attribution 

Content adapted, with editorial changes, from:

Mastering thinking: Reasoning, psychology, and scientific methods (2024) by Michael Ireland, University of Southern Queensland, is used under a CC BY-SA licence.

License

Icon for the Creative Commons Attribution-ShareAlike 4.0 International License

3.6. Cognitive Biases Copyright © 2025 by Marc Chao and Muhamad Alif Bin Ibrahim is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book