In many ways, Cognitive Biases are what make us human. These are systematic errors in thinking that affects the decisions and judgments that people make. We would not be able to survive without the simplifications that these errors allow, but in many ways, these also have an impact on our daily lives.
Some of these biases are related to memory. The way you remember an event may be biased for a number of reasons and that in turn can lead to biased thinking and decision-making. Other cognitive biases might be related to problems with attention. Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them. Because of this, subtle biases can creep in and influence the way we see and think about the world. Think just about how much the recent events around Coronavirus are being affected by the biases our mind has, by making the issue much scarier than it is in reality.
We have seen how Bias plays a role in many decisions, and several books I’ve reviewed cover this topic in great detail. In Rebel Ideas, Matthew Syed looks at how Diversity is often hindered by biases. Iris Bohnet explains in Gender Equality how much Bias hinders the perception of women in many ways. Huib Wursten addresses bias in how we perceive mental models of other cultures. We have seen how Bias creates issues in the way we communicate with strangers. And Todd Rose shows us how much bias in number affects our perception of reality. In all these books awareness is the critical antidote offered for all biases.
There are many lists of biases that list the entire world of biases discovered, and TitleMax has just issued a very neat Infographic that lists 50 Cognitive Biases.
Among the most recent studied forms of biases, three are interesting to notice, especially because of its relationship with technology.
Automation Bias happens when people tend to favour the outputs of information systems. It is a problem that is not just recent (the Chernobyl Disaster is partially been linked to such an event), but with the recent developments in AI and Machine Learning, we take more decisions for granted that are delivered by information systems. Think about the suggestions engines that are so common today, for example on Spotifty, Netflix and Amazon. This “personalisation” of the content offered does take the toll of how we interact with the systems, and often creatses issues. If you look at my Spotify Suggestions now, you’ll notice a lot of relaxing music. Nothing wrong with it, but this comes from the fact that I often play a Dog Calming Playlist when not at home (sometime also for 4 hours), and this has sneaked in into my preferences (and also my Top Listened songs…).
The risk here is always trusting an output just because it comes from a computer.
The Google Effect
The Google Effect, also called Digital Amnesia is the tendency to forget things we would normally find on a search engine. This extends not just on news or information, but also on personal data (how many of you remember a phone number today?). It seems that our brain is re-prioritizing the way we store information, due to the ease of access of many of them. This is happening in many domains, one critical one is navigation. Thanks to the ubiquitous availability of navigating software, people are losing the capacity to navigate from one place to another, often ending up in weird and dangerous situations.
The Ikea Effect
The Ikea Effect refers to the tendency of giving higher value to things we have created ourselves (or that we help create). This bias is being used often by retailers, especially when they allow to “customize” products. But also a big part of the DIY market is impacted by such a bias.
While there might not seem to be something inherently wrong in this bias, its awareness may help us with some decisions as consumer, but also in a working environment.
The full list of 50 Cognitive Biases
The list of the 50 Cognitive Biases identified is as follows:
- Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
- Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
- In-Group Favoritism: We favour people who are in our in-group as opposed to an out-group.
- Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
- Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
- Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
- Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
- False Consensus: We believe more people agree with us than is actually the case.
- Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
- Spotlight Effect: We overestimate how much people are paying attention to our behaviour and appearance.
- Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
- Defensive Attribution: As a witness who secretly fears to be vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
- Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
- Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
- Naïve Cynicism: We believe that we observe objective reality and that other people have higher egocentric bias than they actually do in their intentions/actions.
- Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
- Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
- Anchoring: We rely heavily on the first piece of information introduced when making decisions.
- Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
- Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
- Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
- Confirmation Bias: We tend to find and remember information that confirms our perceptions.
- Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
- Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
- Belief Bias: We judge an argument’s strength, not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
- Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
- Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
- Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
- Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
- Gambler’s Fallacy: We think future possibilities are affected by past events.
- Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
- Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
- Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
- Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
- Authority Bias: We trust and are more often influenced by the opinions of authority figures.
- Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
- Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
- Tachypsychia: Our perceptions of time-shift depending on trauma, drug use, and physical exertion.
- Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
- Zeigarnik Effect: We remember incomplete tasks more than completed ones.
- IKEA Effect: We place a higher value on things we partially created ourselves.
- Ben Franklin Effect: We like doing favours; we are more likely to do another favour for someone if we’ve already done a favour for them than if we had received a favour from that person.
- Bystander Effect: The more other people are around, the less likely we are to help a victim.
- Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
- False Memory: We mistake imagination for real memories.
- Cryptomnesia: We mistake real memories for imagination.
- Clustering Illusion: We find patterns and “clusters” in random data.
- Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
- Optimism Bias: We sometimes are over-optimistic about good outcomes.
- Blind Spot Bias: We don’t think we have a bias, and we see it others more than ourselves.
Understanding the potential impact of Cognitive Bias is critical not just to fight Diversity and Belonging issues, but in any organisational intervention. It’s needed to help to deliver a strong business case for Creativity and supports many of the critical skills for the Future of Work, such as Curiosity.
I believe that Bias Avoidance and Awareness should be intentionally built also in our organisation design, especially at the Governance level. Awareness in this field is key to ensure failures are intercepted, especially in the case of Change and Digital Transformation.
And you? Are you aware of your biases?
Cover Photo by Andreas Kind on Unsplash
[…] Being Aware of Cognitive Bias […]
[…] Being Aware of Cognitive Bias […]
[…] to cite. That of Bias affecting choices and interpretations. I have already covered unconscious bias and the importance of awareness. What’s essential in this context, is making sure we don’t […]
[…] But I am profoundly aware that traditional work practices and managerial frameworks are a big obstacle to this. A reform from inside is very complicated, if not impossible. Most of the success stories come up from the moment of deep crisis, or from intentional design rooted at the beginning of the organisation. The only way to really reform an organisation today is by adopting a truly Rebellious Mindset, capable of dismantling existing paradigms and smashing assumptions and biases. […]
[…] are often understood also in the framework of other heuristic elements we have already seen: biases. If you want to utilise the Nudge Theory, you need even to know that component of human […]
[…] and explicating how they impede gender equality, it may be possible to intervene and design around the point at which bias is most likely to take hold. This approach, often identified as ‘Nudging’, allows for faster achievements, moving the […]
[…] In conclusion, the book approaches still two broad subjects: the risk implicit in pure linear thinking, and the issue with Unconscious Bias. […]
[…] live in a world where Diversity truly adds value. They tend to move away from stereotypes and bias, continually questioning the validity of shared assumptions. We’re not talking here merely […]
[…] Choice Blindness is a TEDx video that focuses on Bias. […]
[…] Cognition: the first factor stems from cognitive biases. We tend to protect the course of action we have developed with so much effort by suppressing […]
[…] merits. First, it comes in at a point where behavioural economics research focused on biases, painted the human being as “irrational” and almost incapable of making sensible decisions. […]
[…] In this TEDx Talk, Nathan Nguyen from Rochester University discusses how our political leanings can affect the way we perceive scientific evidence. He explains how to examine and minimize such biases, which would, in turn, bring about important changes in the nature of current political discourse. A topic very current in the wake of the COVID19 pandemics, where political play has played a major role in the preparadness of the different countries. But also a key learning in how to identify and manage our biases. […]
[…] Thinking are a typical way of arranging people into boxes, thus devising differences. Unconscious Bias is another way to structure forms of exclusion. The issue is when categories or biases (which […]
[…] Center for Advanced Hindsight was created based on the concept of Hindsight Bias. Established at Duke, it offers a really broad network of researchers and practitioners, offering […]
[…] In this TEDx Talk, Thaniya Keereepart explores three design principles to help us overcome everyday bias. Thaniya’s passion lies at the intersection of human-computer interaction, design and behavioural […]