top of page

Why do we talk about cognitive biases so much ?

Dernière mise à jour : 11 oct. 2021

Whether it be in academics (behavioural economy, neuromarketing...) or the human resources department of your company, hearing about unconscious biases has become quite mainstream. Why have these sometimes become the stars of press articles or popularised research? Are they simply buzzwords designating theories destined to be disproved, or notions that can actually help us understand some of our dysfunctions, such as, I don't know... climate inaction ?

In this article, I have tried to make a recap of what cognitive biases are, and why learning about them can help us improve ourselves in many ways.


What are cognitive biases ?


Unconscious - or cognitive - biases are mental mechanisms that influence how we treat decisions when we have too much information to process. They are part of the broader study of cognitive psychology (famous for the associated kind of therapy), which questioned behaviourism in the 1950's (E. E. Smith, 2001), to introduce a new study of mental processes, inspired by the algorithms of computers. Indeed, this era was Alan Turing's first steps in the famous and mind-opening world of Artificial Intelligence.


Funnily enough (at least I find it funny), the first works about cognitive biases that seem to have arrived in academia are economics papers. The psychologists Daniel Kahneman and Amos Tversky suggested that humans' decision-making process wasn't the perfect rational wonder we had based our neoclassical economy on: the so-called homo œconomicus was a flawed model, and individuals wouldn't always make the best choices in the stock market or in uncertain situations. Their 1974 paper for example, Judgement under Uncertainty: Heuristics and Biases, shows that the mental shortcuts of subjects lead them to have intuitions about predicting an outcome that is often contrary to rational calculus or probabilities.


A. Wilke and R. Mata recap cognitive biases as "people's systematic but purportedly flawed patterns of responses to judgment and decision problems." As in statistics, a bias is therefore a non-controlled factor that makes us deviate from rational behaviours.


They notably intervene when we have a decision to make and too complex information to treat to do so, and use heuristics. They're the ones that allow you to intuitively know how to use a door that you've never seen before, for example (M. Chammat, 2018). Kahneman's and Tversky assert that heuristics are colored (biased) by several factors. These colors differ from one society or person to another and drive people to come to different conclusions or take different actions when faced with the same situation.


Why do they attract so much attention now ?


As shown in the Google Trends graph at the beginning of the article, this is not only a matter that is being increasingly researched (Institute of Cognitive Studies of the French Ecole normale superieure created in 2011 for example), but something that we are more and more interested in. After Kahneman and Tversky's Nobel Prize in 2002, and especially since 2009, more articles and more interest have grown from this topic.


I believe that social alerts and crisis, like misinformation or prejudice issues (sexism, racism...), have taught us - as social individuals - that some of us have thoughts patterns, that can lead us to be irrational and harm others. The last great rush of information about it was probably the one due to the COVID-19 crisis and the governmental and environmental decisions that were taken about it. These were even considered by Gernot Wagner - and, honestly, by me and probably lots of others - a model to understand, in a more restricted time frame, the reactions and inaction to climate change. This is mainly the reason I got interested in cognitive biases, and I believe understanding them is key to make better decisions about ourselves as individuals and as a group.


Cognitive biases have a significant role in climate inaction


The mapping below is possibly the most well-known categorisation of cognitive biases. Even though I will later study and write more about the impacts of some of them on climate policies, here is a recap of what to know about them, and some examples of how they prevent us from acting immediately. They help us cope with 4 types of situation:

The cognitive biases mapping (John Manoogian III &Buster Benson) - Click to enlarge
  • Too much information: we try to reduce to the "essential" the flood of information we receive. We'll remember information that correspond to a bias better, and leave out the other ones, considering them useless or even false. The confirmation bias - choosing to remember an information supporting something we already thought was true, and not taking into account any contradictory signal - is one of the most famous and is strongly studied in research and medical papers. A person who doesn't believe in climate change, for example, might only search for and believe information that tries to prove it wrong, and will discard any proof of mistake in his beliefs. I personally know I have experienced it a lot, especially when needing to choose information on the Internet, when I don't know how to select information, don't have the time to read all sides of an issue, or fear a mighty "fake news" incident iteration. (Now, I try to read mainly research papers and documented studies.)

  • What should we remember, or how we remember a past experience. Our memory stores bites of information that can differ from its initial form, because biases color it. Negativity bias, for example, is the one that explains that we'll remember bad news more intensely than good ones. This is why, for example, one can be more scared of losing than eager to earn, or why we feel more stimulated by bad news than good news in the media.

  • Need to act fast: our decision-to-act making process is generally based on other consideration than "the right thing to do". An example I like is the status quo bias, the one that encourages us to stick to what is already in place or what we know others are more likely to do. This is one I believe is strongly linked with the observed behavioural inertia of climate inaction. It is also a way to save us the energy from making a yet unexperienced decision and risking unexpected side effects.

  • Not enough meaning: this is a particularly interesting category of biases, that I think shows our necessity to link our knowledge to us and our story. We'll generally consider that things, people or patterns we have experienced more or that are close to us are more valid or better. An exemple is the normalcy bias. I wrote an article that was reviewed and published in the Council of Business and Society's special 10 years anniversary issue (pp. 62-67) about the effect of normalcy bias on climate inaction and some prospects we can have, you can take a look at it here!


Learning to unlearn : cognitive debiasing material


Talking about imperfections and how these unconscious biases lead us to have behaviours we should condemn or change, but don't know how to (racism, sexism, climate inaction,...). However, not talking about it and letting us persevere in these behaviours would be a great problem and can harm ourselves or the ones that are the victims of biased perceptions. Having a biased decision-making process can be really frustrating, especially if we feel the cognitive dissonance that arises from the gap between the urgency and our behaviour, and don't apprehend the source well enough to make less biased choices. I studied three separated - but linked - notions that I found helpful:

  • Strategies of cognitive debiasing in emergency room diagnosis, in clinical research, and, among other things, encourage us to raise awareness of biases.

  • Mindfulness - not judging oneself and experiencing your emotions and environment more fully;

  • Unlearning - not being afraid to question one's own judgement, and considering doubt and disagreement as sources of knowledge instead of imperfections.


Cognitive debiasing through awareness


Some clinical research papers have already engaged in trying to understand how we could mitigate the risks of bad decisions caused by cognitive biases, mostly in emergency rooms. In a context of emergency and unstable environment, biases can cause misdiagnosing and decrease patient safety. Cognitive debiasing means switching from a Type 1 thinking process (pattern recognition based, heuristic-driven, and, all in all, quick and biased) to a Type 2 thinking one (analytical, slower, less biased). Even though this can seem like a more short-term oriented problematic, whereas climate change is a long-term crisis, it is absolutely appropriate to study this, as climate inaction often results from a series of punctual heuristic-driven decisions, either individual or grouped. I believe it also allows an ethical basis to study the reasons to practice cognitive debiasing.


So what do I do now, Karen ? I'll write about the concept in more articles, but the thing I wanna emphasize here is the following: a necessary step to cognitive debiasing is identifying one's bias and the consequences that emerges from them. A lot of material to practice this exist. For example, I found that this course on LinkedIn Learning (a platform I never thought I'd use) - that inspired to write this article - quickly sums up the basis of cognitive biases in the work environment, in a clear and rather insightful way.


Mindfulness and self-kindness


Mindfulness has several definitions, although I'll go with the following one: "that process of consciously making use of information relevant to the situation" (Langer, 1978). It all in all means that you're aware of your mental process and can identify and correct its dysfunction. Though mindfulness is a technique used to reduce anxiety and improve psychological well-being, I think the link with climate change is pretty direct. Being more aware of our actions, emotions and decision-making can drive us to understand our biases and keep track of our initial goal.


So what do I do now ? Although it is a term generally associated with meditation, I believe you don't have to frequently meditate to practice it. Having said that, having done yoga or meditation a few times before can help understanding the philosophy behind it. What I try to do is noticing when I have an emotional or way-too-quick response to a rational situation (if I'm upset someone said something, anxious about a problem I don't know how to solve), and reflecting upon it. I started the book How to cope with exo-anxiety by the psychologist Karine Saint-Jean, that guides me into this, and according to this quizz, I don't suck at it (I got a 72, I didn't expect it). Of course, you can also search for material on the matter, or even ask a therapist for advice.


Unlearning when necessary



The last notion that I like to highlight is unlearning. Mariam Chammat, a neuroscience researcher at the French DITP (Interministerial Direction for Public Transformation), warns us in her Ted Talk about the dangers of taking our knowledge for granted. Very often, we make assertions and decisions based on something we think we know, without questioning our own knowledge. But this knowledge can be flawed or false, and the decisions we make accordingly will be as flawed.


So what do I do now ? To avoid this danger, M. Chammat encourages a "habit of doubt", to avoid "illusions of knowledge". Mindfulness and unlearning should, in my opinion, go together: so as to be able to question one's knowledge, it is important to accept our limitations and not judge ourselves. Most false beliefs we have don't come from ourselves or our identity, but rather or education or our cognitive biases. They are not inherently bad, and some of them we have learned to identify through education of life experience, but climate change doesn't consistently call to us in immediate ways.



There's always much more to know about this topics, so don't hesitate telling me what article you expect next!

 

External Links

  1. Welcome to the Jungle, Series about cognitive biases in HR departments. Visit Online.

  2. M. S. Roul, (2021). Climate Inaction: Changing Normalcy. Global Voice magazine #18, 10 years anniversary issue, Council of Business and Society, 62-67. Read Online.

  3. SAMUELSON, W., & ZECKHAUSER, R. (1988). Status Quo Bias in Decision Making. Journal of Risk and Uncertainty, 1(1), 7–59. Read Online.

  4. S. Gordon, (2017). Unconscious Biases. [Course]. LinkedIn Learning. Visit Online.


Bibliography

  1. Number of searches (index 100) Google Trends, 18/09/2021, Visit Online.

  2. E. E. Smith, (2001). Cognitive Psychology: History. Read Online.

  3. Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131. Visit Online. Read Online.

  4. A. Wilke, R. Mata, Cognitive Bias, Editor(s): V.S. Ramachandran, Encyclopedia of Human Behavior (Second Edition), Academic Press, 2012, Pages 531-535, ISBN 9780080961804, Visit Online

  5. S. Behimehr, H. R. Jamali, (2020). Cognitive Biases and Their Effects on Information Behaviour of Graduate Students in Their Research Projects, Read Online.

  6. Giovanni Luca Ciampaglia, Filippo Menczer, (2018). Biases Make People Vulnerable to Misinformation Spread by Social Media. [Article], Scientific American, Read Online.

  7. Thomas H. Davenport, (2020). How to make better decisions about Coronavirus. [Article] MIT Sloan Management Review. Read Online.

  8. François Candelon, Yves Morieux, Michel Frédeau, Eric Boudier, Veronica Chau, and Rodolphe Charme di Carlo, (2020). A New Approach to the Intractable Problem of Climate Change. [Article], BCG. Read Online.

  9. Jm3, CC BY-SA 4.0, via Wikimedia Commons. See Online.

  10. List and descriptions of cognitive biases, The Decision Lab. [Website]. Visit Online.

  11. Croskerry, P., Singhal, G., & Mamede, S., (2013). Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ quality & safety, 22 Suppl 2(Suppl 2), ii58–ii64. Read Online.

  12. Croskerry P, Singhal G, Mamede S, (2013). Cognitive debiasing 2: impediments to and strategies for change. BMJ Quality & Safety 2013;22:ii65-ii72. Read Online.

  13. Daniel, M., Carney, M., Khandelwal, S., Merritt, C., Cole, M., Malone, M., Hemphill, R. R., Peterson, W., Burkhardt, J., Hopson, L., & Santen, S. A. (2017). Cognitive Debiasing Strategies: A Faculty Development Workshop for Clinical Teachers in Emergency Medicine. MedEdPORTAL : the journal of teaching and learning resources, 13, 10646. Read Online.

  14. M. Chammat, (2018). Les illusions du savoir, un danger pour la collectivité. Read Online.

  15. Mindfulness Quiz [Link] Greater Good. Visit Online.

Comentários


Follow @msoryae on Instagram to be informed of new articles!

bottom of page