The mind behind the cognitive bias codex, and practical tips on tackling bias

Emergent Thinkers 5 minute Spotlight: with Buster Benson

Continuing on my exploration of key future-proofing skills I turn to Critical Thinking, thinking about how we think.

Research in cognitive science reveals cognitive biases (originally Tversky & Kahneman, 1972)1 cause people to draw incorrect conclusions in countless ways. Arising from heuristics, known as shortcuts in reasoning, they heavily influence decision-making (Lockton, 2012). Getting to know Buster Benson, the man behind the inspired cognitive bias codex (fig 1.) which synthesises 200+ known biases into one cohesive story has been on my professional bucket-list for over a year.

I asked Buster 9 questions: to share insight into one man’s mission to help us all see the forest for the trees (Goldminz, 2016), and to tap into his expertise on making best use of the codex, to facilitate improved learner/employee self-awareness of these decision-making influencers.

Fig 1: The Cognitive Bias Codex: Buster’s work was illustrated by John Manoogian III (available from )

Insight to the mind behind the Cognitive Bias codex

1. Buster, where does your interest in bias originate?

The answer to this is a bit weird. My drive to identify blind spots in my thinking was triggered when I was a kid (maybe 8 or 9 years old). I had recently moved to a new city, and a classmate invited me to an after-school youth group. We played a game where we gathered in a circle wearing a blindfold to receive instructions on things to do. Silly things mostly, but one that I recall involved peeling and eating a banana. After a while, I became aware of the sound of laughter. Eventually, I realised that everyone else had taken off their blindfolds, and were laughing at me. I was super embarrassed, and ended up leaving, feeling a fool. The emotional weight of that experience built a defence mechanism against blind spots that continue to crank away 30+ years later.

Cognitive biases are one way that we can accidentally make fools of ourselves, so I’m interested in understanding them better. I told you it was a weird story! Now, one of my favourite questions to ask people is “What formative events in your life are you aware of that influenced building your character and values?” I love this question because many people haven’t thought about it, yet have some big stories to share.

2. You have a very unique and broad skill set, from creative writing, data visualisation, and software engineering, as well as being a confessed obsessive self-tracker, what skills have facilitated your understanding of bias?

All of these interests are tied to my pursuit to understand bias. I also tend to make use of self-reflection, through private journaling, meditation and creative writing to explore the big questions. Self-tracking is also useful to remove the narrator in my head that comes up with incorrect stories. My first big self-tracking experiment was to test my prediction about how hard it would be to gain 10 pounds, then lose 10 pounds. It’s all about coming up with weird questions that you think will lead to surprising answers. If you’re looking for your blind spots, you’re always asking “what am I missing?” Each tool can provide a different glimpse into this mystery.

3. You explain there is a purpose to bias, that it can be positive as well as negative in the right contexts, do you have any guilty pleasure biases, and how does it/they add value in your life?

An interesting question I’ve never considered before! I do have guilty pleasure biases. One that comes immediately to mind is the bizarreness effect. This is based on our always-running pattern-matching attention filter latching onto bizarre things, promoting them to our conscious attention. I have a love of bizarreness, both as a result of this bias and because of its effect. I’m consciously and subconsciously aware that bizarre stories (like the one about the blindfold) are interesting, and that this bias incentivises me to share them.

I’m also obsessed with the mere-exposure effect because I think it can be used as a weapon so effectively. Ads are one way this happens. Frequent exposure to content means it becomes familiar, which increases our affinity to whatever the ad is for. The media is fumbling big time with this right now, by exposing us to every annoying thing that Trump does. We think we’re shaming him, but we’re just becoming more exposed and normalised to his content.

This approach can however be used productively, for example putting ideas that you want to absorb around your work space, or on your phone’s lock screen. I created a version of the cognitive bias cheat sheet for my phone that was pretty popular. If we become familiar with the idea of our cognitive limits, then we are one step closer to accepting them and developing a healthier relationship with them.

4. You provide very open documentation of your life manifesto and beliefs in your codex vitae, why do you believe it to be so important that people openly ‘admit’ their bias and prejudges?

Admitting bias is very important! It is the only real action item that we can successfully achieve as it relates to improving our relationship with bias. I think of it in terms of incentives. Many of us know right from wrong, that racism and sexism are bad. At the same time, our desire to avoid such things creates blind spots because of confirmation bias. We avoid information we don’t want to believe and seek out that which confirms our beliefs. If we don’t want to believe we’re racist or sexist or at all complicit in systemic prejudices, then we will not see information that is evidence of that. By “admitting” bias publicly we begin to develop an honest bias, which is a chapter in my book that I’m currently writing. The basic idea is that we can’t avoid bias, so we should accept it, and be open to feedback and evidence that reveals our own bias so we can correct it. Such an approach commits us to be forever uncomfortable, which is the hard part.

Buster On Tackling Bias

5. Which of the biases do you believe are the most resilient, the most difficult to overcome?

A common misconception with biases is that you can overcome them. If you take a problem like information overload, you can see why it’s necessary for our brains to filter systematically, because if we noticed every piece of information, we’d become incapacitated by the firehose of mostly unimportant information. So if the filtering must happen before we notice it, it needs to happen with an “always on” part of our brains, which Kahneman calls System 1.

System 1 is fast, doesn’t require much energy, and yet is required to do a lot of the first pass filtering and analysing of the world. As a result, it uses simple heuristics such as depending on the context, working only with the things that come most easily to mind, amplifying the bizarre, noticing new and changing things more than old and unchanging things etc. Consequently, we access a much simpler trickle of information that we can practically use. If we tried to turn off our filter that only works with things that come easily to mind, we’d use up all of our energy thinking about unimportant tasks like deciding where to go for lunch.

Alternatively to considering the most resilient, I believe there’s a whole class of biases we can correct through practice. Take the Dunning-Kruger effect which suggests that when we have a low competency on a certain task, we’ll tend to overestimate our ability. Research has indicated over-estimation can be neutralised with attention.

6. Discounting the obvious psychopathic type bias, if you could strike a bias off the list for the good of all humanity, what would it be and why?

In-group familiarity is the one I’d pick. The number of cultural problems currently and historically are linked to us favouring our own group, whether it be religious, national, racial, ethnic, or gender reasons. If at a flip of a switch we moved from an ethnocentric mindset to a world centric mindset, so we were all in the same tribe, I think 90% of our problems would go away relatively quickly.

7. Would humans, devoid of all forms of bias, become more machine-like?

No, not really.  Machines have all the same biases, but unlike humans, machines and computers can scale a lot faster thus causing more damage. So, if we somehow removed our biases, which is impossible, we wouldn’t become more machine-like, we’d just become more similar. What keeps us different is that we don’t like people different from us, and so we avoid them, or fight them, or try to do the opposite of what they’re doing. That leads to a wide diversity of cultures and people. If we were suddenly all cool with everyone else’s differences, then wouldn’t put so much effort into being different to distinguish our identities. Such an outcome is a bizarre one I hadn’t considered before, and not fully thought through, but it’s the first thing that came to mind.

8. Addressing all the biases, will take a life-time of focus and effort, which of the set would you recommend priority attention, and why?

In the spirit of developing an honest bias, I’d start with the bias blind spot, and take heed of the following:

  1. Accept your own cognitive limitations
  2. Learn to be uncomfortable with the fact that you can’t give every option a fair shake
  3. Accept that your drive for meaning is going to conjure illusions that you think are real
  4. Recognise many decisions are made without all of the information
  5. Be aware your actions will always give preference to those you’re most fond

After this, I talk about developing an honest bias in both a beginner mode and an advanced mode.

The beginner mode is to invite dissenting opinions to the table, welcome information that disconfirms your beliefs, and to seek ways to see others as in your tribe rather than outside it. That’s not easy to do and could take half a lifetime. The beginner mode is more reactive, but you can do it alongside whatever you are already doing

The advanced mode is to proactively seek to falsify your own beliefs by finding the most worthy opponents to them, to engage in fruitful disagreement  This mode requires putting aside some of the things you’re doing, and going on this wacky adventure instead.

9. Finally, any recommendations for students, educators or talent developers, to help facilitate the address of negative bias that impacts upon critical thinking?

I have two recommendations, applicable to everyone.

The first is to try to avoid exiling people and ideas. When we exile people and ideas from the conversation, we also warp them and make them look worse than they would appear if they were there to represent themselves directly. Avoid speculating about people and groups that you don’t like, as it triggers all kinds of uncharitable biases. I’ve written a short list of guidelines for fruitful dialogue.

Secondly I recommend keeping a public record of your beliefs and positions on current events and your topics of interest. I have been doing this for 7 years (see Buster’s codex vitae) and it’s been useful for 3 main reasons.

  1. When things happen in the world, instead of reacting immediately, it helps to take a step back to update your beliefs outside of the reactive moment, and is more likely to assist in addressing the root problem
  2. You’ll quickly find that you don’t have very considered beliefs and positions on everything!
  3. It’s something you can put out there; ask others to hold you accountable to. This increases the likelihood that the people who challenge your beliefs understand your position more holistically, and encourages them to add to your belief system in a more productive way

I thank you Buster for sharing so enthusiastically and openly. Having access to the wonderful cognitive bias codex is one thing, but you have further detailed on very practical advice on how to make use of this tool, as well as your others, to impact on how we actually think and make decisions. Good luck with the forthcoming new book on how to conduct more honest and constructive arguments.

Buster, B. (2019) Why Are We Yelling?: The Art of Productive Disagreement Hardcover – November 19, 2019 Pre-order on :


1Amos Tversky & Daniel Kahneman published a series of seminal articles in the general field of judgment and decision-making, culminating in the publication of prospect theory in 1979, contributing to Kahneman being awarded the Nobel Memorial Prize in Economics in 2002


Benson, B. (2012- present) Beliefs: A living and dying record of my beliefs [blog]. Available at:  [Accessed 15/02/2019]

Buster, B. (2018) Guidelines for fruitful dialogue [blog]. Available at:   [Accessed 15/02/2019]

Goldminz, I. (2016) ‘How to make sense of Cognitive Biases [Benson]’ [blog]. Available at: [Accessed 5/03/2019]

Lockton, D. (2012) ‘Cognitive biases, heuristics and decision-making in design for behaviour change’, [Accessed 5/03/2019]

Kahneman, Daniel; Shane Frederick (2002). “Representativeness Revisited: Attribute Substitution in Intuitive Judgment”. In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN 978-0-521-79679-8

4 Replies to “The mind behind the cognitive bias codex, and practical tips on tackling bias”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: