“Don’t believe everything you think.”
~ Bumper sticker
In the world of insights we tend to think about biases and heuristics and System 1 and System 2 thinking in the same light of how we interact with respondents, and how we collect information from them. It is less common, but no less important, to think about how biases and heuristics influence our analysis and the insights we generate.
In this article we’ll look at the kind of impact these biases can have.
The problem is real
Bias kills. It is estimated that medical errors cause “100,000 unnecessary deaths” and “perhaps one million excess injuries” each year in the U.S. alone. One meta-analysis of investigations into the link between cognitive bias and misdiagnosis found cognitive bias to be implicated in, depending upon which study you looked at, between 37% and 77% of cases.
With so much at stake, it is comforting to know there are researchers in medicine seeking to find ways to combat biases in diagnosis and treatment. Similar work is occurring in the world of justice, where the number of wrongfully convicted continues to grow, revealing the extent to which bias can lead to outcomes that nobody wants.
By better understanding the role bias plays, the justice system has been able to move away from an unhelpful blame game and toward a more hopeful focus on achieving just outcomes. “Traditionally, prosecutorial decision making has been studied through a lens of fault, blame, and intentional wrongdoing,” writes law professor Alafair Burke. “Consistent with this lens, those who have studied the downsides of broad prosecutorial discretion have blamed bad prosecutorial decisions on overzealousness, flawed cultural and individual values, and a lack of moral courage.”
The problem is, the evidence doesn’t really support this. A simple story of malevolent prosecutors seeking convictions at any cost is attractive, because it follows a simple script of bad versus good, but the reality is more complex.
Burke is encouraged that a “growing literature seeks to attribute poor prosecutorial decision making to a set of information-processing biases that we all share, rather than exclusively to ethical or moral lapses. From this perspective, prosecutorial resistance to defense claims of innocence can be viewed as deep (and inherently human) adherences to the ‘sticky’ presumptions of guilt that result from various forms of cognitive bias that can impede the neutrality of prosecutors throughout their handling of a case.”
Having correctly identified the problem, it becomes possible to work toward solving it. In the insights industry we are not as advanced in combating bias as our fellow sense-makers in medicine or the law.
There is much we can learn from the solutions that these disciplines, and others, have investigated.
The first step
The first step in the journey to confronting bias is creating awareness and admitting it is a problem. It is like step one in a twelve step addictions program. I’ll start: “My name is Andrew and I have a problem with bias. All my analyses—including this one—are affected by my biases.”
Just making people aware of bias does not help them confront it. Education is necessary, but not sufficient. As Edinburgh-based physicians E.D. O’Sullivan and S.J. Schofield write, “while focused educational sessions seem an intuitive and practical approach to mitigating bias, the evidence to support this is mixed and there are certainly enough negative studies to suggest it would be a low-yield intervention at best.”
Burke confirms this is the case in the law as well: “commentators have continually called for increased prosecutorial training regarding the dangers of cognitive bias…. Unfortunately, the empirical evidence also suggests that cognitive bias is stubborn, and that education is an unlikely panacea.”
Basic education won’t change the situation. But education in the form of specific feedback on past decisions has been shown to be effective in reducing bias. “A systematic review of feedback across all medical areas (not solely diagnosis) concluded that feedback improves performance in selected settings, especially if the feedback is intensive,” Graber et al. report in their paper “Cognitive interventions to reduce diagnostic error: a narrative review.”
“Feedback also offers the potential to reduce errors by helping develop expertise,” they explain. “Feedback is also the key to reducing overconfidence, which in turn could open the door for clinicians to appreciate the possibility of their own errors and take actions to avoid them.”
Teamwork, doubt, and structured techniques, including checklists, also hold promise as strategies for fighting bias. We’ll look at each of these techniques for minimizing bias in subsequent articles.
This article is adapted from Eureka! the science and art of insight, which is available now.