How Bias Affects Our Choices
We may think that our beliefs are based on solid facts and reason, but, in the words of George Gershwin’s old song, “It ain’t necessarily so.” The way our minds work actually limits our ability to discern the truth or make rational decisions. Psychologists have been studying the way we think for many years, and have identified several biases that cause this.
Prejudices and preconceptions
We all tend to pick and choose information that supports our prejudices and preconceptions while ignoring a heap of evidence to the contrary. Right or wrong, climate change deniers provide the most obvious example today, focusing on a tiny minority of expert opinion and brushing aside the overwhelming consensus of climate scientists. Similarly, most of us don’t listen carefully to politicians’ speeches or analyze their policies in detail. Instead we pick up the fragments of information that reinforce our party preference.
This “confirmation bias” even penetrates scientific thinking. For instance, the evidence for the reality of paranormal phenomena is far stronger than the evidence for the effectiveness of many pharmaceutical drugs. And yet most scientists reject out of hand any suggestion that psychic abilities are real because they don’t believe they’re possible. In less-dramatic ways, confirmation bias distorts our thinking and decisions about many aspects of life. Despite my high level of education and career as an academic, I frequently catch myself doing this.
The attraction effect
Another source of bias in our thinking is the “attraction effect.” Imagine you’re comparing smart phone options, and are drawn to the cheaper Basic contract rather than the more expensive Advanced one because it meets your needs adequately. Now suppose that you’ve been offered a third Luxury alternative that costs more but provides no more benefits than the Advanced contract. Research shows that the presence of this third option increases the probability that you’ll choose the Advanced contract. One possible explanation is that the Luxury option makes it easier to justify your choice by claiming you’ve got a bargain — perhaps our decisions are normally biased towards ones we can easily justify rather than what is best for us?
The framing effect
Three decades ago, a third type of bias was identified. The “framing effect” leads us to make choices depending on how the information is presented. In one classic experiment, people were asked to imagine an outbreak of disease threatening a village of 600 people. Plan A would definitely save 200 lives, whereas Plan B would have a 1 in 3 chance of saving An example of how bias affects rational thinking everyone, and a 2 in 3 chance of saving no one.
Most people chose Plan A. However, participants tended to make different decisions when the same Plans were presented another way. In this case, they were told that 400 people would die under Plan A, while Plan B remained unchanged. Surprisingly, most people then chose Plan B. The reason for this switch may be to do with the way we weigh risks, and, as with the attraction effect, the ease of justifying our choice. In the first case, there is the certainty of saving 200 lives versus a rather complex risk assessment that is hard to get our heads around. In the second case, it can be argued that Plan B might save 400 people from certain death.
More kinds of bias
As if these biases were not enough, we have others too. One is the “sunk-cost fallacy” — our tendency to stick with a project even when it would be more rational to cut our losses and move on. Another is “feature creep” — a tendency to buy gadgets that have more features than we will ever use. More significant, perhaps, is the way emotions cloud our judgment. Inducing a sense of disgust, for example by showing people pictures that they find offensive, makes them more likely to make harsh moral judgments that can induce bias.
How can we best guard against these and other biases?
One approach that I try to apply to my own life is to live by my current beliefs, but to hold them lightly and to be aware of my emotional feelings. I try to remain open to the reality that I may be quite wrong, and to be flexible in the face of new information or ideas – not easy!
A second approach is to share our ideas with others and debate about them. This can be very creative provided we mix with people who don’t all think the same. When groups of five or six people take tests that involve logical deduction, the success rate is far higher than when individuals tackle them on their own. Even groups whose members have previously failed the test succeed by openly generating ideas and revising them in the light of criticism. The most successful groups have a clear goal to reach agreement, encourage everyone to participate in the discussion, and consist of people who are sensitive to the feelings of others. The danger with this approach is that, if the members of a group are too similar in their beliefs, knowledge and ways of thinking, the result can be “groupthink” which stifles dissent, ignores alternative actions, and can make disastrous decisions.
For more information
See “The Argumentative Ape” by Dan Jones, New Scientist, 26 May 2012, p.33-36.
Malcolm Hollick is a QuestioningTheTruth.com contributor, and is the author of The Science of Oneness: A worldview for the twenty-first century, and co-author with Christine Connelly of Hope for Humanity: How understanding and healing trauma could solve the planetary crisis. For more information, visit their Human Solutions Web site.