Critical Thinking and Cognitive Biases
Critical Thinking and Cognitive Biases
Critical Thinking and Cognitive Biases
Scholarship at UWindsor
Sharon Bailin
Simon Fraser University, Education
Battersby, Mark and Bailin, Sharon, "Critical thinking and cognitive biases" (2013). OSSA Conference
Archive. 16.
https://scholar.uwindsor.ca/ossaarchive/OSSA10/papersandcommentaries/16
This Paper is brought to you for free and open access by the Conferences and Conference Proceedings at
Scholarship at UWindsor. It has been accepted for inclusion in OSSA Conference Archive by an authorized
conference organizer of Scholarship at UWindsor. For more information, please contact scholarship@uwindsor.ca.
Critical thinking and cognitive biases
MARK BATTERSBY
Department of Philosophy
Capilano University
North Vancouver, BC
Canada V7J 3H5
mbatters@capilanou.bc.ca
SHARON BAILIN
Education
Simon Fraser University
Burnaby, BC
Canada
bailin@sfu.ca
ABSTRACT: We argue that psychological research can enhance the identification of reasoning errors
and the development of an appropriate pedagogy to instruct people in how to avoid these errors. In
this paper we identify some of the findings of psychologists that help explain some common fallacies,
give examples of fallacies identified in the research that have not been typically identified in
philosophy, and explore ways in which this research can enhance critical thinking instruction.
1. INTRODUCTION
Mohammed, D., & Lewiński, M. (Eds.). Virtues of Argumentation. Proceedings of the 10th International Conference of the Ontario
Society for the Study of Argumentation (OSSA), 22-26 May 2013. Windsor, ON: OSSA, pp. 1-9.
MARK BATTERSBY AND SHARON BAILIN
2
MARK BATTERSBY AND SHARON BAILIN
The obvious question, then, is what, if anything, can such a descriptive cum
explanatory account add to our understanding that might help us in thinking about
and teaching critical thinking?
The findings of the various studies conducted by cognitive psychologists
detail an extensive range of cognitive errors which are common and predictable.
And many of the fallacies identified by informal logic can be seen as particular
instances or manifestations of certain of these cognitive biases. The fallacy of
popularity, for example, is likely an instance of the bandwagon effect -- the tendency
to do (or believe) things because many other people do (or believe) the same. And
the fallacy of hasty conclusion could be a result of any of: belief bias -- where
someone's evaluation of the logical strength of an argument is biased by the
believability of the conclusion; clustering illusion -- the tendency to see patterns
where actually none exist; and/or confirmation bias -- the tendency to search for or
interpret information in a way that confirms one's preconceptions. The elucidation
and detailing of various cognitive biases can give us a richer understanding of those
errors in reasoning which have already been identified by informal logicians.
Many cognitive biases describe systematic errors in reasoning which are not
among those traditionally highlighted by critical thinking theorists, however. A few
examples are loss aversion – where the disutility associated with giving up an object
is seen as greater than the utility associated with acquiring it; and recency bias -- the
tendency to weigh recent events more heavily than earlier events (such cognitive
biases will be discussed in more detail in the next section). The cognitive bias
literature can, then, add to the repertoire of reasoning errors which deserve
attention by critical theorists and instructors.
In addition to detailing a list of errors, what the research on cognitive biases
also indicates is that these errors are systematic and predictable, but also extremely
widespread and very tenacious. These are not errors that are made occasionally by
people who have momentary lapses in their thinking. Nor are they necessarily the
result of people’s failure to understand the relevant logical norms. The research
provides convincing evidence that they are, rather, very common and extremely
difficult to resist. This is an aspect of cognitive biases that needs to be taken into
account in critical thinking instruction.
Another helpful aspect that arises from the research is information regarding
under what conditions these errors are most likely to occur and whether there are
circumstances or conditions which can mitigate them. This type of information can
be useful for critical thinking instruction in providing a basis for the development of
strategies to help avoid these errors.
In addition to the guidance provided by the research itself, the explanatory
accounts offered by cognitive psychologists also give us a framework for attempting
to understanding why we make these errors. The ubiquity and tenacity of cognitive
biases demonstrate that these are not simply errors in reasoning; they are errors
that persuade. The theoretical accounts offer an explanation for why it may be that
we are persuaded by them.
3
MARK BATTERSBY AND SHARON BAILIN
1This type of thinking has been referred to variously as automatic, experiential, heuristic, implicit,
associative, intuitive, and/or impulsive (Evans, 2008).
2 This type of thinking has been referred to variously as controlled, rational, systematic, explicit,
analytic, conscious, and/or reflective (Evans, 2008). See Evans for an overview of a number of dual-
systems theories of reasoning and cognition.
4
MARK BATTERSBY AND SHARON BAILIN
When asked to recall the story afterwards, participants associated the word
pickpocket with the story more frequently than they did the word sights despite the
fact that sights appeared in the story while pickpocket did not. The juxtaposition of
the ideas lost wallet, New York, and crowds prompted participants to infer a
coherent causal story to explain the loss of the wallet despite the lack of any
3 In Reason in the Balance (Bailin & Battersby, 2010), we define a fallacy as an argument pattern
whose persuasive power greatly exceeds its probative value (i.e., evidential worth). We then describe
each fallacy in terms of two aspects: 1. “logical error” – an explanation of why the argument has
limited or no probative value, and 2. “rhetorical effect” (which we would now choose to call
“persuasive effect”)– an explanation of why the argument has a tendency to be persuasive.
5
MARK BATTERSBY AND SHARON BAILIN
The list of errors in reasoning identified by the cognitive science research which go
beyond those typically identified by Informal Logic is too lengthy to detail here. We
shall, instead, focus on one of the most striking discoveries by Kahneman and
Tversky, the phenomenon of anchoring -- the influence of irrelevant initial
information when estimating a value or making a judgment. In the standard
research example, subjects are given a random number, a number which they know
is random, and then asked questions such as how many of the states in the UN are
from Africa. Those given a larger number guess a relatively larger number of African
states and those given a smaller number estimate a smaller number of states. We all
recognize that when negotiating, it is common practice for the seller to price her
object high and for the buyer to try and low ball. But these strategies, while they
may be exploiting the phenomena of anchoring, also introduce relevant
considerations. They give us some idea what price the seller or buyer is seeking.
What is striking about the phenomenon of anchoring is that the anchoring numbers
are known to the subjects to be irrelevant. This might seem to be just a quirky
curious fact about human psychology, but a number of studies have demonstrated
that it is a phenomenon with profound social implications.
In one study, for example, German researchers examining the effects of
anchors on judicial decision-making were able to show that even trained judges
knowing that the information they were given was irrelevant, were still influenced
6
MARK BATTERSBY AND SHARON BAILIN
In what ways might this research inform and help to enhance critical thinking
instruction? Cognitive psychological accounts suggest that noticing that we are
succumbing to the influence of a cognitive bias is actually quite difficult. As
Kahneman suggests, "The best we can do is … learn to recognize situations in which
mistakes are likely and try harder to avoid significant mistakes when the stakes are
high" (Kahneman, 2011, p. 28).
Recognizing certain inferences as errors is certainly a sine qua non for
avoiding such mistakes, and critical thinking pedagogy has focused effectively on
this task. It is not sufficient, however. The cognitive bias research has demonstrated
just how strong and ubiquitous are these tendencies. Thus we would argue that
helping students to see the naturalness and allure of cognitive biases would be
important for helping them to resist their pull. In particular, we have argued for the
need to teach students to identify fallacies not only in terms of the errors they
commit but also in terms of their persuasive power.4
One of the most important points to emerge from the cognitive bias literature
with implications for pedagogy is the necessity to put the brakes on our tendency to
rush to inference under certain circumstances. Dealing with complex mental
challenges and drawing complex inferences requires the kind of deliberate,
controlled, and effortful thinking characteristic of System 2 or slow thinking. Thus
what is required when trying to make a judgment is a conscious attempt to make
our thinking more deliberate. Strategies such a following a procedure or a set of
guiding questions (Bailin & Battersby, 2010, pp. 19-38) and consciously monitoring
our thinking process (Bailin & Battersby, 2010, pp. 201-202) are essential aspects of
rational decision making.
In addition, it is possible to institute strategies to counter the effects of some
of these quick inferential tendencies. The tendency to make confident judgments on
the basis of limited evidence seems to be particularly strong and one manifestation
of this tendency is the failure to look at both sides of an issue or to seek alternatives
(sometimes called “my side bias” by cognitive psychologists). The common habit of
4 See note #3.
7
MARK BATTERSBY AND SHARON BAILIN
philosophers of seeking counter examples to any claim is a crucial antidote for this
tendency. The strategy of actively seeking out counter evidence to one’s views,
looking for and seriously considering the arguments on various sides of an issue,
and deliberately considering alternative positions when making a judgment can go a
long way toward countering this tendency of rushing to judgment. The development
of the habit of considering counterexamples and alternatives is a crucial aspect of
critical thinking instruction and is necessary in order to frustrate the natural
tendency to leap to conclusions.
The cognitive bias research has also served to highlight the power of the
framing effect– the tendency to draw different conclusions from the same
information, depending on how that information is presented (for example, people
are more likely to accept a risk if they are told that there is a 10% chance of winning
rather than a 90% chance of loosing). Deliberately attempting to reframe or change
the way one views a situation may be helpful in countering this tendency. For
example, one can attempt to view marijuana use as a harm issue rather than as a
crime issue and see what effect this has on one’s judgment about the legalization of
marijuana. The question then becomes: how do the harms resulting from illegality
compare to any reasonably anticipated harms to health? When engaging in
argumentation, one can try to view the enterprise in terms of making the best
judgment rather than in terms of winning or losing. And trying to identify with being
reasonable rather than with a particular view can be a helpful strategy for
developing open-mindedness and fair-mindedness in inquiry (Bailin & Battersby,
2010, p. 201).
The bias of overconfidence – the tendency to have more confidence in one’s
judgment than is warranted by the weight of evidence – is another common
cognitive bias which may be somewhat mitigated through deliberate efforts. The
strategies outlined above for promoting an examination of the full range of
arguments on all sides of an issue is necessary in order to make a judgment with the
appropriate degree of confidence, as is making students aware of the need to give
explicit consideration to how much weight various arguments carry in making an
overall judgment (Bailin & Battersby, 2010, p. 180-181; Battersby & Bailin, 2010, pp.
154-157).
An important concept which runs through the cognitive bias literature is that
of mental effort. Fast Thinking is quick and easy, virtually effortless, but slower,
more deliberate thinking requires more mental effort. Kahneman and others have
suggested that our minds have a tendency to go for the easier route much of the
time (Kahneman, 2010, pp. 39-49). For example, the research has shown repeatedly
that people have a strong tendency to see an erroneous answer to a simple math
problem as correct or an invalid syllogism as valid when the conclusion is believable
(the belief bias error) (Evans, 2008). The intuitive answer suggests itself
immediately and people generally do not bother to check the reasoning. These are
cases when the reasoning could be checked without too much difficulty. Nonetheless
overriding the intuitive response requires some mental work, and most people do
not appear to be initially inclined to put in this effort.
An important idea for our pedagogical purposes is Kahneman’s argument
that this failure is due at least in part to insufficient motivation (2010, p.46). Indeed,
8
MARK BATTERSBY AND SHARON BAILIN
the fact that many people willingly put considerable mental effort into certain
activities (e.g., Sudoku) when they find them interesting and engaging suggests that
a task can elicit mental energy when it is seen as being worth the effort. Thus one of
our challenges as educators is to help students to see thinking critically as being
worth the mental effort.
REFERENCES
Bailin, S. & Battersby, M. (2010). Reason in the Balance: An Inquiry Approach to Critical
Thinking. Whitby, Ont.: McGraw-Hill.
Battersby, M. & Bailin, S. (2011a). Guidelines for reaching a reasoned judgment. In J. A. Blair
and R. H. Johnson (Eds.). Conductive Argument: An Overlooked Type of Defeasible
Reasoning. London: College Publications, pp. 145 – 157.
Battersby, M. & Bailin, S. (2011b). Fallacy identification in a dialectical approach to teaching
critical thinking, in OSSA 2011, The Ontario Society for the Study of Argumentation.
Brenner, L., Koehler, D. & Tversky, A. (1996). On the evaluation of one-sided evidence.
Journal of Behavioral Decision Making, 9: 59-70.
Englich, B., Mussweiller, T. & Strack, F. (2006). Playing dice with criminal sentences: The
influences of irrelevant anchors on experts’ judicial decision making. Personality and
Social Psychology Bulletin, 32: 188.
Evans, J. (2008). Dual processing accounts of reasoning, judgments and social cognition.
Annual Review of Psychology, 59: 255-278.
Hassin, R., Bargh, A. & Uleman, J. (2002). Spontaneous causal inference. Journal of
Experimental Social Psychology, 38: 515-22.
Kahneman, D. (2011). Thinking Fast and Slow. London: Penguin.
Kahneman, D., Slovic, P. & Tversky, A. (Eds.). (1982). Judgment Under Uncertainty: Heuristics
and Biases. New York: Cambridge University Press.
Slovic, P. (1969). Analyzing the expert judge: A descriptive study of a stockbroker’s decision
processes. Journal of Applied Psychology, 53: 255-263.
Slovic, P., Fischhoff, B. & Lichtenstein, S. (1977). Behavioral decision theory. Annual Review
of Psychology, 28: 1 – 39.
Stanovich, K. (2011). Rationality and the Reflective Mind. Oxford: Oxford University Press.
Tversky, A. & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction
fallacy in probability judgment. Psychological Review, 90: 293-315.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty. Science, 185: 1124-1131.
Tversky, A. & Kahneman, D. (1973). Availability: A heuristic for judging frequency and
probability. Cognitive Psychology, 5: 207-232.
Walton, D. (2010). Why fallacies appear to be better arguments than they are. Informal
Logic, 30, 2: 159-184.
Wason, P.C. & Shapiro, D. (1971). Natural and contrived experience in a reasoning problem.
Quarterly Journal of Experimental Psychology 23: 63–71.
Wason, P.C. (1966). Reasoning. In Foss, B. M.. New Horizons in Psychology. Harmondsworth:
Penguin.