Critical Thinking and Cognitive Biases

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

University of Windsor

Scholarship at UWindsor

OSSA Conference Archive OSSA 10

May 22nd, 9:00 AM - May 25th, 5:00 PM

Critical thinking and cognitive biases


Mark Battersby
Capilano University, Department of Philosophy

Sharon Bailin
Simon Fraser University, Education

Follow this and additional works at: https://scholar.uwindsor.ca/ossaarchive

Part of the Philosophy Commons

Battersby, Mark and Bailin, Sharon, "Critical thinking and cognitive biases" (2013). OSSA Conference
Archive. 16.
https://scholar.uwindsor.ca/ossaarchive/OSSA10/papersandcommentaries/16

This Paper is brought to you for free and open access by the Conferences and Conference Proceedings at
Scholarship at UWindsor. It has been accepted for inclusion in OSSA Conference Archive by an authorized
conference organizer of Scholarship at UWindsor. For more information, please contact scholarship@uwindsor.ca.
Critical thinking and cognitive biases
MARK BATTERSBY
Department of Philosophy
Capilano University
North Vancouver, BC
Canada V7J 3H5
mbatters@capilanou.bc.ca

SHARON BAILIN
Education
Simon Fraser University
Burnaby, BC
Canada
bailin@sfu.ca

ABSTRACT: We argue that psychological research can enhance the identification of reasoning errors
and the development of an appropriate pedagogy to instruct people in how to avoid these errors. In
this paper we identify some of the findings of psychologists that help explain some common fallacies,
give examples of fallacies identified in the research that have not been typically identified in
philosophy, and explore ways in which this research can enhance critical thinking instruction.

KEYWORDS: critical thinking, psychological research, cognitive biases, reasoning errors

1. INTRODUCTION

A primary aim of critical thinking research and teaching is to improve human


reasoning with the intent of getting people to be more rational with respect to their
beliefs and actions. For the Informal Logic/critical thinking community, this effort
has largely taken the form of analyzing the structure of arguments and identifying
certain types of errors or problems in reasoning, in particular those commonly
identified as fallacies. The focus is on exposing the nature of the error -- showing
why these particular arguments are fallacious. The pedagogical assumption
underlying this focus is that once people are aware of these errors, they will notice
them in the arguments of others and be able to resist them, and that they will avoid
making these errors themselves.
Much valuable work has been done in this area, including contributions to an
understanding of the nature of fallacies, the identification and characterization of a
growing number of fallacies, and innumerable rich ideas and strategies for teaching
critical thinking. The identification of reasoning errors, in this context, has been
based largely on the work of philosophers studying arguments and not on empirical
studies of reasoners. In addition, relatively little work has been done by

Mohammed, D., & Lewiński, M. (Eds.). Virtues of Argumentation. Proceedings of the 10th International Conference of the Ontario
Society for the Study of Argumentation (OSSA), 22-26 May 2013. Windsor, ON: OSSA, pp. 1-9.
MARK BATTERSBY AND SHARON BAILIN

philosophers (with some notable exceptions. e.g., Walton 2010) on trying to


understand why these errors are so common and persuasive.
Since the 1970s, however, much important work on human reasoning has
also been done by psychologists who have undertaken systematic empirical studies
of reasoning errors and produced many insightful accounts of these errors (Wason,
1966, 1971; Tversky & Kahneman, 1974; Slovic, 1966, 1977; Kahneman, Slovic, &
Tversky, 1982; Stanovich, 2011; Kahneman, 2011). Some of these errors map onto
identified informal logic fallacies, but some of them have not been previously
identified by philosophers.
The critical thinking community has, however, by and large given little
attention to the work of these cognitive psychologists. It is our contention that this
work can make a contribution both to reflection on reasoning errors and to the
development of an appropriate pedagogy to instruct people in how to avoid these
errors.
In this paper, we explore some of the intersections between this
psychological research on reasoning and the work of critical thinking theorists, as
well as the implications of this research for conceptualizing and teaching critical
thinking. The paper addresses this theme in terms of the following aspects:
 what this work can add to our understanding of reasoning errors in general,
and of the reasoning errors identified by critical thinking theorists in
particular
 which reasoning errors identified by this research are not typically identified
by the critical thinking community
 the ways in which this research can inform and help to enhance critical
thinking instruction.

2. PSYCHOLOGICAL VERSUS PHILOSOPHICAL ACCOUNTS

Although both philosophers and psychologists offer detailed accounts of reasoning


errors, there are important differences between the accounts. Philosophical
accounts are primarily normative. The work of philosophers has consisted in
specifying the norms of logical reasoning as well as identifying errors of reasoning
which are common in arguments and showing in what way they are logically
erroneous or epistemologically deficient.
The accounts of cognitive psychologists, in contrast, are largely descriptive,
and to some extent explanatory. Their work consists in conducting empirical studies
of people engaged in tasks that require reasoning and critical thinking. By means of
these studies, they have been able to identify errors that are commonly made,
identify patterns in the types of errors made which reflect cognitive biases (errors
which are systematic and predictable), amass evidence regarding the frequency and
tenacity of such errors, and investigate the circumstances which tend to be
correlated with their occurrence. In addition, based on the data accumulated, some
cognitive psychologists have also proposed explanatory accounts of these cognitive
biases in terms of their likely origins as well as a conceptual framework for
understanding how they function.

2
MARK BATTERSBY AND SHARON BAILIN

3. ENHANCED UNDERSTANDING OF REASONING ERRORS

The obvious question, then, is what, if anything, can such a descriptive cum
explanatory account add to our understanding that might help us in thinking about
and teaching critical thinking?
The findings of the various studies conducted by cognitive psychologists
detail an extensive range of cognitive errors which are common and predictable.
And many of the fallacies identified by informal logic can be seen as particular
instances or manifestations of certain of these cognitive biases. The fallacy of
popularity, for example, is likely an instance of the bandwagon effect -- the tendency
to do (or believe) things because many other people do (or believe) the same. And
the fallacy of hasty conclusion could be a result of any of: belief bias -- where
someone's evaluation of the logical strength of an argument is biased by the
believability of the conclusion; clustering illusion -- the tendency to see patterns
where actually none exist; and/or confirmation bias -- the tendency to search for or
interpret information in a way that confirms one's preconceptions. The elucidation
and detailing of various cognitive biases can give us a richer understanding of those
errors in reasoning which have already been identified by informal logicians.
Many cognitive biases describe systematic errors in reasoning which are not
among those traditionally highlighted by critical thinking theorists, however. A few
examples are loss aversion – where the disutility associated with giving up an object
is seen as greater than the utility associated with acquiring it; and recency bias -- the
tendency to weigh recent events more heavily than earlier events (such cognitive
biases will be discussed in more detail in the next section). The cognitive bias
literature can, then, add to the repertoire of reasoning errors which deserve
attention by critical theorists and instructors.
In addition to detailing a list of errors, what the research on cognitive biases
also indicates is that these errors are systematic and predictable, but also extremely
widespread and very tenacious. These are not errors that are made occasionally by
people who have momentary lapses in their thinking. Nor are they necessarily the
result of people’s failure to understand the relevant logical norms. The research
provides convincing evidence that they are, rather, very common and extremely
difficult to resist. This is an aspect of cognitive biases that needs to be taken into
account in critical thinking instruction.
Another helpful aspect that arises from the research is information regarding
under what conditions these errors are most likely to occur and whether there are
circumstances or conditions which can mitigate them. This type of information can
be useful for critical thinking instruction in providing a basis for the development of
strategies to help avoid these errors.
In addition to the guidance provided by the research itself, the explanatory
accounts offered by cognitive psychologists also give us a framework for attempting
to understanding why we make these errors. The ubiquity and tenacity of cognitive
biases demonstrate that these are not simply errors in reasoning; they are errors
that persuade. The theoretical accounts offer an explanation for why it may be that
we are persuaded by them.

3
MARK BATTERSBY AND SHARON BAILIN

These accounts differ from those generally offered by philosophers, which


tend to view the primary source of human unreason as the emotions (the
explanations of reasoning errors offered in contemporary textbooks, for example,
tend to be in terms of ego involvement or ethnocentrism). While not denying that
emotional sources can often be a cause of irrationality, the work of cognitive
scientists has shown that many reasoning errors are grounded primarily in natural
reasoning processes.
What many psychologists have argued is that humans have, over time,
evolved a set of quick inferences tendencies which allow a rapid, almost immediate
response or reaction. Some examples of these quick inferences are detecting
hostility in a voice, driving a car on an empty road, understanding a simple sentence,
or answering a simple math problem. Some of these fast mental activities are innate
and automatic while others are based on skills and knowledge which have become
automatic through prolonged practice (e.g., driving on an empty road, solving a
simple math problems) (Kahneman, 2011, pp. 21-24). This the type of thinking is
referred to by Kahneman (2011) as System 1 or fast thinking.1 This type of quick
inference-making is sufficiently reliable to stand us in good stead in many
circumstances, providing quick and generally appropriate initial reactions to
challenges under routine conditions. But such fast thinking can also lead to cognitive
biases as these immediate, unreflective inference-tendencies are not adequate to the
task of dealing with more complex challenges. Tasks such as performing complex
calculations, monitoring the appropriateness of one’s behaviour, comparing items
for overall value, or checking the validity of a complex logical argument require
attention, deliberate mental effort, and conscious reasoning. This type of more
deliberate, controlled, and effortful thinking is referred to by Kahneman as System 2
or slow thinking.2 According to Kahneman, slow thinking is required in order to
avoid cognitive biases.
So why are cognitive biases so persuasive? The two systems theory would
suggest that they persuade us because they arise from natural inferential
tendencies. These tendencies are quick and cognitively easy and are generally the
first line of attack when we are faced with cognitive challenges. Moreover, it is
rational in many circumstances to rely on these tendencies; they are what allow us
to function most of the time. But they can lead to errors in some circumstances and
it is important in such circumstances to institute strategies to become more
controlled and deliberate. The cognitive bias research suggests that this is not
always easy as fast thinking occurs automatically. But it is possible.
While these theoretical accounts provide a plausible explanation of the
persuasive power of cognitive biases in general, accounts of particular cognitive
biases may also help us understand why particular errors are persuasive. This is an
element that has been missing in most accounts of fallacies in the critical thinking

1This type of thinking has been referred to variously as automatic, experiential, heuristic, implicit,
associative, intuitive, and/or impulsive (Evans, 2008).
2 This type of thinking has been referred to variously as controlled, rational, systematic, explicit,
analytic, conscious, and/or reflective (Evans, 2008). See Evans for an overview of a number of dual-
systems theories of reasoning and cognition.

4
MARK BATTERSBY AND SHARON BAILIN

literature. Fallacies are typically identified in terms of what is erroneous about


them. But fallacies are not just any errors in reasoning; they are persuasive errors
(Battersby & Bailin, 2011; Walton, 2010). It is the existence of underlying cognitive
biases which make the fallacious inferences tempting. Thus we would argue for the
need to conceptualize fallacies not only in terms of the errors they exemplify, but
also in terms of their persuasive power.3 Understanding why particular fallacies
persuade us provides us with a tool for helping us to resist their thrall.
For example, while philosophers have identified the error of making hasty
generalizations based on anecdotal evidence, cognitive psychologists have identified
the cognitive bias of the “availability heuristic” (estimating what is more likely by
what is more available in memory, which is biased toward vivid, emotionally
charged, or easily imagined examples (e.g., a plausible story). In a famous study,
Tversky and Kahneman (1983) asked which was more likely
1. a massive flood somewhere in North America this year, in which more than
1000 people drown
2. an earthquake in California sometime this year, causing a flood in which
more than 1000 people drown.
Despite the fact that what is described in statement #2 is included in statement #1, a
large percentage of people found statement #2 more likely since the latter provides
a more plausible and easily imagined story. The philosophical accounts identify this
reasoning as an error; the psychological accounts tell us that we tend to be
persuaded by this particular error because people generally have a strong tendency
to make judgments of likelihood on the basis of ease of imagining an event, an ease
which can be much facilitated by a plausible story (Kahneman, 2010, pp. 159-60).
Another example is provided by the fallacy of questionable cause, which has
been pointed out by critical thinking theorists, but the tendency to commit this
fallacy can be seen to be grounded in the strong tendency, identified by
psychologists, to see causal relationships even between unrelated events in order to
make a coherent story. This phenomenon is nicely illustrated by an experiment by
Hassin, Bargh, & Uleman (2002) in which participants were given the following to
read:

After spending a day exploring beautiful sights in the crowded streets


of New York, Jan discovered that her wallet was missing.

When asked to recall the story afterwards, participants associated the word
pickpocket with the story more frequently than they did the word sights despite the
fact that sights appeared in the story while pickpocket did not. The juxtaposition of
the ideas lost wallet, New York, and crowds prompted participants to infer a
coherent causal story to explain the loss of the wallet despite the lack of any

3 In Reason in the Balance (Bailin & Battersby, 2010), we define a fallacy as an argument pattern

whose persuasive power greatly exceeds its probative value (i.e., evidential worth). We then describe
each fallacy in terms of two aspects: 1. “logical error” – an explanation of why the argument has
limited or no probative value, and 2. “rhetorical effect” (which we would now choose to call
“persuasive effect”)– an explanation of why the argument has a tendency to be persuasive.

5
MARK BATTERSBY AND SHARON BAILIN

evidence presented in the story to support this inference.


An important aspect of System 1 or fast thinking highlighted by cognitive
psychologists is that it is coherence-seeking – it is prone to construct a coherent
story out of whatever information is available, whatever its quality and however
limited. A common error in reasoning which is a result of this tendency is jumping to
conclusions (hasty conclusion), and a particularly troubling manifestation is the
failure to look at both sides of an issue or to seek alternatives. A striking illustration
of this phenomenon is provided by one study (Brenner, Koehler, & Tversky, 1996)
in which participants had to make a decision based on one-sided evidence. All the
participants were given the same scenarios providing background material to a legal
case, but then one group heard only a presentation by the defence lawyer, one group
heard only a presentation by the prosecutor, and one group heard both
presentations (each lawyer framed the issue differently but neither presented any
new information). Despite the fact that all the participants were fully aware of the
setup and could easily have generated the argument for the other side, the
presentation of the one-sided evidence had a significant effect on the judgments.
Moreover, the consideration of only one side of the issue also resulted in the
bias of overconfidence. The participants who heard one-sided evidence were more
confident of their judgments than those who heard both sides. This is not surprising
as it is easier to construct a coherent story with less information. The strength of
this tendency to make confident judgments based on limited evidence is a robust
and significant finding of the cognitive bias research and strongly suggests the need
for deliberate measures and strategies to counter this tendency.

4. IDENTIFYING ADDITIONAL ERRORS IN REASONING

The list of errors in reasoning identified by the cognitive science research which go
beyond those typically identified by Informal Logic is too lengthy to detail here. We
shall, instead, focus on one of the most striking discoveries by Kahneman and
Tversky, the phenomenon of anchoring -- the influence of irrelevant initial
information when estimating a value or making a judgment. In the standard
research example, subjects are given a random number, a number which they know
is random, and then asked questions such as how many of the states in the UN are
from Africa. Those given a larger number guess a relatively larger number of African
states and those given a smaller number estimate a smaller number of states. We all
recognize that when negotiating, it is common practice for the seller to price her
object high and for the buyer to try and low ball. But these strategies, while they
may be exploiting the phenomena of anchoring, also introduce relevant
considerations. They give us some idea what price the seller or buyer is seeking.
What is striking about the phenomenon of anchoring is that the anchoring numbers
are known to the subjects to be irrelevant. This might seem to be just a quirky
curious fact about human psychology, but a number of studies have demonstrated
that it is a phenomenon with profound social implications.
In one study, for example, German researchers examining the effects of
anchors on judicial decision-making were able to show that even trained judges
knowing that the information they were given was irrelevant, were still influenced

6
MARK BATTERSBY AND SHARON BAILIN

in their decision-making in a manner similar to the naïve subjects described above.


The researchers ran a number of different experiments providing the judges with
information of varying degrees of relevance. In one example, participants were
presented with a realistic case description of an alleged rape and were told that
during a court recess they received a telephone call from a journalist who asked "Do
you think that the sentence for the defendant in this case will be higher or lower
than 1 (or 3) years?" Subsequently, they were asked for their own decision and also
asked how certain they felt about the decision. Participants who had been exposed
to the high anchor chose a considerably higher sentences (mean 33 months,
standard deviation of 9.6) compared to those with the low anchor (mean 25 months,
standard deviation 10) and participants generally felt fairly certain about the
decision. Other experiments have yielded similar, troubling results (Englich, 2006).

5. ENHANCING CRITICAL THINKING INSTRUCTION

In what ways might this research inform and help to enhance critical thinking
instruction? Cognitive psychological accounts suggest that noticing that we are
succumbing to the influence of a cognitive bias is actually quite difficult. As
Kahneman suggests, "The best we can do is … learn to recognize situations in which
mistakes are likely and try harder to avoid significant mistakes when the stakes are
high" (Kahneman, 2011, p. 28).
Recognizing certain inferences as errors is certainly a sine qua non for
avoiding such mistakes, and critical thinking pedagogy has focused effectively on
this task. It is not sufficient, however. The cognitive bias research has demonstrated
just how strong and ubiquitous are these tendencies. Thus we would argue that
helping students to see the naturalness and allure of cognitive biases would be
important for helping them to resist their pull. In particular, we have argued for the
need to teach students to identify fallacies not only in terms of the errors they
commit but also in terms of their persuasive power.4
One of the most important points to emerge from the cognitive bias literature
with implications for pedagogy is the necessity to put the brakes on our tendency to
rush to inference under certain circumstances. Dealing with complex mental
challenges and drawing complex inferences requires the kind of deliberate,
controlled, and effortful thinking characteristic of System 2 or slow thinking. Thus
what is required when trying to make a judgment is a conscious attempt to make
our thinking more deliberate. Strategies such a following a procedure or a set of
guiding questions (Bailin & Battersby, 2010, pp. 19-38) and consciously monitoring
our thinking process (Bailin & Battersby, 2010, pp. 201-202) are essential aspects of
rational decision making.
In addition, it is possible to institute strategies to counter the effects of some
of these quick inferential tendencies. The tendency to make confident judgments on
the basis of limited evidence seems to be particularly strong and one manifestation
of this tendency is the failure to look at both sides of an issue or to seek alternatives
(sometimes called “my side bias” by cognitive psychologists). The common habit of
4 See note #3.

7
MARK BATTERSBY AND SHARON BAILIN

philosophers of seeking counter examples to any claim is a crucial antidote for this
tendency. The strategy of actively seeking out counter evidence to one’s views,
looking for and seriously considering the arguments on various sides of an issue,
and deliberately considering alternative positions when making a judgment can go a
long way toward countering this tendency of rushing to judgment. The development
of the habit of considering counterexamples and alternatives is a crucial aspect of
critical thinking instruction and is necessary in order to frustrate the natural
tendency to leap to conclusions.
The cognitive bias research has also served to highlight the power of the
framing effect– the tendency to draw different conclusions from the same
information, depending on how that information is presented (for example, people
are more likely to accept a risk if they are told that there is a 10% chance of winning
rather than a 90% chance of loosing). Deliberately attempting to reframe or change
the way one views a situation may be helpful in countering this tendency. For
example, one can attempt to view marijuana use as a harm issue rather than as a
crime issue and see what effect this has on one’s judgment about the legalization of
marijuana. The question then becomes: how do the harms resulting from illegality
compare to any reasonably anticipated harms to health? When engaging in
argumentation, one can try to view the enterprise in terms of making the best
judgment rather than in terms of winning or losing. And trying to identify with being
reasonable rather than with a particular view can be a helpful strategy for
developing open-mindedness and fair-mindedness in inquiry (Bailin & Battersby,
2010, p. 201).
The bias of overconfidence – the tendency to have more confidence in one’s
judgment than is warranted by the weight of evidence – is another common
cognitive bias which may be somewhat mitigated through deliberate efforts. The
strategies outlined above for promoting an examination of the full range of
arguments on all sides of an issue is necessary in order to make a judgment with the
appropriate degree of confidence, as is making students aware of the need to give
explicit consideration to how much weight various arguments carry in making an
overall judgment (Bailin & Battersby, 2010, p. 180-181; Battersby & Bailin, 2010, pp.
154-157).
An important concept which runs through the cognitive bias literature is that
of mental effort. Fast Thinking is quick and easy, virtually effortless, but slower,
more deliberate thinking requires more mental effort. Kahneman and others have
suggested that our minds have a tendency to go for the easier route much of the
time (Kahneman, 2010, pp. 39-49). For example, the research has shown repeatedly
that people have a strong tendency to see an erroneous answer to a simple math
problem as correct or an invalid syllogism as valid when the conclusion is believable
(the belief bias error) (Evans, 2008). The intuitive answer suggests itself
immediately and people generally do not bother to check the reasoning. These are
cases when the reasoning could be checked without too much difficulty. Nonetheless
overriding the intuitive response requires some mental work, and most people do
not appear to be initially inclined to put in this effort.
An important idea for our pedagogical purposes is Kahneman’s argument
that this failure is due at least in part to insufficient motivation (2010, p.46). Indeed,

8
MARK BATTERSBY AND SHARON BAILIN

the fact that many people willingly put considerable mental effort into certain
activities (e.g., Sudoku) when they find them interesting and engaging suggests that
a task can elicit mental energy when it is seen as being worth the effort. Thus one of
our challenges as educators is to help students to see thinking critically as being
worth the mental effort.

REFERENCES

Bailin, S. & Battersby, M. (2010). Reason in the Balance: An Inquiry Approach to Critical
Thinking. Whitby, Ont.: McGraw-Hill.
Battersby, M. & Bailin, S. (2011a). Guidelines for reaching a reasoned judgment. In J. A. Blair
and R. H. Johnson (Eds.). Conductive Argument: An Overlooked Type of Defeasible
Reasoning. London: College Publications, pp. 145 – 157.
Battersby, M. & Bailin, S. (2011b). Fallacy identification in a dialectical approach to teaching
critical thinking, in OSSA 2011, The Ontario Society for the Study of Argumentation.
Brenner, L., Koehler, D. & Tversky, A. (1996). On the evaluation of one-sided evidence.
Journal of Behavioral Decision Making, 9: 59-70.
Englich, B., Mussweiller, T. & Strack, F. (2006). Playing dice with criminal sentences: The
influences of irrelevant anchors on experts’ judicial decision making. Personality and
Social Psychology Bulletin, 32: 188.
Evans, J. (2008). Dual processing accounts of reasoning, judgments and social cognition.
Annual Review of Psychology, 59: 255-278.
Hassin, R., Bargh, A. & Uleman, J. (2002). Spontaneous causal inference. Journal of
Experimental Social Psychology, 38: 515-22.
Kahneman, D. (2011). Thinking Fast and Slow. London: Penguin.
Kahneman, D., Slovic, P. & Tversky, A. (Eds.). (1982). Judgment Under Uncertainty: Heuristics
and Biases. New York: Cambridge University Press.
Slovic, P. (1969). Analyzing the expert judge: A descriptive study of a stockbroker’s decision
processes. Journal of Applied Psychology, 53: 255-263.
Slovic, P., Fischhoff, B. & Lichtenstein, S. (1977). Behavioral decision theory. Annual Review
of Psychology, 28: 1 – 39.
Stanovich, K. (2011). Rationality and the Reflective Mind. Oxford: Oxford University Press.
Tversky, A. & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction
fallacy in probability judgment. Psychological Review, 90: 293-315.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty. Science, 185: 1124-1131.
Tversky, A. & Kahneman, D. (1973). Availability: A heuristic for judging frequency and
probability. Cognitive Psychology, 5: 207-232.
Walton, D. (2010). Why fallacies appear to be better arguments than they are. Informal
Logic, 30, 2: 159-184.
Wason, P.C. & Shapiro, D. (1971). Natural and contrived experience in a reasoning problem.
Quarterly Journal of Experimental Psychology 23: 63–71.
Wason, P.C. (1966). Reasoning. In Foss, B. M.. New Horizons in Psychology. Harmondsworth:
Penguin.

You might also like