Fountain 2016
Fountain 2016
Fountain 2016
Lily Fountain
L. Fountain (&)
Faculty of Nursing, University of Maryland School of Nursing, Baltimore, MD, USA
e-mail: fountain@son.umaryland.edu
Takeaways
• Performance constructs such as critical thinking should be explicitly
defined and measures should align with these definitions.
• Previous critical thinking research has usually examined learners at only
one level of experience, such as students, new graduates, or experts.
Future studies can be strengthened by examining learners at more than one
level of experience.
• Domain-specific measures are preferable for performance assessment. For
example, in nursing education studies, individual interest, knowledge,
relational reasoning, prioritization, inference, and evaluation, along with 3
contextual factors, patient assessment, caring, and environmental resource
assessment, were identified as common keywords.
• Professions education research is strengthened when AERA/APA/NCME
standards for reliability and validity are followed.
10.1 Introduction
Nurse Jennifer has been a maternity nurse for 5 years and has been fascinated by maternity
nursing since her basic education in nursing. Today, she stares thoughtfully at her patient
Mrs. Gablonsky. Nurse Jennifer sees something surprising. Mrs. Gablonsky’s condition
differs in a way the nurse does not expect for a woman who birthed a baby the previous day.
Nurse Jennifer wonders about what is causing Mrs. Gablonsky’s state and questions the
patient closely to find out if there were any symptoms that could help explain Mrs.
Gablonsky’s condition. Nurse Jennifer compares Mrs. Gablonsky’s condition to the other
postpartum women she has treated in her career. She searches her mental data base for
knowledge about complications that could be consistent with the symptom that surprised
her. After a few moments of contemplation, Nurse Jennifer knows how to help her patient.
As this scenario illustrates, critical thinking using key cognitive processes is cen-
trally involved in the quality of care provided by maternity nurses and other healthcare
professionals. Leading policy and professional organizations such as the Institute of
Medicine, Carnegie Foundation, and American Association of Colleges of Nursing are
in agreement that critical thinking is a key competency for health professionals to
function in our complex health care environment (American Association of Colleges
of Nursing 2008; Institute of Medicine 2010; Cooke et al. 2010). There is far less
clarity in professions education literature about what critical thinking and its analog in
practice, clinical reasoning really mean or how to best measure it to ensure compe-
tence in providers and students. Toward this end, this systematic review was con-
ducted to examine the quality of definitions and measures of critical thinking and
clinical reasoning within the literature pertaining to health care professions.
10 Thinking Critically About the Quality of Critical Thinking … 187
What is meant by critical thinking? Facione (1990) gives one commonly used
definition of critical thinking as “purposeful, self-regulatory judgment which results
in interpretation, analysis, evaluation, and inference” (p. 2). This present study also
examined the term clinical reasoning, defined by Higgs and Jones (2000) as “the
thinking and/or decision-making processes that are used in clinical practice”
(p. 194). Although other terms have been used to describe clinical thinking, such as
clinical judgment, problem-solving, and decision-making, the terms critical think-
ing (CT) and clinical reasoning (CR) were chosen as the basis for this systematic
review because a preliminary electronic database search indicated they were the
most commonly equated terms populating the recent research.
Specifically, from the 1980s to the present, critical thinking and clinical rea-
soning have been areas of intense research and clinical interest in nursing and
medicine, as documented by several recent reviews of the literature (Brunt 2005;
Chan 2013; Norman 2005; Ross et al. 2013; Simpson and Courtney 2002; Walsh
and Seldomridge 2006a). Critical thinking has been identified as a key construct in
core competencies for interprofessional collaborative practice and consensus
statements on critical thinking in health professions education (Huang et al. 2014;
Interprofessional Education Collaborative Expert Panel 2011). The most important
reason for studying critical thinking in the health professions is that health care
providers, educators, researchers, and policy-makers believe it leads to better health
care. It has been argued that critical thinking has the potential to reduce morbidity
and mortality for patients, and increase patient satisfaction with care. It can reduce
health care costs by avoiding mistakes, unnecessary procedures, and unnecessary
use of supplies (Benner et al. 2008; Kataoka-Yahiro and Saylor 1994).
However, problems with the definitions and measures used in critical thinking
and clinical reasoning have been recognized (Krupat et al. 2011; Walsh and
Seldomridge 2006a). Further, the quality of definitions and measures used in
educational research on critical thinking and clinical reasoning affects the ability to
evaluate which educational strategies are effective at promoting these skills (Brunt
2005). In addition, the context of health education research affects the quality of
research (Ovretviet 2011). Merriam-Webster defines context as “the interrelated
conditions in which something exists” (Context n.d.). The types of participant
samples and research designs used in health education affect the ability of
researchers to clearly define and measure constructs in the health professions (Waltz
2010).
In order to function with the same level of research rigor as clinical practice
(Harden et al. 2000), clear definitions (Creswell 1994, 2014) and measures
(Ratanawongsa et al. 2008) have been identified as essential steps in producing
quality professions education research. With this increased focus on evidence-based
practice during education and clinical practice, in combination with the other
pressures on health care and education, it is vital that a shared base of terminology
and psychometrically sound assessment tools be identified.
There is long-standing literature on the problems with definitions of CT terms,
which fall into the categories of clarity, domain specificity, and equivalency of term
usage. The lack of explicit, clear definitions of terms has been cited as a
188 L. Fountain
particular did not show expected improvements in critical thinking over the course
of professional education (Walsh and Seldomridge 2006a).
Thus, the need exists for a detailed and systematic description of the types and
quality of definitions and measures of critical thinking and clinical reasoning used
in the health professions. This study seeks to address that gap. The aim of this
review is to lay the foundation for improving the conceptualization and opera-
tionalization of critical thinking and clinical reasoning in the health professions by
addressing the following research questions:
1. What is the nature of the context in which critical thinking and its analog clinical
reasoning have been examined?
2. How and how well are definitions for critical thinking and clinical reasoning
specified in the literature?
3. How and how well are measures of critical thinking and clinical reasoning used
in the literature?
10.2 Methods
Electronic database
Reasons for Exclusion at Title/Abstract
PsycINFO Level:
224
Duplicates Violated Selection Criteria:
1)Not Health Sciences research and about
46
the thinking of health professionals: 56
Title and Abstract 2)Not Analysis of thinking skills used by
individual humans (not computers or
Review
groups) on direct patient care or during
178 education to provide patient care: 34
3)Did not measure thinking as indicated by
Excluded measures, or measured dispositions
105 including confidence or self-efficacy: 11
Article Review 4)Violated Delimiter: 4
73
Duplicates
Hand 9 Reasons for Exclusion at Article Level:
searching Violated Selection Criteria:
from 1)Not Health Sciences research and about
reference the thinking of health professionals: 6
2)Not Analysis of thinking skills used by
lists
individual humans (not computers or
8 Excluded
groups) on direct patient care or during
29 education to provide patient care: 3
3)Did not measure thinking as indicated by
measures, or measured dispositions
including confidence or self-efficacy: 12
Included Studies 4)Violated Delimiter: 8
43
Fig. 10.1 Summary of literature search and review process for primary literature
10 Thinking Critically About the Quality of Critical Thinking … 191
review process. As the figure indicates, 224 abstracts were produced by the search
terms; after title, abstract, and full article review, 43 articles met the criteria for
inclusion in the review.
In order to clarify the definitions and measures used for critical thinking research,
an explicit coding scheme was used. I developed a protocol based on the coding
typology used in prior research (Alexander and Murphy 2000; Dinsmore et al.
2008), and the recommendations of the Best Evidence Medical Education (BEME)
collaboration (Harden et al. 2000). The coding typology for Dinsmore et al. (2008)
was adapted to classify explicitness or clarity of definitions, and the alignment of
measures with the definitions. The categories of study design, study size, target
outcomes, and level of experience of participants were chosen from the BEME
categories to evaluate the quality of medical education research studies.
Overall, the study variables were categorized as relating to the contextual aspects
of the studies, the definitions, or the measures. A general description of the coding
is given here, and Appendix A details the components of the codebook that was
developed, that specified all resulting codes and was used to establish interrater
agreement on the coding scheme.
Context. Variables coded for context include purpose, participant variables, and
research design.
Purpose. The constructs critical thinking and clinical reasoning were examined
for several purposes. McCartney et al. (2006) caution that care must be exercised
when a measure is used differently than its intended purpose, such as either clinical
evaluation, screening, or research. In this study, the purposes were categorized into
four types: 1—evaluation of a program of education; 2—evaluation or description
of a teaching technique; 3—evaluation of admission, course performance, or pro-
gression decisions; 4—evaluation or description of students, faculty, or providers.
Participants. The participants in the study pool were coded by professional
domain, level of experience, and number. For this study, professional domain was
determined at the level of licensure, such as nursing or medicine; the study may
have focused on a subspecialty of the domain, such as emergency nursing or cardiac
medicine but these were coded according to the overreaching domain category. In
this study, medicine refers to the profession of doctors or physicians. Nursing refers
to the profession of nurses, and includes registered nurses at all education levels. In
addition to medicine and nursing, articles from veterinary medicine, kinesiology,
health sciences, and occupational therapy were produced by the search. Since the
goal for this study was an examination of terms used in critical thinking across
multiple health care domains, these articles, which included references to medicine
or nursing, were retained if they met the delimiters and selection criteria. In addi-
tion, two articles examined more than one profession, and were labeled multidis-
ciplinary. Each study was coded for the level of experience of participants, either
192 L. Fountain
(a) student or prelicensure, (b) new graduate or residents, (c) practicing provider, or
(d) multiple levels. The number of participants or sample sizes were categorized as
small, with less than 30 participants, moderate, with 31 to 100 participants, or large,
with over 100 participants.
Research designs. The studies were categorized by type of research design.
Preexperimental designs included one group pretest/posttest and cross-sectional
studies. Quasi-experimental designs included separate sample pretest/posttest
design, and separate sample pretest/posttest control design. Examples of experi-
mental designs include pretest/posttest control group design and Solomon
four-group design. Case studies were coded as qualitative.
10.2.4 Definitions
For each study, the definition or descriptive data about of critical thinking or clinical
reasoning was coded for clarity, domain specificity, and equivalency.
Clarity. In this study, clarity refers to whether the definition was explicitly or
implicitly defined in the study. A definition was coded as explicit if the author
explicitly stated the definition of critical thinking used in the study. For example, in
Funkesson et al. (2007), the following definition for clinical reasoning was
explicitly stated: “In this paper, clinical reasoning is seen as a cognitive process,
where both theoretical knowledge and personal experience are used in a unique care
situation aiming to achieve a desired outcome for the person in focus” (p. 1110).
In order to analyze the lack of distinction in CT/CR term usage, the explicit
category was further delineated. If the definition was explicitly stated, the definition
was analyzed as to whether it was a definition shared by other researchers in
published research, or an idiosyncratic definition used by the researcher for this
specific study. For example, Blondy (2011) stated this definition: “We understand
critical thinking to be purposeful, self-regulatory judgment which results in inter-
pretation, analysis, evaluation and inference… and inductive and deductive rea-
soning” (p. 182), and this was coded as explicit shared. Appendix B contains a list
of shared definitions.
Forneris and Peden-McAlpine (2007), on the other hand, explicitly stated her
own unique definition: “Grounded in these theoretical views, critical thinking is
defined as a process of reflective thinking that goes beyond logical reasoning to
evaluate the rationality and justification for actions within context…Using the work
of these theorists, four attributes of critical thinking: reflection, dialog, context, and
time (p. 411). This definition was coded as explicit idiosyncratic. Idiosyncratic
explicit definitions were definitions that contained components that were specific to
this study, not captured by previously published definitions.
Another example of an explicit but idiosyncratic definition is “critical thinking:
problem identification, problem definition, exploration, applicability, and integra-
tion” (Schnell and Kaufman 2009). The keywords “applicability” and “integration”
10 Thinking Critically About the Quality of Critical Thinking … 193
were not found in common CT definitions, so this study was coded as explicit
idiosyncratic.
If the definition was only implicitly defined, the definitional data were further
analyzed as to the manner in which the construct was discussed and was coded as
conceptual, referential, or measurement. If the construct was discussed through the
use of related concepts, it was coded as implicit conceptual. For example, Mamede
et al. (2007) stated:
Judicious judgements [sic] and effective decision making define successful clinical problem
solving. Two different approaches for processing clinical cases, nonanalytical and analyt-
ical, have been shown to underlie diagnostic decisions. Experienced doctors diagnose
routine problems essentially by recognizing similarities between the actual case and
examples of previous patients. This pattern-recognition, non- analytical form of clinical
reasoning is largely automatic and unconscious. In the second, analytical form of case
processing, clinicians arrive at a diagnosis by analyzing signs and symptoms, relying on
biomedical knowledge when necessary. (p. 1185)
Mamede’s discussion used many concepts that are part of the discussion of
clinical reasoning but no definition was stated. Thus, this definition was coded as
implicit conceptual.
If the author did not clarify which definition was being used and cited definitions
used in other literature, the definition was coded as implicit referential. For
example, Göransson et al. (2007) stated that, “deductive content analysis was fol-
lowed, using the thinking strategies described by Fonteyn and Cahill (1998) from a
long-term TA study” (emphasis added). If the author only defined the construct
through the use of the measure, the clarity was coded as implicit measurement. For
example, Wolpaw et al. (2009) measured the following outcomes: summarizing
patient findings; providing a differential diagnosis; analyzing possibilities in dif-
ferential diagnosis; expressing uncertainties and obtaining clarification; discussing
patient management; and identifying case-related topics for further study. Because
they did not define clinical reasoning, but measured these aspects of clinical rea-
soning, so this study was coded as implicit measurement.
Domain specificity. Domain specificity refers to whether the definition used was
specific to a domain or was generally applicable. In this study, if a definition was
specific to a domain, it was coded as domain-specific. By contrast, if the definition
could be used by other professions it was coded domain-general. For instance,
Johansson (2009) defined clinical reasoning as “the cognitive processes and
strategies that nurses use to understand the significance of patient data, to identify
and diagnose actual or potential patient problems, to make clinical decisions to
assist in problem resolution and to achieve positive patient outcomes” (p. 3367).
The specification of the definition as pertinent to nurses rendered this a
domain-specific definition. On the other hand, Ajjawi and Higgs (2008) stated
“clinical reasoning is defined as the thinking and decision-making processes
associated with professional practice,” and this was coded as domain-general.
Equivalency. Equivalency refers to how the numerous terms used to describe
critical thinking were used in the study. Up to 43 terms for critical thinking have
been identified (Turner 2005). Often authors would state that one term was
194 L. Fountain
equivalent to another by using such phrases as “also known as” and “or.” This
intermingling of terms was analyzed and coded as “equivalency.” For example,
Funkesson et al. (2007) stated “clinical reasoning…can be named critical thinking,
reflective reasoning, diagnostic reasoning, decision making, etc.” (p. 1110). This
was coded as equivalency present. For the purposes of this study, the terms ana-
lyzed for statements of equivalency were critical thinking, clinical reasoning,
clinical judgment, decision-making, and problem-solving.
10.2.5 Measures
In order to evaluate the findings regarding the contextual aspects, the purpose,
participants, and research design findings will be reported and discussed.
Purpose. Nearly half (49 %) of the articles in the resulting pool of 43 articles
were about critical thinking, and 51 % had clinical reasoning as the focus.
196 L. Fountain
Table 10.3 Frequency and percentage of domain specificity of clinical thinking studies by
construct
Construct
All studies Critical thinking Clinical reasoning
n Percent (%) n Percent (%) n Percent (%)
Specific 24 56 5 24 19 86
General 19 44 16 76 3 14
Total 43 100 21 100 22 100
Table 10.4 Frequency and percentage of equivalency of term usage presence in clinical thinking
studies by construct
Equivalency presence Construct
All studies Critical thinking Clinical reasoning
n Percent (%) n Percent (%) n Percent (%)
Present 30 70 13 62 17 77
Absent 13 30 8 38 5 23
Total 43 100 – – – –
Table 10.5 Frequency and percentage of commonality of measure usage among clinical thinking
studies by construct
Measure commonality Construct
All studies Critical thinking Clinical reasoning
n Percent (%) n Percent (%) n Percent (%)
Study-Specific 17 40 5 24 12 55
Commonly used 16 37 6 29 10 45
Standardized 10 23 10 48 0 0
Total 43 100 21 100 22 100
200 L. Fountain
Table 10.6 Frequency and percentage of alignment of definitions with measures of clinical
thinking
Category Construct
All studies Critical thinking Clinical reasoning
n Percent (%) n Percent (%) n Percent (%)
Full 20 47 11 52 9 41
Partial 10 23 4 19 6 27
NA 13 30 6 29 7 32
Total 43 100 21 100 22 100
Researchers are probably creating new measures more than necessary or opti-
mal; perhaps searching for instruments from other domains could lead to a larger
stable of domain-specific, well-validated instruments.
Alignment. For the measure of alignment adapted from Alexander and Murphy
(2000) and Dinsmore et al. (2008), less than half the studies, 47 %, had a full
alignment between the definition of critical thinking and the operationalization
through the measure. As Table 10.6 shows, partial alignment occurred in 16 % of
the studies, and minimal alignment occurred in 7 % of the studies. Thirty percent of
the studies were excluded because the definition was implicit referential or implicit
measurement, or unable to be determined because the measurement was not ade-
quately described. Interrater reliability was established for the coding of alignment
of the measures at α = 0.90.
Operationalization of the construct was a big problem for this pool of studies,
with less than half having complete alignment. This distribution was about the same
for both CR and CT. The lack of clarity in definitions, or lack of explicit definitions,
is probably driving this. Researchers often listed a menu of definitions or attributes
of critical thinking without specifying which definition or model of thinking they
were using to measure critical thinking.
Reliability. Reliability documentation was absent in 7 % of studies (Table 10.7).
Reliability was obtained from a previous dataset in 23 % of studies. Reliability was
obtained for the current sample in 54 % of studies. Reliability was obtained for both
There were several limitations to the generalizability of this study. It was conducted
by a single researcher, with a second researcher for interrater reliability of key
variables. Systematic review technology calls for a team of researchers at each step
202 L. Fountain
References
Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., & Zhang,
D. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage
1 meta-analysis. Review of Educational Research, 78, 1102–1134. doi:10.3102/
0034654308326084
Ajjawi, R., & Higgs, J. (2008). Learning to reason: A journey of professional socialization.
Advances in Health Sciences Education, 13, 133–150. doi:10.1007/s10459-006-9032-4
Alexander, P. A., & Murphy, P. K. (2000). The research base for APA’s learner-centered
psychological principles. In N. Lambert & B. McCombs (Eds.), How students learn (pp. 25–
60). Washington, DC: American Psychological Association.
American Association of Colleges of Nursing. (2008). Essentials of baccalaureate education for
professional nursing practice. Washington, DC: American Association of Colleges of Nursing.
American Educational Research Association, American Psychological Association, National
Council on Measurement in Education. (1999). Standards for educational and psychological
testing. Washington, DC: American Educational Research Association.
American Educational Research Association, American Psychological Association, National
Council on Measurement in Education. (2014). Standards for educational and psychological
testing. Washington, DC: American Educational Research Association.
Bashir, M., Afzal, M. T., & Azeem, M. (2008). Reliability and validity of qualitative research.
Pakistani Journal of Statistics and Operation Research, IV(1), 35–45.
Benner, P. E., Hughes, R. G., & Sutphen, M. (2008). Clinical reasoning, decision-making, and
action: Thinking critically and clinically. In R. G. Hughes (Ed.), Patient safety and quality: An
evidence-based handbook for nurses. AHRQ Publication No. 08-0043. Rockville, MD: Agency
for Healthcare Research and Quality.
Blondy, L. C. (2011). Measurement and comparison of nursing faculty members’ critical thinking
skills. Western Journal of Nursing Research, 33, 180–195. doi:10.1177/0193945910381596
Brunt, B. A. (2005). Critical thinking in nursing: An integrated review. Journal of Continuing
Education in Nursing, 36, 60–67.
Carpenter, C. B., & Doig, J. C. (1988). Assessing critical thinking across the curriculum. New
Directions for Teaching and Learning, 34, 33–46. doi:10.1002/tl.37219883405
Chan, Z. C. Y. (2013). A systematic review of critical thinking in nursing education. Nurse
Education Today, 33, 236–240.
Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for
psychometric instruments: Theory and application. The American Journal of Medicine, 119,
166.e7–166.e16.
Cooke, M., Irby, D. M., & O’Brien, B. C. (2010). Educating physicians: A call for reform of
medical school and residency. San Francisco: Jossey-Bass.
Cooper, H., Hedges, L. V., & Valentine, J. C. (2009). The handbook of research synthesis and
meta-analysis (2nd ed.). New York: Russell Sage Foundation.
Creswell, J. W. (1994). Research design: Qualitative and quantitative approaches. Thousand
Oaks, CA: Sage Publications.
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods
approaches. Thousand Oaks, CA: Sage Publications.
Crocker, L. M., & Algina, J. (2006). Introduction to classical and modern test theory. Mason, OH:
Wadsworth Group/Thomas Learning.
Cruz, D. M., Pimenta, C. M., & Lunney, M. (2009). Improving critical thinking and clinical
reasoning with a continuing education course. The Journal of Continuing Education in
Nursing, 40, 121–127. doi:10.3928/00220124-20090301-05
Dinsmore, D. H., Alexander, P. A., & Loughlin, S. M. (2008). The impact of new learning
environments in an engineering design course. Instructional Science, 36, 375–393. doi:10.
1007/s11251-008-9061-x
10 Thinking Critically About the Quality of Critical Thinking … 205
McAllister, M., Billett, S., Moyle, W., & Zimmer-Gembeck, M. (2009). Use of a think-aloud
procedure to explore the relationship between clinical reasoning and solution-focused training
in self-harm for emergency nurses. Journal of Psychiatric and Mental Health Nursing, 16,
121–128. doi:10.1111/j.1365-2850.2008.01339.x
McCartney, K., Burchinal, M. R., & Bub, K. L. (2006). Best practices in quantitative methods for
developmentalists. Monographs of the Society for Research in Child Development, 71, 285.
Chapter II, Measurement issues. doi: 10.1111/j.1540-5834.2006.00403.x
Nikopoulou-Smyrni, P., & Nikopoulos, C. K. (2007). A new integrated model of clinical
reasoning: Development, description and preliminary assessment in patients with stroke.
Disability and Rehabilitation, 29, 1129–1138. doi:10.1080/09638280600948318
Norman, G. (2005). Research in clinical reasoning: Past history and current trends. Medical
Education, 39, 418–427. doi:10.1111/j.1365-2929.2005.02127.x
Ovretviet, J. (2011). BMJ Qual Saf 2011. 20(Suppl1), i18ei23. doi:10.1136/bmjqs.2010.045955
Patel, V. L., Arocha, J. F., & Zhang, J. (2004). Thinking and reasoning in medicine. In Keith
Holyoak (Ed.), Cambridge handbook of thinking and reasoning. Cambridge, UK: Cambridge
University Press.
Pintrich, P. R., Smith, D. A., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the
Motivated Strategies for Learning Questionnaire (MSLQ). Ann Arbor, MI: University of
Michigan National Center for Research to Improve Postsecondary Teaching and Learning.
Ralston, P. A., & Bays, C. L. (2013). Enhancing critical thinking across the undergraduate
experience: an exemplar from engineering. American Journal of Engineering Education
(AJEE), 4(2), 119–126.
Ratanawongsa, N., Thomas, P. A., Marinopoulos, S. S., Dorman, T., Wilson, L. M., Ashar, B. H.,
et al. (2008). The reported reliability and validity of methods for evaluating continuing medical
education: A systematic review. Academic Medicine, 83, 274–283. doi:10.1097/ACM.
0b013e3181637925
Rogers, J. C., & Holm, M. B. (1983). Clinical reasoning: The ethics, science, and art. American
Journal of Occupational Therapy, 37, 601–616. doi:10.5014/ajot.37.9.601
Ross, D., Loeffler, K., Schipper, S., Vandermeer, B., & Allan, G. M. (2013). Do scores on three
commonly used measures of critical thinking correlate with academic success of health
professions trainees? A systematic review and meta-analysis. Academic Medicine, 88(5), 724–
734.
Schell, R., & Kaufman, D. (2009). Critical thinking in a collaborative online PBL tutorial. Journal
of Educational Computing Research, 41, 155–170. doi:10.2190/EC.41.2.b
Simmons, B. (2010). Clinical reasoning: Concept analysis. Journal of Advanced Nursing, 66,
1151–1158. doi:10.1111/j.1365-2648.2010.05262.x
Simpson, E., & Courtney, M. D. (2002). Critical thinking in nursing education: A literature review.
International Journal of Nursing Practice, 8(April), 89–98. doi:10.1046/j.1440-172x.2002.
00340.x
Tanner, C. A. (1997). Spock would have been a terrible nurse and other issues related to critical
thinking. Journal of Nursing Education, 36, 3–4.
Turner, P. (2005). Critical thinking in nursing education and practice as defined in the literature.
Nursing Education Perspectives, 26, 272–277.
Walsh, C. M., & Seldomridge, L. A. (2006a). Critical thinking: Back to square two. Journal of
Nursing Education, 45, 212–219.
Walsh, C. M., & Seldomridge, L. A. (2006b). Measuring critical thinking: One step forward, one
step back. Nurse Educator, 31, 159–162.
Waltz, C. F. (2010). Measurement in nursing and health research (4th ed.). Philadelphia: Springer.
Watson, R., Stimpson, A., Topping, A., & Porock, D. (2002). Clinical competence assessment in
nursing: A systematic review. Journal of Advanced Nursing, 39, 421–431.
Wolpaw, T., Papp, K. K., & Bordage, G. (2009). Using SNAPPS to facilitate the expression of
clinical reasoning and uncertainties: A randomized comparison group trial. Academic
Medicine, 84, 517–524. doi:10.1097/ACM.0b013e31819a8cbf
10 Thinking Critically About the Quality of Critical Thinking … 207
Context (n. d.). In Merriam-Webster.com. Retrieved April 11, 2012 from http://www.merriam-
webster.com/dictionary/context.
Alexander, P. A., Dinsmore, D. L., Fox, E., Grossnickle, E., Loughlin, S. M., Maggioni, L.,
Parkinson, M. M., & Winters, F. I. (2011). Higher order thinking and knowledge:
Domain-general and domain-specific trends and future directions. In G. Shraw & D.
H. Robinson (Eds.), Assessment of higher order thinking skills (pp. 47-88). Charlotte, NC:
Information Age Publishing.
Fonteyn, M. E., & Cahill, M. (1998). The use of clinical logs to improve nursing students’
metacognition: A pilot study. Journal of Advanced Nursing, 28(1), 149-154.