Understanding The Assessment of Clinical Reasoning: Omar S. Laynesa
Understanding The Assessment of Clinical Reasoning: Omar S. Laynesa
Understanding The Assessment of Clinical Reasoning: Omar S. Laynesa
Understanding the
Assessment of Clinical
Reasoning
Omar S. Laynesa
ABSTRACT
• Clinical reasoning ability depends on a health professional’s knowledge and knowledge organization rather
than a general thinking process.
• Clinical reasoning is context-specific; clinician or trainee characteristics account for only a small amount of
the variance in diagnostic accuracy.
• Determining the validity of any clinical reasoning assessment method is challenging due in part to the
situation specific nature of clinical reasoning (context specificity).
• Most clinical reasoning assessment methods provide adequate reliability for a high stakes examinations
provided adequate sampling.
Expertise as a state argues that clinical reasoning performance is specific to the patient, other health professionals on the
team, the environmental and their emergent interactions (i.e., the specific situation). Thus, content specificity that was first
described by Elstein et al. (1978) has been renamed context specificity to capture the notion that something besides the
clinical content of a case is influencing diagnoses and therapy.
Situativity theory has emerged to explain context specificity and expand understanding of clinical reasoning beyond the
limits of information processing theory. Situativity theory posits that the knowledge of the health professional is only one
of several, rather than the sole factor that predicts clinical reasoning success.
11.3 The Construct and Process of Clinical Reasoning
- a technique for visually representing a learner’s thinking - defined as a set of processes that learners use to
or knowledge organization. In a concept map, the learner moderate their own learning and performance, which is
connects a number of ideas (concepts) with specific typically divided into a number of elements in each of
phrases (linking words) to demonstrate how they put their three stages: forethought (before), performance (during),
ideas together. and reflection (after).
NEUROBIOLOGICAL CORRELATES
• In designing an IPE assessment a series of key questions need to be posed and addressed, including, what is
the purpose of the assessment? What is one going to assess? How is the assessment to be performed?
• Development of assessment blueprint is vital to linking proposed learning outcomes with methods of
assessment.
• A focus on collaborative performance using competency domains such as communication, collaboration and
professionalism can be an effective approach to IPE assessment.
• The use of an assessment matrix can effectively collate key elements related to the assessment of IPE.
• Entrustable professional activities and milestones are promising techniques to use in IPE assessment.
INTERPROFESSIONAL EDUCATION (IPE)
IPE focuses on learning activities designed to enhance the attitudes, knowledge, skills, and
behaviors for effective interprofessional practice (Barr et al. 2005).
Through the use of IPE, it is anticipated that improvements in the quality of care delivered
to patients/clients and families will be achieved (e.g., Reeves et al. 2010, 2013; Institute of
Medicine 2013, 2014).
12.2 Assessment Development: Key Principles
1. COLLABORATIVE PERFORMANCE 2. WHAT TO ASSESS
• Within an IPE program or curriculum, working • Within each learning activity, assessment must
together collaboratively is the core issue. In include knowledge, application of knowledge,
introducing an IPE curriculum, agreement was performance of the knowledge, and what we do
needed for both examination standards and in reality which would be to develop a
assessment criteria—these should be equivalent professional competence in practice: ‘knows’;
across all programs. ‘knows how’; ‘shows how’, and ‘does’ (Miller
• As these assessment issues are complex, it was 1990).
not surprising that the leaders of the different • This approach can also be defined as content-
programs initially agreed on a formative specific assessment or domain-specific
approach to assessment. assessment (van der Vleuten 2008).
4. ASSESSING COLLABORATIVE
3. DEVELOPMENT OF A BLUEPRINT PERFORMANCE
• A blueprint links learning outcomes with • Attempting to assess collaboration can be
methods of assessment, the target phase of problematic, especially when a group of learners are
learning in which the learning outcome should be brought together, often without preparation, and
achieved and also maps the assessment to required to perform together as an interprofessional
team.
practice.
• In practice, however, interprofessional teams may
have worked together for many years.
12.2 Assessment Development: Key Principles
1. Structure-Function-Outcome
2. Individuals-Team-Task
6. Entrustable Professional
Activities