Aleven Et Al. 1998
Aleven Et Al. 1998
Aleven Et Al. 1998
Abstract The PACT Geometry tutor has been designed, with guidance from
mathematics educators, to be an integrated part of a complete, new-standards-
oriented course for high-school geometry. We conducted a formative evaluation of
the third “geometric properties” lesson and saw significant student learning gains.
We also found that students were better able to provide numerical answers t o
problems than to articulate the reasons that are presumably involved in finding
these answers. This suggests that students may provide answers using superficial
(and possibly unreliable) visual associations rather than reason logically from
definitions and conjectures. To combat this type of shallow learning, we are
developing a new version of the tutor’s third lesson, aimed at getting students t o
reason more deliberately with definitions and theorems as they work on geometry
problems. In the new version, students are required to state a reason for their
answers, which they can select from a Glossary of geometry definitions and
theorems. We will conduct an experiment to test whether providing tutoring o n
reasoning will transfer to better performance on answer giving .
Introduction
A problem for many forms of instruction is that students may learn in a shallow way [Burton
and Brown, 1982; Miller, et al., submitted], acquiring knowledge that is sufficient to score
reasonably well on some test items, but that does not transfer to novel situations. One
manifestation of shallow learning is that students construct superficial domain heuristics that
may allow them to solve some problems quite well, even though that “knowledge” is
ultimately not correct. For instance, in the context of geometry, most students will learn
how to find the measures of unknown quantities in diagrams. However, they may rely on the
fact that certain quantities look equal rather than reason from geometric definitions and
theorems. Such superficial perceptual strategies, enriched with some correct geometric
knowledge, can be very serviceable and lead to correct solutions on many naturally-occurring
problems. However, they fall short on more complex problems or when students are asked to
discuss reasons for their answers.
Superficial strategies occur in many domains. In physics problems solving, consider a
problem where students are asked to draw an acceleration vector for an elevator coming to a
halt while going down. They often draw a downward arrow, assuming, incorrectly, that the
acceleration has the same direction as the velocity. Also, when asked to categorize physics
problems, novices tend to do so on the basis of surface level features, while experts use the
deeper physics principles involved [Chi, et al., 1981].
We can interpret the shallow learning problem within the ACT-R theory of cognition and
learning [Anderson, 1993], as follows: In the ACT framework, learning a procedural skill
means acquiring a set of production rules. Production rules are induced by analogy to prior
experiences or examples. Superficial knowledge may result when students pay attention to
the wrong features in those experiences or examples, features that may be readily available
and interpreted, but that do not connect to deeper reasons. However, not much is known
about what types of instruction are more likely or less likely to foster shallow learning.
Evaluations of cognitive tutors indicate that they can be significantly more effective than
classroom instruction [Anderson, et al., 1995; Koedinger, et al., 1998]. In spite of this
success, cognitive tutors (and other computer-based learning environments) may not be
immune from the shallow learning problem. It is important to determine to what degree
students come away with shallow knowledge, when they work with cognitive tutors. This
may help to find out how these tutors can be designed to minimize shallow learning and be
even more effective.
We study these issues in the context of the PACT Geometry Tutor, a cognitive tutor
developed by our research group used in four schools in the Pittsburgh area. In a formative
evaluation study, we found that the instructional approach of which the tutor is part, leads to
significant learning gains. We also found evidence of a form of shallow learning: Students
cannot always give a reason for their answers to geometry problems, even if the answer itself
is correct. Such a reason would be, for example, a definition or theorem applied in calculating
certain quantities in a diagram. We have redesigned the tutor in an attempt to remedy this
kind of shallow learning. Currently, we are pilot-testing the new tutor.
In this paper, we give a brief overview of the PACT Geometry tutor. We present results
from our formative evaluation study that motivated the redesign of the tutor. We describe
how we have modified the tutor, and finally discuss why the changes may lead to a more
effective tutor, in geometry and potentially also in other domains.
Fig. 1. PACT Geometry tutor curriculum, organized by lessons, sections, and skills.
Students enter answers in the Table Tool, a spreadsheet-like device shown on the left,
second window from the top. The table cells correspond to the key quantities in the problem,
such as the givens, the target quantities, or intermediate steps. As students enter values into
the table, they receive immediate feedback indicating whether their answer is correct or not.
When students are stuck, they can ask for hints, which the tutor presents in a separate
Messages window (see Figure 2, left, third window from the top). The hints become more
specific as students repeat their request for help. Students can move on to the next problem
only when they have entered correct values for all quantities in the table.
The Diagram Tool presents an abstract version of the problem diagram (Figure 2, bottom
right) with cells for students to record the measures of angles, as one often does when solving
geometry problems on paper. This makes it easier for students to relate the quantities in the
problem to the entities in the diagram and to keep track of information. For problems that
involve difficult arithmetic operations, students can use the Equation Solver (not shown, see
[Ritter and Anderson, 1995]), a tool which helps students to solve equations step-by-step.
Finally, the PACT Geometry Tutor provides a skillometer, which displays the tutor’s
assessment of the student, for the skills targeted in the current section (see Figure 2, bottom
left). The skillometer helps students keep track of their progress. The skills for which a
student has reached mastery level are marked with a “ ”. When students have reached mastery
levels for all skills, they graduate from the current section of the curriculum.
The PACT Geometry Tutor is a cognitive tutor, an approach based on the ACT theory
which has proven to be effective for building computer tutors for problem-solving skills
[Anderson, 1993; Anderson, et al, 1995]. The PACT Geometry tutor is based on a production
rule model of geometry problem solving, organized by lessons and sections as shown in
Figure 1. The model, which contains 77 geometry rules and 198 rules dealing with equation-
solving, is used for model-tracing and knowledge tracing. The purpose of model-tracing is to
monitor a student’s solutions and to provide feedback and hints. When the student enters a
value into the Table Tool, the tutor uses the model output as a standard to evaluate the
student’s answer. To provide hints, the tutor applies its production rule model to the current
state of problem-solving and displays the hint messages associated with the applicable rule
that has highest priority.
Fig. 2. The PACT Geometry Tutor provides tools and intelligent assistance
35°
Correct Incorrect
∠ • triangle sum, isosceles triangle • only angle besides congruent angles;
• third # to the sum of triangle opposite congruent sides
1 • you subtract 180 - 35 - 35 and get 110 • linear pair
• alt. interior angles are congruent
∠ • vertical pair of angles • because it 2 is a linear pair with 1
• it is opposite ∠1
2 • corresponding angles are congruent
∠ • Opposite Angles of a Parallelogram are equal • parallel lines --> Alt. Int. angles are
• Interior angles on same side of transversal are congruent
3 supplementary • ANG 2 & 3 are CONG cause of parallel
• all the lines are parallel so there will be 2 pairs of sides
equal angles • same as m∠2
Fig. 3. Sample test question with correct answers and reasons (top) plus a sample of
reasons given by students for correct answers on the post-test
Each test involved four multi-step geometry problems in which students were asked to
calculate certain measures in a diagram and were asked also to state reasons for their answers,
in terms of geometry theorems and definitions. Students were given a sheet listing relevant
definitions and theorems and were told that they could use the sheet freely. We used two
different test forms, each of which was given (randomly) to about half the students, during
pre-test and post-test, to counterbalance for test form difficulty. An example question is
shown in Figure 3, together with correct and incorrect reasons that students gave for correct
numeric answers (e.g., correct angle measures), when they took the post-test.
The criterion for grading the reasons was whether students were able to justify their
answers in terms of geometry definitions and theorems, possibly stated in their own words.
For example, to calculate the measure of angle 1, one needs to apply the isosceles triangle
theorem and the triangle sum rule, as shown in Figure 3, first correct reason for angle 1.
Since the grading was lenient, listing only one of the two rules was deemed correct, as can be
seen in the second correct reason for angle 1. Even a procedural description which did not
mention any geometry rules was deemed (borderline) correct, as in the third correct reason for
angle 1. We see also that some of the incorrect reasons were more incorrect than others.
Some are very close to being correct (e.g., the first incorrect answer for angle 1), some are
plain wrong (e.g., the first incorrect reason for angle 2 mentions the wrong theorem), some
are in between.
As shown in Figure 4, students’ test scores improved from pre-test to post-test. Numeric
answer-finding increased from an average of 0.74 on the pre-test to 0.86 on the post-test,
reason-giving improved from 0.43 on the pre-test to 0.60 on the post-test. A two-factor
ANOVA with test-time (pre v. post) and action type (numeric answer vs. reason) as subjects
1 factors, revealed significant main effects of both
.9 test-time (F(1, 70) = 39.1, p < .0001) and action
.8
type (F(1,70) = 191.4, p < .0001). This indicates
that students improved significantly from pre-test to
Proportion Correct
.7
(Student Means)
words are stored in memory both perceptually and verbally and these redundant codes make
retrieval of them more likely than equally frequent abstract words which are stored in memory
only in a verbal code. In our case, providing more practice in deliberate rule use should
complement student’s natural inclination toward perceptual encoding of geometric properties
and encourage students toward a second verbal encoding that can enhance the probability of
reliable recall. In both these arguments, we are not suggesting stamping out the use of
perceptual encodings and heuristics in geometric reasoning, but rather bolstering them with
complementary verbal-logical encodings.
to the current problem. The tutor’s production rule model has been extended to accommodate
these changes. In all other respects, the tutor is the same as the previous version.
Thus, the new tutor provides opportunities to work with English descriptions of the rules
and to some extent forces students to do so. We have begun pilot-testing the new tutor.
Initial reactions of two geometry teachers and a student were favorable. The student remarked
that he liked the tutor better than the original version, “because you have the reasons.”
Acknowledgments
Ray Pelletier and Chang-Hsin Chang contributed to the development of the PACT Geometry tutor.
The PACT Geometry curriculum was developed by Jackie Snyder. The research is sponsored by the
Buhl Foundation, the Grable Foundation, the Howard Heinz Endowment, the Richard King Mellon
Foundation, and the Pittsburgh Foundation. We gratefully acknowledge their contributions.
References
Anderson, J. R., 1993. Rules of the Mind. Hillsdale, NJ: Addison-Wesley.
Anderson, J. R., A. T. Corbett, K. R. Koedinger, and R. Pelletier, 1995. Cognitive tutors: Lessons learned. The
Journal of the Learning Sciences, 4,167-207.
Anderson, J. R., and R. Pelletier, 1991. A development system for model-tracing tutors. In Proceedings of the
International Conference of the Learning Sciences, 1-8. Evanston, IL.
Burton, R. R., and J. S. Brown, 1982. An Investigation of Computer Coaching for Informal Learning Activities. In
D. H. Sleeman and J. S. Brown (eds.), Intelligent Tutoring Systems, 79-98. New York: Academic Press.
Chi, M. T. H., P. J. Feltovich, and R. Glaser, 1981. Categorization and representation of physics problems by
experts and novices. Cognitive Science, 5, 121-152.
Corbett, A. T., and J. R. Anderson, 1995. Knowledge tracing: Modeling the acquisition of procedural knowledge.
User Modeling and User-Adapted Interaction, 4: 253-278.
Holland, J. H., K. J. Holyoak, R. E. Nisbett, and P. R. Thagard, 1986. Induction: Processes of Inference, Learning,
and Discovery. Cambridge, MA: The MIT Press.
Koedinger, K. R., and J. R. Anderson, 1990. Abstract planning and perceptual chunks: Elements of expertise in
geometry. Cognitive Science, 14, 511-550.
Koedinger, K. R., and J. R. Anderson, 1993. Reifying implicit planning in geometry. In S. Lajoie and S. Derry
(eds.), Computers as Cognitive Tools, 15-45. Hillsdale, NJ: Erlbaum.
Miller, C. S., J. F. Lehman, and K. R. Koedinger, 1997. Goals and learning in microworlds. Submitted to Cognitive
Science.
NCTM, 1989. Curriculum and Evaluation Standards for School Mathematics. National Council of Teachers of
Mathematics. Reston, VA: The Council.
Paivio, A., 1971. Imagery and verbal processes. New York: Holt, Rinehart, and Winston.
Ritter, S., and K. R. Koedinger, 1997. An architecture for plug-in tutoring agents. Journal of Artificial Intelligence
in Education, 7 (3/4), 315-347. Charlottesville, VA: AACE.
Ritter, S., and J. R. Anderson, 1995. Calculation and strategy in the equation solving tutor. In Proceedings of the
Seventeenth Annual Conference of the Cognitive Science Society, 413-418. Hillsdale, NJ: Erlbaum.