Paper About Forces

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/220612962

Evaluating the Educational Impact of Visualization (Report of the Working


Group on Evaluating the Educational Impact of Visualization)

Article  in  ACM SIGCSE Bulletin · December 2003


DOI: 10.1145/960492.960540 · Source: DBLP

CITATIONS READS

164 2,181

14 authors, including:

Thomas L Naps Stephen Cooper


University of Wisconsin - Oshkosh University of Nebraska at Lincoln
68 PUBLICATIONS   2,071 CITATIONS    72 PUBLICATIONS   2,769 CITATIONS   

SEE PROFILE SEE PROFILE

Boris Koldehofe Charles Leska


University of Groningen Randolph-Macon College
135 PUBLICATIONS   1,941 CITATIONS    4 PUBLICATIONS   245 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

PBL in learning programming View project

Infrastructure for Computer Science Education View project

All content following this page was uploaded by Guido Rößling on 27 May 2014.

The user has requested enhancement of the downloaded file.


Evaluating the Educational Impact of Visualization
Report of the Working Group on “Evaluating the Educational Impact of Visualization”

Thomas Naps (co-chair) Guido Rößling (co-chair) Jay Anderson


U Wisconsin Oshkosh TU Darmstadt, Germany Franklin & Marshall College
naps@uwosh.edu roessling@acm.org jay.anderson@fandm.edu
Stephen Cooper Wanda Dann Rudolf Fleischer
St. Joseph’s University Ithaca College Hong Kong U Sc. & Techn.
scooper@sju.edu wpdann@ithaca.edu rudolf@cs.ust.hk
Boris Koldehofe Ari Korhonen Marja Kuittinen
Chalmers U Techn., Sweden Helsinki U Techn., Finland Joensuu University, Finland
khofer@cs.chalmers.se archie@cs.hut.fi marja@cs.joensuu.fi

Charles Leska Lauri Malmi Myles McNally


Randolph-Macon College Helsinki U Tech., Finland Alma College
cleska@rmc.edu lma@cs.hut.fi mcnally@alma.edu
Jarmo Rantakokko Rockford J. Ross
Uppsala U, Sweden Montana State University
jarmo@tdb.uu.se ross@coe.montana.edu

ABSTRACT General Terms


The educational impact of visualization depends not only on Algorithms
how well students learn when they use it, but also on how
widely it is used by instructors. Instructors believe that Keywords
visualization helps students learn. The integration of visu-
alization techniques in classroom instruction, however, has Visualization, Animation, Pedagogy
fallen far short of its potential. This paper considers this
disconnect, identifying its cause in a failure to understand 1. INTRODUCTION
the needs of a key member in the hierarchy of stakehold-
The educational impact of visualization includes two com-
ers, namely the instructor. We describe these needs and
ponents: the enhancement of learning with visualization,
offer guidelines for both the effective deployment of visual-
and the deployment of visualization in the classroom. Pre-
izations and the evaluation of instructor satisfaction. We
vious surveys [35] show a significant disconnect between the
then consider different forms of evaluation and the impact
intuitive belief that visualization enhances a student’s learn-
of student learning styles on learner outcomes.
ing and the willingness and ability of instructors to deploy
visualization in their classrooms. A key impediment to the
adoption of visualizations by instructors is the time required
Categories and Subject Descriptors to learn, install, and develop visualizations and then inte-
grate them into a course. Additionally, there is also a per-
K.3.2 [Computers and Education]: Computer & Infor- ceived lack of effective visualization development tools and
mation Science EducationComputer Science Education software.
Whereas studies have begun to show the conditions under
which visualization enhances student learning [35], the over-
all educational impact of visualization is and will be minimal
Permission to make digital or hard copies of all or part of this work for until more instructors are induced to integrate visualization
personal or classroom use is granted without fee provided that copies are techniques in their classes. By continuing to explore the ef-
not made or distributed for profit or commercial advantage and that copies fect of visualization on student learning, and by overcoming
bear this notice and the full citation on the first page. To copy otherwise, to the impediments to the deployment of visualization, we ex-
republish, to post on servers or to redistribute to lists, requires prior specific pect to raise the positive impact of visualization in computer
permission and/or a fee.
ITiCSE 2003 Thessaloniki, Greece science education.
Copyright 2003 ACM X-XXXXX-XX-X/XX/XX ...$5.00. Understanding the disappointing integration of visualiza-
tion techniques in the educational process requires that we on the needs of the instructor. As a result, support for
identify the stakeholders in the instructional use of visual- deploying visualizations in the classroom is nearly always
ization. We recognize four different roles associated with lacking.
visualization, similar to Price et al. [39]: Instructors face substantial impediments when they try
to use visualization in their teaching. Based on a survey of
• Visualization Tool Developers develop tools for visual- SIGCSE members done by the 2002 Working Group [35],
izing contents, such as algorithms, data structures or the top five impediments (listed by percentage of responses)
program execution. Such tools may provide visualiza- are:
tions directly, such as Jeliot 2000 [32], or may be meta
tools that allow others to design desired visualizations,
for example JAWAA [2] or Animal [43]. • 93%: time required to search for good examples

• Visualization Designers specify the mapping from the • 90%: time it takes to learn the new tools
abstract concepts to their visual representation by cre-
ating specific visualizations – either by using visualiza-
• 90%: time it takes to develop visualizations
tion meta tools or by other more direct methods.
• Instructors incorporate visualizations into their teach- • 83%: lack of effective development tools
ing materials and methodology.
• Learners view and hopefully interact with the visual- • 79%: time it takes to adapt visualizations to teaching
ization in one or more of the engagement levels de- approach and/or course content
scribed in [35].
Clearly, the amount of time involved in learning how to
An individual may take on many of these roles. For exam- use visualization tools and in developing demonstrations and
ple, instructors will often also act as visualization designers interactive lab materials discourages the use of visualiza-
to generate their own specialized visualizations. tion. Our thesis is that these impediments can be overcome
by providing high quality support materials for instructors.
The availability of good support materials will lead to in-
Vis. Tool Developer creased instructor satisfaction, which consequently will lead
to more widespread usage of visualization.
Working from this basic thesis, Sections 2 and 3 will offer
Concept V Tool guidance for enhancing the effective deployment of visual-
ization tools in the classroom. Since impacting CS edu-
cation requires both widespread use and improved student
outcomes, such guidance will be broken down along these
Visualization Designer two lines. Section 2 will focus on the instructor satisfaction
issues, and Section 3 will cover measurement of student out-
Visualization comes. Section 2 will be of particular interest to developers
and designers who seek feedback on how effective their tool
is in teaching situations.
In future years, the combined results emanating from the
Instructor Learner evaluation techniques described will provide a more informa-
tive measure of the two aspects that combine to influence
educational impact.
Figure 1: Schematic View of User Roles.
2. ADDRESSING INSTRUCTOR NEEDS
Figure 1 shows the different roles and how they can in- The introduction has made the case that visualization
teract. Each role has specific expectations of visualizations. systems are perceived by instructors as being potentially
To gain more wide-spread usage, visualization tool develop- beneficial to learning outcomes and motivation. However,
ers are interested in optimizing their tool for the other three such systems will see widespread use in the computer sci-
roles. Visualization designers strive to design visualizations ence curriculum only if instructors can be enticed to incor-
that are valuable to a large audience. Instructors want to porate them into the fabric of their courses. This implies in
be able to integrate visualizations into their course materi- turn that visualization tool developers must recognize and
als to make both their teaching more satisfying and improve address the impediments instructors face in integrating vi-
student motivation and learning. Students hopefully learn sualizations into their teaching. In Section 2.1, we rely on
the concepts better or in a way that is “more fun” because the literature (see, for example, [7] and [33]) in providing a
of the visualizations. review of these impediments.
Given this analysis of the stakeholders in instructional use In Section 2.2, we offer advice to visualization system and
of visualization, the reason for visualization’s lack of impact tool designers for overcoming these impediments. Section
begins to emerge. Visualization research has focused on the 2.3 is devoted to techniques for making visualization soft-
developer and designer while research in CS education has ware easy to locate and obtain. Finally, Section 2.4 explores
focused on the effectiveness of visualization to improve stu- ways of evaluating how successful a visualization system or
dent learning. In contrast, virtually no research has focused tool is in meeting the needs of instructors.
2.1 Impediments Faced by Instructors Hypertextbooks are an example of the other end of the
Without question, the main impediment to instructor adop- spectrum. They are envisioned to be complete teaching and
tion of visualizations for use in teaching and learning com- learning resources that complement—or indeed supplant—
puter science concepts is time. This includes time to: traditional course teaching and learning resources (for ex-
ample, textbooks). Hypertextbooks may include standard
• Find text, illustrations, pictures, video clips (for example, Quick-
Time movies), audio, and various paths through the mate-
• Download rial based on different learning needs. Most importantly,
• Install they would also include embedded visualization applets of
key concepts and models that engage the learner in active
• Learn learning of the subject. In this case, since the hypertext-
book is the course teaching and learning resource, and since
• Develop visualizations (if the tool is one that assumes the hypertextbook runs in standard browsers (and likely dis-
that the instructor will design his or her own visual- tributed on CD or DVD), the issues of finding, downloading,
izations via the tool) installing, adapting and integrating into a course, and main-
• Adapt and integrate into a course taining and upgrading are moot. Since the visualization ap-
plets are part of the fabric of the hypertextbook, they will
• Teach students to use visualizations be used quite naturally by instructors and students alike.
The applets themselves must be designed so that they are
• Maintain and upgrade easy to learn by both instructor and students – issues we
discuss below. More can be read about hypertextbooks in
Exacerbating this situation is the fact that this effort may
[6, 40, 41, 27].
all be done for just a couple of lectures and might need to
Between these two extremes lie a variety of other possibil-
be repeated for each concept to be visualized. Indeed, of the
ities for visualization systems and visualization tool devel-
nine top impediments cited by instructors in the survey, six
opment that address many of the impediments listed earlier.
were time issues.
It should be noted that these suggestions are not mutually
A second major impediment identified in the literature is
exclusive. Many can be combined to address more of the
platform dependence. If visualization systems are designed
impediments. We provide some suggestions next.
to run on a particular platform (for example, a PC with
Windows), it precludes their use on another system (for ex- Design for platform independence. This will, obviously,
ample, a PC with Linux). Indeed, platform dependence has eliminate the impediment of platform dependence. Plat-
many more subtle nuances that a visualization tool designer form independence is an elusive and likely unattain-
must address (for example, version of operating system or able goal in its ideal sense, but there are some choices
browser used). that are better than others. For example, designing
A third major impediment highlighted in the references systems for the Web (or, more precisely, the Java Vir-
is course integration issues. How easy is it to incorporate tual Machine) is one possibility. An alternative is to
a visualization into the fabric of a course and to adapt it ensure that a visualization system runs on all of the
to the way the concepts are presented in the classroom and major platforms likely to be available in academic set-
in the accompanying textbooks or other class resources? If tings around the world. Some visualization systems
the visualization does not integrate well into a course, it will designed in this manner have come to untimely ends.
most likely not be used. Those likely to be successful are those that are based
Notice that although the course integration issue is high- on widely accepted and standardized software tools
lighted separately in the literature, it is actually captured that themselves have been ported to many platforms,
in the time impediment list as well, as adaptation and in- such as OpenGL for graphics.
tegration of a visualization system into a course are rightly
Capture larger concepts. This will ameliorate the time
also identified as substantial time sinks. impediments of searching for, downloading, installing,
2.2 Advice and learning a new tool for each new concept. A vi-
sualization system is likely to be more widely used if
Visualization systems run a wide gamut. At the two ex-
it allows for visualizations of an entire “module” of
tremes are:
related concepts, such as all of searching and sorting
• Standalone, single purpose, undocumented, platform rather than just a single algorithm or two. Further-
dependent visualization systems that do not engage more, the treatment of each visualized concept will
the student beyond passive viewing have a similar “look and feel” that allows instructors
and students to focus on learning concepts instead of
• Complete teaching and learning resources that incor- learning how to use a tool. For example, systems like
porate visualizations seamlessly and become an inte- Alice [9] provide a resource around which a course can
gral resource used for an entire course be designed and that can be used for most, if not all,
of a course.
Many visualizations have fallen into the first category and
are precisely the ones that lead to instructor frustration as Map to existing teaching and learning resources. Ex-
measured in the survey. They should be avoided by design- tending the previous observation, providing a visu-
ers of such systems at all costs. Systems like this, while they alization package that corresponds to an existing re-
may be usable by the designer in a local course for a specific source (for example, a textbook) will make the pack-
purpose, are sure to not gain widespread use. age more appealing to users of that textbook. If done
well, the seamless integration of the book with the vi- Complete Collection of Algorithm Animations [7], the
sualization system will also eliminate most of the time forthcoming Algorithm Animation Repository [10], CI-
required to adapt and integrate a visualization sys- TIDEL [8], a prototypical repository of visualization
tem into a particular course. It has the drawback, of resources [42] as well as the “SIGCSE Education Links”
course, of being primarily useful to those who choose on the ACM SIGCSE Web site [1].
to use that textbook. Integrating the system with text
materials can further improve the adaption of the sys- Publicize. It is probably safe to say that most instructors
tem. For example, Brown et al. state that a large part do not actively seek visualization systems to use in
of the success of the BALSA system was due to its their courses. This could be because they are satisfied
tight integration with the teaching materials [?]. with their way of teaching, they are unaware of visu-
alizations in general, or they have tried visualizations
Design for flexibility. One can make a visualization tool in their courses before without success. It is manda-
so flexible that it can easily be adapted to many differ- tory that, in addition to simply registering their work
ent presentation styles and course resources (for exam- in a repository, visualization tool designers publicize
ple, textbooks). This is a worthy goal but, of course, their work in venues such as the annual SIGCSE sym-
more difficult to achieve. For example, there are many posium (through papers or posters), in general edu-
different implementation nuances that affect how a cational journals such as Inroads and in area-specific
particular algorithm (for example, a Quicksort algo- journals (for example, if the tool visualizes aspects of
rithm) actually runs. Virtually every textbook pro- the theory of computing, in relevant theory journals),
vides a different version. Adapting a given visualiza- and any other appropriate media.
tion of an algorithm to the precise way in which the
algorithm is presented in the textbook may be diffi- The above advice is not meant to be comprehensive, but
cult or time-consuming. The discrepancy in content rather illustrative. Once the impediments are known and
may lead to confusion on the part of students. Making some examples of ways to surmount these impediments have
adaptation easy due to system flexibility can therefore been discussed, we are certain that visualization design-
play a key role in a successful adoption of a visualiza- ers will become more adept at providing the community
tion system. with systems that address the issues that have slowed the
widespread use of the tools and thus promote the use of
Provide comprehensive, integrated support. To elim-
visualizations in a positive way throughout the curriculum.
inate the frustrating time impediments of learning a
visualization resource and teaching students to use it, 2.3 Disseminating Visualization Tools
a comprehensive support structure should be part of
How a visualization tool designer disseminates a system
the tool. This support should include a very care-
plays an important role in how widely the system will be
fully designed GUI that is novice-friendly and easily
used. In this section, we present a suggested outline of a
learned. Documentation on the use of the tool as well
standard Web site for this purpose. The site should make it
as tutorials illustrating its use should also be part of
easy for Web surfers to find the tool, learn about it, down-
the software. The entire support structure should be
load it, and install it. The site should further provide a
refined based on feedback from the user community,
mechanism for obtaining feedback from those who choose
both learners and instructors.
to download the system. Feedback is used to measure the
Develop a supporting Web site. A carefully designed level of satisfaction of those who use it. In what follows,
Web site for a visualization system can do much to the word “tool” may mean either software for designing vi-
address the time impediments that frustrate instruc- sualizations (examples: Animal, Alice, Matrix, . . .), or a
tors. Choose a clever name for the visualization tool collection of pre-prepared visualizations (examples: Quick-
that is catchy, informative, and will be easily found Time movies or Java applets).
in a Web search. A Web site that provides a place
to download the visualization software should include The portal. Acknowledging principles of good Web page
many other things as well, such as sample lectures, ex- design (see, for example, [36, 37]), we recommend that
ercises, and PDF documents for hard-copy instructions the entry page, or portal, to the Web site be attrac-
that can be used by students who are learning to use tively designed and that it provide clear information
the system. Community forums and FAQs that allow describing:
users of the tool to interact with other instructors who
are using the system also make adoption of the system • The name of the tool
more likely. A good Web site is, in fact, such an im- • Author contact information
portant aspect of this discussion that we elaborate on
– Names
it in sections 2.3 and 2.4.
– E-mail addresses
Register the tool in repositories. To help overcome the – Institutions
impediment of time to find visualizations on the Web,
the systems should be registered in relevant reposito- • A short, clear description of the tool that will let
ries. Unfortunately, there is currently no single author- visitors know whether the system is of interest
itative repository for registering visualization tools. (so that they can decide whether to investigate
However, there are various competing repositories or further or abandon this particular search)
link collections that can be accessed when searching • Other pages that provide in-depth details for in-
for visualization tools or content. These include the terested visitors, including:
– A detailed description of the tool The download page. This is a crucial page. This page
– Documentation on the use of the tool should not only supply an easy way to download the
– Supporting materials for the instructor tool, but it should be designed to elicit information
from those who download. The first part of this page
– Evaluation instruments and results of prior
should be a genuine plea to the person downloading
evaluations
the tool that asks for help with the project. It should
– Download information be clearly explained that this is an academic (rather
than commercial) project and that the continued suc-
We elaborate further on the links listed above. cess and improvement of the tool depends crucially
The description page The detailed description page should on voluntary feedback from the user community. The
provide: download process should thus have a mandatory reg-
istration procedure that requests information about:
• A comprehensive description of the tool and its
• E-mail and other contact information
use
• The background of the person downloading the
• The levels of targeted learners
software (instructor, student, or other)
• References to the algorithms or concepts being vi-
• The purpose of the download (for use in a course,
sualized so that instructors can determine whether
independent learning, simple inquisitiveness)
the visualization integrates with their way of teach-
ing • The person’s willingness to receive future e-mails
about the use of the tool (along with a clear state-
• Further links to any existing publications describ-
ment that contact information will not be used for
ing the tool and its use
other purposes)
The documentation page. The documentation page should The information on this page should point to the eval-
provide: uation instruments (see the next section) to provide
the person with a clear idea of the kinds of informa-
• Documentation on how to use and install the tool tion that might be requested in the future. It is also
• A statement about whether the tool is still main- always good to provide a free-response box to allow
tained the person to provide additional comments as well.
• A printable tutorial for students to use when learn-
ing to use the tool
2.4 Evaluation
In this section, we propose sample items to include in eval-
The support page. The support page should provide, where uation instruments intended to measure instructor and stu-
possible and appropriate: dent satisfaction with the tool. The purpose is to provide
feedback to the tool designer that will allow modification
• Suggestions on the use of the tool of the tool to improve instructor and student satisfaction.
These instruments will be filled out by instructors and stu-
• Lecture support material (such as PowerPoint slides)
dents after the tool has been used. Thus, the evaluator
• Sample exams and quizzes needs to maintain contact with instructors after they have
• A set of exercises for use with the tool first downloaded the system.

The evaluation page. The purpose of the evaluation page 2.4.1 Evaluation instruments
is twofold: (1) to provide feedback to the tool designer The evaluation instruments should be extremely easy to
on the level of instructor and student satisfaction with fill out so that as great a return as possible is obtained.
regard to use of the tool, and (2) to provide visitors They should be administered online and be automatically
with results of earlier evaluations. As the tool matures, accumulated in a database belonging to the tool designer.
it may even be possible to include formal studies on Requests for written answers should be carefully thought out
the effects of the tool on student learning, but in this and not used excessively so as not to be too time-consuming
section we just provide suggestions for measuring in- for those filling out the form.
structor and student satisfaction with an eye towards The evaluation instrument for instructors should include
making the tool more enticing. questions that obtain:
Thus, what we suggest is that this page include links • The instructor name and contact information
to online statistics-gathering instruments for:
• The content and level of the course in which the tool
• Obtaining feedback from instructors who down- was used
load and use the system
• Course enrollment
• Obtaining feedback from students who use the
system • Assumed prerequisites

This information is so vital to the ongoing success of Scaled questions using the traditional Likert scale with
the tool that we devote Section 2.4 to an elaboration values such as strongly disagree, disagree, neutral, agree,
of possible instruments for these items. strongly agree include:
• The tool is easy to obtain 3. EVALUATION OF LEARNER OUTCOMES
• The tool is easy to install Ultimately, the application of visualization techniques in
the teaching and learning process will become widespread
• The tool is easy for an instructor to use only if instructors have concrete evidence that student per-
• The tool is easy to show and teach to students formance improves and/or that student interest and moti-
vation in the subject are enhanced when visualizations are
• The tool is easy for students to learn used.
• The tool works reliably Since we cannot directly measure the learning process,
the focus of this section is on measuring learning outcomes
• The tool contributes to good learning outcomes and how student attitudes are affected by the use of visu-
Multiple-choice questions which don’t fit on a scale like alization techniques. We provide suggestions for evaluating
the one above include: student learning outcomes and attitudes when visualization
tools are employed. We also offer guidance for the visualiza-
• How did you learn about the tool (private communica- tion tool designer, the visualization designer, and instructors
tion, from a conference, from a Web site, from a Web who desire to study the effects of using visualization tools
search, in a book, . . .)? in their courses. Throughout, we attempt to describe ex-
• How often did you use the tool in the course (one or periments that support the selection of tools for everyday
two times, regularly for part of the course, regularly teaching situations (hereafter TS) as well as experiments
throughout the course, . . .)? that are designed for more formal education research pur-
poses (hereafter RP).
• In what context did you use the tool (classroom pre-
sentation, closed lab exercises, open lab assignments, 3.1 Different forms of evaluation
. . .)? Summative evaluations are those that occur after stu-
• In this context, was the use of the tool (required, op- dents’ use of the tool is completed for the study in ques-
tional)? tion. Formative evaluations occur during the study and are
meant to determine whether project-related objectives are
• How did students interact with the tool (watched, an- being met as the study progresses.
swered questions, provided input or parameters, de-
signed their own visualizations, gave a presentation in 3.1.1 Formative evaluations
which the tool played a role, . . .)? Formative evaluations typically involve qualitative assess-
A student evaluation instrument would ask different ques- ment. For more details about formative evaluations, see [17,
tions. It is hoped that the tool designer, upon contacting 16]. This section discusses formative evaluations in general,
an instructor who downloaded the tool, could encourage the but focuses on those formative evaluations we believe are
instructor to require students to fill out this separate stu- particularly well-suited for visualization.
dent evaluation instrument on the tool Web site. Scaled
questions for measuring student satisfaction include: 1. Student attention to a visualization
• I enjoy using the tool By studying in depth how the student uses the specific
visualization, we can determine if the student is using
• I feel I understand the concept better when using the it in ways the instructor intended. Possible implemen-
tool tation ideas include observations (where the student is
• The tool is easy to use watched, either directly or through a one-way mirror,
performing a specified task) and eye-tracking cameras
• The tool works reliably for me (where it can be determined on which parts of the vi-
Multiple-choice questions for students include: sualization tool the student is focusing).
• For any given assignment, how much time did you 2. Time-on-task
spend with the tool on average (about 5 minutes, about This evaluation may be thought of as both formative
10 minutes, about 15 minutes, about 30 minutes, about and summative. The purpose is to keep track of how
an hour, more than an hour, . . .)? long the student spends working with the visualization
• How many exercises or assignments did you do with tool in an assignment. The simplest approach is to
this tool? have the tool record the time at startup and when it is
shut down. A more detailed implementation involves
• How did you use the tool (watched in class, used in the generation of a log by the visualization tool of all
lab, used in university work area, used on my own student interactions with the tool. While generating
computer, . . .) a log is more difficult to implement (and analyze), it
The evaluation instrument, whether for instructor or stu- does allow for a more detailed analysis of interaction of
dent, should always provide an open field for additional com- the student with the tool. This approach may be used
ments at the end. Further, since an instructor may use in a formative manner. For example, a log would allow
visualizations in different ways in different courses, the in- the instructor to see if the student is having difficul-
structor should be encouraged to complete an evaluation for ties using the tool. Such a formative evaluation allows
each course. We assume that instructors who were willing the instructor to adjust lecture materials, or perhaps
to fill out the instruments would not mind being contacted provide a modified tutorial on the visualization tool’s
again if clarification of responses is needed. use.
This is especially true in the case of virtual learning 3.1.2 Summative evaluations
environments, where the learner may lack direct feed- Summative evaluations summarize the effectiveness and/or
back. Here monitoring the overall student performance results of the study after students have completed their use
in a time-on-task sense is a bit more easily adopted. of the tool. Generally quantitative methods are used in sum-
The feature was recently incorporated, for example, mative evaluations. More details about summative evalua-
into the electronic text book illustrated in [27]. tions may be found in [16]. Like the formative evaluation
Software that allows an instructor to watch the screen subsection, the focus here is on those summative evaluations
of a student at a remote workstation may also be used most applicable to visualization.
when the instructor wants to focus on one particular
student’s interaction with the visualization tool. 1. Analysis of learner understanding using mental models

3. Intermediate student feedback Evaluation of student outcomes using mental models


Anonymous surveys may be employed during the ac- is an analysis technique used primarily in formal ed-
tual usage of the visualization tool in a class to get ucation research studies. Since we have no analysis
students’ impressions of the tool. Students may be method based on mental models for visualizations, we
asked to solve a problem to gauge their comprehen- describe an analysis method that has been developed
sion, how much time they are spending using the tool, for mental models used by programmers. It would be
how much time they are spending on the course with- important to try to find out what is good comprehen-
out involving the tool, their opinion of the materials, sion of algorithms (instead of programming in general).
the lecture, the visualization tool itself, and so forth. By knowing this we would have a solid basis for the
One way to improve the outcome of the survey is to evaluation of learning outcomes. One possibility for
compare the survey data to actual facts known from determining the quality of algorithm comprehension is
the tool (from a log automatically generated by the vi- to use similar study techniques as Pennington [38] did
sualization tool, as discussed in item 2 above). Thus, with programming.
the impression expressed by the students can be “ver- Pennington [38] studied comprehension strategies in
ified” by comparing their opinions with actual test re- programming. Good [18] continued Pennington’s work
sults of the performance. These are not absolute values and developed a classification for analyzing novice pro-
but merely comparable with each other, for example gram comprehension. Her classification has been ap-
to determine the learner’s opinion on the tools versus plied by Good and Brna [19] and Sajaniemi and Kuit-
other resources used in the course. tinen [44]. These studies describe the method and the
Another approach of interest in obtaining intermediate procedure for analyzing the mental models of novice
student feedback is the use of Classroom Assessment programmers. Thus, they can be used as examples
Techniques (CATs) [45]. While not directly related of how to use mental models for evaluating learning
to the use of visualization tools in class per se, CATs outcomes.
appear to be a promising mechanism for obtaining on- Pennington had 40 expert programmers that were the
going student feedback, as well as helping to contex- top and bottom quartile comprehenders (20 each) from
tualize course content for the students. her previous study (see the details in [38]). The pro-
grammers had to make a modification to a program
4. Peer reviews of curricular materials normally maintained by another programmer. After
Peer review is a widely accepted technique for exam- studying the program they were asked to write a sum-
ining the content, construct, and criterion validity of mary of the program, respond to a set of comprehen-
instructional materials [3]. This is particularly valu- sion questions and explain their answers. Then they
able when the instructors’ materials are evaluated by were asked to make the modification and, after that, to
the visualization tool creator, and by other visualiza- summarize the program and respond to another set of
tion experts in the field, as well as by “expert teachers” comprehension questions. Answers to the questions as
in the specific content area where the visualization is well as the program summaries were the basis for the
being used. The instructor may receive valuable feed- measurement of programmers’ comprehension which
back ensuring that proposed use of the visualization then could be classified as good or poor comprehen-
tool in class is pedagogically sound. sion based on the subjects’ quartile.
The basic idea of Good’s analysis method is that com-
5. Student interviews prehension can be studied using program summaries
Interviewing a random subset of the students who have which are analyzed using two different methods: in-
used the specific visualization tool and its associated formation type analysis and object description anal-
materials can provide valuable feedback. By focusing ysis. According to Good and Brna [19], the infor-
on the students’ experiences with the tool, the instruc- mation types classification is used to code summary
tor can gain a detailed understanding of the students’ statements on the basis of the information types they
comprehension, student attitudes towards the visual- contain. The types include eleven categories: func-
ization tool and the subject being studied, students’ tion, actions, operations, state-high, state-low, data,
comfort with using the tool, students’ suggestion for control, elaborate, meta, unclear, and incomplete. For
tool/lecture improvements, and so forth. Interviewing a definition of these terms, refer to [19]. The object
may be done in an individual and/or a small group classification looks at the way in which objects are de-
format. scribed. This classification has seven categories: pro-
gram only, program, program – real-world, program – The 2002 Working Group report [35] provides more
domain, domain, indirect reference, and unclear. detailed examples of the different Bloom levels. The
context used in that report is a course in data struc-
The information types described above help to divide
tures and algorithms.
information into low level information and high level
information. High level types are function, action, 3. Pre- and post-content tests
state-high, and data, while low level types include op-
eration, state-low, and control. Elaborate, meta, and To help determine content mastery where a visualiza-
unclear cannot be classified as either high or low level tion tool is used, pre- and post-tests are particularly
types. effective. Typically, the same test is used for both
tests. The purpose of the pre-test is to determine the
There are two ways to use these analytical methods: level of prior knowledge the student has. It is of ut-
students can be asked to write program summaries or most importance to ask the “right” questions. If in
they can be asked to answer very carefully designed the case of a study designed for TS it is not possible
questions. Program summaries are difficult to analyze to use identical exams for the pre- and post-tests, the
and therefore these kinds of tasks may not be suitable questions on the pre-test may be a proper subset of
in TS. For RP, the analysis method is very useful for those on the post-test.
finding what kind of effect different teaching methods
may have on students’ mental models. Asking ques- 4. Attitude survey
tions is more suitable for TS because the questions Attitude surveys are generally given before and after
can be designed to make the analysis of the answers students’ use of a visualization tool. They are used to
easy. determine changes in students’ attitudes, confidence,
Next we will discuss the design of the questions. Both motivation, and so forth as a result of their experi-
high and low levels of information types should be ence with the visualization. The two most widely ac-
used when asking questions, although the high level cepted surveys for determining student attitudes to-
questions measure deeper understanding and therefore wards computers and computer science, Francis [15],
should be more valuable. It is possible that a student and Loyd and Gressard [33], are somewhat dated. These
whose comprehension is on a high level cannot cor- survey instruments are most appropriate for use in in-
rectly answer the questions on a low level. This is troductory classes. There is certainly a need for the
not unusual and not necessarily bad, since high level development of a newer, more relevant, survey instru-
comprehension replaces low level comprehension when ment to be tested and validated. Instructors of upper-
learning is deep enough [38]. level courses will need to create their own survey in-
struments. The difficulty is that the survey instrument
Object descriptions enable us to ask three kinds of itself will not be experimentally validated. A good
questions: questions about the program code itself, attitude survey provides valuable feedback about the
questions about the domain of the program, and cross- students’ impression of the visualization tool, and is
reference questions in which the student needs to com- typically administered as a pre- and post-test.
bine both the program code and the domain. Those
students who can answer the cross-reference questions 5. Retention and attraction
have a deep comprehension of the program while the Particularly for visualization tools used in lower-level
others have only a surface level comprehension (that courses, it is interesting to monitor the change in stu-
is, they probably recognize the code and/or the do- dent retention in the computer science major and mi-
main but may not understand the connection between nor. For those tools that impact courses for non-
them). From the evaluation point of view the best sit- majors, it is important to examine whether or not the
uation is when a student is able to answer all three use of the visualization helps to encourage students to
kinds of questions. The next best is that a student become computer science majors or minors. Retention
can answer cross-reference questions (deep level) even and attraction statistics are often combined with stu-
though program or domain questions are not answered. dent attitude surveys to help gauge student reaction
to their experiences with the visualization tool.
2. Analysis of learner understanding using levels in Bloom’s
taxonomy 6. Grades
Bloom’s taxonomy [5] is another common tool used Student grades are a measure of student success in
in formal education research studies. The taxonomy a course. While they are generally not as useful as
offers a way to systematically design test tasks for pre- and post-tests to gauge student content mastery,
analyzing learner understanding. Six comprehension grades are easy to collect and analyze.
levels are provided, each of which characterizes an as-
pect of the learning process. The idea is to reduce the 7. Time-on-task
qualitative improvement in learning to discrete levels While described in the formative subsection, time-on-
of comprehension and moreover, to measurable scores. task may also be used as a summative measure. As
The six comprehension levels are knowledge, compre- an example, consider the following. A student’s use
hension, application, analysis, synthesis, and evalua- of a visualization tool as part of an assignment can be
tion. The experiment can be designed to measure the timed. For example, if a student does an assignment in
learner outcome in each of these levels starting from an environment where no clock is available, the student
the knowledge and ending up with evaluation. can be asked how much time elapsed while working on
the assignment. If the student thinks that a smaller Differences in learning styles can be expressed with
amount of time passed than actually did, this indicates learning models, which are general frameworks for char-
that student’s interest in using the visualization tool. acterizing different aspects of learners’ activities. Ex-
See [21] or [4] for more details. amples of such models include Kolb’s experiential learn-
3.1.3 Which of these evaluation methods should be ing [23] and the Felder-Silverman learning model [13].
used? In the following, we discuss the latter model in some
more detail since it seems particularly suited to science
This is, in general, a difficult question to answer. The education.
first question to answer is whether the particular evaluation
of a study is primarily for teaching or research purposes. Felder and Silverman identify four dimensions of stu-
A research study typically requires significantly more as- dent behaviors, each of which has two extremes.
sessment than a study primarily interested in improving a Sensory vs. Intuitive – what type of information does
teaching situation. Additionally, the specific visualization the student preferentially perceive. Sensing learners
tool and its associated visualizations are key determinants prefer collecting information by observation. They of-
of the specific evaluation strategies to be used. For several ten like facts, working with details, and memorizing
sample studies, see the case studies described in Section 4 data. They also prefer a practical approach, experi-
of this paper. mentation and solving problem using standard meth-
ods. Intuitive learners like conceptual thinking, such
3.2 Covariant factors as theories and principles, and grasping new concepts.
If we are going to set up a study where we compare differ-
Visual vs. Verbal – how is the sensory information
ent tools and assignments, we should recognize that learning
most effectively perceived. Visual learners better adopt
style may affect the results. Consider as a trivial example a
visual information and verbal learners prefer informa-
study with only two students, A and B, in which the results
tion in written or spoken form.
of two assignments given to both are compared. Suppose
that in the first assignment, A gets grade of 1 out of 5 and Active vs. Reflective – how does the student prefer to
B gets grade of 5 out of 5. In the second assignment, sup- process information. Active learners learn by trying
pose A gets 5 out of 5 and B gets 1 out of 5. Obviously, the things out and prefer interaction. Reflective learners
average results are the same, and we could claim that we prefer examining and manipulating information intro-
found no difference in results for the assignments. Suppose, spectively.
however, that A is a highly verbal and B a highly visual Sequential vs. Global – how does the student progress
learner and that the assignments were visual and verbal, re- towards understanding. Sequential learners like to pro-
spectively. By considering the learning style as a covariant ceed linearly with small steps using the given instruc-
factor in the study, we would observe a major change in tions. Global learners like to get a holistic view of
students’ performance. acquired knowledge.
Several covariant factors are listed in last year’s working
Initially, Felder and Silverman also had a dimension
group report [35]. Here we discuss in detail some factors as
inductive vs. deductive learners, but they decided to
they pertain to evaluation.
drop it since the model should not promote the exist-
1. Learning styles ing conflict between deductive teachers and inductive
Students have different styles of taking in and process- learners [12].
ing information. Thus, some students are more com- Obviously, all these axes are continuous, that is, each
fortable with reading text while others prefer grasp- student lies somewhere between the extremes. Their
ing information from figures and visualizations. Other orientation can be evaluated by exposing the student
students like to process information actively through to a simple questionnaire, such as presented in [22].
experimentation and observations while others prefer However, we note that orientation may change over
processing information in their minds through intu- time and may depend on the context in which students
ition and reflection. are working.
It is important that we as teachers recognize these dif- The general goal is that our students should extend
ferences, since they very much affect how our students their skills of adopting and processing information with-
feel about studying and how well they succeed at var- in all four axes. To accomplish this goal, the teacher
ious activities and tasks we design. A conflict often may push the change by setting up activities that train
occurring in undergraduate education is that teach- the weaker side of student’s behavior.
ers explain the topic deductively, first concentrating
on principles and theories, and then proceed to ex- Next we present a few examples of assignments that
plain how these theories and principles are applied to demonstrate how to request different types of activities
analyzing phenomena and solving practical problems in this context. The examples are presented in more
[12]. Most undergraduate students, however, seem to detail in [30].
process information inductively, learning details first Consider an assignment that requests the user to trace
and then proceeding to understand principles. As an how various binary tree traversal algorithms work, that
example, in introductory programming courses, most is, list the order in which the nodes are traversed when
students have difficulties with syntax and semantic de- a specific traversal algorithm is applied. A visual form
tails and find it hard to understand the relevant prin- of this exercise could include a picture of a binary tree,
ciples in the background, even if they are explicitly allowing the student to click the nodes on the screen in
explained. the appropriate order. Alternatively, the assignment
could be given in verbal order by giving the tree as an • Rudolf Fleischer – This study will measure the effec-
adjacency list and asking the student to write the keys tiveness of visualizations in the context of a second-
in the nodes as a list according to their traversal order. year course on the theory of computation (finite au-
A sensing learner could solve the exercise by applying tomata, context-free grammars, Turing machines) [14].
a simple mnemonic such that a line is drawn around The study will be done in Spring 2004 at the Hong
the tree starting from the left side of the root and list- Kong University of Science and Technology. IP, SO-
ing each key when the corresponding node is passed TS.
from the left (preorder), below (inorder) or right (pos- • Boris Koldehofe – By using the framework presented in
torder). An intuitive form of the exercise is that we this working group report and considering the results
give the tree and the pseudo code of the traversal al-
of a previous study [25], a new study of instructor’s sat-
gorithm and ask the student to list the nodes in the
isfaction with LYDIAN [24], a simulation-visualization
order the given algorithm visits them. environment for learning and teaching distributed al-
2. Student background surveys gorithms, is planned. The purpose is to evaluate which
Introductory student surveys are useful for obtaining features of LYDIAN contribute to instructor satisfac-
background data (such as previous classes taken, pre- tion and the teachers’ performance when integrating
vious experience, mathematics background, and gen- LYDIAN in their course. The study also tries to deter-
eral entrance exam scores) as well as information such mine the factors which may prevent instructors from
as why they registered for this section of the course. actually using LYDIAN after they download the tool.
This data is particularly useful in helping to deter- IP, IS.
mine whether the student’s background is a factor in • Ari Korhonen and Lauri Malmi – Intervention study
the student’s success either in the particular course, or with automatic assessment – the paper [28] presents
with the particular visualization tool. the results of the large scale (N=550 students) inter-
3. Time-on-task vention study carried out in a virtual learning envi-
ronment based on the TRAKLA system [26]. The sys-
Time-on-task has already been discussed in the for-
tem provides individually tailored exercises that the
mative evaluation subsection. We have also included
learner solves on the Web using an algorithm simu-
it as a covariant factor because it might be expected
lation [29]. The learner receives immediate feedback
that increased time spent using visualization may be
around the clock on his or her performance. The sys-
a factor in performance.
tem has features that traditional instruction cannot
3.3 Testing with human subjects provide. Thus, the study was pursued to research the
We wish to issue a note of warning for those instructors quality, advantages and limitations of this novel ap-
planning to run studies in their classes. In many countries, proach. One of the conclusions was that the learning
approval of “human subjects review boards” is required for environment was as good as if they were solving the
all studies involving students. Written consent from stu- same exercises in class room with human tutors giving
dents involved in studies will also often be required. The the feedback. CP, SO-RP.
case studies in Section 4 provide examples of successful ap-
• Ari Korhonen and Lauri Malmi – This study [34] presents
plications that others have used in broaching this sensitive
some experiences and observations made during 10
area within their institutions. Such applications may serve
years of using the TRAKLA system [26]. It is in-
as useful templates for instructors to use for gaining “human
evitable that learners perform better when they are
subjects” approval at their own schools.
allowed to resubmit their work. However, this is not
the whole story; it is also important to organize the
4. CASE STUDIES course so that the skills and challenges of the learner
Many members of the working group have been and are coincide. Moreover, the grading policy seems to have a
involved in studies that demonstrate the efficacy of the guid- major impact on performance. The study summarizes
ance given in Sections 2 and 3. These are collected online the results and points out changes in learner perfor-
and provide concrete realizations of the principles that are mance under the several changes the course has gone
described in this report. The letter legends used indicate through during the time period. CP, SO-TS.
whether the study is completed and published (CP), com-
pleted but not yet published (CNP), in progress (IP), ori- • Ari Korhonen and Lauri Malmi – This study focuses
ented toward measuring instructor satisfaction (IS), oriented on the effect of different learner engagement levels on
toward measuring student outcomes with respect to every- learning. The ITiCSE 2002 working group report [35]
day teaching situations (SO-TS) or more formal education on visualization prepared a taxonomy of engagement
research (SO-RP). in order to identify and differentiate various types of
Presently, this list includes the following: actions the learners may perform while using visual-
izations. The plan for this study will appear at the
• Stephen Cooper and Wanda Dann – This study ex- Web site of the Computer Science Education Research
amines the use of program visualization for introduc- Group (http://www.cs.hut.fi/Research/COMPSER/)
ing objects and their behaviors using Alice [11], a 3D at Helsinki University of Technology. IP, SO-TS.
program visualization environment. Statistical data
is collected to show evidence of student performance • Marja Kuittinen and Jorma Sajaniemi – This study
and retention in CS1 for test and control groups. Early [31, 44] evaluates the use of variable roles and role-
summaries may be found in [9]. CNP, SO-TS, SO-RP. based animation in teaching introductory programming.
The role concept captures tacit expert knowledge in a Languages and Environments. Available online at
form that can be taught to novices. Details about the http://www.cs.umd.edu/~bederson/talks/
project and evaluation studies can be found at [47]. HCCKeynote-Sept2002.ppt (seen July 14, 2003).
CP, SO-RP. [5] Bloom, B. S., and Krathwohl, D. R. Taxonomy of
Educational Objectives; the Classification of
• Charles Leska – This study examines the impact on Educational Goals, Handbook I: Cognitive Domain.
student performance of using visualization on a unit Addison-Wesley, 1956.
in a computer literacy course. The unit focuses on
[6] Boroni, C. M., Goosey, F. W., Grinder, M. T., and
the binary representation of data and the interplay of
Ross, R. J. Engaging Students with Active Learning
the processor components during the fetch-execution
Resources: Hypertextbooks for the Web. In
cycle. Details about the study can be found at [48].
Proceedings of the 32nd ACM SIGCSE Technical
IP, SO-TS.
Symposium on Computer Science Education (SIGCSE
• Scott Grissom, Myles McNally, and Tom Naps – This 2001), Charlotte, North Carolina (2001), ACM Press,
study compares the effect of three different learner en- New York, pp. 65–69.
gagement levels with visualization in a course mod- [7] Brummund, P. The Complete Collection of Algorithm
ule on introductory sorting algorithms. Results of the Animations, 1998. Available online at
study have been published in [20]. CP, SO-TS. http://cs.hope.edu/~alganim/ccaa/ (see July 14,
2003).
• Jarmo Rantakokko – This study evaluates the effects [8] CITIDEL - Computing and Information Technology
of using algorithm visualization in parallel computing. Interactive Digital Educational Library. Available
The study focuses on student learning outcomes. De- online at www.citidel.org (seen July 14, 2003).
tails about the project and this particular study can [9] Cooper, S., Dann, W., and Pausch, R. Introduction to
be found at [49]. IP, SO-RP. OO: Teaching Objects-First in Introductory
Computer Science. In Proceedings of the 34th ACM
The working group’s URL [46] contains updated informa-
SIGCSE Technical Symposium on Computer Science
tion on these studies and others that have been added since
Education (SIGCSE 2003), Reno, Nevada (2003),
the preparation of this report. ACM Press, New York, pp. 191–195.
[10] Crescenzi, P., Faltin, N., Fleischer, R., Hundhausen,
5. CONCLUSION C., Näher, S., Rößling, G., Stasko, J., and Sutinen, E.
This report has been based on the premise that visual- The Algorithm Animation Repository. Proceedings of
ization can significantly impact CS education only if two the Second International Program Visualization
goals are met – widespread use and positive student out- Workshop, HornstrupCentret, Denmark (2002), 14–16.
comes. Visualization designers have typically not researched Available online at
the factors that would influence instructor satisfaction, and http://www.daimi.au.dk/PB/567/PB-567.pdf.
Section 2 of this report has offered guidance for beginning [11] Dann, W., Cooper, S., and Pausch, R. Making the
this process. Section 3 has provided an overview of student Connection: Programming With Animated Small
outcome measurement techniques. This overview includes World. In Proceedings of the 5th Annual ACM
approaches that can be used for informal everyday teaching SIGCSE/SIGCUE Conference on Innovation and
situations and for more formal education research studies. Technology in Computer Science Education (ITiCSE
We hope this will be of help to both visualization tool devel- 2000), Helsinki, Finland (2000), ACM Press, New
opers and visualization designers who may not be specialists York, pp. 41–44.
in doing such evaluation but who nonetheless are required [12] Felder, R. M. Author’s preface – June 2002.
to validate their product by demonstrating its effectiveness. http://www2.ncsu.edu/unity/lockers/users/f/
felder/public/Papers/LS-1988%.pdf (seen July 14,
6. REFERENCES 2003), 2002.
[1] ACM Special Interest Group on Computer Science [13] Felder, R. M., and Silverman, L. K. Learning Styles
Education. SIGCSE Education Links. Available online and Teaching Styles in Engineering Education.
at http://www.sigcse.org/topics, 2003. Engineering Education 78, 7 (1988), 674–681.
[2] Akingbade, A., Finley, T., Jackson, D., Patel, P., and [14] Fleischer, R. COMP 272: Theory of Computation —
Rodger, S. H. JAWAA: Easy Web-Based Animation A proposed study on the learning effectiveness of
from CS 0 to Advanced CS Courses. In Proceedings of visualizations, 2003. Manuscript.
the 34th ACM SIGCSE Technical Symposium on [15] Francis, L. Attitude towards Computers Scale.
Computer Science Education (SIGCSE 2003), Reno, Computers in Education 20, 3 (1993), 251–255.
Nevada (2003), ACM Press, New York, pp. 162–166. [16] Frechtling, J. The 2002 User Friendly Handbook for
[3] American Educational Research Association, Project Evaluation. National Science Foundation,
American Psychological Association, and National 2002.
Council on Measurement in Education. Standards for [17] Frechtling, J., and Sharp, L. User-Friendly Handbook
Educational and Psychological Testing. American for Project Evaluation. National Science Foundation,
Educational Research Association, Washington, D.C., 1997.
USA, 1999. [18] Good, J. Programming Paradigms, Information Types
[4] Bederson, B. Interfaces for Staying in the Flow. and Graphical Representations: Empirical
Keynote at IEEE Human-Centric Computing Investigations of Novice Program Comprehension.
PhD thesis, University of Edinburgh, 1999. [30] Korhonen, A., Malmi, L., Nikander, J., and Tenhunen,
[19] Good, J., and Brna, P. Toward Authentic Measures of P. Interaction and Feedback in Automatically
Program Comprehension. In Proceedings of the Assessed Algorithm Simulation Exercises. Accepted for
Fifteenth Annual Workshop of the Psychology of publication in Journal of Information Technology
Programming Interest Group (PPIG 2003) (2003), Education (2003).
pp. 29–49. [31] Kuittinen, M., and Sajaniemi, J. First Results of An
[20] Grissom, S., McNally, M., and Naps, T. L. Algorithm Experiment on Using Roles of Variables in Teaching.
Visualization in Computer Science Education: In EASE & PPIG 2003, Papers of the Joint
Comparing Levels of Student Engagement. In Conference at Keele University (2003), pp. 347–357.
Proceedings of the First ACM Symposium on Software [32] Levy, R. B.-B., Ben-Ari, M., and Uronen, P. A. An
Visualization, San Diego, California (2003), ACM Extended Experiment with Jeliot 2000. In Proceedings
Press, New York, pp. 87–94. of the First International Program Visualization
[21] Jenkins, J., and Visser, G. Making Learning Fun. Workshop, Porvoo, Finland (July 2001), University of
Available online at Joensuu Press, Finland, pp. 131–140.
http://www.imaginal.nl/learningFun.htm (seen [33] Loyd, B. H., and Gressard, C. P. Computer Attitude
July 14, 2003). Scale. Journal of Computing Research 15, 3 (1996),
[22] Keirsey, D. M. Keirsey Temperament and Character 241–259.
Web Site. Available online at WWW: [34] Malmi, L., Korhonen, A., and Saikkonen, R.
http://www.keirsey.com (seen July 14, 2003), 2002. Experiences in Automatic Assessment on Mass
[23] Kolb, D. A., Ed. Experiential Learning: Experience as Courses and Issues for Designing Virtual Courses. In
the Source of Learning and Development. Proceedings of the 7th Annual ACM SIGCSE/SIGCUE
Prentice-Hall Inc, New Jersey, USA, 1984. Conference on Innovation and Technology in
[24] Koldehofe, B., Papatriantafilou, M., and Tsigas, P. Computer Science Education (ITiCSE 2002) (Århus,
LYDIAN, An Extensible Educational Animation Denmark, 2002), ACM Press, New York, pp. 55–59.
Environment for Distributed Algorithms. In [35] Naps, T. L., Rößling, G., Almstrum, V., Dann, W.,
Proceedings of the 4th Annual ACM Fleischer, R., Hundhausen, C., Korhonen, A., Malmi,
SIGCSE/SIGCUE Conference on Innovation and L., McNally, M., Rodger, S., and Velázquez-Iturbide,
Technology in Computer Science Education (ITiCSE J. Á. Exploring the Role of Visualization and
2000) (July 2000), ACM Press, New York, p. 189. Engagement in Computer Science Education. ACM
[25] Koldehofe, B., Papatriantafilou, M., and Tsigas, P. SIGCSE Bulletin 35, 2 (June 2003), 131–152.
Integrating a Simulation-Visualization Environment in [36] Nielsen, J. Designing Web Usability. The Practice of
a Basic Distributed Systems Course: A Case Study Simplicity. New Riders Publishing, 1999.
Using LYDIAN. In Proceedings of the 8th Annual [37] Norman, D. ”Top Ten Mistakes” Revisited Three
ACM SIGCSE/SIGCUE Conference on Innovation Years Later. Available online at
and Technology in Computer Science Education http://www.useit.com/alertbox/990502.html (seen
(ITiCSE 2003) (June 2002), p. 226. July 14, 2003), May 1999.
[26] Korhonen, A., and Malmi, L. Algorithm Simulation [38] Pennington, N. Comprehension Strategies in
with Automatic Assessment. In Proceedings of the 5th Programming. In Empirical Studies of Programmers:
Annual ACM SIGCSE/SIGCUE Conference on Second Workshop (1987), G. M. Olson, S. Sheppard,
Innovation and Technology in Computer Science and E. Soloway, Eds., Ablex Publishing Company,
Education (ITiCSE 2000) (Helsinki, Finland, 2000), pp. 100–113.
ACM, pp. 160–163. [39] Price, B., Baecker, R., and Small, I. An Introduction
[27] Korhonen, A., Malmi, L., Mård, P., Salonen, H., and to Software Visualization. In Software Visualization,
Silvasti, P. Electronic course material on Data J. Stasko, J. Domingue, M. H. Brown, and B. A.
structures and Algorithms. In Proceedings of the Price, Eds. MIT Press, 1998, ch. 1, pp. 3–27.
Second Annual Finnish / Baltic Sea Conference on [40] Ross, R. J. Hypertextbooks for the Web. In
Computer Science Education (October 2002), Proceedings of the First International Program
pp. 16–20. Visualization Workshop, Porvoo, Finland (July 2001),
[28] Korhonen, A., Malmi, L., Myllyselkä, P., and Scheinin, University of Joensuu Press, Finland, pp. 221–233.
P. Does it Make a Difference if Students Exercise on [41] Ross, R. J., and Grinder, M. T. Hypertextbooks:
the Web or in the Classroom? In Proceedings of the Animated, Active Learning, Comprehensive Teaching
7th Annual ACM SIGCSE/SIGCUE Conference on and Learning Resources for the Web. In Software
Innovation and Technology in Computer Science Visualization (2002), S. Diehl, Ed., no. 2269 in Lecture
Education (ITiCSE 2002) (Århus, Denmark, 2002), Notes in Computer Science, Springer, pp. 269–284.
ACM Press, New York, pp. 121–124. [42] Rößling, G. Algorithm Animation Repository.
[29] Korhonen, A., Malmi, L., Nikander, J., and Silvasti, P. Available online at http://www.animal.ahrgr.de/
Algorithm Simulation – A Novel Way to Specify (seen July 14, 2003), 2001.
Algorithm Animations. In Proceedings of the Second [43] Rößling, G., and Freisleben, B. Animal: A System for
International Program Visualization Workshop, Supporting Multiple Roles in Algorithm Animation.
HornstrupCentret, Denmark (2002), pp. 28–36. Journal of Visual Languages and Computing 13, 2
Available online at (2002), 341–354.
http://www.daimi.au.dk/PB/567/PB-567.pdf.
[44] Sajaniemi, J., and Kuittinen, M. An Experiment on online at http://www.algoanim.net, 2003.
Using Roles of Variables in Teaching Introductory [47] Kuittinen, M., and Sajaniemi, J. Home Page of the
Programming. Submitted to Empirical Software Study on Variable Roles and Role-Based Animation.
Engineering (2003). Available online at
[45] Schwarm, S., and VanDeGrift, T. Making http://cs.joensuu.fi/~saja/var_roles/, 2003.
Connections: Using Classroom Assessment to Elicit [48] Leska, C. Homepage of the Visualization Study in
Students’ Prior Knowledge and Construction of Computer Literacy. Available online at
Concepts. In Proceedings of the 8th Annual ACM http://faculty.rmc.edu/cleska/public_html/
SIGCSE/SIGCUE Conference on Innovation and algvizstudy.htm, 2003.
Technology in Computer Science Education (ITiCSE [49] Rantakokko, J. Homepage of the Visualization Study
2003) (2003), ACM Press, New York, pp. 65–69. in Parallel Computing. Available online at
[46] Visualization Working Group Home Page. Available http://user.it.uu.se/~jarmo/hgur.html, 2003.

View publication stats

You might also like