Gal Izzi
Gal Izzi
Gal Izzi
c
= 0.63
c
= 0.74
p-value
d
= 0.15
Principles of Micro: 41 12 12 24 21
section 1
c
= 0.39
c
= 0.56
p-value
d
= 0.47
Principles of Micro: 33 12 13 26 24
section 2
c
= 0.52
c
= 0.49
p-value
d
= 0.12
Notes:
a
The total number of multiple choice questions was 50 in the Labor Economics class
and 40 in the Principles classes. They were evenly divided between new and previously
answered questions.
b
The total possible score in the essays/problems section was 50 in the Labor Economics
class and 40 in the Principles classes.
c
indicates correlation coefficients between individual scores in the two multiple
choice sections
d
The p-value is for a two sample t tests for differences in mean values.
(lines a and b in the diagram; correlation coefficient 0.63), a statistical t-test rejected
the hypothesis that there was any statistically signicant difference in the mean of
correct answers between the two groups of multiple choice questions (p-value =
0.15) (Table 1).
To further assess the results, the instructor compared the overall performance in
the two combined multiple choice sections of the exam with records from classes
taught in three previous academic years. Grading policy and exam difficulty was
consistent across all years because the multiple choice questions had been
extracted from the same textbook test bank and were either identical or tested the
same concepts at the same difficulty level (except for the exam administered
during the experiment with half the multiple choice questions extracted from the
homework online quizzes). Figure 3 reects this analysis.
Once again t-tests were conducted to test for differences in mean scores, and the
mean performance was found to be not statistically signicant across the different
academic years (p-values at least > 0.26). The opportunity to preview and practice
online the material covered in the text did not result in better performance:
students did not show a signicantly greater ability to answer correctly the multiple
choice questions they had already been exposed to, and their overall performance
in the exam did not improve.
Because students found out about the surprise component of the experiment
(students realized that they had been tested on questions they had already
answered as homework), it was not possible to replicate this experiment with other
exams of the Labor Economics class. Therefore, the same experiment was conducted
the following year, 2007, with two sections of a Principles of Microeconomics class.
33.5
29.76
27.8
30.7
32.27
28
27
29
0
5
10
15
20
25
30
35
40
45
50
2003 2004 2005 2006
Academic year
Mean
Median
#
o
f
m
u
lt
ip
le
c
h
o
ic
e
q
u
e
s
t
io
n
s
Figure 3: Correct answers to multiple choice questions (upper level class)
An assessment of the impact of online quizzes and textbook resources on students learning
39
There were, however, also some dissenting voices, noting how online quizzes may
be difficult to monitor and how quizzes do not really test the depth of students
understanding:
I did them each time; it was useful because it made you read the textbook.
It was helpful for me but if someone wanted to they could easily cheat on it.
I dont think its the best representation of what students understand
I have completed every quiz assigned. I found it somewhat helpful because
the midterm exam was based strongly on these types of questions.
However, I tend to retain information more so when assigned analytical,
essay type questions and problems. I did not nd it very helpful in studying
and understanding concepts because there is very little room for personal
relevance. I do nd online assignments useful because in any industry these
tools are essential. Familiarity with technological application to assignments
will better prepare us for a professional career.
When I did the online quizzes I simply looked at the question and tried go
back and nd the answer in my book. No problem there. A problem arose,
for me, when it came time to decide which of these questions were
important or not. (Which ones should I study?) I could not decide. If I cannot
decide which questions need more attention in order for me to grasp the
concepts, then I will not know what to study for when it comes time for the
nal or midterm. I actually felt the questions asked were not important
because they did not relate so much to the material which was taught in
class, and being a student, I focused on what the professor focused on in
class. I realize now that I am a fool for underestimating these questions.
In addition, only three out of the 14 survey respondents reported that they used
the additional online textbook resources, and it was always just to read the
summaries of the textbook chapters.
Discussion and conclusions
This study aims to assess the value of online textbook resources. In particular, it
focuses on the learning that occurs through the use of online textbook quizzes. It
makes use primarily of records collected in a Labor Economics class. However,
because of the small number of students observed in this upper level class, a
problem that is often encountered in analyses that explore students learning in
web-enhanced or intermediate courses (Anakwe,2008; Dahlgran, 2008; Mukherjee
and Cox, 2001; Smolira, 2008), this studys main results were also tested on two
larger sections of an introductory level Principles of Microeconomics class.
International Review of Economics Education
38
This produced a total of 74 additional students. Again, students were previously
assigned online quizzes as homework which could be found on the textbook
website. Without students knowing, some of these quizzes were then included in
the multiple choice section of a mid-term exam. Table 1 compares the results of this
experiment across the different classes. This time, the average number of correct
answers was actually the same or higher in the new multiple choice questions, but
again the mean score was found to be not statistically different from that of the
previously answered on-line quizzes. Compared to the upper level class, however, I
found lower correlation coefficients between the two sets of multiple choice
questions (0.39 and 0.52) and between the whole multiple choice section and the
problems and essay sections of the exam (0.56 and 0.49).
These results lead to the conclusion that while multiple choice tests are reasonable
predictors of students overall performance, homework consisting in online quizzes
did not have any signicant effect on students performance: most students who
did well in the exam did so regardless of whether or not they had already
encountered and answered some of the exam questions in their online homework.
Students feedback
A possible explanation of these results could be that students had not really
applied themselves seriously to the solutions of the homework online quizzes. After
all, they knew that what counted was their participation in this homework and not
their nal performance on it: the instructor had decided not to make the
performance on the online quizzes a component of the grade because the
textbook website technology permitted students to revise their answers before
submitting them. A different textbook website could have been smarter in this
regard, but still would not have solved the problem of monitoring whether
students were answering online quizzes on their own or with other peoples help.
Had this been the case, then online quizzes would have represented neither a valid
performance measure nor a real learning moment for students. Surprisingly,
however, students assertions suggested the opposite. The instructor had decided
to administer an open responses survey at mid-semester. The goal was to better
understand the use that students had made of the textbook online resources.
Specic questions were asked about the textbook online quizzes. 82% of students
returned the survey. Their responses could be summarized in this quote:
I did the online quizzes when they were assigned. I found them helpful for
pulling out the key points in the chapter. I would go back and reread the
sections in the book that contained the answers for the questions. I know if I
had gone back over the questions again before the test I would have done
better. I didnt mind these assignments; they are easy enough to get to and use.
An assessment of the impact of online quizzes and textbook resources on students learning
41
2001; Buckles and Siegfrid, 2006), although this study conrms that multiple choice
tests are reasonable predictors of students overall performance (Walstad and
Becker, 1994; Mukherjee and Cox, 2001; Anakwe, 2008). But multiple choice tests are
unlikely to disappear (Schaur et al., 2008), given the heavy teaching load faced by
faculty in many academic institutions. These are often the same schools where,
given students socio-economic status and conicting time demands, online work
may legitimately be seen as a tool to increase students learning and retention. The
results of this study conrm that web homework increases students participation,
but that textbook online multiple choice quizzes do not represent the best
investment for time spent online. They are not the useful learning tools that
textbook publishers and, even more so, instructors would like them to be. Students
need more comprehensive and interactive online assignments.
References
Agarwal, R. and Day A. E. (1998) The impact of the internet on economic education,
Journal of Economic Education, Vol.29, pp. 99110.
Anakwe, B. (2008) Comparison of student performance in paper-based versus computer-
based testing, Journal of Education for Business, Vol.84, pp. 1317.
Becker, W. E. and Johnston C. (1999) The relationship between multiple choice and essay
response questions in assessing economics understanding, Economic Record, Vol.75,
pp. 348357.
Becker, W. and Watts (2001) Teaching economics at the start of the 21st century: Still
chalk-and-talk, American Economic Review, Vol.91, pp. 446451.
Biktimirov, E. N. and Klassen K. J. (2008) Relationship between use of online support
materials and student performance in an introductory nance course, Journal of
Education for Business, Vol.83, pp. 153158.
Brown, B. W. and Liedholm C. E. (2002) Can web courses replace the classroom in
principles of microeconomics? American Economic Review, Vol.92, pp. 444448.
Buckles, S. and Siegfried J. J. (2006) Using multiple-choice questions to evaluate in-depth
learning of economics, Journal of Economic Education, Vol.37, pp. 4857.
Chickering, A. and Ehrmann S. (1996) Implementing the seven principles. Technology as
Lever. AAHE Bulletin 3, October, pp. 36.
Dahlgran, R. A. (2008) Online homework for agricultural economics instruction:
Frankensteins monster or robo TA? Journal of Agricultural and Applied Economics, Vol.40,
pp. 105116.
Devadoss, S. and Foltz J. (1996) Evaluation of factors inuencing student class
attendance and performance, American Journal of Agricultural Economics, Vol.78,
pp. 499507.
Dufresne, R., Mestre J., Hart D. M. and Rath K. A. (2002) The effect of web-based homework
on test performance in large enrollment introductory physics courses, Journal of
Computers in Mathematics and Science Teaching, Vol.21, pp. 229251.
Elliott, C. (2003) Using a personal response system in economics teaching, International
Review of Economics Education, Vol.1, pp. 8086.
International Review of Economics Education
40
The use of textbook resources was part of the more general implementation of a
course website. The introduction of the course website by itself had a very positive
effect in increasing students commitment to their coursework. The instructor
noticed a remarkable increase both in terms of completion of the required
assignments and in terms of the use students made of their textbook and of
additional written resources. However, participation in online textbook assignments
did not make any signicant difference in the students ability to score higher
grades on their written exams. This happened, despite the fact that the exam
included several quizzes that had been previously assigned as online homework.
The analysis of qualitative data collected through an open response survey,
however, suggested that many students found the online homework to be quite a
useful study tool, feedback similar to what has been documented in other studies
(Tse et al., 2007; Smolira, 2008).
These results could possibly be explained by the fact that, because of the features
of the textbook online quizzes (the website technology permitted students to
check and revise their answers before submitting them), the instructor had decided
to monitor and reward participation in online homework, but not the scores. It is
therefore possible that many students did not review their homework
computerized quizzes for errors. The effort of going over homework quizzes has
been found to positively inuence performance (Johnson et al., 2002) although
there is also mixed evidence about the benets of grading homework, giving
feedback and providing access to homework solutions (Peters et al., 2002;
Biktimirov and Klassen, 2008; Hadsell, 2009). In addition, research has indeed shown
that when unsupervised online quizzes are tied to incentives for participation (e.g.
grades) students may be induced to cheat (Kibble, 2007; Passow et al., 2006).
A different or additional explanation could be, however, that the large majority of
observed students were heavily involved in work activities outside of school: 61%
of students enrolled in the upper level class had responded that they were
registered as full-time students but also employed for more than 36 hours per
week. This type of student is known to face difficulties with school work (Devadoss
and Foltz, 1996; Kirby and McElroy, 2002; Stinebrickner and Stinebrickner, 2003), and
for them classroom time constitutes the main learning component of their college
experience. The introduction of a course website and the required online
homework quizzes increased their dedication and participation in the class. The
availability of online textbook resources did not signicantly change the
effectiveness of the time students actually dedicated to studying, however.
Students learning is unlikely to be fully captured by one simple assessment tool
such as multiple choice questions ((Becker and Johnston, 1999; Krieg and Uyar,
An assessment of the impact of online quizzes and textbook resources on students learning
43
Tse, M. M.Y., Pun S. P. Y., Chan, M. F., (2007) Pedagogy for teaching and learning
cooperatively on the web: A web-based pharmacology course, CyberPsychology &
Behavior, Vol.10, pp.32-37.
Walstad, W. B. and Becker W. E. (1994) Achievement differences on multiple-choice and
essay tests in economics, American Economic Review, Vol.84, pp. 193196.
Acknowledgments
I thank Mary Beaudry, Alease Bruce, Steven Tello and the participants in the
University of Massachusetts Lowell workshop on 'Reective Teaching with
Technology.' They all offered many useful insights and comments.
Contact details
Monica Galizzi
University of Massachusetts Lowell
Falmouth 302 F
One University Avenue
Lowell, MA 01854
Tel: (978)934-2790
Fax: (978)934-3071;
Email: Monica_Galizzi@uml.edu
International Review of Economics Education
42
Goffe, W. L. and Sosin K. (2005) Teaching with technology: May you live in interesting
times, Journal of Economic Education, Vol.36, pp. 278291.
Hadsell, L. (2009) The effect of quiz timing on exam performance, Journal of Education for
Business, Vol.84, pp. 135141.
Harter, C. L. and Harter J. F. R. (2004) Teaching with technology: Does access to computer
technology increase student achievement? Eastern Economic Journal, Vol.30, pp.
507514.
Johnson, D. L., Joyce P. and Sen S. (2002) An analysis of student effort and performance in
the nance principles course, Journal of Applied Finance, Vol.12, pp. 6772.
Katz, A. and Becker W. E. (1999) Technology and the teaching of economics to
undergraduates, Journal of Economic Education, Vol.30, pp. 194199.
Kibble, J. (2007) Use of unsupervised online quizzes as formative assessment in a medical
physiology course: Effects of incentives on student participation and performance,
Advances in Physiology Education, Vol.31, pp. 253260.
Kinzie, S. (2006) Swelling textbook costs have college students saying Pass. Washington
Post, 23 January.
Kirby, A. and McElroy, B. (2003) The effect of attendance on grade for rst year economics
students in University College Cork, Economic and Social Review, Vol.34, pp. 311326.
Krieg, R. G. and Uyar B. (2001) Student performance in business and economics statistics:
Does exam structure matter? Journal of Economics and Finance, Vol.25, pp. 229241.
Lass, D, Morzuch, B. and Rogers, R. (2007) Teaching with technology to engage students and
enhance learning, University of Massachusetts Amherst, Department of Resource
Economics, Working Papers: 20071.
McClure, J. and Spector L. (2003) Behavior and performance in the economics classroom,
Educational Research Quarterly, Vol.27, pp. 1523.
Mukherjee, A. and Cox J. (2001) Using electronic quizzes to promote self-reliance in
minicase analysis in a decision support systems course for MIS majors, Journal of
Education for Business, Vol.76, pp. 221225.
Palocsay, S. W. and Stevens S. P. (2008) A study of the effectiveness of web-based
homework in teaching undergraduate business statistics, Decision Sciences Journal of
Innovative Education, Vol.6, pp. 213232.
Passow, H., Mayhew M., Finelli C., Harding T. and Carpenter D. (2006) Factors inuencing
engineering students decisions to cheat by type of assessment, Research in Higher
Education, Vol.47, pp. 643684.
Peters, M. H., Kethley R. B. and Bullington K. (2002) The relationship between homework
and performance in an introductory operations management course, Journal of
Education for Business, Vol.77, pp. 340344.
Schaur, G., Watts M. and Becker W. (2008) Assessment practices and trends in
undergraduate economics courses, American Economic Review, Vol.98, pp. 552556.
Siegfried, J. J. (1996) Teaching tools: How is introductory economics taught in America?
Economic Inquiry, Vol.34, pp. 182192.
Smolira, J. C. (2008) Student perceptions of online homework in introductory nance
courses, Journal of Education for Business, Vol.84, pp. 9094.
Stinebrickner, R. and Stinebrickner T. (2003) Working during school and academic
performance, Journal of Labor Economics, Vol.21, pp. 473491.