EDUCATOR
SCIENCE
THE NATIONAL SCIENCE EDUCATION LEADERSHIP ASSOCIATION JOURNAL
Fall 2009 • Volume 18, Number 2
Science Educator
EDITOR: Jack Rhoton
East Tennessee State University
Fall 2009 Vol. 18 No. 2
Table of Contents
Officers and Members of the Executive Committee
National Science Education Leadership Association
ExecutiveDirector
SusanSprague
2011HollyDrive
Prescott,AZ86305
Phone:(928)771-1030
Fax:(480)240-4188
susansprague@yahoo.com
President
BrendaShumateWojnowski
Wojnowski&Associates,Inc.
6318ChurchillWay
Dallas,Texas75225
Phone:(214)288-9779
bwojnowski@gmail.com
PastPresident
LindaAtkinson
3100MonitorAve,Suite200
Norman,OK73072
Phone:(405)623-2365
Fax:(405)325-7592
latkinson@ou.edu
DirectorRegionD
LA,TX,OK,AR,MS,AL,NM
JeffreyPatterson
NormanPublicSchools
ScienceCoordinator
131SouthFlood
Norman,OK73069
Phone:(405)366-5821
Fax:(405)573-3505
eallan@ucok.edu
Nominations&Elections
Committee
Co-Chair
PatShane
CB#3500,309PeabodyHall
UNCCenterforMath/ScienceEducation
ChapelHill,NC27599-3500
Phone:(919)966-3092
Fax:(919)962-0588
pshane@unc.edu
DirectorRegionE
IL,IN,MI,MN,WI,KS,OH,CO,IA,
MO,NE,ND,SD,MT,WY
SusanKoba
1219North54St.
Omaha,Nebraska68132
Phone:(402)561-0176
skoba@cox.net
Nominations&Elections
Committee
Co-Chair
JerryValadez
CentralValleyScienceProject
1231S.WaverlyLand
Fresno,CA93727
Phone:(559)288-4953
jdvscience@aol.com
President-Elect
JaneyKaufmann
K-12CurriculumCoordinator
16467North109thWay
Scottsdale,AZ85255
Phone:(480)484-8026
Fax:(480)484-6887
jkaufmamm@susd.org
DirectorRegionF
CA,WA,OR,ID,AK,HI,NV,AZ,UT,
USTerritories&
InternationalChapters
XanSimonson
MesaPublicSchools
BiotechCoordinator
702WEmelitaAve.
Phone:(480)472-5783
Fax:(480)472-5990
rhinoxan@cox.net
Secretary
TrishaHerminghaus
5530EastNorthernLights
Anchorage,AK99504
Phone:(907)345-5512
Fax:(907)742-4581
herminghaus_trisha@asdk12.org
MembershipCommitteeChair
BethSnokeHarris
98WestLakeAvenue
Hendersonville,NC28739
Phone:(828)692-9875
Fax:(801)659-3351
beth@seven-oaks.net
Treasurer
KarenCharles
RTIInternational
3040CornwallisRoad
ResearchTriangle,NC27709
Phone:(704)876-1575
kcharles@rti.org
AwardsCommittee
ThomasenaWoods
NSFDirectorate.Educationand
HumanResources
4201WilsonBoulevard,Suite875
Phone:(703)292-7850
Fax:(703)292-9047
twoods@nsf.gov
DirectorRegionA
CT,MA,NH,RI,VT,ME
JoyceTugel
MaineMathematics&ScienceAlliance
SouthernNHOffice
6BirchLane
Barrington,NH03825
Phone:(207)899-8661
jtugel@mmsa.org
DirectorRegionB
NY,NJ,PA
EricWalters
DirectorofScienceandTechnology
TheMarymountSchoolofNewYork
1026FifthAvenue
NewYork,NY10028
Phone:(212)744-4486,Ex.142
eric_walters@marymount.k12.ny.us
DirectorRegionC
DE,DC,MD,VA,NC,SC,WV,FL,
GA,TN,KY
DianneAffleck
UNCChapelHillNC-MSEN
Pre-CollegeProgram
309PeabodyHall,CB#3500
ChapelHill,NC27599
Phone:(919)962-9362
Fax:(919)962-0588
affleck@email.unc.edu
InformalScienceCommittee
CherylLaniJuarez
DirectorofProfessionalDevelopment
CenterforInteractiveLearning
MiamiScienceMuseum
1320SouthDixieHighway,Suite720
CoralGables,FL33146
Phone:(305)284-2757
cheryl@miamisci.org
MulticulturalCommitteeChair
AnaLopez
3132E.Fairmont,Building5
Fresno,CA93726
Phone:(559)248-7181
Fax:(559)227-5404
aglopez@comcast.net
Webmaster
BethSnokeHarris
98WestLakeAvenue
Hendersonville,NC28739
Phone:(828)692-9875
Fax:(801)659-3351
beth@seven-oaks.net
SafetyComplianceOfficer
Parliamentarian,ICASE
Representative
KennethR.Roy
GlastonburyPublicSchools
330HubbardStreet
Glastonbury,CT06033
Phone:(860)652-7200,Ex.2002
Fax:(860)652-7275
royk@glastonburyus.org
NSTARepresentative
ChristineRoyce
AssociateProfessorofEducation
P.O.Box305
Newberg,PA17240
Phone:(717)477-1681
Fax:(717)477-4046
gpollock@casscomm.com
PositionStatementsCommittee
MariaAliciaLopezFreeman
CaliforniaScienceProject
3806GeologyBuilding,UCLA
LosAngeles,CA90095-1567
Phone:(310)794-4861
mafreema@ucla.edu
ProfessionalDevelopmentChair
MaryLouiseBellamy
TheScienceHouse,NCSU
K-12OutreachCoordinator
104GreyHorseDrive
Cary,NC27513
Phone:(919)513-3419
Fax:(919)515-7545
mlbellam@unity.ncsu.edu
PublicationsCommitteeCo-Chair
TheNavigatorEditor
LindaAtkinson
3100MonitorAve,Suite20
Norman,OK73072
Phone:(405)623-2365
Fax:(405)325-7592
latkinson@ou.edu
PublicationsCommitteeCo-Chair
ScienceEducator/NSELABooks
JackRhoton
EastTennesseeStateUniversity
DivisionofScienceEducation
Box70684
JohnsonCity,TN37614
Phone:(423)439-7589
rhotonj@etsu.edu
PublicRelationsCommittee
JanisSlater
ScienceProgramsCoordinator
K20Center
UniversityofOklahoma
3100MonitorAve.,Suite200
Norman,OK73072
Phone:(405)325-2683
Fax:(405)325-7592
jslater@ou.edu
NSELAOfficeManager
JudyHamilton
NSELA
2260ThumbButteRoad
Prescott,AZ86305
Phone:(928)420-3774
judyhamilton51@gmail.com
Program for International Student Assessment (PISA)
2006 and Scientific Literacy: A Perspective
For Science Education Leaders* ....................................1
Rodger W. Bybee
Opening the Classroom Door: Professional
Learning Communities in the Math and
Science Partnership Program .......................................14
James E. Hamos, Kathleen B. Bergin, Daniel P. Maki,
Lance C. Perez,Joan T. Prival, Daphne Y. Rainey,
Ginger H. Rowell, and Elizabeth VanderPutten
3D Multi-User Virtual Environments: Promising
Directions for Science Education ..................................25
Yufeng Qian
Analysis of Student Responses to Peer-Instruction
Conceptual Questions Answered Using an
Electronic Response System: Trends by
Gender and Ethnicity ....................................................30
David Steer, David McConnell, Kyle Gray,
Karen Kortz, Xin Liang
No Child Left Behind and High School Astronomy ........39
Larry Krumenaker
Improving Science Achievement Through
Changes in Education Policy ........................................49
Tara M. Owens
Impact of Inquiry-Based Professional Development
on Core Conceptions and Teaching Practices:
A Case Study...................................................................56
Mahsa Kazempour
Editoral Board
HansAndersen
IndianaUniversity
HerbBrunkhorst
CaliforniaStateUniversity
DawnDelCarlo
UniversityofNorthernIowa
GaryHedges
MontgomeryCounty
PublicSchools,MD
EllaIngram
RoseHulman
PolytechnicUniversity
PaulKeller
UniversityofIllinois
ChristineLotter
Universityof
SouthCarolina
GerryMadrazo
HawaiiDepartment
ofEducation
JamesMaKinster
HobartandWilliam
SmithColleges
JamesE.McLean
TheUniversity
ofAlabama
KenMiller
MontanaStateUniversity
LaMoineMotz
Waterford,Michigan
BrendaWojnowski
NationalInventors
HallofFame
RobertYager
UniversityofIowa
EllenYezierski
GrandValley
StateUniversity
NingfengZhao
DepartmentofChemistry
EastTennesseeStateUniversity
Manuscripts:SubmitmanuscriptstotheEditor,JackRhoton,EastTennesseeState
University,CenterofExcellenceinMathematicsandScienceEducation,Box70301,Johnson
City,TN37614-1701.Refertotheinformationforauthorselsewhereinthisjournal.The
opinionsandstatementspublishedaretheresponsibilityoftheauthors,andsuchopinions
andstatementsdonotnecessarilyrepresentthepoliciesofNSELA.Annualmembershipdues
forNSELAare$75.00,$30individualjournal-only.Non-memberswillbeassessedachargeof
$10.00perissue.Forsubscriptionordersandcustomerservice,call(423)439-7589.
TheScienceEducator(ISSN1094-3277Copyrighted2009bytheNationalScience
EducationLeadershipAssociation(NSELA)ispublishedbiannuallywithprovisionsfor
quarterlypublicationinthefuture.PrintedattheUniversityPress,EastTennesseeState
University,JohnsonCity,TN37614,thejournalservesasaforumforpresentationand
discussionofissuespertainingtoscienceeducationleadership.ItisindexedintheERIC
ClearinghouseforScience,Mathematics,andEnvironmentalEducation.Generalinquiriesmay
bedirectedto:ExecutiveDirectorSusanSprague<susansprague@yahoo.com>orMembership
Co-Chair,BethSnokeHarris<beth@seven-oaks.net>,orJackRhoton,Editor,EastTennessee
StateUniversity,Box70684,JohnsonCity,TN37614(Ph: (423)439-7589;
e-mail: rhotonj@etsu.edu).
Rodger W. Bybee
Program for International Student
Assessment (PISA) 2006 and
Scientific Literacy: A Perspective
For Science Education Leaders*
This article describes the idea of scientific literacy as defined in PISA,
discusses relevant results of PISA, and clarifies meaningful relationships
between PISA data and scientific competencies of U.S. students. Finally, the
author includes insights and recommendations for contemporary leadership
in science education.
PISA’s 2006 measurement of
scientific literacy has connections to
several themes that President Obama
has included in his discussions of
scientific issues and visions for
science education. The President
consistently includes themes of
economic development, energy
efficiency, environmental quality,
health maintenance, and the importance
of scientific knowledge in national
policy. In science education, the
President has indicated that, over
the next decade, achievement of
American students must move from
the middle to the top on international
assessments, including PISA. The
description of scientific literacy in
PISA 2006, the dismal results of U.S.
students, and themes described by the
current administration have clear, but
challenging, implications for science
education in the United States.
What Is Meant by Scientific
Literacy?
Scientific literacy has become the
term used to express the broad and
encompassing purpose of science
education. The use of the term in the
U.S. most likely began with James
Bryant Conant in the 1940s (Holton,
1998) and was elaborated for educators
in a 1958 article by Paul DeHart
Hurd entitled “Science Literacy: Its
Meaning for American Schools.”
Hurd described the purpose of
Scientific literacy has
become the term used
to express the broad and
encompassing purpose of
science education.
scientific literacy as an understanding
of science and its applications to
social experience. Science had such
a prominent role in society, Hurd
argued, that economic, political,
and personal decisions could not be
made without some consideration of
the science and technology involved
(Hurd, 1958). Hurd made a clear
connection between science and
citizenship; yet, even today most
school science programs emphasize
content and methods that represent
preparation for a professional career in
science. In contrast, scientific literacy
as it should be manifest in educational
policies, programs, and practices has
the explicit goal of preparing students
for life and work—as citizens.
The historical perspective of
scientific literacy, however, is not
the reality of contemporary science
education. Academic researchers
* Portions of this article were presented in March 2008 at an Education Reform International Symposium organized by the National Institute
for Educational Policy (NIER), Tokyo, Japan and in June 2009 at a National Center for Education Statistics PISA Research Conference,
Washington, DC, USA.
FALL 2009
VOL. 18, NO. 2
1
debate the real meaning of the
term, classroom teachers claim
their students are attaining scientific
literacy, and national and international
assessments provide evidence that,
somewhere between the abstract
purpose and concrete practice, the
science education community has
failed to achieve this goal.
Students with a more
developed scientific
literacy demonstrate the
ability to use conceptual
models to explain natural
phenomena, to formulate
explanations, to evaluate
alternative explanations of
the same phenomena, and to
communicate explanations
with precision.
Several authors have clarified the
curricular orientation and instructional
emphasis of scientific literacy as a
purpose of science education. George
DeBoer (2000) has provided an
excellent historical and contemporary
review of scientific literacy. Robin
Millar (2006) addressed historic and
definitial issues of the term before
outlining the role of scientific literacy
in a contemporary curriculum—
Twenty First Century Science.
Two other essays stand out when
discussions turn to contemporary
science education and the challenges
of attaining higher levels of scientific
literacy in the U.S. In “Science
Education for the Twenty First
Century,” Jonathan Osborne (2007)
makes a clear case that although
scientific literacy is stated as a goal,
2
contemporary science education is
primarily “foundationalist” in that
it emphasizes educating for future
scientists more than educating future
citizens. Douglas Roberts published
a chapter on scientific literacy in the
Handbook of Research on Science
Education (Abell & Lederman,
2007). Roberts describes a long
history of political and intellectual
tension between scientific literacy
and foundational science. The two
politically conflicting emphases can be
stated in a question: Should curriculum
emphasize science subject matter
itself, or should it emphasize science
in life situations in which science plays
a key role? Curriculum designed to
answer the former, Roberts refers to as
Vision I, and the latter he refers to as
Vision II. Vision I looks within science,
while Vision II uses external contexts
that students are likely to encounter
as citizens. These two visions of
science are evident in contemporary
discussions of core content for national
standards for science.
How Is Scientific Literacy
Defined in PISA 2006?
In PISA 2006, scientific literacy
referred to four interrelated features
that involve:
• Scientific knowledge and use
of that knowledge to identify
questions, to acquire new
knowledge, to explain scientific
phenomena, and to draw
evidence-based conclusions
about science-related issues;
• Understanding of the
characteristic features of
science as a form of human
knowledge and inquiry;
• Awareness of how science and
technology shape our material,
intellectual, and cultural
environments; and
• Willingness to engage in
science-related issues, and
with the ideas of science, as a
constructive, concerned, and
reflective citizen (Organization
for Economic Co-operation and
Development [OECD], 2006).
In PISA 2006, scientific literacy was
perceived as a continuum from less
developed to more developed scientific
competencies that include levels of
proficiency. So, for example, the
student with less developed scientific
literacy might be able to recall simple
scientific factual knowledge about a
physical system and to use common
science terms in stating a conclusion.
Students with a more developed
scientific literacy demonstrate the
ability to use conceptual models
to explain natural phenomena, to
formulate explanations, to evaluate
alternative explanations of the same
phenomena, and to communicate
explanations with precision.
How Was Scientific Literacy
Assessed in PISA 2006?
PISA 2006 situated its definition
of scientific literacy and its science
assessment questions within a
framework that used the following
categories: scientific contexts (i.e.,
life situations involving science
and technology), the scientific
competencies (i.e., identifying
scientific issues, explaining phenomena
scientifically, and using scientific
evidence), the domains of scientific
knowledge (i.e., understanding of
scientific concepts and the nature of
science),and attitudes toward science
(i.e., interest in science, support for
scientific inquiry, and responsibility
SCIENCE EDUCATOR
toward resources and environments).
scientific evidence. These three
key scientific competencies were
These four aspects of the PISA 2006
conception of scientific literacy are
selected because of their relationship
to the practice of science and their
illustrated in Table 1.
The scientific contexts align with
connection to key abilities such
various issues citizens confront.
as inductive/deductive reasoning,
systems-based thinking, critical
PISA 2006 Science items were
framed within a wide variety of life
decision making, transformation
of data into tables, construction of
situations involving science and
technology, primarily: “health,”
arguments and explanations based
“natural resources,” “environmental
on data, thinking in terms of models,
and use of mathematics. Table 2
quality,” “hazards,” and “frontiers of
describes the features of the three
science and technology.”
competencies.
The PISA 2006 science
The scientific competencies can
competencies required students to
be illustrated with a contemporary
identify scientific issues, explain
���������������������������������������������������
example. Global climate change has
phenomena
scientifically, and use
become one of the most talked about
and threatening global issues.As people
read or hear about climate change, they
must separate the scientific reasons for
a response from economic, political,
and social issues. Scientists explain,
for example, the origins and material
consequences of releasing carbon
dioxide into the Earth’s atmosphere.
This scientific perspective has been
countered with an economic argument
for continued use of carbon-based fuels
and against reduction of greenhouse
gases. Citizens should recognize
the difference between scientific
and economic positions. Further, as
people are presented with more, and
Table 1: Framework for PISA 2006 Science Assessment
������������������������
��������
Life situations
that involve
science and
you technology
you to:
to:
����������
���������������
������������
��������������������
��������������������������
What you know:
• about the natural
world (knowledge
���� science), and
• Identify scientific
Issues,
Require
you to:
• Explain
phenomena
scientifically, and
How you do
so is influenced
by:
• about science itself
(knowledge �����
science).
• Use scientific
evidence. Require you
�������������������������
������������
How you respond
to science issues
(interest, support
for scientific
enquiry,
responsibility)
FALL 2009
VOL. 18, NO. 2
3
Table 2: PISA 2006 Scientific Competencies
Identifying scientific issues
• Recognizing issues that are possible to investigate scientifically
• Identifying keywords to search for scientific information
• Recognizing the key features of a scientific investigation
Explaining phenomena scientifically
• Applying knowledge of science in a given situation
• Describing or interpreting phenomena scientifically and predicting changes
• Identifying appropriate descriptions, explanations, and predictions
Using scientific evidence
• Interpreting scientific evidence and making and communicating conclusions
• Identifying the assumptions, evidence, and reasoning behind conclusions
• Reflecting on the societal implications of science and technological developments
sometimes conflicting, information
about phenomena, such as climate
change, they need to be able to access
collective scientific knowledge and
understand, for example, the scientific
basis for evaluations by bodies such
as the Intergovernmental Panel on
Climate Change (IPCC) versus the
basis for perspectives by individuals
representing oil, gas, or coal companies.
Finally, citizens should be able to use
the results of scientific reports and
recommendations about issues such as
health, prescription drugs, and safety
to formulate arguments supporting
their decisions about scientific
issues of personal, social, and global
consequence.
In PISA 2006 Science, scientific
literacy also encompassed both
knowledge of science and knowledge
about science itself. The former
includes understanding fundamental
scientific concepts; the latter includes
understanding inquiry and the nature
of science. Because PISA describes
the extent to which students can apply
their knowledge in contexts relevant
to their lives, assessment material was
selected from the major domains of
physical, life, Earth, and technology
4
systems. Knowledge of science is
required by adults for understanding
the natural world and for making sense
of experiences in the personal, social,
and global contexts.
PISA 2006 Science used two
categories for knowledge about
science: “scientific inquiry,” which
centers on inquiry as the central
process of science, and “scientific
explanations,” which are the results
of scientific inquiry. Inquiry is the
means of science (how scientists get
evidence) and explanations are the
goals of science (how scientists use
evidence).
The U.S. trend in science
should be a great concern
to those in leadership
positions.
Finally, attitudes toward science
underlie an individual’s interest in,
attention to, and response to science and
technology. The inclusion of attitudes
and the specific areas of attitudes
selected for PISA 2006 Science is
supported by and builds upon reviews
of attitudinal research (OECD, 2006).
The PISA 2006 Science assessment
evaluated students’ attitudes in three
areas: interest in science, support for
scientific inquiry, and responsibility
towards resources and environment.
How Did U.S. Students Do
On PISA 2006 Science?
The PISA 2006 Science survey
provided an opportunity to compare
U.S. students’ scientific literacy with
that of students in other countries,
both our economic competitors
in the Organization for Economic
Cooperation and Development
(OECD) and twenty-seven other
countries. This discussion begins
with a summary of the U.S. position
among OECD countries and other
non-OECD countries that participated
in PISA 2006 Science. The discussion
continues with a review of student
performance on the proficiency levels
used to clarify degrees of scientific
literacy.
The United States was one of
57 countries participating in PISA
2006 Science. This number includes
30 OECD countries and 27 partner
countries. Students in the U.S. scored
an average of 489 points, which is 11
points below the OECD average of
500 points. U.S. students ranked 17th
among other industrialized (OECD)
countries.
The U.S. trend in science should be
a great concern to those in leadership
positions. The U.S. dropped from
14th in science on PISA 2000 to 19th
in 2003 and to 21st in 2006. These
results provide a reason to address
the general scientific literacy in the
U.S. In practical terms, the U.S.
scientific literacy translates to the
supply of scientists and engineers,
skilled workers, and technological
SCIENCE EDUCATOR
Table 3: PISA 2006 Science Survey: OECD Jurisdictions
PISA Results
Science
OECD average score ....................500
OECD Jurisdictions
Average is measurably higher than
the U.S. average
Finland .......................................... 563
Canada ......................................... 534
Japan ............................................ 531
New Zealand ................................ 530
Australia........................................ 527
Netherlands .................................. 525
South Korea.................................. 522
Germany ....................................... 516
United Kingdom ........................... 515
Czech Republic ............................ 513
Switzerland ................................... 512
Austria .......................................... 511
Belgium ........................................ 510
Ireland ........................................... 508
Hungary ........................................ 504
Sweden ........................................ 503
Average is not measurably higher or
lower than U.S.
Poland .......................................... 498
Denmark ....................................... 496
France .......................................... 495
Iceland .......................................... 491
UNITED STATES ........................... 489
Slovak Republic............................ 488
Spain ............................................ 488
Norway ......................................... 487
Luxembourg ................................. 486
Average is measurably lower than
the U.S. average
Italy ............................................... 475
Portugal ........................................ 474
Greece .......................................... 473
Turkey ........................................... 424
Mexico .......................................... 410
innovation, as well as to economic
growth in general.
Overall U.S. performance. To say
the least, U.S. results on PISA 2006
Science were disappointing. U.S. 15year-olds lag behind the majority of
developed nations that participated in
the survey. Out of 30 OECD countries
participating, 16 countries’ average
score was significantly higher than
the U.S. average (See Table 3). The
average score for Finland, the highest
achieving country, was 74 points above
the U.S. Other high achieving countries
FALL 2009
VOL. 18, NO. 2
included Canada, Japan, New Zealand,
and Australia. Among non-OECD
countries, six countries’ average
scores were significantly higher than
the U.S. Those countries were: Hong
Kong-China, Chinese Taipei, Estonia,
Liechtenstein, Slovenia, and MacaoChina (see Table 4).
Science literacy scores for racial/
ethnic groups. On the combined science
literacy scale, Black students and
Hispanic students scored significantly
lower, 409 and 439 respectively, than
the OECD average (500) and lower
than White (523), Asian (499), and
students of more than one race (501).
This pattern of performance for racial
and ethnic groups was similar to that
reported by PISA in 2000 and 2003
(Baldi, Jin, Skemer, Green & Herget,
2007; Lempke et al., 2001; Lempke
et al., 2004).
Proficiency levels for scientific
literacy. In PISA 2006 Science,
performance levels were defined for
the purpose of describing in greater
detail the scientific competencies
and overall scientific literacy. Student
scores in science were grouped into six
proficiency levels. Level 6 represents
the most difficult tasks, and Level 1
represents the least difficult tasks. The
grouping into proficiency levels was
undertaken on the basis of combining
scientific knowledge and abilities
underlying scientific competencies.
Proficiency at each of the six levels
can be understood in relation to
descriptions of the kind of scientific
competencies that students need to
attain the respective levels. Table 5
summarizes the levels and represents
a synthesis of individual competencies
for overall science literacy. The
percentage of OECD students and
the percentage of U.S. students at the
respective levels also are displayed
in Table 5.
U.S. students at higher levels of
proficiency. At Level 6, for example,
students can consistently identify,
explain, and apply both knowledge
of science and knowledge about
science in a variety of complex
situations involving science. For
OECD countries, 1.3% of students
perform at Level 6 on the science
literacy scale. In the U.S., 1.5% reach
Level 6. If we consider both Level 5
and 6, the U.S. is the same as the OECD
average—9.0%. This is the good news.
However, other countries have much
5
Table 4: PISA 2006 Science Survey: Non-OECD Jurisdictions
PISA Results
Science
OECD average score ....................500
Non-OECD Jurisdictions
Average is measurably higher than
the U.S. average
Hong Kong ................................... 542
Chinese Taipei .............................. 532
Estonia.......................................... 531
Liechtenstein ................................ 522
Slovenia ........................................ 519
Macao........................................... 511
Average is not measurably higher or
lower than U.S.
Croatia .......................................... 493
Latvia ............................................ 490
*United States .............................. 489
Lithuania ....................................... 488
Russia ........................................... 479
Average is measurably lower than
the U.S. average
Israel ............................................. 454
Chile ............................................. 438
Serbia ........................................... 436
Bulgaria ........................................ 434
Uruguay ........................................ 428
Jordan .......................................... 422
Thailand ........................................ 421
Romania ....................................... 418
Montenegro .................................. 412
Indonesia ...................................... 393
Argentina ...................................... 391
Brazil............................................. 390
Colombia ...................................... 388
Tunisia .......................................... 386
Azerbaijan ..................................... 382
Qatar............................................. 349
Kyrgyz Republic ........................... 322
* The United States is an OECD country. It has been included in this table for
comparison purposes.
higher percentages at Levels 5 and 6;
for example, Finland (20.9%), New
Zealand (17.6%), and Japan (15.1%).
These countries have a very high
potential for creating scientists and
engineers and promoting scientific
literacy among all citizens.
U.S. students at lower levels of
proficiency. In PISA 2006 Science,
Level 2 was designated as the
baseline for the competencies. This
is the level at which students begin to
demonstrate science competencies that
6
will allow them to participate actively
as citizens. Students at Level 2 can
identify key features of a scientific
investigation, recall concepts, and use
results of an investigation represented
in a data table to support a personal
decision. Across the OECD, 19.2%
of students are categorized as below
the baseline, Level 2. For the U.S.
this average is 24.5%. Below Level
2, students may confuse key features
of an investigation, apply incorrect
scientific information, and confound
scientific evidence with personal
opinions and beliefs. These results
indicate that about one quarter (24.5%)
of U.S. students do not demonstrate
the competencies that will allow them
to productively engage in science and
technology related life situations.
Science proficiency levels for racial
and ethnic groups. Black, Hispanic,
and American Indian/Native Alaskan
students scored below the OECD
average. Scores for White students
were above the OECD average. On
average, the mean scores for White,
Asian, and students of more than one
race were in Proficiency Level 3; the
mean scores of Hispanic, American
Indian/Native Alaskan, and Native
Hawaiian/Other Pacific Islander
students were in Proficiency Level 2;
and the average mean score for Black
students was at the top of Proficiency
Level 1 (Baldi, et al, 2007).
U.S. students and science
competencies. Among the unique
insights gained from PISA2006 Science
is information on student performance
on three scientific competencies:
identifying scientific issues, explaining
phenomena scientifically, and using
scientific evidence. Examining the
scientific competencies individually
suggests possible areas to emphasize
in school science programs. One way
to think of the science competencies is
in terms of a sequence that individuals
might go through as they encounter and
solve science-related problems. First,
they must identify the scientific aspects
of a problem, then apply appropriate
scientific knowledge to that problem,
and, finally, they have to interpret
and make sense of their findings and
use them to support a decision or
recommendation. Traditional science
courses in the U.S. tend to concentrate
on the middle segment—explaining
SCIENCE EDUCATOR
����������������������������������������������������������������������������������������
Table
�����5: Summary Descriptions for the Six Levels of Proficiency on the Combined Science Scale
FALL 2009
�����������������
����������������
�������������
�����������������
�������������������
�������������
�������������
���������������
�����������������������
�����
�������������������������������������������������������������������
�����������������������������������������������������������������
���������������������������������������������������������������������
����������������������������������������������������������������
��������������������������������������������������������������
�����������������������������������������������������������������
�������������������������������������������������������������������
��������������������������������������������������������������������
��������������������������������������������������������������
��������������������������������������������������������������������
����
����
�
��������������������������������������������������������������������
������������������������������������������������������������
��������������������������������������������������������������
���������������������������������������������������������������������
������������������������������������������������������������������
��������������������������������������������������������������������
����������������������������������������������������������������
������������������������������������������������������������
����
����
�
���������������������������������������������������������������������
��������������������������������������������������������������
���������������������������������������������������������������������
��������������������������������������������������������������������
�������������������������������������������������������������������
�������������������������������������������������������������������������
���������������������������������������������������������
���������
�����
�����
�
����������������������������������������������������������������������
���������������������������������������������������������������
�����������������������������������������������������������������
����������������������������������������������������������������������
��������������������������������������������������������������������
��������������������������������������������������������������������
����������
�����
�����
�
�������������������������������������������������������������������
���������������������������������������������������������������������
�������������������������������������������������������������������
�����������������������������������������������������������������������
������������������������������
�����
�����
�
����������������������������������������������������������������������
��������������������������������������������������������������������
��������������������������������������������������������������������
���������������
�����
�����
�������������
����
����
�����
��������������������������������������������
�
VOL. 18, NO. 2
��
7
Table
6: Identifying Scientific Issues: Summary Descriptions of the Six Proficiency Levels
������������������������������������������������������������������������������������������
�����
8
�������������������������
�����������������
����������������
�������������
����������������������
�����
�������������������
�����������������
�����������������
����������
�
�������������������������������������������������
������������������������������������������������
�������������������������������������������
����
����
�
������������������������������������������������
������������������������������������������������������
����������������������������������������������������
���������������������������������������������
����������������������������������������������
�����������������������������������������������������
���������������������������������������������
�������������������������
����
����
�
���������������������������������������������������
����������������������������������������������������
������������������������������������������������
���������������������������������������������
���������������������������������������������
��������������������������������������������������
�����
�����
�
��������������������������������������������������
���������������������������������������������
���������������������������������������������
�����������������������������������������
����������������������������������������������������
�������������������
�����
�����
�
���������������������������������������������������
���������������������������������������������������
��������������������������������������������������
�������������������������������������������������
���������������������������������������������������
�������������������������������������������������
���������������������������������������������
�����������������������������������
�����
�����
�
�����������������������������������������������
��������������������������������������������������
�����������������������������������������������������
�������������������������������������������������
������������������������������������������������
��������������������������������������
�����
�����
�������������
����
����
SCIENCE EDUCATOR
��
������������������������������������������������������������������������������������������
Table
������7: Explaining Phenomena Scientifically: Summary Description for the Six Proficiency Levels
FALL 2009
������������������
����������������
�������������
����������������������
�����
������������������
��������������
����������������
�������������������
���������������������������������������������������
������������������������������������������
����������������������������������������������������
����������������������������
����
����
�
���������������������������������������������������
�������������������������������������������
����������������������������������������
���������������������������������������
����
����
�
������������������������������������������������
������������������������������������������������������
���������������������������������������������������
�����������������������������������������������������
���������������������������������������
�����������
�����
�����
�
���������������������������������������������
������������������������������������������������������
�����������������������������������������������
����������������������������������������������������
������������������������������������������������
��������������������������������������������
��������������������������������������������������
������������������������������������
�����
�����
�
��������������������������������������������������
����������������������������������������������������
�����������������������������������������������������
�������������������
�����
�����
�
��������������������������������������������������
��������������������������������������������������
���������������������������������������������������
������������������������������������������������
�����������������
�����
�����
�������������
����
����
�����
�������������������������
�
VOL. 18, NO. 2
��
9
phenomena scientifically—and give
much less emphasis to identifying
scientific issues and using scientific
evidence.
On identifying scientific issues,
U.S. students ranked 15th among
OECD countries, which was not
statistically significantly different
from the OECD average. U.S. students
were statistically significantly below
the OECD average on explaining
phenomena scientifically and using
scientific evidence. There were gender
differences in that girls performed
better on identifying scientific issues
and using scientific evidence and
boys performed better on explaining
phenomena scientifically. This finding
was consistent with performance by
other OECD countries.
U.S. students performing at the
highest levels, 5 and 6, were about
equal to the percentage of all OECD
students, 9.7% for OECD students,
and 9.3% for U.S. students. Below
the baseline, the U.S. had 21.6% on
identifying scientific issues. (See
Table 6.)
On explaining phenomena
scientifically, the U.S. had slightly
more students in the upper two levels
of proficiency, 11.8% (U.S.) and 11.6%
(OECD). However, the U.S. had 26.3%
of students below the baseline. (See
Table 7).
Finally, for using scientific
evidence, 13.7% of U.S. students
did well by achieving at the top
levels on this proficiency. However,
this percentage was lower than the
percentage of OECD students (14.2%).
The disappointing result was at the
lower level. Twenty-six percent of U.S.
students were below the baseline. This
is compared to 21.9% of all OECD
students (see Table 8).
10
Selected Implications of
PISA for Science Education
Leaders
This concluding section presents
implications of PISA for science
education in general and several
themes emphasized by President
Obama in his clearest discussion of
science and science education, the 27
April 2009 remarks at the National
Academy of Sciences.
Fostering scientific literacy. In
the United States, most school
science programs do not emphasize
scientific literacy as described in
PISA 2006. Consistently, the term
scientific literacy is stated as the
purpose of science education, but
school programs primarily emphasize
facts, information, and knowledge
of the science disciplines and only
secondarily emphasize the applications
of science related to citizens’ life
situations. The distinction may seem
subtle, but it is basic and essential to
understand the difference as it relates
directly to curriculum, instruction,
and assessments at local, state, and
national levels.
A critical challenge for science
education leaders centers on the
difference between the two perspectives
of science curriculum and teaching
described earlier. One perspective
is the fundamentalist and internal to
science itself. This is the perspective
currently emphasized in most state
standards, assessments, and school
science programs. In this perspective,
educational policies, programs, or
practices center on questions such
as: What knowledge of science
and its processes should students
have? What facts and concepts from
physics, chemistry, biology, and the
Earth sciences should be the basis for
school science programs? In contrast,
there is the external perspective that
begins with science-related situations
that citizens might encounter. When
thinking about educational policies,
programs, and practices from this
perspective, questions center on:
What science should students know
and be able to do as future citizens?
What contexts could be the basis for
introducing science and technology?
The difference between these two
perspectives is significant, because
the emphasis of curricula, selection
of instructional strategies, design
of assessments, and professional
education of teachers differ depending
on the perspective.
Based on this discussion of PISA
2006, I point out what is perhaps the
single most significant challenge
facing leaders who wish to foster
scientific literacy in the U.S. Most
science educators hold the internalist
perspective that school science
programs should first, foremost,
and exclusively emphasize the basic
knowledge and processes of science
and secondarily and incidentally
make some links to social issues such
as health, environment, resources,
and energy efficiency. If time and
opportunity permit—which usually
they do not—the science-related social
issues might be taught.
If the United States wants to foster
higher levels of scientific literacy, then
it is essential to begin recognizing
the perspective that includes sciencerelated social issues and accept the
importance of incorporating scientific
literacy into standards, assessments,
and school programs for science.
In order to realize the President’s
vision, science and scientific literacy
must be added to initiatives being
undertaken by the National Governors
SCIENCE EDUCATOR
Table
8: Using Scientific Evidence: Summary Descriptions for the Six Levels of Proficiency Levels
��������������������������������������������������������������������������������������
������������������
������������������
����������������
��������������
�����
�������������������������
�������������
����������������
���������������������� �������������������
�����
�
�������������������������������������������������
������������������������������������������
����������������������������������������������������
�������������������������������������������������
����������������������
����
����
�
�������������������������������������������������������
����������������������������������������������������
���������������������������������������������������������
�����������������������������������������������
����������������������������������������������
�����
�����
�
���������������������������������������������������������
������������������������������������������������������
������������������������������������������
�������������������������������������������������������
���������������������������������������������
������������������������������������������������������
�����������
�����
�����
�
�����������������������������������������������������
����������������������������������������������
��������������������������������������������������������
�����������������������������������������������
�������������������������������������������������������
�����������������������������������������������
�����������������������������������������������������
�����
�����
�
���������������������������������������������������������
�������������������������������������������������������
��������������������������������������������������
�������������������������������������������������������
����������������������������������������������������������
��������������������������������������������������
������������������������
�����
�����
�
������������������������������������������������������
�������������������������������������������������
������������������������������������������������
�����������������������������������������������������
����������������������������������������������
�������������������������������������������������
�������������������������������
�����
�����
�������������
����
�����
��
FALL 2009
VOL. 18, NO. 2
11
Association, the Council of Chief State
School Officers, and Achieve (2008),
as well as those initiated by the National
Research Council, National Science
Teachers Association, and others so
that core standards and assessments
can be revised and improved upon. By
clearly identifying the ability to apply
science to social issues as a goal of
basic science education, we can move
towards the goal of achieving higher
levels of science understanding for all
citizens of our nation.
If the United States wants
to foster higher levels of
scientific literacy, then
it is essential to begin
recognizing the perspective
that includes science-related
social issues and accept the
importance of incorporating
scientific literacy into
standards, assessments, and
school programs for science.
Socioeconomics and science
achievement. One major insight from
PISA 2006 Science is the fact that
poverty had a greater affect on science
scores in the U.S. than in other nations.
Socioeconomic background accounted
for an 18% variation on U.S. student
achievement. This finding should
cause alarm about the importance of
scientific literacy as it relates to social
inequities and the connections between
social inequities and racial and ethnic
groups. It is clear that students of lower
socioeconomic status do not have the
same opportunities to learn science
as students in higher socioeconomic
groups. The U.S. system of education
does not provide underprivileged
12
students with demanding science
curricula, high quality science teachers,
and other resources, such as well
equipped modern science laboratories.
To place this in contemporary terms,
in spite of the No Child Left Behind
legislation, we are leaving some
children behind, and they tend to be
the less privileged.
The difficulty with the
socioeconomic problem is that it is a
huge, complex social issue. Schools
can only contribute to changes in
the larger social problem, but policy
makers and educators can respond
to inequities within the educational
system. I refer to those mentioned above:
curriculum, instruction, teachers,
and the allocation of resources. The
current administration has proposed
to allocate $5 billion for states that
make a commitment to improve math
and science achievement. States are
competing for these funds under an
initiative titled “Race to the Top.” The
double entendre of this title should
not be lost in discussions that center
on competitions with other countries
and educational standards, curriculum,
and teacher education. Reducing the
discrepancies of achievement among
racial and ethnic groups must be a part
of contemporary reform in the U.S.
A new generation of science
curricula. Assuming that the themes
related to scientific literacy, such
as “science in personal and social
perspectives” (National Research
Council, 1996), are included in
standards, then it is clear there is a
need for instructional materials aligned
to the standards. This new generation
of curriculum materials for grades
K-12 could include strategies that
help students develop 21st century
workforce skills and abilities in
modernized laboratories for science
and technology.
Let me be very clear about this
implication. I propose that this new
generation of curriculum materials be
designed, developed, and implemented
as a complement to current programs.
Rather than a complete reform of
the current fundamental curriculum,
the proposed new generation would
account for 4-6 weeks of activities
in the school year. These activities
would give students opportunities to
apply their scientific knowledge to
local, national, and global problems
of energy, environment, resources, and
health while developing 21st century
skills and abilities.
In conclusion, I have used my
perspective of PISA 2006 Science
to provide both an orientation and
rationale for reinvigorating American
science education. The President has
indicated that a decade is a reasonable
amount of time within which to realize
this reform. The results from PISA
2015, when science will again be
emphasized, represent a reasonable
benchmark for U.S. progress in our race
from the middle to the top. The stage
has been set. We have clear indications
of our standing as a nation, and we have
clearly defined measures of success.
We have identified problems with the
way science is currently being taught,
and solutions to resolve those problems
have been proposed, as has funding to
implement those solutions. Now we
must take action.
References
Abell, S., & Lederman, N. (2007).
Handbook of Research on Science
Education. Lawrence Erhbaum
Associates.
Baldi, S., Jin, Y., Skemer, M., Green, P.J., &
Herget, D. (2007). Highlights from PISA
2006: Performance of U.S. 15-year-old
students in science and mathematics
literacy in an international context
SCIENCE EDUCATOR
(NCES 2008-016). National Center
for Education Statistics, Institute of
Education Sciences, U.S. Department
of Education. Washington, DC.
Bybee, R.W. & McCrae, B.J. (2009)
PISA Science 2006: Implications
for Science Teachers and Teaching.
National Science Teachers Association.
Arlington, Virginia
DeBoer, G. (2000). Scientific literacy:
Another look at historical and
contemporary meanings and its
relationship to science education
reform. Journal of Research in
Science Teaching, 37(6): 582-601.
Holton, G. (1998). 1948: The new
imperative for science literacy.
Journal of College Science Teaching,
8, 181-185.
Hurd, P.D. (1958). Science literacy:
Its meaning for American schools.
Educational Leadership, 16: 13-16.
Lempke, M., Calsyn, C., Lippman, L.,
Jocelyn, L., Kastberg, D., Liu, Y.Y.,
Roey, S., Williams, T., Kruger, T.,
and Bairu, G. (2001). Outcomes
of learning: Results from the 2000
program for international student
assessment of 15-year-olds in reading,
mathematics, and science literacy
(NCES 2002-115). National Center for
Education Statistics, U.S. Department
of Education. Washington, DC.
FALL 2009
VOL. 18, NO. 2
Lempke, M., Sen, A., Pahlke, E., Partelow,
L., Miller, D., Williams, T., Kastberg,
D., and Jocelyn, L. (2004). International
outcomes of learning in mathematics
literacy and problem solving: PISA
2003 results from the U.S. Perspective
(NCES 2005-003). National Center
for Education Statistics, Institute of
Education Sciences, U.S. Department
of Education. Washington, DC.
Millar, R. (2006). Twenty First Century
Science: Insights from the design
and implementation of a scientific
literacy approach in school science.
International Journal of Science
Education, 28(13): 1499-1521.
National Governors Association (NGA),
the Council of Chief State School
Officers (CCSSO), & Achieve, Inc.
(2008). Benchmarking for success:
Ensuring U.S. students receive a
world-class education. A report by
the National Governors Association,
the Council of Chief State School
Officers, and Achieve, Inc.
National Research Council (NRC). (1996).
National science education standards.
Washington, DC: National Academy
Press.
Obama, Barack. (2009, April 27). Remarks
as Prepared for Delivery. National
Academy of Sciences. Washington,
DC.
Organization for Economic Co-operation
and Development (OECD) (2006).
Assessing scientific, reading and
mathematical literacy. Paris: OECD.
Osborne, J. (2007). Science education
for the twenty first century. Eurasia
Journal of Mathematics, Science &
Technology Education, 3(3): 173184.
Rodger Bybee is chair, Science Forum and
Science Expert Group, PISA 2006 Science.
He is the executive director (retired) of
the Biological Sciences Curriculum Study
(BSCS). Before this he was executive director
of the Center for Science, Mathematics
and Engineering Education at the National
Research Council. Author of numerous journal
articles, chapters, books, science curricular
and textbooks, he also directed the writing
of the content standards for the National
Science Education Standards. Honors included
the National Science Teachers Association
Distinguished Service Award and Robert
Carleton Award. Correspondence concerning
this article can be sent to <RBybee@bscs.
org>.
13
James E. Hamos, Kathleen B. Bergin, Daniel P. Maki, Lance C. Perez,
Joan T. Prival, Daphne Y. Rainey, Ginger H. Rowell, and Elizabeth VanderPutten
Opening the Classroom Door:
Professional Learning Communities
In the Math and Science
Partnership Program
This article highlights examples of professional learning communities (PLCs)
in the National Science Foundation (NSF) Math and Science Partnership
program.
Students come marching into the
classroom and take their seats … the
bell rings … the teacher closes the
door and thinks, “This is my time with
the kids. I have a lesson plan that I
prepared, and they’ll learn what I have
to offer.” The teacher never talks to
other teachers about what to teach or
how to teach, and the only time that
anyone visits the classroom is when
an administrator comes to evaluate the
teacher once a year.
Although such a reality typified
many classrooms in the 20th century,
in the 1990s and the first decade of this
21st century, a new exemplar of K-12
teacher professional development has
evolved—the professional learning
community (PLC). This paper looks at
how PLCs have become an operational
approach for professional development with potential to de-isolate the
teaching experience in the fields of
science, technology, engineering, and
mathematics (STEM). We offer a short
synopsis of the intellectual origins of
PLCs, provide multiple examples of
PLCs employed in projects funded
by the National Science Foundation
14
Learning organizations were
characterized by a shared
vision among employees
and management with team
learning through group
discussion of goals and
problems.
(NSF) through its Math and Science
Partnership (MSP) program, and consider benefits for varied aspects of the
teaching and learning environment.
Origins
Much has been written about PLCs.
Fuller histories are available elsewhere
(e.g., see Feger & Arruda, 2008), and
countless articles and synopses are
found online. The term ‘learning community’ began to enter the educational
vernacular broadly in the early 1990s,
following the publication of Peter
Senge’s book The Fifth Discipline
(1990). Senge’s philosophy called
for a radical restructuring of business
management strategies. The purpose
of this restructuring was to transform
corporations into learning organizations. Learning organizations were
characterized by a shared vision among
employees and management with team
learning through group discussion
of goals and problems. The concept
of an environment in which “people
are continually learning how to learn
together” (Senge, 1990, p. 3) caught
fire in the educational world.
Soon, the term was modified to
‘learning communities’ as educational
practitioners and researchers began
to create a collection of literature on
this topic (Hord, 1997a, 1997b; Senge
et al., 2000). In this period, Richard
DuFour and Rebecca DuFour, along
with an array of collaborators, became
broadly influential as they popularized
the term ‘professional learning communities’ and edited the seminal book
On Common Ground: The Power of
Professional Learning Communities
(DuFour, Eaker & DuFour, 2005).
Today, PLCs are used to describe a
variety of circumstances that include
bringing administrators and teachers
together into discussion groups, enSCIENCE EDUCATOR
visioning the classroom environment
as a community, and enhancing the
classroom experience by including
the broader community. Moreover,
STEM educators have not been absent
from the work with PLCs, and this is
captured well in a recent volume edited
by Mundry and Stiles (2009).
While educators in the United
States exhibit a growing enthusiasm
for participating in PLCs, it is interesting to recognize that educators
across the globe already identify collaboration with peers as a common
mode of practice (Wong, Britton &
Ganser, 2005). Perhaps the strongest
example of a learning community is
the cultural norm among Japanese
teachers to participate in lesson study
groups as described, for example, by
Stigler and Hiebert in The Teaching
Gap (1999). Acculturating new teachers into learning communities is also
well-developed in nations outside of
the United States. As Britton notes,
“Although all teachers in Shanghai
and Japan participate in learning
communities, beginning teachers receive particularly essential help from
participating in them at the outset of
their practice … What we observed
in Shanghai and Japan contrasts with
what we saw generally in the U.S. We
have noticed places where lesson study
groups exist as professional development for experienced teachers, but
beginning teachers often are omitted”
(2007, p. 9). To overcome this reticence
on the part of American educators, the
National Commission on Teaching and
America’s Future (Fulton, Yoon & Lee,
2005) has adopted recommendations
that new teachers become deeply
engaged in learning communities
during the induction phase of their
careers. Such efforts are meant to address the observation noted by Wong,
Britton, and Ganser that “isolation is
FALL 2009
VOL. 18, NO. 2
the common thread and complaint
among new teachers in U.S. schools.
New teachers want more than a job.
They want to contribute to a group”
(2005, p. 384).
As the notion of PLCs has entered
the mainstream, concerns about the
fundamental definition of the term
have emerged. DuFour notes that
“the term has been used so ubiquitously that it is in danger of losing all
meaning. The professional learning
community model has now reached
a critical juncture, one well known to
those who have witnessed the fate of
other well-intentioned school reform
efforts. In this all-too-familiar cycle,
initial enthusiasm gives way to confusion about the fundamental concepts
driving the initiative, followed by
inevitable implementation problems,
the conclusion that the reform has
failed to bring about the desired
results, abandonment of the reform,
and the launch of a new search for
the next promising initiative” (2004,
p. 6). Michael Fullan identifies several “reasons to be worried about the
spread of professional learning communities. First, the term travels faster
and better than the concept. Thus we
have many examples of superficial
PLCs—educators simply calling what
they are doing professional learning
communities without going very deep
into learning and without realizing
they are not going deep … Second,
people make the mistake of treating
professional learning communities as
the latest innovation. Of course in a
technical sense it is an innovation to
the people first using it, but the moment
you treat it as a program innovation,
you run two risks. One is that people
will see it as one innovation among
many—perhaps the flavor of the year,
which means it can be easily discarded
once the going gets rough and as other
innovations come along the following
year” (2006, p. 10).
The Math and Science
Partnership program
Launched in 2002, the Math and
Science Partnership program at the
National Science Foundation is a
research and development effort to
build capacity and integrate the work of
higher education, especially its STEM
disciplinary faculty, with that of K-12
to strengthen and reform mathematics
and science education. Ultimately,
the MSP program seeks to improve
student achievement in mathematics
and science for all students, at all K-12
levels. MSP projects are expected to
incorporate creative, strategic actions
that extend beyond commonplace approaches in order to improve the depth
and quality of K-12 mathematics and
science education. A primary goal of
MSP projects is to develop and embellish strategies that deal with issues of
teacher quality, quantity, and diversity.
Because the preparation and diversity
of future teachers is also of concern,
many MSP projects strive to improve
undergraduate and graduate education
for those seeking to enter the teaching
profession.
The first call for proposals, MSP
Solicitation 02-061, remarked that
“teachers require support throughout
the professional education continuum
from recruitment, through preparation,
induction and continued professional
development in order to create and
sustain an excellent teaching force”
(NSF, 2002). Proposals were encouraged to offer solutions that would
“[s]trengthen the mathematics and
science teaching profession, especially in underserved areas, through
(a) recruitment of qualified individuals to become teachers, (b) preparation of future teachers in significant
15
content and pedagogy, (c) support of
the teacher certification process, (d)
policies that impact where teachers
are employed, (e) induction into the
field, and (f) continuing professional
development.” It is noteworthy that in
2002, and even in later years, PLCs
were emphasized as a significant
strategy for engaging K-12 teachers
and higher education faculty in only
a few proposals, including those that
succeeded through the merit review
process and thus were awarded funding. This has been true even though
the intent of the MSP program is to
forge partnerships among individuals
and institutions.
However, as MSP-funded projects
began to unfold and add to their repertoire of strategic interventions, it
became clear at conferences of the
MSP community and in early prepublications from project investigators
that PLCs have become a relatively
common vehicle for professional
development. Most often, PLCs have
been implemented as school-based
communities of teachers with a common purpose for their professional
development, and they often also
include higher education STEM and
education faculty. PLCs occur in many
of both the mathematics-focused and
the science-focused projects of the
MSP program, and with well over
100 MSP projects awarded to date, it
is clear that STEM PLCs are new exemplars of professional development
in the lives of thousands of teachers.
This article discusses several examples—made available by investigators
and staff—of science-focused MSP
projects from across the nation.
16
North Cascades and Olympic
Science Partnership (NCOSP),
led by Western Washington
University
During the first three years of the
project (2003-2006), the NCOSP
focused on developing a highly competent cadre of approximately 160
teacher leaders by increasing their
knowledge and skills concerning: (a)
science content, (b) considerations
related to effective science teaching
and learning (Bransford, Brown &
Cocking, 1999), (c) tools for effectively structuring collaborations
among teachers that aid in improving
student learning (such as Lesson Study,
Curriculum Topic Study, Formative
Assessment Probes, and Looking at
Student Work Protocols), and (d) strategies to develop effective professional
learning communities (Garmston &
Wellman, 1999). The teacher leaders
were given opportunities to practice
leadership through presenting, facilitating, coaching, and consulting with
teachers.
Subsequently, in Summer 2007, 105
out of the 160 NCOSP teacher leaders
involved in the project expanded the
scope of the partnership by developing
PLCs within their respective schools.
Each teacher leader collaborated with
higher education faculty and other
teacher leaders for one week in July
to develop three-day professional development activities that met the initial
needs of his/her school-based PLC.
Using what they had learned during
their first three years with NCOSP,
the teacher leaders focused their initial
three-day professional development
events on developing teachers’ science
content knowledge and understanding
of the ways in which people learn.
During the 2007-08 school year,
most of the PLCs used Curriculum
Topic Study, Formative Assessment
Probes, and Looking at Student Work
Protocols in a coherent sequence to
better understand students’ thinking
and determine ways to improve classroom instruction and student learning
in science. In the Summer of 2008,
the teachers from the PLCs attended
a week-long content immersion in
physical science while their teacher
leaders and administrators worked on
developing an action plan to guide the
work of the PLCs during the 2008-09
school year.
NCOSP examined the PLCs’
working processes and impacts on
teachers in order to obtain formative
and summative evaluation data that
the partnership could use to make
programmatic decisions and that
the PLCs could use to improve their
foci and practices. Multiple methods
were developed and used, including
a Professional Learning Community
Observation Protocol and a School
Capacity for Improvement—Survey
of Science.
A case study of one of the NCOSP
schools illustrates the process through
which a PLC became a key school
advisory board. During the 200708 school year, the NCOSP teacher
leader “Conny” provided leadership
for the science PLC at an elementary
school in rural northwest Washington
State. The PLC included one teacher
representative from each grade of the
K-5 school. The principal participated
in a few PLC meetings but mainly
supported the work of the PLC by
providing the teachers time to meet
as a group. During the initial threeday professional development event
that Conny developed and led for the
teachers in the PLC in August 2007,
SCIENCE EDUCATOR
she shared NCOSP tools and resources,
made the case for science reform with
a Minds of Our Own video, discussed
the research on How People Learn,
and had the teachers participate in a
one-day content immersion on light.
During this first professional development event and over the course of the
school year, the PLC teachers were
very willing to explore new content
and their own misconceptions in
order to develop further their content
knowledge in science. They quickly
determined their goals for the year
and initially focused on overcoming
the limited amount of science being
taught at the school.
Because the preparation
and diversity of future
teachers is also of concern,
many MSP projects strive
to improve undergraduate
and graduate education for
those seeking to enter the
teaching profession.
The school already had FOSS science kits (see <www.fossweb.com>)
available at each grade level, so the
PLC recommended to the principal that
science be reintegrated in the school by
requiring that each K-5 teacher implement one FOSS kit per year. Conny, as
the school’s science specialist, would
teach a second FOSS kit at each grade
every year. As the FOSS kits began
to be used fully, thus increasing the
amount of time devoted to teaching
science, the PLC shifted its focus to
work on improving classroom assessment and grading in science, and began
exploring ways to improve teachers’
ability to assess students’ understandFALL 2009
VOL. 18, NO. 2
ing through the use of science notebooks, formative assessment probes,
and questions similar to those on the
statewide assessment that would better
prepare students for these exams. In
the spring of 2008, the school decided
to deepen its emphasis in science by
having a building-wide science fair in
which the lower elementary students
presented their results from whole
class science projects and the upper
elementary students shared their
individual or group science projects.
This inaugural science fair brought
together teachers, students, parents,
and community members at the school
one evening in May. By this time, science appeared to permeate all aspects
of the school. As the principal wrote
in a school newsletter to all staff and
parents, “I am not kidding when I say
science is the bedrock subject that we
hang all of our teaching on, we know
science rules and we want our children
to think like scientists.”
The PLC had become a key advisory
body in the school because they had
the support of the principal and had
structured the PLC so that each grade
was represented and the role of each
teacher representative was to facilitate
the sharing of information between
the PLC and the grade level teams.
The group had made a lot of progress
in increasing the amount of science
instruction. Although it is difficult to
make a direct attribution, the percentage of 5th grade students proficient on
the state science assessment increased
by 19.6% following the increase in
science instruction that took place
during the 2007-08 school year. This
finding encourages the continued use
of PLCs to increase emphasis on and
awareness of teachers’ roles in teaching and assessing science.
Boston Science Partnership
(BSP), led by the University of
Massachusetts - Boston
The BSP employs a professional
learning community model called
Collaborative Coaching and Learning
in Science (CCLS). CCLS is adapted
from a model originally developed
for the Boston Public Schools to support teaching of literacy. In the CCLS
model, a group of 3-8 science teachers
in a building meets once or twice per
week for an 8-16 session cycle. Each
group is led by a teacher and supported
by an “apprentice facilitator,” both of
whom receive training from the Boston
Public Schools Science Department.
A full CCLS cycle includes a course
of study about science teaching and
learning chosen by the participants,
research, observations and debriefs, a
review of student work, and reflective
documentation. Recent topics have
included writing in science, using
notebooks, assessing student understanding, using evidence to support
claims, student misconceptions, and
analyzing standardized test results.
CCLS groups were designed to have
a greatly reduced dependence on external staff resources than the groups
in the original Boston literacy model.
To accomplish this, Boston Public
Schools Science Department staff
members spend much of their time
providing specific on-site support
to CCLS groups as needed, including co-facilitating and/or providing
quarterly training sessions for teacher
facilitators. Three part-time staff
members support 30-35 CCLS groups
each year. As a result of these efforts,
some CCLS groups have successfully
become independent, self-sustaining
communities.
17
CCLS is an extremely flexible and
adaptable model that includes the
ability to address a particular mission
of the school or district. CCLS has
changed the nature of how teachers
teach and reflect on teaching and learning science. This was accomplished by
providing a context and culture that
supports on-going, research-informed,
in-depth conversations about science
teaching and learning. The external
evaluation, which consisted of observations, surveys, and interviews
of participants, administrators, and
district staff, found changes in teachers’ feelings about their effectiveness
in the classroom as well as a change
to the overall community of science
teachers across Boston. CCLS was
shown to expand teachers’ knowledge
of the science curriculum, advance an
atmosphere of professionalism, and
raise awareness among teachers and
administrators of the resources available from the district’s science department. Teachers also reported learning
about and implementing new teaching
strategies, focusing more on student
success and student understanding,
and gaining content knowledge. By
the spring of 2010, the BSP will have
findings that look at student outcomes
as a function of teacher participation in
CCLS; however, the formative evaluations, feedback from participants, and
informal observations indicate that
there have been important changes
to the community of science teachers
in Boston. Teachers feel they have
support and connections across the
district. They are familiar with their
peers’ teaching and are known by their
peers. They have a structured format
in which to talk about teaching and
learning in science. Teachers at all
stages of the professional continuum
can participate equally. Furthermore,
18
opportunities for professional growth
and recognition, such as the facilitator
and apprentice facilitator positions, are
made convenient through the training and support. Participation in the
BSP (CCLS and other programs) is
a statistically significant contributor
to teacher retention. CCLS provides
an incentive to remain in Boston
by supporting a vibrant community
of practice. A core group of teacher
leaders in the district, many of whom
were first recruited through CCLS,
even formed a monthly science social
rotation that is hosted each month by
teachers from a different school. The
socials have continued for two years
now, and 50 to 100 science teachers
from across the district, as well as
STEM faculty and BSP project staff,
typically attend each social. Teachers
credit their desire to remain in the
school district to the professional
atmospheres of their schools and the
cohesive learning communities they
have formed.
BSP evaluators have found that
there are several characteristics common to successful CCLS groups. These
include: 1) support by school administrators, 2) a course of study chosen by
the teachers participating in the CCLS
group and alignment of that course of
study with the school’s mission, 3) a
sincere desire by teachers to participate
and development of trust among the
teachers in a CCLS group, 4) effective
facilitation and clear structure in CCLS
meetings, 5) authentic feedback offered by peers that includes both praise
and challenges with discussions that
focus on improving teaching practice,
and 6) recognition by participants
of connections between the chosen
course of study and the lessons observed. Implementation of CCLS has
also included challenges that mirror
most of the common characteristics.
Three key contextual considerations
emerged as the most critical factors
necessary for successful implementation of CCLS: 1) at least a minimal
level of administrative support, 2) a
trained facilitator with the ability to
effectively lead a CCLS group, and
3) the prior existence of a moderately
well functioning science program in
the school. Lastly, it is critical that
someone with an understanding of
high-quality instruction is a facilitator
or participant in the group in order for
high quality and productive conversations to occur.
Institute for Chemistry Literacy
through Computational Science
(ICLCS), led by the University of
Illinois - Urbana-Champaign
A significant component of ICLCS,
which is now entering its fourth
year, has been the use of the Virtual
Professional Learning Community
(VPLC) to support rural high school
chemistry teachers who reside in different geographic areas across Illinois.
Among the ICLCS Fellows, i.e., the
teachers participating in ICLCS institutes, 24% are the only science teacher
in their small district. The project
has used Moodle, an open-source
course management application, as a
platform for a vibrant, active learning
community in which the emphasis
is on learning and the purpose of
professional development is student
achievement. ICLCS Fellows partner
with University of Illinois faculty,
students, and researchers as equals
to improve student achievement. The
total of 44,712 logins (June 2007-May
2009) and 16,428 postings among
100 Fellows, faculty, and ICLCS staff
shows that the VPLC has become
a powerful tool in the continued
SCIENCE EDUCATOR
professional development of ICLCS
Fellows.
The flexibility in time and space
provided by the asynchronous communication of the VPLC is important,
because it 1) allows for in-depth investigation and analysis of discussion
topics, which promotes deep thinking/
learning, and 2) creates opportunities
for more teachers and faculty to participate in the same discussion session,
which enhances collaboration and
social interaction. It also effectively
creates a network of experts and peers
who communicate regularly. Through
the use of social network analyses, the
interwoven web of communication is
being further studied over the remaining years of the project as ICLCS continues to gather longitudinal data on
the VPLC. However, there are definite
indications of early success. As one
Fellow noted, “[t]he networking with
others in my field has meant a great
deal to me. I have taught chemistry in
Illinois for over twenty years and knew
virtually no other chemistry teachers.
Now I have a HUGE network of fellow teachers I can use for support and
resources.”
The project implemented a randomized selection research design to measure the impact of ICLCS strategies
on students in participant classrooms.
Over the past two years, ICLCS has
observed a significant difference in
achievement between students of
Cadre I teachers (treatment group) and
those of the control group (Cadre II).
The Cadre I Fellows had completed
a full year of professional development, including participation in the
VPLC. In the following year, using an
American Chemical Society standardized test, ICLCS found that students
of Cadre I teachers had a 45% greater
gain in terms of content acquisition
than students of the Cadre II teachers.
FALL 2009
VOL. 18, NO. 2
ICLCS staff is continuing to examine
these trends and the VPLC at large in
order to understand the impact of its
interventions on teacher learning and
student achievement.
Project Pathways, led by Arizona
State University
In their original design, Pathways
staff included PLCs as part of the intended plan. However, the project team
initially underestimated the support
that teachers in PLCs would need to
shift their instruction to have a primary
focus on student thinking and learning
while utilizing inquiry as a primary
mode of instruction. Pathways also did
not anticipate the many school-based
obstacles that emerged during its effort
to establish PLCs in the schools. Over
the past four years, the Pathways PLC
research team has utilized qualitative
methods to code videos of PLC meetings in order to identify the essential
attributes of highly effective contentbased PLCs.
The Pathways PLCs are composed
of 3-7 teachers who teach the same
course. These teachers meet weekly
to discuss issues of knowing, learning,
and teaching the ideas that are central
to that course. The PLCs are initially
structured with an agenda that aids the
facilitator in promoting meaningful
reflection and discourse among all
members of the PLC. In the absence
of a PLC facilitator who holds teachers
to high standards for verbalizing the
processes involved in knowing, learning, and teaching content, Pathways
research has revealed that PLC discussions tend to be superficial and teachers
make little progress in shifting their
classroom practices (Carlson, Moore,
Bowling & Ortiz, 2007). As a result,
Pathways PLCs currently designate a
lead teacher to serve as a facilitator.
All facilitators within a school attend a
four-day facilitator training workshop
and weekly coaching meetings that are
designed to support them in learning to
guide the PLC conversations so as to
assure that teachers “speak meaningfully” about the processes involved
in knowing and learning the content
(Clark, Carlson & Moore, 2007). If a
teacher is vague in expressing what it
means to understand, learn, or teach an
idea, the facilitator is responsible for
posing questions that will encourage
members of the PLC to express clearly
ideas about the issue under discussion.
A good facilitator must have strong
content knowledge about the subject
area that is the focus of the PLC. The
facilitator must also be interested and
able to inquire into the thinking of other
members of the PLC. This requires the
facilitator to be a good listener who is
able to make sense of the meanings
conveyed by others (Carlson, Moore,
Bowling & Ortiz, 2007).
Pathways researchers have found
that before teachers are ready to
develop new lessons to improve the
teaching of specific ideas, they must
first inquire into: 1) student thinking
relative to these specific ideas, 2)
the processes involved in learning
the specific ideas, and 3) the degree
to which their students are currently
learning about the specific ideas. In the
most recent iteration, Pathways found
that after one year of meeting weekly
in PLCs that emphasized content, the
teachers were ready for extended work
in the summer that prepared them to
make substantive shifts in their curriculum, assessments, and pedagogical approaches. At this stage of their
development, the teachers also express
willingness to videotape their new lessons and present video clips from their
classrooms as artifacts for discussion
with other members of their PLC.
19
Additionally, the Pathways team
has found that the school principal and
STEM department chairs are critical to
the institutionalization of PLCs within
a school. For the purposes of institutionalizing PLCs, desirable qualities
of a principal include: 1) willingness
to rearrange schedules to accommodate content-focused, school-based
PLCs for one hour during the work
week, 2) support of inquiry-based and
conceptually-oriented teaching, and 3)
willingness to work through logistical
obstacles to facilitate participation
by all teachers’ in the workshop or
course and weekly PLC meetings.
The researchers have concluded that
shifts in secondary mathematics and
science teaching practice are achieved
when teachers have opportunities to
re-conceptualize and revise their curriculum and instructional practices to
align with inquiry-based instruction.
Research on the practices of secondary mathematics and science teachers
has revealed that teachers’ images of
teaching and curriculum are deeply
rooted in their experiences and that
often these experiences have been predominately stand-and-deliver, procedurally-oriented instruction. Because
of their deep rooted beliefs about
teaching and learning and previous
experiences, teachers typically need
an external support system in addition
to more developed content knowledge
in order to realize substantive shifts in
their classroom practices1.
Vertically Integrated Partnerships
K-16 (VIP K-16), led by the
University System of Maryland
VIP K-16 has brought together
several Maryland institutions of
higher education and high schools in
the Montgomery County (Maryland)
Public Schools district in order to
promote inquiry-based learning in
the sciences, both in high schools and
at the undergraduate level. Learning
communities became the commonly
accepted strategy for teachers and
faculty to exchange information, interact and observe instruction, share
research endeavors, reflect on teaching practices, and reform curriculum
at all levels. Although several PLCs
included only faculty or only high
school teachers (usually because of
geographical limitations), several
had participants from across the K-16
spectrum.
In one example of developing PLCs,
project leaders at the University of
Maryland, Baltimore County developed bi-annual colloquia that brought
faculty, graduate students, and high
school teachers together to explore
inquiry instruction in science. Nearly
80 people were involved in three colloquia. At the first such colloquium,
participants self-selected into smaller,
sustained PLCs that met as small
groups (of 1-2 faculty and 1-3 teachers) throughout the year. Ultimately,
7 faculty and 10 teachers participated
in these groups. The PLCs designed
inquiry-based lesson plans for highschool and undergraduate courses,
and some teachers and faculty spent
time visiting each others’ classes.
One PLC contributed to the development of a graduate teaching assistant
training program for the mathematics
department.
Another type of PLC was designed
by project leaders at the University of
Maryland Biotechnology Institute.
Over a four-year period, the program placed nearly 40 high-school
teachers (8-10 each year) in research
laboratories during the summer and
supplemented their experience with
a pedagogical learning community
that was established to help teachers
translate their laboratory experiences
into inquiry lessons in the classrooms.
During the summer and in follow-up
meetings during the academic year,
the teachers and faculty (in science
and in science education) met regularly to talk about and challenge their
own notions of scientific inquiry and
redesign their instructional practices in
response to those discussions. Survey
instruments and learning-community
observations were employed as well
as an inquiry-teaching rubric modified
from Llewellyn (2002). The results
indicate significant increases in teachers’ understanding and use of inquiry
instruction over the course of the year.
This strategy, dubbed “ExPERT”
(Extended Professional Experiences in
Research for Teachers), was one of the
most successful learning community
strands in the project.
Measuring PLCs
Implicit in the design of MSP
projects offering professional development for teachers is the belief
that these projects will result in new
learning among the teachers, which
will then translate into improved learning opportunities for students. How
do investigators know that creating
PLCs results in new and meaningful
interactions among teachers or that
PLCs result in changes in classroom
practice that benefit students? As part
of a national research and development
effort, MSP projects are expected to
collect data to document their work
and use that data to inform future
1. In the Pathways project, teachers either enrolled in a two sequence graduate course or attended 8 half day workshops that were focused on
improving teachers’ content knowledge for teaching.
20
SCIENCE EDUCATOR
directions and provide insights to the
field on methods of analysis that are
effective at measuring indicators of
success. Rigorous assessment of the
impact that professional development
has on teachers and their students
requires the development of tools and
instruments accompanied by piloting,
revision, and field-testing.
Two of the projects discussed above
have developed instruments for observing PLCs. In NCOSP, the investigators
developed and used the Professional
Learning Community Observation
Protocol, which is an instrument structured around the project values that
had been identified as key elements
of an effective PLC: Shared Vision
and Ways of Working, Collaboration,
and Reflective Dialogue. These three
elements combine to help foster
open communication among group
members so that they develop common norms, vision, and goals. The
two main purposes of this protocol
are to: 1) build and deepen a shared
understanding of what it means to work
effectively as a PLC, and 2) provide a
meaningful tool for self-monitoring a
PLC’s development.
Project Pathways researchers
are currently refining its Learning
Community Observation Protocol
(LCOP), which is a tool used by
project staff to determine the degree
to which a PLC is engaging in genuine
inquiry and meaningful conversations
about knowing, learning, and teaching
specific content (Sutor, Oehrtman &
Carlson, in preparation). The LCOP
is being designed to determine if PLC
members are ”productively engaged”
during sessions and if group members
reflect on and discuss problems related
to student thinking and understanding,
problems of teaching practice, ways
to unpack mathematical or scientific
FALL 2009
VOL. 18, NO. 2
ideas, and/or problems related to communication with peers. The Pathways
team has found that productive engagement in PLCs is characterized
by PLC members contributing to the
discussion in meaningful ways, and
encouraging others to do the same,
with the group engaged in a reflective
rather than routine way, and the group
taking important issues as problematic.
In contrast, unproductive characteristics appear when the PLC group
works routinely through the agenda
without reflective engagement with
the material, allowing a) some members not to be engaged in the intended
activity of the group, b) exclusion of
some members of the group by more
engaged members, c) an excess of time
to be spent on extraneous discussion,
and/or d) a failure of the group to
value the time spent in the learning
community.
Other projects funded by the MSP
program have developed additional
methods to measure impacts of PLCs,
and it is anticipated that this research,
such as the two examples that follow, will be made available to others
across the nation who are interested in
assessing the effects of professional
development.
Partnership for Reform in Science
and Mathematics (PRISM), led by
the University System of Georgia
PRISM is a large-scale project
with state and regional partners. The
state partners include the University
System of Georgia, which is the public
higher education state agency, and the
Georgia Department of Education,
which is the K-12 state agency. Four
regional P-16 (‘P’ is for preKindergarten) partnerships include at least one
institution of higher education (IHE)
and one K-12 system, which results
in a total of 6 IHEs and 15 school
districts participating in PRISM. In
order to increase the quality of science and mathematics teaching and
learning in Georgia, PRISM initiated
10 focal strategies. One of the strategies is to “engage higher education
and P-12 faculty in learning communities” (see <http://www.gaprism.
org/about/strategies.phtml>). Over
multiple years, PRISM has developed
evidence showing consistent, positive effects of PLCs on teaching and
learning practices (Monsaas, 2006;
Hessinger, 2009).
To provide evidence about the
impact of PLCs, PRISM has used the
Inventory of Teaching and Learning
(ITAL), which is a self-report survey
that was developed by a team of
PRISM evaluators to assess teachers’ reported emphasis on reformed
teaching and learning practices (Ellett
& Monsaas, 2007). Reformed teaching was characterized as primarily
learner-centered, whereas more traditional teaching was characterized
as primarily teacher-centered. The
inquiry questions on the ITAL were
derived from the observation categories and assessment indicators of
the Reformed Teaching Observation
Protocol (RTOP) developed at Arizona
State University (Sawada et al., 2000).
Additional items were developed to
assess teachers’ reported use of standards-based teaching and learning
practices and traditional practices.
The inquiry items reflected reformed
teaching and learning activities (e.g.,
encouraging students to evaluate their
own thinking throughout the lesson)
and the traditional scale reflected
more traditional teaching practices
(e.g., evaluating learning and performance on the basis of right and wrong
answers). Teachers used a six-point
21
scale ranging from 1=No Emphasis
to 6=Very Strong Emphasis to rate the
extent to which they emphasized each
ITAL teaching and learning activity in
their classrooms. Principal components
analyses supported three subscales of
the ITAL: Inquiry-Based Teaching and
Learning (30 items), Standards-Based
Teaching and Learning (10 items), and
Traditional Teaching and Learning
(12 items) (Ellett & Monsaas, 2007).
In addition to the ITAL questions
about teaching and learning practices,
several demographic questions (e.g.,
grade level and science and/or mathematics courses taught) and questions
about participation in PRISM activities
were asked, including if the responding teacher participated in a PRISM
learning community and if a higher
education faculty member participated
in the PLC.
The ITAL has been given to thousands of teachers across Georgia,
including those who participated
in PRISM PLCs and those who did
not, and statistical analyses were run
separately in the Springs of 2006,
2007, 2008 and 2009. The dependent
variables were the three subscales of
the ITAL and the independent variable was participation in a PRISM
PLC. The results were consistent over
the four collection times and showed
that participation in a PRISM LC is
associated with greater emphasis on
standards-based teaching and learning practices in both mathematics and
science K-12 classrooms. Moreover,
the PRISM team also found that participation of an IHE faculty member
has an additional, positive impact on
teachers’ reported use of inquiry-based
teaching and learning.
Developing Distributed
Leadership, led by Northwestern
University in collaboration with
22
the Math in the Middle Institute
Partnership of the University of
Nebraska - Lincoln
This collaboration between an
MSP-funded research project and a
partnership project (entitled Math in
the Middle) focuses on PLCs for mathematics education (see Pustejovsky,
Spillane, Heaton & Lewis, 2008)
that examining different dimensions
of middle school mathematics by
comparing them to other subjects
(e.g., Language Arts). One component
of this work explored the validity
of a social network instrument (the
Social Network Survey) for studying
subject-specific leadership and social
influence in schools, with particular
attention to question-order effects
(Pustejovsky & Spillane, 2008; Pitts
& Spillane, 2009).
The Social Network Survey was
administered to all certified staff in
each of the ten middle schools in the
partnership. School-level response
rates ranged from 70% to 94% for
teaching staff and were slightly lower
for administrators and other certified staff. The survey collected data
on different dimensions of the PLC.
Seven sets of measures, comprised of
46 items in total, measured teachers’
views on the social norms within their
school, including:
• Trust among teachers (6 items)
• Trust between teachers and the
Principal (8 items)
• Teachers’ evaluation of the
Principal’s instructional leadership (7 items)
• Collective responsibility for
student learning: peer-assessed
(7 items)
• Collective responsibility for
student learning: self-assessed
(7 items)
• Teachers’ control over classroom practice (5 items)
• Openness to innovation (6
items)
Network data were collected in
order to measure the structural and
content aspects of the PLC, and
network ties (i.e., linkages between
individuals) were measured by asking
respondents to list the people “to whom
they go for advice and information”
about several topics. All teachers were
asked about mathematics and reading/
writing/language arts. Additionally, all
subject-specific teachers were asked
about their primary subject. For each
tie listed by a respondent, data was collected on the tie’s designated role, the
frequency of contact between respondent and advisor, the influence of the
advisor on the respondent’s practice,
and the content of the interaction between respondent and advisor. Content
was measured along five dimensions:
deepening content knowledge, planning or selecting course content and
materials, approaches for teaching
content to students, strategies specifically aimed at assisting low-performing students, and assessing students’
understanding of the subject.
The collaboration’s ongoing analyses suggest that there is considerable variation across schools in the
structure of the PLCs, even though
the norms and substance of PLCs
appear to be relatively homogeneous
across schools (e.g., regarding norms,
between-school variation ranges from
only 2% for teacher control over classroom practice to 7% for instructional
leadership). Although school-level
averages do not vary greatly, there
do appear to be differences in the
homogeneity of attitudes within each
school; respondents in some schools
have a high level of agreement about
SCIENCE EDUCATOR
the principal’s instructional leadership,
while respondents in other schools
show a much greater range of opinions.
There is also considerable variation
in terms of the structure of PLCs,
both by school and across subjectareas. Schools varied in the degree to
which the subject-specific networks
spanned the formal organization of
the school and the degree to which
teachers’ networks reached outside the
school to access advice and information. Schools and subject-areas also
varied in their network concentration.
For example, math networks generally appeared more concentrated than
reading/writing/language arts. Finally,
the researchers observed that Math in
the Middle associates are prominent
brokers of information both within
schools and between schools and their
environment. The associates tended
to be named as advisors by more
individuals within their schools as
compared to other teachers in similar
roles. Moreover, associates sought
advice from more sources outside of
their schools, compared to their colleagues, and many of their external ties
were with other Math in the Middle
associates at different schools. All
in all, this work on PLCs in schools
shows great promise. The collaboration of the research and partnership
projects is now exploring relationships
between teacher networks and student
achievement.
Conclusion
Over the past decade, professional
learning communities have been identified by many schools as an effective
approach to increasing collaboration
among educators. As such, PLCs
challenge the stereotype that teachers
work in isolation and, instead, open the
classroom door wide so that teachers
can discover ways to improve their
FALL 2009
VOL. 18, NO. 2
craft through group effort, discuss with
others ways to improve the education
of all students, and generally create
a culture of mutual support within
school walls. A literature on PLCs in
science education has begun to appear, and the projects of the National
Science Foundation’s MSP program,
which emphasizes partnerships within
and across schools as well as with
institutions outside of schools such
as colleges and universities, have
become especially fruitful sources of
new experiments with PLCs in varied
manifestations. With the development of new tools and instruments to
measure their impact, MSP projects
anticipate identifying additional outcomes from their work and, thus, will
inform the decisions that all educators
must make to improve teaching and,
ultimately, learning.
References
Bransford, J. D., Brown, A. L., and Cocking, R. R. (eds.) (1999). How people
learn: Brain, mind, experience, and
school. Washington, DC: National
Academy Press.
Britton, E.D. (2007). Roles of Communities in Mathematics and Science
Teacher Induction: Issues and International Examples. Presented at the
National Commission on Teaching and
America’s Future Conference on the
Induction of Science and Mathematics
Teachers into Professional Learning
Communities, Wingspread Conference
Center, Racine, WI. <http://www.nctaf.
org/resources/events/documents/nctafbritton012908.final.pdf>.
Carlson, M. P., Moore, K., Bowling, S.,
and Ortiz, A. (2007). The role of the
facilitator in promoting meaningful
discourse among professional learning
communities of secondary mathematics and science teachers. In T. Lamberg
& L. R. Wiest (eds.), Proceedings of
the 29th annual meeting of the North
American Chapter of the International
Group for the Psychology of Mathematics Education (841-8). Reno, NV:
University of Nevada.
Clark, P. G., Carlson, M., and Moore, K.
(2007). Documenting the emergence
of “speaking with meaning” as a sociomathematical norm in professional
learning community discourse. In T.
Lamberg & L. R. Wiest (eds.), Proceedings of the 29th annual meeting
of the North American Chapter of the
International Group for the Psychology of Mathematics Education (862-4).
Reno, NV: University of Nevada.
DuFour, R. (2004). What Is a “Professional
Learning Community”? Educational
Leadership, 61 (8), 6-11.
DuFour, R., Eaker, R., and DuFour, R.
(eds.). (2005). On Common Ground:
The Power of Professional Learning
Communities. Bloomington, IN: Solution Tree.
Ellett, C. D. and Monsaas, J.A. (2007).
Summary of the Development and
Use of the Inventory for Teaching
and Learning (ITAL) in the External
Evaluation of the Georgia Partnership
for Reform in Science and Mathematics
(PRISM). In the MSP Toolbox / Materials of MSPnet. <http://hub.mspnet.
org/index.cfm/14286>.
Feger, S. and Arruda, E. (2008). Professional learning communities: Key
themes from the literature. Review
conducted by The Education Alliance,
Brown University, Providence, RI.
<http://www.alliance.brown.edu/pubs/
pd/PBS_PLC_Lit_Review.pdf>.
Fullan, M. (2006)s Leading professional learning: Think ‘system’ and
not ‘individual school’ if the goal is
to fundamentally change the culture
of schools. School Administrator,
63(10), 10-14.
Fulton, K., Yoon, I. and Lee, C. (2005).
Induction into learning communities.
Policy Paper of the National Commission on Teaching and America’s
Future, Washington, DC. <http://www.
nctaf.org/documents/NCTAF_Induction_Paper_2005.pdf>.
23
Garmston, R. and Wellman, B. (1999).
The adaptive school: A sourcebook for
developing and collaborative groups.
Norwood, MA: Christopher-Gordan
Publishers, Inc.
Hessinger, S. (2009). Professional Learning Communities. In J.S. Kettlewell
and R.J. Henry (eds.), Increasing the
competitive edge in math and science
(101-120). Lanham, MD: Rowan and
Littlefield.
Hord, S.M. (1997a). Professional learning communities: What are they and
why are they important? Issues about
Change, 6 (1).
Hord, S.M. (1997b). Professional learning communities: Communities of
continuous inquiry and improvement.
Austin, TX: Southwest Educational
Development Laboratory.
Llewellyn, D. (2002). Inquire Within:
Implementing Inquiry-Based Science
Standards. Thousand Oaks, CA: Corwin Press.
Monsaas, J.A. (2006). Engaging Higher
Education Faculty in K-16 Learning
Communities to Improve Teaching and
Learning in Science and Mathematics
in the K-12 Schools. Paper presented at
the MSP Evaluation Summit II Minneapolis, MN, October 4-5, 2006 <http://
hub.mspnet.org/media/data/Monsaas.
pdf?media_000000002242.pdf>.
Mundry S. and Stiles, K.E. (eds.). (2009).
Professional learning communities
for science teaching: Lessons from
research and practice. Arlington, VA:
National Science Teachers Association.
National Science Foundation (2002).
Math and Science Partnership (MSP).
Arlington, VA. <http://www.nsf.
gov/publications/pub_summ.jsp?ods_
key=nsf02061>.
Pitts, V. and Spillane, J. (2009). Using Social Network Methods to Study School
Leadership. International Journal of
Research and Method in Education,
32(2), 185-207.
24
Pustejovsky, J.E. and Spillane, J.P. (2008).
Question-Order Effects in Social Network Name Generators. Working paper
of the Distributed Leadership Study,
Northwestern University, Evanston, IL.
<http://www.sesp.northwestern.edu/
docs/Question-order_effects_in_social_network_name_generators.pdf>.
Pustejovsky, J.E., Spillane, J.P., Heaton,
R.M. and Lewis, W.J. (2008). Understanding Teacher Leadership in Middle
School Mathematics: A Collaborative
Research Effort. Working paper of
the Distributed Leadership Study,
Northwestern University, Evanston, IL.
<http://www.sesp.northwestern.edu/
docs/Understanding_Teacher_Leadership_in_Middle_School_Mathematics.pdf>.
Sawada, D., Piburn, M, Turley, J., Falconer, K., Benford, R., Bloom, I., and
Judson, E. (2000). Reformed Teaching
Observation Protocol (RTOP) Training
Guide. ACEPT Technical Report No.
1N00-2, Tempe, AZ: Arizona Collaborative for Excellence in the Preparation of Teachers. <http://physicsed.
buffalostate.edu/pubs/RTOP/RTOPTrgGd_IN002.pdf>.
Senge, P.M. (1990). The fifth discipline:
The art and practice of the learning organization. New York, NY:
Currency Doubleday.
Senge, P., Cambron-McCabe, N., Lucas,
T., Smith, B., Dutton, J., and Kleiner, A.
(2000). Schools that learn: A fifth discipline fieldbook for educators, parents,
and everyone who cares about education. New York, NY: Doubleday.
Stigler, J.W. and Hiebert, J. (1999). The
teaching gap: Best ideas from the
world’s teachers for improving education in the classroom. New York, NY:
Free Press.
Wong, H. K., Britton, T., and Ganser, T.
(2005). What the World Can Teach
Us About New Teacher Induction.
Phi Delta Kappan, January 2005,
379-384.
James E. Hamos, Kathleen B. Bergin, Daniel
P. Maki, Lance C. Perez, Joan T. Prival,
Daphne Y. Rainey, Ginger H. Rowell, and
Elizabeth VanderPutten are Program Directors, Division of Undergraduate Education,
National Science Foundation, Arlington, VA
22230. Correspondence concerning this article
should be sent to <jhamos@nsf.gov>.
Acknowledgement: The authors heartily acknowledge the intellectual and written
contributions of the investigators and staff of
the following Math and Science Partnership
projects: Boston Science Partnership; Developing Distributed Leadership: Understanding
the Role Boundary Tools in Developing and
Sustaining Leadership for Learning Networks;
Institute for Chemistry Literacy through
Computational Science; North Cascades and
Olympic Science Partnership; Partnership for
Reform in Mathematics and Science; Project
Pathways; Vertically Integrated Partnerships
K-16.
We have been pleased to work with an exceptional team of staff members: Philis Hauser,
Michael Jugan, Darnita Kizer and Jessica
Slater. We also acknowledge the considerable
contributions of the following senior Program
Officers who worked in the Math and Science
Partnership program in earlier years: Costello
Brown, Deborah Crawford, Joyce Evans,
and Diane Spresser. Lastly, we acknowledge
Kanwal Singh who helped us first identify, in
2003, professional learning communities in
projects of the Math and Science Partnership
program.
Opinions and conclusions expressed in this
article are those of the authors and do not
necessarily reflect the views of the National
Science Foundation.
SCIENCE EDUCATOR
Yufeng Qian
3D Multi-User Virtual Environments:
Promising Directions for
Science Education
Centered on the theme of scientific inquiry, this article describes a number
of 3D multi-user virtual environment programs and their potential for
improving science learning.
Our nation’s students fall short
in science. The Department of
Education’s 2000 National Assessment
of Education Progress (NAEP), also
known as “The Nation’s Report Card,”
showed no improvement in student
science performance between 1996
and 2000 in grades four and eight,
and a slight decline in performance by
twelfth-graders. While results from the
2005 NEAP indicated improvement
for elementary school students in
science achievement over the last
decade, middle school scores have
remained flat, and high school scores
have continued to decline since 1996,
in sharp contrast to the large gains in
math, and slower but still significant
gains in reading (National Assessment
of Educational Progress, 2005). A
recent report from the National Center
for Education Statistics revealed that
American students scored below
average on science literacy in the 2006
Program for International Student
Assessment (PISA), trailing their peers
in 16 of 30 industrialized countries
(National Center for Education
Statistics, 2007).
Lamenting the “statistically and
morally significant” fall in science
results, Rod Paige, former Education
FALL 2009
VOL. 18, NO. 2
Secretary, warned that “(e)veryone
should be concerned—82% of our
high school seniors are not performing
at the proficient level in science,”
(Leath, 2001) which could threaten
the country’s economic future and
damage national security in the long
run. In reaction to U.S. students’
science performance in 2006 PISA,
Senta Raizen, Director of WestEd’s
National Center for Improving
Science Education, pointed out that
U.S. students “seem to lack a strong
grasp of the nature of science, and of
science’s important role in society”
(Cavanagh, 2007).
Leveraged by federal support
to address the critical
crisis in science education,
scientists, science education
researchers, and school
teachers have started to
join efforts to explore how
to maximize the use of
emerging technologies to
improve science teaching
and learning.
What obstacles have hindered U.S.
students’ performance in science?
While studies have identified a number
of factors that have contributed to the
decline of science education, such as
shortage of highly qualified teachers
and inadequate support from the
public system and community, two
pressing issues are in need of a rapid
response. First, compared to reading
and math, which by law are the
nation’s educational priority, much less
attention and thus relatively limited
time are devoted to science teaching,
especially in elementary schools.
The results of the National Survey of
Science and Mathematics Education:
Trends from 1977 to 2000 showed
that, while mathematics continues
to be taught virtually every day in
grades 1-12, only about 70 percent
of elementary classrooms receive
science instruction every day (Smith,
Banilower, McMahon, & Weiss, 2002).
A more recent study investigated
the status of science education in
California Bay Area elementary
schools, which is home to much U.S.
innovation in science and technology,
but which ranked 2nd lowest of all
states in 2005 NAEP in science. This
study showed a diminishing amount
25
of time spent on science since the
enactment of No Child Left Behind,
and schools in program improvement
status reported that little to no time was
being spent on science at all because
of their need to show improvements
in the tested subjects of language arts
and mathematics (Dorph, Goldstein,
Lee, Lepori, Schneider, & Venkatesan,
2007).
Parallel to the time constraints, the
dominant science instruction pedagogy
is problematic. Heavily influenced by
the high-stake tests and standardsbased curriculum and exacerbated
by time constraints, science teachers
have focused primarily on delivering
the outcomes of science to their
students, as opposed to engaging them
in the inquiry process. This deviates
from the nature of science learning.
Compared to their peers in highachieving countries (such as Japan,
Australia, and the Netherlands), U.S.
science teachers tend to present science
content as a collection of discrete facts,
definitions, and algorithms rather than
as a connected set of ideas, and highinterest activities and real-life issues
are usually designed and introduced
as a side-bar to motivate and engage
students, rather than being used as
tools for developing concepts (Roth &
Garnier, 2007). The National Science
Education Standards emphasize
that scientific inquiry is at the heart
of science and science education
(National Research Council, 1996); the
National Science Teachers Association
(2004) suggests that all K-16 teachers
embrace scientific inquiry. Although
inquiry has a decade-long history of
strongly supported recommendation as
a best practice in science education,
its implementation in the classroom
is, unfortunately, misguided. Many
teachers, unclear about how to
implement inquiry in the classroom,
26
substitute real scientific inquiry with
traditional “cookbook” experiments
(Wallace & Louden, 2002).
Exemplary 3D Multi-User
Virtual Environments for
Science Education
Leveraged by federal support to
address the critical crisis in science
education, scientists, science education
researchers, and school teachers have
started to join efforts to explore how
to maximize the use of emerging
technologies to improve science
teaching and learning. There has been a
surge of interest in the use of emerging
3 dimensional (3D) multi-user virtual
environment (MUVE) technology
to engage and motivate learners,
support authentic scientific inquiry,
and facilitate students’ construction of
science knowledge and development
of inquiry skills in a socially situated
and distributed environment. Made
popular by SecondLife, the 3D MUVE
is an immersive 3D virtual space
where people, entering the space via
avatars, meet and interact with one
another and learn in the multi-user
environment in real time. A variety of
3D MUVE programs has rapidly burst
into the limelight in science education,
including Harvard University’s
River City, Indiana University’s
Quest Atlantis, Cornell University’s
SciCentr, and North Dakota State
University’s Geology Explorer and
Virtual Cell.
River City
River City, funded by the National
Science Foundation and developed
by the Graduate School of Education
at Harvard University, is a scientific
inquiry-based 3D MUVE that targets
middle school students. Adopting a
storyline—a familiar game design
scheme among mainstream games—
River City is set in a 19th-century city
with a river running through it, and
its citizens face a chronic illness. The
students’ task is to find out why the
residents of River City are getting sick
and what can be done to help them.
The problems are interdisciplinary and
integrate aspects of science, history,
and social studies, allowing students
to experience real world inquiry skills
that are required when disentangling
multi-causal problems in a complex
environment.
Centered on the scientific inquiry
skills and on the content in biology
and ecology that are embedded within
historical, social, and geographical
contexts, River City guides students
through making observations, posing
questions, developing hypotheses,
investigating, explaining, predicting,
proposing answers, and communicating
the results in the form of a letter to
the Mayor of River City. River City
has been implemented successfully
in twelve states in the U.S., and has
involved approximately 100 teachers
and over 5,000 students in 2007-2008
(Harvard University, 2008).
Quest Atlantis
Quest Atlantis, funded by the
National Science Foundation and
MacArthur Foundation and developed
by the Center for Research on Learning
& Technology at Indiana University,
is another widely cited innovative
science learning program. Similar
to River City, Quest Atlantis is a 3D
multi-user online learning community
intended to engage children ages 9-12
in science learning. Its legend is that the
people of “Atlantis” face an impending
disaster; their world is slowly being
destroyed through environmental,
moral, and social decay. The task of the
project is to save Atlantis. Leveraging
3D technologies and game-based
SCIENCE EDUCATOR
methodologies, the problems are
presented in an interactive narrative
in which the ‘‘reader’’ has agency in
co-determining how the story unfolds
(Barab, Sadler, Heiselt, Hickey, &
Zuiker, 2007).
To echo the national call for inquirybased math and science learning,
Quest Atlantis has been designed
to support children’s learning and
thinking through the use of scientific
inquiry. Its inquiry-based activities
begin with an interesting problem
that is grounded in real-world issues.
Students are involved in refining
questions, gathering data, evaluating
information, developing plausible
interpretations, and reflecting on
their findings. Similar to other multiuser virtual worlds, Quest Atlantis
is a globally distributed community
with more than 20,000 participants
from four continents (Barab, Arici, &
Jackson, 2005).
SciCentr
SciCentr, an outreach program of
Cornell University’s Cornell Theory
Center, is a 3D multi-user chat-enabled
online museum developed to engage
young people in science, technology,
engineering, and mathematics subjects.
As opposed to River City and Quest
Atlantis, which focus on the guided
inquiry method of learning, SciCentr
is based on constructivism in that it
promotes children’s exploration of
scientific topics of their own choice
and provides a virtual platform for
them to share their passion and
knowledge of a particular topic with
the science community.
Since 2001, SciFair, a portion of
the online SciCentr museum, has
involved more than 1,000 middle
school students and teachers annually.
Participants build their own virtual
knowledge spaces that combine science
FALL 2009
VOL. 18, NO. 2
exhibitions with game interactions.
SciFair was designed to target a wide
variety of settings, especially in terms
of cultural diversity, that include
underserved, rural, and minority
communities. For example, SciFair
has been successfully implemented
as a science communication program
with Native American students in
Washington and urban middle school
students in New York and Virginia
(Corbit, Bernstein, Kolodziej, &
McIntyre, 2006).
Geology Explorer and
Virtual Cell
Developed by North Dakota
State University’s World Wide Web
Instructional Committee, Geology
Explorer is a multi-user role-playing
virtual environment that provides
secondary and post-secondary students
the means and equipment to carry out
geologic investigation of a mythical
planet called “Planet Oit”. This planet
is described as similar to Earth, but it
is directly opposite of the Sun from
Earth. In a role-based “learn by doing”
environment, students take on the role
of a geologist and learn fundamental
concepts of geology and inquiry
strategies used by geologists through
exploration, experimentation, and
guided collaboration.
Along with Geology Explorer,
Virtual Cell is a similar 3D MUVE
for learning fundamental concepts
of cell biology and strategies for
diagnostic problem-solving. Similar
to Geology Explorer, the pedagogical
approaches are to provide students
with authentic problem solving
experiences that include elements
of practical experimental design
and decision making, while learning
science content at the same time
(Slator & Beckwith, 2006). These two
programs have significantly facilitated
science students’ learning of abstract
concepts in geology and cell biology
via 3D visualization and modeling.
Promising Directions for
Science Education
As evidenced in these pioneering
3D MUVE programs for science
education, this emerging technology
holds great promise and opportunities
for improving science learning and
is potentially a viable solution to
the pressing issues facing science
education in schools.
1. Platform for Scientific Inquiry
The 3D MUVE provides a viable
platform to support the authentic
scientific inquiry process and help
learners acquire inquiry skills defined
by the National Science Education
Standards (1996). As in River City and
Quest Atlantis, the scientific inquiry
process and skills are seamlessly
embedded in the immersive probing
environments. In such environments,
students are first exposed to a
complex, authentic problem, such
as finding solutions to save Atlantis
from problems similar to those being
faced on Earth. In order to disentangle
the complex multi-causal problems,
students need to go through the process
of making observations, refining
questions, gathering data, evaluating
information, developing plausible
interpretations, and reflecting on their
findings—a set of inquiry activities
that are at the heart of science and
science education. Additionally, the
science content and skills specified
in the curriculum are embedded in
the inquiry activities, which provides
an opportunity for the assessment of
students’ mastery of these contents
and skills.
The unique technological
affordances of 3D MUVE offer
27
a variety of tools for conducting
scientific inquiry. One of 3D MUVE’s
salient features is its ability to
construct a virtual space that can not
only resemble but go beyond the real
world, and provide an experience that
is not accessible, possible, or practical
in reality. In Geology Explorer, for
example, students are able to access
and examine almost 100 different
rocks and minerals that are normally
not readily available, and use nearly
40 scientific instruments and geology
tools (e.g., “streak,” “scratch,” “hit,”
“view,” “taste,” and “touch,” etc). This
greatly enhances students’ inquiry
experiences by providing exploration
opportunities similar to those of a
real geologist. In addition, in most
3D MUVEs, students can teleport
instantly from one place to another,
“physically” (via avatar) visiting a
place thousands of miles away or
even on the other side of the globe
and meeting and chatting with people
and content experts from around the
world. These capabilities can create
a profound sense of motivation and
engagement conducive to a rich and
deep inquiry experience.
2. Gateway to Engaged Learning
Our schools are faced with the
challenge of engaging this generation
of students in formal learning in the
classroom. Studies over a span of
two decades reveal a consistently low
level of engagement in the classroom,
which has resulted in widely reported
boredom and an escalated high school
drop-out rate. One reason may be
related to the widening gap between
the tech-savvy students and the printcentric schools. Children today are
growing up in a rapidly evolving
digital media environment where using
cutting-edge gadgets has become an
integral and important part of their
28
growing and learning experience.
Despite children’s massive use of
digital technologies outside of the
classroom, schools still continue to
operate within a print-based cultural
logic.
Yet as revealed by studies on
the above-discussed programs, the
3D MUVE is a motivationally rich
gadget that deeply engages students
in an enjoyable and fervent gamelike environment. Results from the
implementations of River City in
public school classrooms indicate that
students, both boys and girls, are highly
motivated by the 3D MUVE program,
with students reporting that they “felt
like a scientist for the first time”
(Clarke & Dede, 2004). Similarly,
students in SciCentr rate their learning
experience in 3D MUVE significantly
higher than in a traditional science
teaching environment, stating that they
have more fun and have learned more
(Norton, Corbit, & Ormaechea, 2008).
Moreover, SciCentr appears to have
the greatest impact on students who
begin with neutral or negative attitudes
toward science. This echoes the results
of River City, which showed a greater
impact on learning for low achieving
students in inner-city schools (Dede &
Ketelhut, 2003). These findings make
it obvious that, if well designed and
wisely used, the 3D MUVE would be
a viable platform to increase student
engagement, which is important to
students’ achievement and to their
social and cognitive development.
3. Bridge between Formal and
Informal Science Learning
As discussed previously, science
learning in a school setting is subject
to time constraints, in addition to
the added complexity of classroom
management introduced by technology. The 3D MUVE appears to be an
ideal supplemental tool that connects
formal science learning in class and
learning in an informal setting, such
as after-school programs, or leisure
time playing at home. A consistent
theme among the existing 3D MUVE
science programs is that they are being
implemented with great success in the
informal setting, as teaching aids or
supplemental activities in K-12 science classes.
Except for the above mentioned
exemplary 3D MUVE science
programs, there have been few efforts
to leverage the energy, passion, and
engagement children show for the 3D
game world in their time outside of
school. Children’s enthusiasm with 3D
MUVE should be harnessed and linked
to the science content and inquiry
skills required in the curriculum.
Instead of grousing and competing
with reading and math for a share of
the limited amount of time available
in school, science educators should
make the most of the 3D MUVE’s
abundant features and popularity, and
connect it with in-school activities.
By building a continuum between
classroom instruction and after-school
or at-home activities in 3D MUVE,
the passion and informal learning
that occur in the 3D game-playing
environment will transfer into the
classroom and significantly increase
student engagement in the formal
learning setting.
Conclusions
The fact that it is a multi-billion
dollar industry that rivals Hollywood’s
cultural influence shows that digital
games are now a dominant play
culture, and they are increasingly
affecting kids’ development and
informal learning outside school. It
is becoming ever more evident that
technologies make access to children’s
SCIENCE EDUCATOR
interests, passion, and preferred
learning styles quick and easy. To
harness the power of 3D MUVE
and leverage the passion and energy
children have with this media, schools
need to consider seriously the role of
3D MUVE in science education. As
we have seen in the pioneering 3D
MUVE science programs designed
by forward-thinking science education
researchers, the 3D MUVE definitely
holds great potential and opportunities
for improving science learning
and points to a new, promising
direction for science education. It
is a viable platform for conducting
scientific inquiry, a gateway to an
engaging, socially distributed learning
environment, and a bridge to connect
and blend formal and informal science
learning. After efforts of more than a
decade in science education reform
with marginal results, it may be
time to sit down and watch how our
children play and learn with the new
media, experience their enthusiasm
and creativity in the digital world, and
ask ourselves how we can make this
happen in the classroom.
References
Barab, S., Arici, A., & Jackson, C.
(2005). Eat your vegetables and do
your homework: A design-based
investigation of enjoyment and
meaning in learning. Educational
Technology, 45, 15-21.
Barab, S., Sadler, T., Heiselt, C., Hickey,
D., & Zuiker, S. (2007). Relating
narrative, inquiry, and inscriptions:
Supporting consequential play. Journal
of Science Education and Technology,
16, 59-82.
Cavanagh, S. (2007). U.S. students fall
short in math and science. Education
Week, December 4, 2007.
FALL 2009
VOL. 18, NO. 2
Clarke, J., & Dede, C. (2005). Making
learning meaningful: An exploratory
study of using multi-user environments
(Muves) in middle school science.
Paper presented at the annual meeting
of the American Educational Research
Association, Montreal.
Corbit, M., Bernstein, R., Kolodziej, S.,
& McIntyre, C. (2006). Student project
virtual worlds as windows on scientific
cultures in CTC SciFair. Available
online at <http://www.scicentr.org/cgibin/download.aspx?get=Resources/
CorbitEtAlPCST2006.pdf>.
Dede, C., & Ketelhut, D. (2003).
Motivation, usability, and learning
outcomes in a prototype museum-based
multi-user virtual environment. Paper
presented at the annual meeting of
the American Educational Research
Association, Chicago.
Dorph, R., Goldstein, D., Lee, S., Lepori,
K., Schneider, S., & Venkatesan, S.
(2007). The status of science education
in Bay Area elementary schools:
Research brief. Available online at
<http://www.lawrencehallofscience.
org/rea/bayareastudy/pdf/final_to_
print_research_brief.pdf>.
Harvard University. (2008). The River
City project. Available online at
<http://muve.gse.harvard.edu/
rivercityproject/contributors/
contributors.html>.
Leath, A.T. (2001). National survey
of student performance in science.
Available online at <http://www.aip.
org/fyi/2001/143.html>.
National Assessment of Educational
Progress. (2005). The nation’s report
card: Science. Available online at
< http://nationsreportcard.gov/science_
2005/>.
National Center for Education Statistics.
(2007). Highlights from PISA
2006: Performance of U.S. 15Year-Old students in science and
mathematics literacy in an international
context. Available online at <http://
nces.ed.gov/pubsearch/pubsinfo.
asp?pubid=2008016>.
National Research Council. (1996).
National science education standards:
Observe, interact, change, learn.
Washington, DC: National Academy
Press.
National Science Teachers Association.
(2004). NSTA position statement:
Scientific inquiry. Available online at
<http://www.nsta.org/about/positions/
inquiry.aspx>.
Norton, C., Corbit, E., & Ormaechea, L.
(2008). A comparison of self-directed
learning in a virtual world environment
to traditional science teaching methods.
Paper presented at the annual meeting of
the National Association for Research
in Science Teaching, Baltimore.
Roth, K., & Garnier, H. (2007). What
science teaching looks like: An
international perspective. Educational
Leadership, 64, 16-23.
Slator, B., & Beckwith, R. (2006). Electric
worlds in the classroom: Teaching and
learning with role-based computer
games. New York: Teachers College
Press.
Smith, P.S., Banilower, E.R., McMahon,
K.C., & Weiss, I.R. (2002). The national
survey of science and mathematics
education: Trends from 1977 to 2000.
Available online at <http://2000survey.
horizon-research.com/reports/trends/
trends_report.pdf>.
Wallace, J., & Louden, W. (2002).
Dilemmas of science teaching:
Perspectives on problems of practice.
New York: Routledge.
Dr. Yufeng Qian is assistant professor in the
School of Leadership Studies at St. Thomas
University, where she teaches educational
research and instructional technology courses.
Author of a number of journal articles and book
chapters on emerging learning technologies,
she is also a member of the doctoral faculty in
the Ed.D. program in Educational Leadership.
Correspondence concerning this article can be
sent to <yqian@stu.edu>.
29
David Steer, David McConnell, Kyle Gray, Karen Kortz, Xin Liang
Analysis of Student Responses to
Peer-Instruction Conceptual Questions
Answered Using an Electronic Response
System: Trends by Gender and Ethnicity
This descriptive study investigated students’answers to geoscience conceptual
questions answered using electronic personal response systems. Answer
patterns were examined to evaluate the peer-instruction pedagogical
approach in a large general education classroom setting.
Over the past decade, it has become
apparent that effective learning occurs
in Science, Technology, Engineering
and Mathematics (STEM) classrooms
that use student-centered, active
approaches that allow interactive
exchange between and amongst
students and instructors (American
Geophysical Union, 1994; National
Research Council, 1997, 2000;
National Science Foundation, 1996).
Such exchanges are facilitated when
students use electronic personal
response systems to answer conceptual
multiple choice questions, called
conceptests by Mazur (1997) and
referred to as think-pair-share exercises
in some disciplines (McTighe &
Lyman, 1988). Conceptests are
repetitive measures designed to
explore student depth of understanding
(both individual and group), and they
often include answers with known
preconceptions. Students consider
the question and respond individually.
Crouch and Mazur (2001) suggest that
an initial correct response rate of 35%
30
- 70% is optimal for these questions.
Peer instruction is a practice in which
students work together in pairs and
small groups to discuss and defend
their responses (Mazur, 1997), and
this discussion may be followed by
a second round of student responses.
The use of conceptests is formative,
because they provide timely feedback
that the instructor and student can use
to improve their performance. Much
has been written about the ways in
which this technique can be used by
faculty (Cox & Junkin, 2002; Crouch
& Mazur, 2001; Green, 2003; Hake,
1998; Mazur, 1997; McConnell, Steer,
Owens & Knight, 2006; Pilzer, 2001;
Responses were analyzed
for predictability, construct
validity and gender
reliability assuming a
statistically normal response
distribution.
Rao & DiCarlo, 2000; Sokoloff &
Thornton, 1997). The evidence is
also compelling that this technique
improves student learning from a
course perspective (Crouch & Mazur,
2001; King & Joshi, 2008; Lasry,
Maur, & Watkins, 2008; Smith et al.,
2009) and that the technology is well
received by students (MacGeorge
et al., 2008a). Less is known about
the impact this technique has on
subpopulations of students based on
gender and race.
The conceptests used in this study
were taken from a large database of
questions for the geosciences that were
developed by more than 30 geoscience
faculty members with multiple years
of experience teaching introductory
courses in a variety of settings (e.g.
community college, small 4-year,
and public universities). Those
faculty members used their personal
experiences and a review of the
published literature to develop lists of
geoscience concepts that are difficult
for students to grasp and are discussed
SCIENCE EDUCATOR
in most typical introductory geoscience
courses for non-majors. Some of these
concepts include plate tectonics,
geologic time, the rock cycle, and
the water cycle. The conceptests were
generated according to good practices
for writing multiple choice questions
(Haladyna, Downing, & Rodriquez,
2002) by focusing on a single concept,
using simple language or graphics,
and including 3-4 short answers
that require few or no calculations.
The distracters (incorrect answers)
also include alternative conceptions,
misconceptions, or incorrect intuitive
responses. The conceptests probe
student understanding at various
cognitive levels and emphasize the
comprehension and application
(“understanding” and “applying”
levels in Anderson and Krathwohl
[2001]) through analysis and evaluation
levels of cognitive processing (Bloom,
1956).
This study focuses explicitly
on conceptual questions at the
understanding, applying, and
analyzing cognitive levels (Anderson
& Krathwohl, 2001), because, these are
the most appropriate levels to assess
using multiple-choice formats. The
questions are posed as text-, diagram-,
or graph-based problems, and they are
similar to questions on the summative
exams. At the understanding level,
students demonstrate they are able to
convert concepts learned as text to an
illustration or vice versa. Students are
also asked to compare and contrast
objects or concepts, select reasons,
compare solutions, or make predictions
(see Figure 1). At the applying level,
students apply rules or principles to
new situations, use known procedures
to solve problems, or demonstrate
that they know how to do something.
When working at the analyzing
level, students select answers that
FALL 2009
VOL. 18, NO. 2
explain how something Figure 1: Example of a diagram-based-based,
works or distinguish fact understand-level conceptest related to the orographic
lifting of air.
from opinion. Questions
that require students to
scrutinize graphical data
or images are interpreted
wind
as analysis questions,
especially if the students
X
have not previously seen
the graph (see Figure 2).
In the landscape
pictured, how would the
Methods
amount of rainfall change at location
The data used for this study
X if the mountain eroded down to the
represents 4712 responses to
dashed line?
conceptests collected from 242
a) Rainfall would increase
students enrolled in four earth science
b) Rainfall would decrease
classes for non-science majors and one
c) Rainfall would stay the same
physical geology class at a community
college. These classes were taught
Figure 2: Example of a graph-based, analysisby three instructors, each with
level conceptest related to the rock cycle.
over five years of teaching
experience using active
learning strategies. In addition
to incorporating conceptests
using peer instruction (Mazur,
1997; McConnell, Steer,
Owens, & Knight, 2006),
classes were taught using a
variety of learner-centered
activities including the use of
student-manipulated physical
models (Gilbert & Ireton,
2002),
lecture tutorials (Kortz, Smay,
The graph illustrates how the
&
Murray,
2008), and predictive
temperature changed with time for
demonstrations (Sokoloff & Thornton,
part of the rock cycle. Which of the
1997). Students earned participation
following is best represented by the
points for responding to conceptests,
graph?
regardless of whether the answers
a) Sand is lithified to form
were correct or incorrect. Three classes
sandstone
occurred in spring 2008, and two
b) Limestone is metamorphosed
classes occurred in fall 2008.
to form marble
This study reports conceptest
c) Marble is uplifted to Earth’s
response trends for paired answers
surface
from students who answered from 1026 questions each over the course of the
d) Magma cools to form granite
semester. The questions are assumed
e) Shale is heated and converted
to be valid for content since they were
to magma
31
developed by geoscience educators
and have been reviewed for content
validity by 12 experts across multiple
institutions. Reliability and validity
testing was completed for the questions
using responses collected in spring
2006 from a large-format, general
education introductory earth science
class (155 students). Responses
were analyzed for predictability,
construct validity and gender
reliability assuming a statistically
normal response distribution. Correct
response rates for the questions as a
whole were not gender biased (p>0.35,
n=55). Three individual questions
appeared to show bias even after
addition of response data for the same
questions from fall 2005. As a set, the
52 remaining conceptest questions
used in this study met predictive
validity requirements. The percentage
of students correctly responding to
comprehension-, application- and
analysis-level questions decreased
with increasing question cognitive
level (p<0.0001; 67%, 52% and 36%
respectively).
Student responses from conceptests
answered during lessons that used the
peer instruction technique were scored
using a rubric (Table 1). Those scores
were used to evaluate the efficacy of
this pedagogical technique for various
populations (male, female, Causcasian,
and minority). Students in selected
courses completed a 15 question,
Geoscience Concept Inventory (GCI)
test (Libarkin & Anderson, 2005) as an
independent assessment of geoscience
conceptual understanding. The GCI
is a valid and reliable assessment
designed to assist geoscience faculty
in evaluating teaching and learning
(Libarkin & Anderson, 2005). Its
purpose and design are similar to the
Force Concept Inventory (Hestenes,
32
Table 1: Scoring rubric for student responses to conceptest questions.
Table 1: Scoring rubric for studentresponses to conceptest questions.
Notethat,
that,asas
averaged
all questions,
of students
Note
averaged
overover
all questions,
33% of 33%
students
recorded recorded incorre
incorrect
responses
after
peer instruction,
and the
remainder
recorded
instruction,
and the
remainder
recorded
a correct
answer
on the second atte
a correct answer on the second attempt.
Wells, & Swackhammer, 1992) that is
widely used in physics education.
Note that, as averaged over
all questions, 33% of students
recorded incorrect responses after
peer instruction, and the remainder
recorded a correct answer on the
second attempt.
Student engagement was determined
by dividing the number of student
answers to conceptests by the total
number of questions posed. For
example, a score of 70% on student
engagement was recorded by a student
who answered 70% of the conceptests
analyzed in the study. These scores
were a proxy indicator of attendance.
Average conceptest scores were
calculated by dividing the number
of correct answers by the number of
questions asked, and no deduction
was made for unanswered questions.
Individual student response rates for
each question category (Table 1) were
calculated by dividing the number of
responses in a category by the total
number of questions answered by
that student. Final course grades and
post-course GCI scores were also used
as summative assessments of student
success. Response data were grouped
by gender and ethnicity for analyses.
African American, Asian, Pacific
Islander, and Hispanic were combined
under the ‘minority’ classification.
All data fields were not available
for all students (due to student absence
during administration of the GCI,
missing self-reported data, failure
to complete the course, etc.). In all,
five variables (pre-GCI, post-GCI,
final grades, average proportion of
correct answers on conceptests, and
engagement) were analyzed for each
of the four populations (minority
male, minority female, Caucasian
male, Caucasian female). Pearson’s
correlation coefficients (δ) were
calculated for the 20x20 matrix
with values of 0.1-0.3 considered of
small significance, 0.3-0.5 moderate,
and 0.5-1.0 large. Comparisons
between larger populations (malefemale, minority-Caucasian) were also
completed using ANOVA or statistical
T-tests using Cohen’s d values for
effect sizes, and values of p<0.05 were
considered significant.
Data
Data were sorted by both gender and
race (Figure 3) to show how student
responses were distributed in the four
paired-response categories (correctincorrect, twice incorrect, incorrectcorrect, twice correct; see Table 1). The
total response database included 6%
minority male (n = 282), 8% minority
female (n = 385), 52% Caucasian male
(n = 2451), and 34% Caucasian female
(n = 1594) responses.
SCIENCE EDUCATOR
questions correctly the first time the question was posed and incorrectly on the second
Figure 3). There were not enough responses in this answer category for meaningful an
population groups.
Figure 3: Percent of student conceptest responses by demographic group.
Correct-Incorrect: Overall, Figure 3: Percent of student conceptest responses by demographic group.
approximately 5% of responses
showed students answered
conceptest questions correctly the
first time the question was posed and
incorrectly on the second attempt
(Table 1; Figure 3). There were not
enough responses in this answer
category for meaningful analyses
between population groups.
Twice-Incorrect: About 28% of
all responses were incorrect on both
attempts (Table 1; Figure 3). As a
percentage of their responses, male
minority students were most likely
to answer in this way (over 36% of
their responses, Figure 3). Female
students of both demographic
groups answered in this fashion
in approximately equal proportions
(32%), and Caucasian males answered
males; 0.4 for Caucasian males) and
compared to other course variables
twice incorrect 26% of the time.
when comparing minority males and
(pre- and post-GCI, final grades,
Comparisons in twice incorrect
females (d = 0.4). Effect sizes were
conceptest average, and engagement),
response rates between minority
larger when analyzing Caucasian
several trends appeared (Table 2).
populations and for Caucasian females
males to both minority populations (d
Male minority students: Minority
compared to other populations showed
= 0.6 for males; 1.3 for females) and
male conceptest averages were
small effect sizes (d = 0.0 to 0.3).
when comparing female populations
strongly correlated with post-course
There were moderate effect sizes when
(d = 0.9).
GCI scores (δ=0.9; Table 2: row D,
comparing male Caucasian response
Incorrect-Correct: Overall, 26%
column B) and moderately correlated
rates in the twice incorrect category
of the responses were incorrect on
with final grades (δ=0.5; Table 2: row
to minority males and females (d =
the first attempt, but correct after
D, column C). Pre-course GCI scores
0.5 and 0.6).
peer instruction (Table 1: score 3).
(Table 2: column A) were strongly
Twice Correct: The largest
At this level, minority females faired
correlated to final grades (Table 2:
differences between populations
better than their minority male peers
row C) and post GCI scores (Table
were noted when analyzing the 41% of
(35% versus 27% of responses) and
2: row B) for this population (δ=0.6
twice-correct responses (Table 1: score
slightly better than Caucasian students
and 0.7). Engagement (Table 2: row
4; Figure 3). Caucasian male students
(26% for Caucasian females and 24%
E) displayed a moderately negative
were most likely to answer correctly
for males). Effect sizes were small
correlation with post-GCI scores
both times (45% of responses). Their
when comparing minority males to
(Table 2: column B) and moderately
female counterparts answered in
both Caucasian populations (d = 0.0
positive correlations to final grades
this fashion about 39% of the time.
for males; 0.2 for females). All other
and average conceptest scores (δ=0.4;
Female minority students were least
response rate comparisons in the
Table 2: columns C and D).
likely to answer twice correct (26%
incorrect-correct category displayed
Female minority students: Minority
of responses), and minority males
moderate effect sizes (d = 0.5 to
female average conceptest responses
answered in this way 32% of the time.
0.7).
(Table 2, row I) displayed a strong
Effect sizes were small to moderate
Combined Responses: When
negative correlation with pre-GCI
when comparing female Caucasian
average response rates for individual
scores (δ=-0.6; Table 2: column F).
students to males (d = 0.3 for minority
students by demographic group were
Final course grades (Table 2: row H)
FALL 2009
VOL. 18, NO. 2
33
Table
2: Pearson’s correlation coefficients for studied populations
�������������������������������������������������������������������
�
� � � �
� �
�������������������������
���
�������������������������
��� ���
����������������������������
��� ��� ���
�����������������������
��� ��� ��� ���
�����������������������
���� ���� ��� ��� ���
�������������������������
���
���
��������������������������
����
��� ���
�����������������������������
���
��� ���
�������������������������
����
���� ���
�������������������������
���� ���� ���
�������������������������
���
���
��������������������������
����
���
�����������������������������
����
������������������������
���
������������������������
����
���������������������������
���
���
����������������������������
����
����
�������������������������������
����
��������������������������
����
��������������������������
���
�
�
�
�
�
������������������
�
� � � � � � �
���������������������������
��� ���
�������������
����
���
���
�����
�������� ��� ���� ���
��� ��
������
���
��� ���
��� ��� ���
���
���
���
���
����
���
���� ����
���
���
����
����
���
��� ���
��� ��� ���
���� ��� ��� ���
���
���
���
����
���
���
���
��� ���
���
��� ���
��� ��� ���
��� ��� ��� ���
�������������������������������������������������������������������������������������������
Between populations (Note: Correlations for unrelated variables are removed from the table)
were moderately correlated (δ=0.5)
to pre-GCI scores (Table 2: column
F) and engagement (Table 2: row J,
column H).
Male Caucasian students:
Male Caucasian students recorded
moderately correlated engagement
and final course grades (δ=0.5; Table
2: row O, column M). Pre- and post
course GCI scores (Table 2: row L,
column K) were also moderately
correlated to post GCI results (δ=0.4;
Table 2: column L), as were average
conceptest scores (Table 2: row N).
Female Caucasian students:
Female Caucasian student data showed
only one strong correlation (δ=0.6),
and that was between engagement
(Table 2: row T) and final grades
(Table 2: column R). All other withingroup correlations were small or
insignificant.
Between Group Correlations:
Moderately significant correlations
were found when variables were
compared between population groups.
34
Male and female minority student preGCI scores were correlated (δ=0.5;
Table 2: row F, column A), and male
minority post-GCI scores (Table 2:
column B) were negatively correlated
with post-GCI scores of all other
demographic groups. Pre-GCI scores
for female minority students (Table 2:
column F) were correlated with both
Caucasian males and females (δ=0.4
and 0.5). Other correlations between
groups were either between variables
that had no practical relationship and
were not shown (e.g. minority male
pre-GCI scores and minority female
post-GCI scores) or were of little or
no significance.
Interpretation and
Discussion
Correct/Incorrect Responses: We
considered an initial correct response
followed by an incorrect answer choice
to be the least desirable response
sequence. The 5% of responses for
which students answered correctly
initially but changed to an incorrect
response following peer instruction
was similar to the 6% rate reported
by Crouch and Mazur (2001). These
data suggest that such responses should
be expected regardless of ethnicity or
gender (Figure 3). The 5% rate closely
matches a four-answer multiple choice
question occurrence probability of 6%
for random guessing on two identical
questions (probability increases to
about 10% for a three answer question).
Since students were awarded credit for
answering the questions (whether
correct or incorrect), it is possible that
some students were simply guessing or
answering randomly to fulfill course
requirements (King & Joshi, 2008). It
is also possible some of these responses
simply represent input error. Such
an error was possible, because the
electronic response software provided
signals when student responses were
received, but did not display individual
responses. However, ineffectual peer
instruction also can not be ruled out. If
SCIENCE EDUCATOR
guessing and input error accounted for
most correct-incorrect responses, those
answers provided little information
relevant to student assessment or
teaching. Additional studies are
necessary to determine if correctincorrect responses are important
indicators of student learning when
using this technology.
Correct/Correct Responses: A
twice-correct answer was considered a
positive result, because such a response
suggested students initially understood
the concepts and then validated
that understanding by answering
correctly a second time. The overall
twice-correct answer rates found here
closely matched the 40% rate reported
by Crouch and Mazur (2001) and
played a major role in understanding
similarities and differences between
populations. Since Caucasian male
and female students were more
likely to answer twice correct, their
other major answer categories had
proportionally fewer responses than
those of minority students (Figure
3). Such an observation supports the
contention that differences within
diverse populations can be more
important than differences between
populations (Harper, 2009). When
student data were sorted into two
groups (>40% and <40% of responses
twice correct), there was a strong
correlation between engagement and
final grade for both the high- and
low-performing groups (δ=0.6). Such
a finding was not surprising, because
engagement is a proxy for attendance,
which has been previously correlated
to course success (Newman-Ford,
Fitzgibbon, Lloyd & Thomas, 2008;
Scott, 2000).
Incorrect/Incorrect: As with the
correct-incorrect answer, a twice
incorrect response was considered
a negative outcome, because it
FALL 2009
VOL. 18, NO. 2
This is the first study to
examine the contrasts in
student performance by both
gender and ethnicity using
electronic response systems
in large classes.
suggests the peer-learning technique
was not effective for the students
that answered in this fashion. The
28% overall response rate for twiceincorrect questions was higher than
was reported for physics (22% in
Crouch & Mazur, 2001). The finding
that over one quarter of all responses
were incorrect after peer discussion
was particularly troubling in light of
the fact that 40% of responses were
twice correct, because this suggests
that more correct responses result
from other learning than from peer
instruction. Students were randomly
organized into four-person learning
teams to encourage in-class discussion
during the peer instruction phase
of the class. The correct answer for
most of the conceptests was also the
most popular answer when students
were polled on the first attempt.
Armed with that information and
group discussion support, such a high
level of twice incorrect answers was
considered problematic. ANOVA and
correlation analyses showed that there
were indistinguishable differences
(p >0.05) and correlations (-0.2<=
δ <=0.2) between students who
frequently answered in this fashion
(>25% of registered twice incorrect
responses) as compared to those
who did so less often (< 25%) for
all analysis variables. If a significant
number of students in these classes
were not actually discussing answers,
there may have been little propensity
for students to change their answers.
Perhaps students simply failed to
change answers to questions if they did
not understand the concepts and dialog
was not effective enough to clarify
understanding. Additional research
that focuses on group interactions
during peer discussion is necessary to
determine the extent to which group
communication affects twice incorrect
response patterns.
Incorrect/Correct Responses:
The type of response sought when
using conceptests with peer instruction
was that of changing from an incorrect
to a correct response (Table 1: score
3). Approximately 26% of student
responses in this study were of this
type, which is lower than the 32%
reported by Crouch and Mazur (2001).
These data suggest that peer instruction
was nearly equally effective for all
populations, but perhaps slightly more
so for female minority students (who
were ~7% more likely than any other
demographic group to answer this
way). Since minority students were
more likely to miss these questions
on the first attempt than Caucasian
students, they were in a better position
to benefit from this approach.
Overall Responses: Combined
analyses of all the response data
suggested that there were similarities
and differences in the ways that diverse
populations respond when using this
technology and pedagogical approach.
When comparing males and females,
all meaningful variable correlations
were small or insignificant, which
supports the suggestion made by
King and Joshi (2008) that electronic
response systems did not significantly
hinder male or female student success
in engineering. Within the male
population, moderate correlations
between pre- versus post-GCI scores,
engagement versus final course grades,
35
and conceptest averages versus postGCI scores were again identified.
Within the female population,
engagement was strongly correlated
to final grades (δ=0.6), and other
variables correlated poorly. When
the responses of all minority students
were compared to the responses of all
non-minority students, all meaningful
variable correlations related to
performance were insignificant
or small, which suggests that this
pedagogy provided an inclusive
approach to formative assessment.
Strong to medium correlations related
to GCI scores and engagement suggest
that prior knowledge and attendance
played the most important role in
minority students’ course success.
This finding supports the use of this
technology with these populations if
doing so encourages attendance, as
has been noted in previous studies
(MacGeorge et al., 2008b).
All populations could benefit
if twice incorrect responses were
minimized. This pedagogy relies on
the positive group synergies known
to be generated when learning with
peers in a low-stakes environment
(Mazur, 1997). Students placed in
groups working toward a common
goal, as is implicit in peer learning,
provides a pseudo- organizational
structure with social norms. Because
of this, organizational learning theory
(Argyris & Schön, 1996) may be
an appropriate lens through which
to view student response patterns.
Central to such learning is the ability
to detect errors (wrong answers)
and take appropriate action (select
correct answers) when responding
to future opportunities (questions).
This requires that members of the
learning team work effectively and
that the culture of the group be
conducive to constructive dialog
36
between all members of the team
(Bensimon, 2005). An environment
that is conducive to constructive
dialogue is one in which all students
are comfortable asking questions of
their group members when they are
not certain of the correct answer or
when they consistently answer twice
incorrect. The social dialog presumed
to occur during peer instruction
is known to result in successful
performance among minority students
(Quaye, Tambascia, & Talesh, 2009).
However, the twice incorrect data
presented here suggest that the optimal
type of dialog was not occurring as
often as desired for all populations.
Clearly, all student groups have high
and low performers. More detailed
observations of student discussions
are needed to better understand the
dynamics and implications of dialog
occurring in these groups and the
impact of those peer discussions on
response distributions.
This is the first study to examine
the contrasts in student performance
by both gender and ethnicity using
electronic response systems in large
classes. Given the ubiquity of this
technology on college campuses,
these data are available in electronic
archives for a wide range of classes.
We encourage others to analyze their
data to determine if the trends reported
here apply more widely.
Conclusions
The similarities and differences
in conceptest response patterns
found here illustrate how data from
electronic response systems can
be used to evaluate a pedagogical
technique such as peer-instruction.
The relatively small percentage of
correct-to-incorrect responses may
simply be a function of operator error
or lack of interest in the class activity.
As a percentage of all responses
within populations, males’ and
females’ answers show very similar
distributions, which implies that
the pedagogical technique is gender
neutral. Furthermore, the distribution
for answer changes from incorrect to
correct suggests that all demographic
groups benefit nearly equally from
peer discussions. Perhaps as expected,
students who answer conceptual
questions correctly the most often tend
to score highest in course grades, and
correct response rates are a moderate
function of prior knowledge and
attendance. However, the consistently
high rate of twice incorrect answers
for all groups, and particularly among
minority males, is cause for concern.
Better dialog within groups appears
to be necessary for diverse student
populations to benefit most effectively
from this intervention.
Acknowledgements
Partial support for this work was
provided by the National Science
Foundation’s Geoscience Education
and DUE CCLI programs under
Award Nos. 0506518 and 0716397.
The authors thank Julie Libarkin and
Suttida Rakkapao for their evaluation
of statistical techniques used in this
study. Any opinions, findings, and
conclusions or recommendations
expressed in this material are those
of the authors and do not necessarily
reflect the views of the National
Science Foundation.
References
American Geophysical Union. (1994).
Scrutiny of Undergraduate Geoscience
Education: Is the Viability of the
Geosciences in Jeopardy? Washington,
DC: American Geophysical Union.
SCIENCE EDUCATOR
Anderson, L.W., & Krathwohl (Eds.).
(2001). A taxonomy for learning,
teaching, and assessing: A revision
of Bloom’s taxonomy of educational
objectives. New York: Longman.
Argyis, C. & Schön, D.A. (1996).
Organizational Learning II: Theory,
method and practice. Reading, MA:
Addison-Wesley.
Bensimon, E.M. (2005). Closing the
achievement gap in higher education:
An organizational learning perspective.
In A.J. Kezar (Ed.), Organizational
Learning in Higher Education. New
Directions for Higher Education: No.
131. 99-111. San Francisco: JosseyBass.
Bloom, B.S. (Ed.) (1956). Taxonomy
of educational objectives: The
classification of educational goals:
Handbook I, cognitive domain. New
York ; Toronto: Longmans, Green.
Cox, A.J., & Junkin, W.F., III. (2002).
Enhanced student learning in the
introductory physics laboratory.
Physics Education, 37 (1), 37-44.
Crouch, C.H., & Mazur, E. (2001). Peer
Instruction: Ten years of experience and
results. American Journal of Physics,
69, 970-977.
Gilbert, S.W. & Ireton S.W. (2002).
Understanding models in Earth and
space science. 129. Arlington, VA:
NSTA Press.
Green, P.J. (2003). Peer instruction for
Astronomy. 178. New York, NY:
Prentice Hall.
Hake, R.R. (1998). Interactive-engagement
versus traditional methods: A sixthousand student survey of mechanics
test data for introductory physics
courses. American Journal of Physics,
66, 64-74.
Haladyna, T.M., Downing, S.M. &
Rodriguez, M.C. (2002). A review
of multiple-choice item-writing
guidelines for classroom assessment.
Applied Measurement in Education,
15(2), 309-334.
Harper, S.R. (2009). Institutional
seriousness concerning black male
student engagement: Necessary
FALL 2009
VOL. 18, NO. 2
conditions and collaborative
partnerships. In S.R. Harper & S.J
Quaye (Ed.s), Student engagement
in higher education: Theoretical
perspectives and practical approaches
for diverse populations: (137-156).
New York, NY: Routledge Taylor and
Francis Group.
Hestenes, D., Wells, M. & Swackhammer,
G., (1992). Force concept inventory.
The Physics Teacher, 30, 141-158.
King, D.B. & Joshi, S. (2008). Gender
differences in the use and effectiveness
of personal response devices. Journal
of Science Education and Technology,
17(6), 544-552.
Kortz, K.M., Smay, J.J. & Murray,
D.P. (2008). Increasing learning in
introductory geoscience courses
using lecture tutorials. Journal of
Geoscience Education, 56(3), 280290.
Lasry, N., Maur, E. & Watkins, J. (2008).
Peer instruction: From Harvard to the
two-year college. American Journal
of Physics, 76(11), 1066-1069.
Libarkin, J.C. & Anderson, S.W. (2005).
Assessment of learning in entry-level
geoscience courses: Results from
the Geoscience Concept Inventory.
Journal of Geoscience Education,
53(4), 394-401.
MacGeorge, E.L., Homan, S.R., Dunning,
J.B., Elmore, D., Bodie, G.D., Evans,
E., Khichadia, S., & Lichti, S.M.
(2008a). The influence of learning
characteristics on evaluation of
audience response technology. Journal
of Computing in Higher Education,
19(2), 25- 46.
MacGeorge, E.L., Homan, S.R., Dunning,
J.B., Elmore, D., Bodie, G.D., Evans,
E., Khichadia, S., Lichti, S.M., Feng,
B. & Geddes, B. (2008b). Student
evaluation of audience response
technology in large lecture classes.
Educational Technology Research and
Development, 56, 125-145.
Mazur, E. (1997). Peer instruction: A user’s
manual. 253. Upper Saddle River, NJ:
Prentice Hall.
McConnell, D.A., Steer, D.N., Owens,
K.D. & Knight, C.C. (2006). How
students think: Implications for
learning in introductory geoscience
courses. Journal of Geoscience
Education, 53(4), 462-470.
McTighe, J. & Lyman, F. (1988). Cueing
thinking in the classroom: The promise
of theory-embedded tools. Education
Leadership, 45(7), 18-24.
National Research Council. (1997).
Science teaching reconsidered. 88.
Washington, D.C.: National Academy
Press.
National Research Council. (2000). How
people learn: Brain, mind, experience
and school. Washington, D.C.: National
Academy Press.
National Science Foundation. (1996).
Shaping the future: New expectations
for undergraduate education in
science, mathematics, engineering,
and technology. 76. Arlington, VA:
National Science Foundation.
Newman-Ford, L., Fitzgibbon, K., Lloyd,
S. & Thomas, S. (2008). A large-scale
investigation into the relationship
between attendance and attainment: A
study using an innovative, electronic
attendance monitoring system. Studies
in Higher Education, 33(6), 699-717.
Pilzer, S. (2001). Peer instruction in
physics and mathematics. Primus,
11(2), 185- 192.
Quaye, S.J., Tambascia, T.P. & Talesh,
R.A. (2009). Engaging racial/ethnic
minority students in predominately
white classroom environments. In S.R.
Harper & S.J Quaye (Ed.s), Student
engagement in higher education:
Theoretical perspectives and practical
approaches for diverse populations,
157-178. New York, NY: Routledge
Taylor and Francis Group.
Rao, S.P., & DiCarlo, S.E. (2000). Peer
instruction improves performance
on quizzes. Advances in Physiology
Education, 24(1), 51-55.
Scott, V. (2000). The significance of
attendance in large college earth
science classes. GSA Abstracts with
Program, 32(7), 491.
37
Smith, M.K., Wood, W.B., Adams, W.K,
Weiman, C., Knight, J.K., Guild, T.T.
& Su, T.T. (2009). Why peer discussion
improves student performance on inclass concept questions. Science, 323,
122-123.
Sokoloff, D.R., & Thornton, R.K.
(1997). Using interactive lecture
demonstrations to create an active
learning environment. The Physics
Teacher, 35, 340-347.
38
David Steer is associate professor, Department
of Geology and Environmental Science, The
University of Akron, Akron, OH 44325-4101.
Correspondence concerning this article can be
sent to <steer@uakron.edu>.
David McConnell is professor, Department
of Marine, Earth and Atmospheric Sciences,
North Carolina State University, Raleigh, NC
27695-8208
Kyle Gray is assistant professor, Department
of Earth Science, University of Northern Iowa,
Cedar Rapids, IA 50614-0018
Karen Kortz is associate professor, Department
of Physics, Community College of Rhode
Island, Lincoln, RI 02865.
Xin Liang is associate professor, Department
of Education Foundation & Leadership, The
University of Akron, Akron, OH 443254208
SCIENCE EDUCATOR
Larry Krumenaker
No Child Left Behind and
High School Astronomy
This article examines the impact of the No Child Left Behind Act on the high
school astronomy course.
Astronomy was a required subject
in the first American secondary level
schools, the academies of the 18th
century. When these were supplanted
a century later by public high schools,
astronomy still was often required,
subsumed into courses of Natural
Philosophy. Reasons given at that time
to support astronomy as a part of general
education include “training of minds,”
“mental discipline,” and the practical
aspects of geography, commerce,
navigation and the refinement of a
civilized person (Bishop, 1977).
The “Committee of Ten” changed
this situation in 1892 by changing
college admission standards to no
longer consider the study of astronomy
as favorable. By 1930, only 0.06% of
all students in the whole country would
take an astronomy class (Bishop,
1980). The launch of Sputnik I in
1957 created a brief renaissance of
astronomy education, but eventually
enrollment slipped back down to 1%
in the 1980s, which was when the last
significant nationwide examination
of high school astronomy was done
through Philip Sadler’s 1986 survey
(Sadler, 1992).
After Sadler, an era of budget
cutbacks and increases in high stakes
standardized testing began, and this
became a dominating influence in
2001 with No Child Left Behind
FALL 2009
VOL. 18, NO. 2
(NCLB) and its emphasis on reading
and mathematics. Today astronomy is
taken by about 4% of all high school
students (Krumenaker, 2008). Despite
the meager growth that this enrollment
represents, it remains important to
re-examine the subject of high school
astronomy as well as the effects that
NCLB has had on the availability and
quality of these courses.
The results indicate that
high school astronomy
courses are far more
affected by NCLB indirectly
than directly.
This mixed-methods study looked
at fully independent, self-contained astronomy courses available to students
in grades 9-12. Therefore, courses,
such as physics or earth science, that
contain some astronomy units were
not considered in this study. The data
came from high school astronomy
teachers via a survey available to them
on a Webpage and as a Word file. (See
Appendix A for an overview of the
research procedures.) The study mirrored but greatly enlarged the scope
of the Sadler study. Quantitative and
categorical questions included diverse
topics such as instructors’ back-
grounds, planetarium and telescope
availability, financial support, course
content, student demographics, school
AYP status, and other items. Also
included were open-ended survey
questions, such as requests for recommendations about ways to go about
starting a course, and these responses
were coded and treated with qualitative or quasi-quantitative analyses. A
detailed quantitative summary can be
read in Krumenaker (2009a).
With an initial estimate of between
2500 and 3000 possible teachers
derived from a national listing called
the National Registry of Teachers
supplied by the National Science
Teachers Association, our initial
survey sample of about 600 teachers
represents 20-25% of the astronomy
teacher population. In order for a
sample of a small population to be
considered reliable, Tuckman (1999)
claims that it must consist of at least
10% of the target population, and this
data exceeds that standard.
The survey had an overall response
rate of about 40%, and out of this initial
sample 237 surveys were deemed
usable.
Additionally, in the Fall of 2007
the same questionnaire was sent in
a second survey by postal mail, and
this generated numerically half as
many responses. All of these results
39
are essentially identical to the larger,
first survey, which strengthens the
conclusions and removes concerns
relating to possible selection effects
arising from the methods of solicitation
and response (Krumenaker, 2009b).
Schools’ AYP Status
and Sizes
This article concerns itself with
the part of the study that deals with
the perceived effects that the No
Child Left Behind Act may have had
on high school astronomy courses.
One key parameter to investigate is
the AYP status of each school. AYP
stands for Adequate Yearly Progress
and is a measure of compliance with
NCLB that relies mostly on high stakes
testing scores. Filtering the results
to include only currently employed
public school teachers yielded 114
public schools with a Pass grade, 30
with a grade of Needs Improvement,
and 5 with a Failing AYP, or a rate of
77% Pass, 20% Needs Improvement,
and 3% Fail. The NCES (2007) values
for 2005-2006 indicate the comparable
national percentages were 60% Pass,
14% Needs Improvement, and 26%
Fail. This shows that high schools
with astronomy are more likely to
be schools that Pass AYP. Needs
Improvement percentages for schools
with astronomy are also higher than the
norm. The percentage of schools with
astronomy that Failed are substantially
lower than the national value.
This supports, though does not
prove, studies that say that electives
like astronomy are eliminated when a
school does not passAYP. Speculatively
speaking, this also supports the
contention that schools that pass AYP
have the luxury of offering an elective
like astronomy. But one must now ask:
have the teachers, therefore, not felt
any effects from NCLB?
40
Have Astronomy Courses
Been Affected by NCLB?
Among the open-ended questions in
the survey was “What, if any, positive
or negative effects have you felt in the
astronomy course from the No Child
Left Behind Act? (And, why do you
feel this way?)”.
Analysis of this (and other similar
questions on the survey) was done
by coding the responses in a manner
akin to grounded theory techniques
developed by Strauss and Corbin
(1997). Each sentence in an answer,
regardless of grammar or size, was
given a code word or phrase indicative,
in this case, of the purported effect
of NCLB. The sentences were sorted
alphabetically by code word, then
grouped under larger headings;
these might include ‘administration’,
‘justifications’, ‘support’, and so on.
No presupposed groupings were used;
each grouping would appear when
a ‘critical mass’ of similar answers
was gathered. As general themes
became evident, larger groups could
be split into smaller ones, and smaller
ones could be combined into larger
groups.
In addition, simple descriptive
statistics were performed on most
qualitative questions by counting and
comparing the sizes and numbers of
groups or themes.
Out of the 237 teacher pool, 30
belong to private schools where NCLB
has no effect or standing. Others did not
respond at all to this question or gave
answers not related to their course.
Of the remaining 139 responses, 83
teachers (60%) claimed that NCLB
had no effect on their course. Forty-six
made statements that can be construed
as negative effects. Only 10 teachers
gave responses that could be construed
as positive. Of those that claim some
effect from the Act, the resulting
balance is clearly negative, 33%,
versus 7% positive (Figure 1).
Figure 1: Teachers Reporting Effects
of No Child Left Behind on Astronomy
Courses
Negative
33%
None
60%
Positive
7%
Why do so many astronomy
teachers find themselves apparently
immune from NCLB? One of the
two most direct answers coming from
survey results is that NCLB itself and
the “high stakes testing” that is NCLBinspired but directly controlled by
state departments of education only
apply to Math and Language Arts,
not science, as of the time of this
survey. The other common answer
is that some states have few or no
high school astronomy standards at
all, such as the state of Texas’ TEKS
(Texas Expected Knowledge & Skills).
Therefore the courses are not tested,
and, consequently, they are ignored.
In the detailed discussion that
will now follow, additional evidence
about these perceptions, through
representative quotations, will be
presented. In them, a “Pass,” “Passing,”
“Fail” or “Failing” comment indicates
the school’s AYP status. Numbers
alone, such as “1.5K” or “1500”
refer to the number of students in
the school. Where the information
SCIENCE EDUCATOR
is not listed, either the status is
unknown or not considered relevant
to the discussion. Also, quotes are left
intact as typed into the surveys by the
respondents, including misspellings
and grammatical errors.
Negative Effects
Negative effects due to NCLB,
or related state high stakes testing
or curriculum changes caused by
NCLB pressure, manifest themselves
in six areas: enrollment numbers,
course cancellations, redeployment
of teachers and certification issues, a
change in the makeup of the courses’
student bodies, loss of status as a
science course, and loss of funding.
Numbers
Teachers report a decline in
enrollment due to a strong and
increasing emphasis on biology,
chemistry, and physics. As these
courses become more state-tested, and
therefore more required of students,
fewer students have time available
for electives, and student scheduling
abilities become more limited. Other
studies such as Hunt (2006) indicate
the same problem for other science
electives. The October 2007 NSTA
Reports found that “investment
in these programs (environmental
education) came to a screeching
halt …” (NewsBits, 2007). Survey
responses show similar concerns.
NCLBA will cause course to be
canceled after this year. School
will concentrate on Biology
which is the only state science
test in Arizona. —Self-described
pessimist, soon-to-retire Arizona
teacher in a 0.6K student Passing
high school whose class is open
to all grade levels.
FALL 2009
VOL. 18, NO. 2
Cancellations
In addition to drops in enrollment,
sometimes the courses themselves are
dropped, or are expected to be dropped,
for other reasons.
However, our administration has
told us that IF our API scores
drop in the future or we do not
meet the benchmarks that have
been set by the State, we will
have to remediate these students
someway. That will cause the
teachers of elective courses
(including science electives) to
become overseers of remedial
courses. Regular class enrollment
will drop and courses will be
eliminated as we have to add
remedial sections. —Teacher in
a 2.2K, Passing Oklahoma high
school.
Student academic levels
Teachers reported substantial
increases in the numbers of students
of lesser academic abilities in their
classes. Comments indicated that
more special education students are
believed to be placed in astronomy
classes, despite the fact that often there
are prerequisites of passing grades in
math, especially algebra, and other
sciences that these students do not
meet. There were more comments on
the effects of inclusion than for any
other individual, negative coding.
… the emphasis of inclusion
has resulted less in includiung a
few special education students
into regular education classes
and more in classes becoming
special education. … the math
prerequisites for the astronomy
course are ignored for special
education students. With time,
more and more students enroll in
the course without the necessary
math background thereby
requiring drastic alterations to
the curriculum. For example,
students are not proficent with
measuring angles and solving
one variable algebra problems.
—Teacher in a 1.2K student
Passing Pennsylvania high school
with a planetarium.
Teaching qualifications
Another effect seen by surveyed
teachers, and the only effect directly
caused by NCLB, is change in, or
elimination of, teaching assignments,
particularly because of the ‘highly
qualified’ specification. Some teachers
wrote that they have had to make
choices in what they can, or will,
teach. Sometimes the change has been
forced upon teachers. This particular
certification issue was further and
vividly exemplified in an email from a
responding teacher that came just after
the formal end of the survey.
Well I thought I would update
you to a new road block to having
astronomy in our classrooms. One
of the provisions of No Child Left
Behind ( No Teacher Left Standing)
is that a teacher must be “Highly
Qualified” in every subject they
teach. In most states including
mine, that means you have to take
a test to prove you are qualified.
Having a degree no matter what
your GPA doesn’t count. If you
haven’t taken such a test you have
to go through all sorts of “hoops”
to earn enough points to prove you
know your subject.
Since there is no Astronomy
test then the process is overly
complicated for any teacher to
attempt starting out a new program.
I my case I have both a BA and
41
Master of Education. Although I
am considered highly qualified
in Biology, a course I have never
taught, I am not in Astronomy since
it isn’t recognized on any state
list. I have taught astronomy for
27 years. Awarded [a prestigious
award from a renown society but
name removed to keep letter writer
anonymous] … for teaching high
school astronomy but can not get the
state of [omitted] to acknowledge I
am highly qualified.
Loss of status
Teachers report that astronomy is
being ‘left behind’ other sciences.
Students are required to take
Biology and two other Science
electives. NCLB does not empasize
the importance of taking any Earth
and Space courses. Earth/Space
seems to take a ‘back seat’ to
Chemistry and Physics. —Teacher
in a 1.3K Minnesota Passing high
school.
Secondary effects
There are secondary negative
consequences mentioned.
Teachers claim they can not call in
as many outside resources.
We [astronomy club members]
have seen a drop off in the number
of request for the club to come out
to schools and put on star parties.
Teachers are commenting that they
are so under presure to meet NCLB
mandated standardized tests that
they don’t have time to cover much
astronomy. —A private school
teacher in Hawaii who also is in
an astronomy club.
Teachers can’t go to a workshop
if it doesn’t fit NCLB. Can’t
make a workshop, can’t write to
state standards, must be federal.
Attendance is down. —Former
small school Maine teacher who
gives workshops.
Further evidence of reduced
professional development
opportunities comes from Pennypacker
(2008) who coordinates a global
version of the Hands-On Universe
(HOU) teaching training program. His
chart of the number of teachers who
have taken the HOU training program
over the years shows a rising trend-line
abruptly curtailed at the 2001 year
mark, and which descends in 2004, just
after the War in Iraq began (Figure 2).
We can not state there is a clear cause
and effect here, but clearly the HOU
graph parallels similar effects reported
by the surveyed teachers.
It is also reported that there are
fewer opportunities for collaborations
and, consequently, a purported stifling
of teacher creativity.
Positive Effects
Fewer positive effects are reported
than negative effects. Two of them
are at odds with some previously
mentioned negative effects. One
positive effect is that courses are
actually experiencing increased
enrollments.
Loss of funds
Figure 2: The number of teachers taking the HOU workshops, 1994-2007, from
Financial resources are also Pennypacker (2008), used with permission.
diminished, according to some
reports.
US HOU Teachers
the courses have been de-emphasied
by the administration because
it is not testable material and
uses resources better spent on
improving test scores. —Selfdescribed pessimistic former
teacher from a small 400-student,
Passing Wisconsin high school.
42
1400
Number of US HOU
Teachers
So many financial resources are
directed to remediation of these that
materials funding has been cut past
the bone. I get about one dollar per
student for the year. —An Alaska
teacher at a 2K student Passing
high school.
1200
1000
800
600
400
200
0
1
2
3
.... Teachers Trained Yearly
4
5
6
7
8
9
10
11
12
13
Years (approximate 1994 ~Year 1)
Cumulative Total Trained (with some attrition)
SCIENCE EDUCATOR
Since No Child Left Behind
analyzes our failure rates, it has
caused an increase in the astronomy
enrollment due to students trying
to make up lost science credits.
—Teacher in a 2K, high minority,
Needs Improvement, planetarium
equipped New Mexico school.
The existence or lack of
astronomy standards affects
administrative perspectives
about whether the course is
worthy of being offered.
The other at-odds positive effect is
the paradoxical increase in the amount
of time spent on astronomy, but not
in astronomy courses. Instead, the
astronomy courses themselves are
eliminated, but more astronomy is
taught in geoscience courses, so the net
effect is that more students, at a lower
non-capstone level, are taught more
astronomy than prior to NCLB.
Positive effects, besides increasing
enrollment at some schools, include
having more literacy work and
math work included in courses that,
presumably, had previously been more
conceptual.
I firmly believe in the intent of
No Child Left Behind. Reading
and Writing in the context of
Astronomy improves the students
abilities in all courses. I approach
the math component usign the
Read/Analyze/Compute/Evaluate
(R.E.A.D.) method. The honors
Geometry classes have visited
my astronomy classes to see first
hand how the fundamentals of
mathematics came into being.
Holding the students to a high
FALL 2009
VOL. 18, NO. 2
level is essential to improve their
attitudes about learning and gives
them confidence. The students will
be doing several major term papers
each semester. There is a rich history
behind the science that helps to
students see the interconnections
between science in general and
their other core classes. —Teacher
in a 1.8K student, high minority,
Needs Improvement school in New
Mexico.
Why ‘No Child’ Has No Effect
The results indicate that high school
astronomy courses are far more affected
by NCLB indirectly than directly.
Enrollments drop often, not because of
a shift of student interest, but because
students are channeled increasingly
into the main three sciences (shades
of the Committee of Ten effect)
and state mandated/tested courses,
leaving fewer students (or schedule
time) available for students to take an
astronomy course. As a result, fewer
sections are offered, and this can lead
to outright elimination of the course.
Because NCLB does not mandate that
Earth/Space Science classes be tested,
funding for these courses is reduced,
which in turns makes teachers unable
to bring in outside resources or obtain
professional development related to
astronomy. Consequently, teachers
report a loss of status for astronomy
teachers compared to those of other
sciences.
Because the existence of state
standards pertaining to a specific
content area often correspond to
state mandated testing of student
performance, another paradoxical
situation arises. The existence or
lack of astronomy standards affects
administrative perspectives about
whether the course is worthy of being
offered. Astronomy course standards
may not be specified by the state, and,
therefore, the courses are ignored by
administrations that must be more
concerned with achieving acceptable
pass rates in math, language arts, and
state-tested sciences like biology or
physical science. However, in other
schools located in states that lack
astronomy standards, that situation
results in the termination of the
course:
They cancelled my course because
it wasn’t tested! —Self-described
pessimistic teacher at a large 2.5Kstudent Passing Texas school, with
a portable planetarium.
Yet, in still another paradoxical
situation, this untested specification
can cause an increase in enrollment,
because this science is perceived
(incorrectly at times) to be an easier
path for students who have difficulty
with the tested sciences. It also means
more students are placed into these
courses without sufficient academic
background, which adds to the
perception that these courses are less
academically challenging. Astronomy
course standards created using NCLB“approved” standards (whatever they
may be) can be helpful to the course’s
survival. This tactic has had some
positive effects, such as increasing
math and literacy-based work.
However, the presence or absence
of standards does not uniformly predict
the existence of astronomy courses
throughout the country. Because the
overall survey indicated that large
schools are more likely to have
astronomy than small schools, size of
the school may be a significant factor
with regards to astronomy course
availability, usually in the form of a
capstone class, whether or not there are
43
established standards (Krumenaker,
2009a).
When all of the reasons given for
the lack of direct effect on the course
by NCLB are examined, it is found
that, currently, astronomy hangs on
primarily as a capstone course that is
offered only to seniors or upper division
students who have essentially passed
all NCLB-created hurdles, such as
graduation requirements and mandated
end-of-course tests. Specifically, 75%
of astronomy courses are offered
exclusively to upperclassmen. These
factors may be putting astronomy
beyond the target range of NCLB.
Should science become a factor in
AYP, this relationship is likely to
change, and the indirect effects would
be overshadowed by direct ones.
Defending the Course
Even though quite a few teachers
appear to have avoided being directly
negatively impacted by NCLB, there
are documented cases included in this
study of courses being cancelled or
curtailed. In other cases, the course
has been threatened, but teachers
have been able to defend the course
successfully.
To find out the tactics and rationale
that teachers have used (or suggest
should be used) to defend the existence
of the course, the following openended question was asked: “If you
should have to defend or justify the
course at some future date, what
arguments would you use? Why?”
The question generated 428
responses, where ‘response’ indicates
a particular defense mechanism. Six
primary themes are listed in Table 1;
percentages do not add up to 100 due
to rounding.
By far, the largest theme is
“Defending with the nature of the
course.” The most common reason
44
Table 1: Themes Teachers Use to Defend the Course
Theme
Number Percentage of
Responses
Defending with the nature of the course
137
32
Defending with student interest
88
21
Defending with cultural linkages
78
19
Helps improves students, school, AYP
54
13
Defend with traits of science
24
6
Institutional benefits
22
5
Other
25
6
given is that astronomy, by nature,
is interdisciplinary in that it involves
math, other sciences, logic, history,
and more. “An integrated course” is
given as all or part of a response in a
full one-third of all the Nature of the
Course responses.
The second largest defense theme is
“Defending with student interest.” The
most common sub-theme is “students
are interested in astronomy, often
more than for any other science, so
we should teach it,” and it was given
by 53% of those teachers providing
responses that fell within the scope
of this theme of defense.
Closely following is the theme of
“Defending the course with its cultural
linkages.” This defense mechanism
utilizes historical, sociological, and
philosophical arguments and intangible
connections that astronomy has with
human thoughts and societies. A
common sub-theme is that astronomy
teaches students about their place in
the universe and about the wonder
of it all. The historical argument
that astronomy is the first science
or the foundational science is listed
frequently. More tangible linkages
include ways that astronomy is part of
everyday life, for example, as cultural
myths, the origin of the calendar, and
so on.
The next largest theme is
“Astronomy helps improve students,
school, and AYP measures.” This
uses the defense that astronomy helps
schools meet state standards, helps
students pass state end-of-course
and school graduation tests, and
provides alternatives for students
who have difficulties passing the
biology, chemistry, and physics course
sequence.
The last two themes are less common
and roughly equally proportionate in
influence. “Defending the course
with traits of the science” utilizes the
arguments that astronomy is physically
and mentally more accessible to
students and that astronomy is less
static than other sciences. “The
Institutional Defense” promotes the
idea that astronomy courses help
the school’s image and economics.
Finally, there are responses that do not
fit any of the stated defenses, including
a few unique defense strategies that
will not be discussed here.
In order that other astronomy
instructors may find these useful, a
discussion of each defensive argument
follows.
Nature of the Course
The most common theme mentioned
in the survey that is useful to defend
the course from external threats is
the argument that astronomy is the
most integrated, interdisciplinary,
multidisciplinary science course that
can be offered. Astronomy involves
mathematics, literacy and language,
SCIENCE EDUCATOR
and other sciences such as chemistry,
physics, various life sciences, and
geosciences. The argument is given
that astronomy is the only capstone
course that inherently incorporates
all the listed subjects, or at least that
it can, if the curriculum is designed
to do so. Because a capstone course
culminates a sequence, it can also
reinforce prior learning.
Requires mastery of all disciplines
and integrates these like no other
course can. My students learn more
history than in some history classes.
They use trig to rediscover Kepler’s
laws as well as analyze many
articles about current research.
—Teacher in a minority, Needs
Improvement, 1.7K student public
high school in Georgia.
Astronomy is truly a multidisciplinary course in which the
different sciences may be blended,
but also one in which students may
see direct application of other course
content as well. For example, math is
obviously required, but government
policy/legislation with respect to
aerospace expendatures, aerospace
spinoffs that help solve Earth-bound
problems, ELA communication of
important findings and discoveries
to the general public, understanding
the environment by working to
create closed ecosystems for
colonization, etc., etc., etc. Beyond
all this, it is a wonderful venue for
teaching problem-solving skills
because space exploration is still
in its infancy. —First year teacher
of astronomy in a large 2.8K public
high school in Texas.
Astronomy at the high school level
should now integrate many other
areas of science and mathematics.
We can now do comparitive
FALL 2009
VOL. 18, NO. 2
geologies, meteorologies, and
possibly some day comparitive
biologies to better understand our
Earth’s systems. —Teacher in
a 500-student Wisconsin public
school.
When all of the reasons
given for the lack of direct
effect on the course by NCLB
are examined, it is found
that, currently, astronomy
hangs on primarily as a
capstone course that is
offered only to seniors or
upper division students who
have essentially passed all
NCLB-created hurdles, such
as graduation requirements
and mandated end-of-course
tests.
Knowledge about the universe
has changed rapidly over recent
years. Consequently, the content and
textbooks used in early science classes
are likely to no longer be current, and
the high school course may be the last
chance that the education system has
to correct misconceptions gained in
elementary and middle schools.
Astronomy courses can incorporate
many science and inquiry skills.
Depending on the curriculum design,
these courses can be taught from a
very descriptive, low-math perspective
or one that incorporates higher-level
physics and math.
Astronomy has no academic level
restriction.
The course was taught in an inner
city school with students that had
low math skills and generally were
not science kids (not also enrolled
in courses like AP chem or AP
Physics), yet this course got them
excited and enthusiastic about
science. Kids joined the astronomy
club and were INTERESTED! This
is/was very uncommon for the
school, and definitely encouraged
many minority and minority female
students to take a science class
and join a science club. —Former
teacher from a high minority,
Connecticut, 1.2K-student public
high school.
Astronomy courses can be directly
and concretely beneficial. For example,
a Washington-state teacher made
arrangements for transferable college
credit. She wrote “they can get 5
University of Washington credits for
taking the course (at a price of $293)
through the UW in the High School
Program.”
Student Interests
Astronomy courses have interest and
appeal among students. Representative
comments include:
• “Many students are interested
in Astronomy and it therefore
is an excellent medium for
teaching fundamental science
principals (i.e., science inquiry,
nature of science, etc.”
• “Students enjoy the course; it is
sometimes the only advanced
science course some students
take;”
Because of this attraction, some
surveyed teachers found students that
normally resist science voluntarily
take their class; others also found that
astronomy changed students’ attitudes
by drawing them into the field of
science and even to causing them to
become scientists. Student interest in
45
astronomy also benefits the teacher by
increasing or maintaining enrollments
or keeping the course on the schedule.
Additionally, the success in college of
prior astronomy students is considered
another top defense argument.
Cultural Linkages
Astronomy is a part of every
culture, not just Western. There are
sky legends from other cultures, the
calendar, the way things are named for
celestial objects, and more everyday
connections to students’ lives and
backgrounds.
Astronomy also has direct relevancy
to modern society. While the subject
matter, unlike chemistry, physics,
or earth science, may be physically
distant from the students, it is still
relevant to their everyday lives.
As one of the oldest sciences,
astronomy has influenced our
lives through use of calendars,
vocabulary, and the scientific
thought process. Most recently, the
‘demotin’ of Pluto to a dwarf planet
has engendered much discussion
about how science changes as
improved technology brings new
information to us. —Teacher in
a 1K student public school in
Massachusetts.
Helping with AYP Issues
With science possibly becoming
a factor in determining AYP status,
teachers have noted schools seem to
be seeking more options.
Astronomy as an elective provides
an interesting and exciting 4th year
of science. Students will opt out of
science if it isn’t something they are
interested in. —Teacher at a AYP
Passing tiny 150-student Arizona
public high school.
46
An appropriately designed
astronomy course will meet a variety
of states’ standards and national ones
as well.
Honors Astronomy involves all of
the important skills that virtually
all state and national teaching
standards emphasize: critical
thinking, application of math
and computer skills, projectbased learning, development of
presentation skills. —Grades 10-12
astronomy course teacher at a high
minority, 1.8K student Passing
California public school.
We use a variety of technology
(telescopes, CCD imagers,
computers) and software (HandsOn-Universe, Adobe Photoshop,
TheSky, Starry Night Pro) to aid
the state mandate to make sure
all students are technologically
literate. —Teacher at a Failing
school in West Virginia.
It can even substitute for some of
the regular science courses; it does
so in at least two states: New Mexico
and Wyoming.
Administrators should find
astronomy helps raise test scores in
science. We note that no state reported
having astronomy end-of-course tests,
but many astronomy concepts do
appear in other tests, including some
national ones.
Kentucky’s Core Content has a
subsection based on astronomy.
According to KSTA, the lowest
scores in the state deal with the
universe’s formation. Since our
state’s test is one the engines that
drives this train here at [deleted
school name] this fact will always
make a good case for my astronomy
class. —Teacher at a 1.4K student
Passing public high school.
Furthermore, AYP status depends on
language arts, and astronomy can play
a role in that as when “the students
are required to produce research
papers and other analytic essays,” as
exemplified by a teacher in a small
Pennsylvania school.
One of the most common
reasons astronomy courses
are able to avoid deleterious
effects is the frequency with
which they are offered as
capstones for seniors who
have already completed the
courses that are directly
examined for AYP status.
Institutional Benefits
Good public relations is always a
positive reason to have a course.
[When the Oregon Department of
Education said schools ranked an
“F” for astronomy in the state,]
Our Superintendent immediately
told the press/ public about our
thriving Astronomy courses and
his commitment to continue to
teach this relevant and stimulating
course. —Teacher at an Oregon
public high school.
The existence of an astronomy
course can be attention getting to
school-shopping parents.
As a selling point to prospective
students/parents. Few other schools
are doing astro. —A Georgia 400student private school teacher.
In a strictly economic sense, a
very common response from one
particular group of teachers—those
with planetariums—is that such an
SCIENCE EDUCATOR
expensive resource should not be
wasted.
The Science Itself
Many teachers believe that
astronomy is more accessible to
student minds than other sciences.
Also its one of the few courses
that you can learn something that
day and use that knowlege that
night. —Teacher at a 1000-student,
Minnesota public high school.
Astronomy is the rare science in
which amateurs do make significant,
valid, and valuable contributions,
and this can be a real jumpstart to a
college career. Students can actually
contribute original research—some
have discovered new asteroids, for
example—to astronomy, and this
allows high school students to feel an
ownership of the material.
This course gives students an
opportunity to contribute to the
school and astronomy research.
Many of my students are nonathletes who really love astronomy.
They are involved in several reseach
programs through NASA and
get their observations publicized
frequently. They have the same
pride in contributing to astronomy
as athletes do in sports. —Teacher
at a small, 400-student Kansas
public high school, with a portable
planetarium and an observatory.
Conclusions
Most teachers of astronomy in
American high schools claim not to
have been directly affected by the
No Child Left Behind Act but do
say they have suffered indirectly
and negatively, notably by effects of
the Passing or Failing of math and
language arts high stakes testing and
FALL 2009
VOL. 18, NO. 2
an emphasis on moving more students
into biology, chemistry, and physics
courses which have testable standards.
The indirect effects include drops in
course enrollment, number of courses
offered, cancellation of courses,
and redeployment of teachers. Loss
of funds, status, and collaboration
and professional development are
also reported. The only major direct
effect appears to be that of meeting
the ‘highly qualified’ status, which
is difficult to achieve because no
state offers teaching certification in
astronomy. A few other teachers have
allowed NCLB to positively, directly
affect their classes by incorporating
more math and literacy exercises
than before. One of the most common
reasons astronomy courses are able
to avoid deleterious effects is the
frequency with which they are offered
as capstones for seniors who have
already completed the courses that
are directly examined for AYP status.
Additionally, in many states (but not
all and not always), a lack of state
standards means a lack of oversight
for the course. However, sometimes
that lack of standards means a course
is not considered important enough to
keep on the schedule, and sometimes
astronomy enrollment increases only
because the course is made into an
alternative source of science credits for
students who have difficulty passing
the mandated courses.
Astronomy, if it exists, is usually
in an AYP Passing school or Needs
Improvement school, and schools
offering astronomy are often larger
than average in student body size.
Furthermore, schools with astronomy
generally have higher Pass and Needs
Improvement status rates than the
nation as a whole.
References
Bishop, J. (1977). U.S. astronomy
education: past, present and future.
Science Education, 61, 3, 295-305.
Bishop, J. (1980). Astronomy education in
the U.S.: out from under a black cloud.
Griffith Observer, 44, 3, 2.
Hunt, J. (2006). Impact of the failure
to make adequate yearly progress
on school improvement and staff
development efforts. Downloaded
June 30, 2007 from <http://cnx.org/
content/m14097/1.1/>.
IPS (International Planetarian Society).
(2005). The IPS Directory [Data file].
Smith, D. (Ed.)
Krumenaker, L. (2008). Unpublished
doctoral dissertation.
Krumenaker, L. (2009a). The Modern
U.S. High School Astronomy Course,
Its Status and Makeup, in the Era of
No Child Left Behind. Astronomy
Education Review, 8,1, (December).
Krumenaker, L. (2009b). The Modern
U.S. High School Astronomy Course,
Its Status and Makeup II: Additional
Findings. Astronomy Education
Review, 8.1. (December).
NCES (National Center for Education
Statistics). (2007). Table 1.6. State
assignment of school ratings, percent
of schools not making adequate yearly
progress, and percent of schools
identified as in need of improvement,
by state: 2005-2006. Downloaded
June 30, 2007 from <http://nces.
ed.gov/programs/statereform/saa_
tab6.asp?referrer=tables>.
NewsBits. (2007). NSTA Reports, 19,
2, p. 17.
Pennypacker, C. (2008, January 10).
Global hands-on universe. Paper
presented at the Winter 2008 American
Astronomical Society meeting.
Sadler, P. (1992). In Pennypacker,C. (Ed.).
High school astronomy: characteristics
and student learning. Proceedings of
the workshop on hands-on astronomy
for education. (pp. 52-62). Singapore:
World Scientific Publishing.
47
Strauss, A. and Corbin, J. (Eds.) (1997).
Grounded theory in practice. Thousand
Oaks, CA: Sage Publications.
Tuckman, B. (1999). Conducting
educational research, 5th Ed. Belmont,
CA:Wadsworth.
Larry Krumenaker is a long-time astronomy
and science educator, and recent Ph.D. from
the University of Georgia. He is editor-inchief of the new publication The Classroom
Astronomer. As a science journalist he
has written astronomy, science, education
and technology stories for numerous trade
magazines and newspapers in the United
States, Germany and elsewhere. He currently
lives, works and teaches both students and
teachers in Atlanta, GA. Correspondence
concerning this article can be sent to <larryk@
toteachthestars.net>
Appendix A:
Survey Procedure
The courses’ teachers were gathered
from announcements in such venues
as the email mailing lists/discussion
groups or print newsletters of
astronomy and science educational
associations, including the National
Science Teachers Association (NSTA),
the American Association of Physics
Teachers (AAPT), the Astronomical
Society of the Pacific (ASP), and
the American Astronomical Society
(AAS), as well as state and regional
association discussion groups for
physics, earth science, and general
science teachers. Also used were other
discussion mailing lists that have
interested astronomy teachers, such
as Dome-L for planetarium teachers,
48
the 200,000-subscriber newsletter for
the “Starry Night” software program,
and the newsletter for StarLab portable
planetarium operators. Several state
science coordinators and educators
who work with astronomy teachers
passed along our invitation in their
own media, including their own
discussion or ‘news broadcast’ lists.
The “science brokers” at NASA, who
work with teachers and maintained
contact lists were of enormous help
(sadly, after our study, the “science
brokers” program was terminated).
Additionally, educational personnel
associated with NASA-operated
missions, such as Cassini, broadcast
our appeal for qualified survey
respondents to the larger community,
as did national observatories and
other programs, including SETI and
NRAO.
The teachers gathered by these
means were labeled our ‘hot’ group,
because they volunteered to take
part. A voluntary response group
is not necessarily the best research
design, because there may be other
factors at play, such as extreme views
or overwillingness that may not be
representative of the whole population.
To counteract the non-probabilistic
‘hot’ group, a more randomly selected
sample, which is labeled the ‘cold’
group, was created. These teachers
were invited through our direct
email solicitation. Their names and
contact information were obtained
primarily either by lists given to us
from personnel at astronomy-related
conferences, publishers, some state
departments of education, or private
individuals who volunteered names.
Names were also amassed through
searches on the Internet, which yielded
lists of planetariums and high school
astronomy clubs obtained from the
Sky and Telescope magazine website,
the International Planetarian Society
Directory (IPS, 2005), and several
American regional planetarium
groups. Finally, snowball sampling—
having responders recommend other
people to survey—was also used.
The spring 2007 survey started
with over 600 names, evenly split
between ‘hot’ and ‘cold’ groups. The
237 usable responses included seventy
from the ‘cold’ group, which resulted
in a response rate of about 24%. The
‘hot’ group responded at a 60% rate.
The second, printed postal survey took
place in the Fall of 2007. Participants
were acquired via a 2176-piece postal
mailing using primarily a mailing list
from the National Registry of Teachers
plus about 600 names and addresses
acquired in the spring without
email addresses. Eighty-five percent
responded by sending the survey
back via a prepaid, pre-addressed
envelope, the remainder answered
using a Web questionnaire as in the
first survey. Response numerically was
half as much as the Spring survey but
proportionally much smaller.
SCIENCE EDUCATOR
Tara M. Owens
Improving Science Achievement
Through Changes in Education Policy
The author reviews current science education policies in the United States
and offers perspectives about ways that these policies can be changed to
improve student science achievement.
Concerns over science education in
the United States continue to grow due
to the increasing global demands and
competitiveness for careers in science
and technology. In addition, current
education policy will be scrutinized
more rigorously as the Obama
administration begins to implement
their vision of public education, which
includes recruiting new teachers and
rewarding effective teachers. The
effectiveness of science teachers
is often measured by the success
of the students. In order to ensure
student success in science, research
about how students learn science
and how teachers should be teaching
science must be taken into account
by policy makers. Accomplishing
the goal of improving student science
achievement in the United States is
necessary in order to increase overall
science literacy amongst the U.S.
population and ensure preparedness
for the growing science and technology
demands of the 21st century.
The current education policy in the
United States is strongly influenced
by the No Child Left Behind Act of
2001. One of the primary goals of
No Child Left Behind (NCLB) is
stronger accountability for results
(U.S. Department of Education, 2004),
and, consequently, schools are now
FALL 2009
VOL. 18, NO. 2
being held responsible for the quality
of education they provide to students.
In order to ensure accountability and
higher performance of students, NCLB
required states to implement a method
of assessing student knowledge of
the core content areas. Although not
mandatory, most states have opted to
use a multiple-choice, standardized
test, because this type of assessment
is most cost effective to administer
and score (Wenning, Herdman, Smith,
McMahon, & Washington, 2003).
Students come to the
classroom with different
skills, ways of thinking, and
learning styles.
This type of large-scale, high-stakes
testing is now taking place in all fifty
states and is administered to all high
school students between the tenth
and twelfth grades. The assessments
are termed “high-stakes” because the
results are used to determine which
students will graduate from high
school. Students must pass the test by
the end of their senior year in order to
receive a diploma.
These statewide test results have also
become the basis for holding schools
accountable and forming funding
and policy decisions. States use data
from the test results to determine
if schools and districts are meeting
their established achievement goals.
If schools and districts fail to meet
these goals, they can face sanctions that
include reduced funding, mandatory
reallocation of funds, and vast
overhauls of curriculum (Wenning,
et al., 2003).
While NCLB was passed by the Bush
administration under a republican-led
Congress, new controversies over
the policy are emerging under the
new Obama administration and a
democratic-led Congress. Current
education policy in the U.S. and
the effectiveness of NCLB is a hot
topic for debate among politicians
and the general public. NCLB has
significantly influenced state policy,
and this, in turn, has affected what is
being taught in the classroom. NCLB
calls on states to implement a more
rigorous science curriculum that is
more closely aligned with national and
state standards for science education.
The goal is to prepare students for
success beyond high school (U.S.
Department of Education, 2004).
As states attempt to make their
science curriculum more rigorous
in compliance with NCLB, state
49
and national science standards call
for students to learn a vast amount
of scientific information, and this
knowledge is assessed during the
statewide tests. As a consequence,
teachers have been forced to alter their
methods of instruction to conform
to the assessment. Teaching to the
test has become more commonplace
as pressures mount on teachers to
ensure they cover everything that
their students need to know in order
to succeed on the state test. The pace
at which content is covered has been
accelerated to an extent that permits
only superficial coverage of topics with
little regard to student comprehension
or depth of knowledge. Effective
teaching strategies are giving way to
quicker, fact-based instruction due
to reductions in the amount of time
allotted for each topic to be covered.
While national and state standards
call for inquiry-based science
instruction, teachers are finding it
increasingly difficult to meet this
expectation and still expose students to
all of the content they need to know to
succeed on the state test. Furthermore,
short-term assessments are geared
more towards preparing students with
questions that are similar in structure
to those on the statewide test. All of
these factors combine to demonstrate a
clear discrepancy between the national
and state expectations of quality
science instruction and what is actually
happening in science classrooms
across the United States.
Some may argue that perhaps
students are not really underachieving
in science, and it is actually the method
of assessment that is inherently flawed.
Most states use standardized, mostly
multiple-choice tests to assess student
proficiency in science. It could be
argued that these types of tests do not
accurately assess student knowledge
50
about science because they are not
aligned with the ways in which
students are being taught. While these
arguments are compelling, the validity
of the NCLB mandated statewide
tests and national assessments of
student proficiency in science is
an issue that extends beyond the
scope of this article. However, even
under the constraints of NCLB and
our current methods of assessment,
efforts can be made to improve overall
understanding of science, which
should translate into improvements
in science achievement.
How do we measure
achievement?
While NCLB calls for accountability
for results, results are not easily
identifiable under our current system.
Individual states have the flexibility to
develop their own science curriculum,
their own assessment, and set their own
performance standards for proficiency
in science (Wenning, et al., 2003).
Proficiency in science is defined as
a threshold of performance on the
state test, but with each state having
a different curriculum, a different
test, and a different set of criteria to
measure proficiency, comparisons
of proficiency from state to state are
meaningless. For this reason, the
national assessment results are used
to discuss science achievement among
U.S. students. At the national level, the
National Assessment of Educational
Progress (NAEP) defines proficiency
in science to be a raw score of 178 out
of a possible 300 points on the national
assessment. The NAEP includes not
only multiple-choice and constructed
response questions but also assesses
students as they engage in actual
science investigations. (Loomis &
Bourque, 2001). Nationally, only
eighteen percent of twelfth graders
performed at or above the proficient
level on the 2005 NAEP science
test, which is static from the 2000
results and demonstrates a decrease
in performance from 1996. (Grigg,
Lauko, & Brockway, 2006).
Another method for measuring
student science achievement is to
compare U.S. students with students
from other countries around the world.
The Third International Mathematics
and Science Study (TIMSS) is an
assessment of both science and
mathematics achievement in fourth
and eighth grade students from
various countries. The study has been
conducted four times since 1995, with
the most recent assessment occurring
in 2007. Results from TIMSS showed
that U.S. students performed at the
same level or below students of other
developed nations (Stigler & Hiebert,
1999). The situation has not improved
in recent years. Results from the
2007 TIMSS shows that the United
States falls behind 9 other countries
in science achievement among other
4th graders, and ranks 11th in 8th grade
science achievement (Martin, et. al.,
2008). Countries outperforming U.S.
students in science are primarily Asian
nations, including Singapore, China,
and Japan. Furthermore, these results
reflect no measurable improvement
in U.S. student science achievement
since 1995 and illustrate that there is
a decline in U.S. student performance
in science between the fourth and
eight grades in comparison with
other countries (Martin, Mullis, &
Foy, 2008).
There are several possibilities as to
why students are not demonstrating
improvements in science achievement.
One theory is that our system of
education ignores the research about
how students learn science. A second
theory is that teachers are aware of the
SCIENCE EDUCATOR
research but, for various reasons, are
unable to implement these practices in
their classrooms. This may be related
to another theory, which purports
that our current system of education,
including standards, curriculum,
and education policy as a whole, is
not conducive to effective science
instruction. Whatever the reason, the
results of the national and international
assessments of student achievement
in science demonstrate the need for a
re-evaluation of the ways that students
learn science and ways that we should
be teaching science content in order to
increase student achievement.
How Do Students Learn
Science?
Students learn science in many
different ways. Some students are able
to learn from reading about science
concepts while other students are
auditory learners. Other students may
learn better when given opportunities
to move and manipulate objects, or
see concepts represented visually.
Students come to the classroom with
different skills, ways of thinking, and
learning styles. For this reason, there is
not one set way that all students learn
science. However, current research
into science learning has identified
several widely accepted ways in
which students come to understand
science.
Inquiry
One of the most important things
to consider when examining how
students learn science is that students
learn science by doing science. This
means that students learn when they
engage in the process of science. The
process of science involves prediction,
observation, collecting evidence, using
evidence to develop explanations, and
repeating investigations and revising
FALL 2009
VOL. 18, NO. 2
explanations. Science learning through
doing has been termed “inquiry” and
has been recognized as important
for student learning of science since
the 1960s (Gallagher, 2007). Inquiry
is also emphasized by the National
Science Education Standards (2003).
According to the Standards, learning
science is an active process, and
students should be participants in
the learning process rather passive
recipients of knowledge. While
engaging in the scientific process,
students are required to use critical
thinking to come up with explanations
that aid in the development of
student understanding of science.
Overwhelmingly, educational research
supports the idea that engaging in
inquiry is one of the most important
components to learning science.
Peer-to-peer interactions
Students also learn science when
they discuss their ideas with their
peers. Since communication is a main
component of the scientific process,
and research has shown that engaging
in science as a process aids in learning
science, it stands to reason that students
must also engage in communication as
one of the most important aspects of the
scientific process. When students work
in groups to formulate explanations
and reach a consensus, they are doing
what scientists do. In addition, peerto-peer communications can help clear
up misunderstandings. Since students
relate to one another on a more equal
level, peers can explain complex
ideas to one another in a way that
may have more meaning (Moreno &
Tharp, 2006).
Incorporation of prior knowledge
and connection of ideas
Students also learn science when
they are able to make meaningful
connections. When students are able
to connect new information with
something they already know, the
new knowledge becomes much more
meaningful and easier to incorporate
into their current knowledge
framework. Students come to the
classroom with prior knowledge about
how the world works. This knowledge
is formed by students’ experiences
in the world. Based on their prior
experiences, students come to the
classroom with their own, although
sometimes faulty, explanations for
scientific phenomena. A student’s
prior knowledge can be deeply
ingrained and very difficult to change.
Students can only learn science when
their prior knowledge is considered
and integrated into the learning of
new concepts. If presented with
observations or data that is consistent
with their prior ideas, the students’
current knowledge framework is
reinforced. However, if presented with
new experiences that are contrary to
their prior ideas, students’ will have
to either explain the new information
within their current framework, or
alter their knowledge framework
to incorporate the new information
(Moreno & Tharp, 2006). Students
also learn science when they apply
new knowledge to new situations,
develop their own explanations for
science phenomena, and reflect on
their own learning.
Knowing how students come
to learn and understand science
is important. However, for this
knowledge to be useful, it needs to be
applied to classroom instruction. In
other words, the way science is taught
in the classroom needs to be reflective
of the ways in which students learn
in order for the instruction to be truly
effective.
51
How should science be
taught?
Teachers are undeniably crucial
to student learning. Teachers set the
tone of for the learning environment.
Teachers that have a positive,
enthusiastic attitude towards science
are more successful in helping their
students learn (Moreno & Tharp,
2006). Creating an open, studentcentered learning environment that
encourages curiosity and exploration
is more conducive to learning science
than the traditional teacher-centered
approach to instruction.
When students are able to
connect new information
with something they already
know, the new knowledge
becomes much more
meaningful and easier to
incorporate into their current
knowledge framework.
In addition to creating an
environment suitable for learning
science, accomplished science
teachers use a variety of instructional
approaches to guide learners toward
knowledge about science. There is
no cookie-cutter strategy of teaching
that reaches all students all of the
time. Therefore, it is important to use
a variety of instructional strategies to
address the unique needs and interests
of individual students. Utilization
of a variety of teaching techniques
provides students with the most
opportunities to learn and refine their
conceptual framework.
Introducing science content in
a way that engages students is one
key strategy that helps students
52
learn. Relating science content to
students’ real life experiences can
be very effective in motivating
learning. This can be accomplished
by developing analogies between
new science ideas and concepts with
which students are more familiar as
a result of their own experiences.
Student interest can also be piqued
by posing intriguing problems and
challenging students to come up with
solutions to the problem. In addition,
using open-ended questioning rather
than soliciting simple one-word, right
or wrong answers requires students to
use higher level thinking strategies
instead of simple, rote memorization,
and this leads to a deeper conceptual
understanding (Moreno & Tharp,
2006).
Using guided inquiry as a method
of instruction has been shown to be
an effective teaching strategy. In
this teaching method, the teacher
establishes guidelines for a scientific
investigation. Guidance can be very
direct, such as posing the problem
to be solved in the investigation, or
very limited, such as simply helping
students select a topic of appropriate
scope for an investigation. As students
gain the skills necessary to do scientific
investigations, the teacher’s role can
become increasingly more limited. In
the process of guided inquiry, students
improve their problem-solving skills
and their abilities to use evidence to
formulate explanations. In addition,
the inquiry process provides students
with the opportunity to work together
and share ideas with one another, all
of which leads to greater conceptual
knowledge about science concepts
(Moreno & Tharp, 2006).
In scientific inquiry, students
need to be given opportunities to
engage in discourse with one another.
Because science is a social endeavor,
it involves consensus building,
peer review, and communication
in many forms. Verbal and written
discourse is crucial to developing
scientific knowledge. Students should
be given opportunities to work in
groups to conduct investigations,
evaluate evidence, and formulate
explanations. Having students develop
their own explanations of scientific
events helps them to integrate new
knowledge with existing knowledge
and make connections between science
concepts.
Implications for Science
Education and Policy
Teaching
The most immediate application of
science education research can occur
at the classroom level. In Teaching
Science in the 21st Century, Bybee
states that “… how much students
learn is directly influenced by how they
are taught” (2006, p. 25). Therefore,
if teachers implement effective
teaching strategies that correlate with
the ways in which students learn,
performance on assessments should
naturally improve, because students
will have a deeper understanding of the
fundamentals of science (Gallagher,
2007). Although teachers can adjust
their teaching methods to promote
student understanding of science,
there are limitations to how much
they can do under the constraints of
the educational system in which they
teach, including the curriculum and the
amount of content they are required
to cover.
Teacher education
In order to deliver the best science
education, teachers need to be trained
to provide excellent education to
students. Both pre-service and inSCIENCE EDUCATOR
service teacher training should be
geared towards development of
science content knowledge and
effective teaching strategies.
However, as Banilower, Heck, and
Weiss (2007) point out, particularly
in grades K-8, science education
tends to be a low priority. This is
partially due to the emphasis placed
on mathematics and reading because
of high-stakes testing in those two
subject areas in the elementary
grades. This is problematic, because
NCLB holds schools accountable for
student achievement in science during
the high school years. Therefore,
the foundations of a solid science
education must be established earlier
in the student’s career, and the value
of science education at the elementary
level must be reinforced.
In addition, teachers need to
be given more opportunities to
observe modeling of effective science
teaching techniques so that they can
be implemented in the classroom.
This can be accomplished during
undergraduate teacher education or
through professional development
programs.
Increasing pedagogical content
knowledge will help science teachers
of all grade levels refine and enhance
their teaching methods, which will
lead to more effective instruction and,
ultimately, result in greater science
literacy among students. To provide
the most effective science instruction,
teachers need to be educated about how
students learn so that they can adjust
their teaching strategies to achieve
greater student understanding.
Standards and curriculum reform
More significant advancements
in science achievement can be made
through fundamental changes to our
current science standards and science
FALL 2009
VOL. 18, NO. 2
curriculum. National and state science
curricula too often place greater
importance on quantity of knowledge
than the quality of knowledge.
Students are encouraged to memorize
and learn scientific facts rather than
explore and engage in science as a
process (National Research Council,
2007).
Because science is a
social endeavor, it involves
consensus building, peer
review, and communication
in many forms.
In accordance with researchedbased understanding about how
students learn and how science should
be taught, the scope of the science
standards needs to be reduced to
allow for more in depth treatment of
core scientific concepts. In addition,
emphasis needs to be placed on
connections between core concepts
and the building of science knowledge
over all grade levels. In order to provide
the time needed to explore science and
acquire essential knowledge and skills,
the sheer amount of material that
today’s science curriculum includes
must be significantly reduced. The
focus needs to be on the big ideas of
science rather than the minute details of
every concept in science. To improve
science achievement in the U.S., the
curriculum should focus more on the
progression of learning and making
connections between concepts and
less on covering a wide range of
individual topics. Learning should
follow a logical, coherent progression
(National Research Council, 2007).
Teaching should be designed to
assist students in understanding these
core concepts and the relationships
between them. Students should have
the opportunity to experience a variety
of learning activities and develop
meaningful science understanding.
However, engaging in a variety of
application and problem-solving
experiences takes time. Furthermore,
the teaching materials and resources
used in the classroom are also limited,
which further supports the idea of
limiting the number of topics covered
and, instead, focusing on depth of
coverage.
In formulating science education
policies and curriculum, much can
be gained by looking towards the
practices of countries that are having
greater success in their science
education programs. Our current
science curriculum focuses too
heavily on breadth of content and not
enough on depth, development, and
the connections between concepts in
science. The Third International Math
and Science Study (TIMSS) found
that U.S. students were outperformed
in science (Stigler & Hiebert, 1997).
Valverde and Schmidt (1997) analyzed
the results of TIMSS by comparing
the science curriculum in the U.S.
with that of the 10 highest-achieving
countries in science and found
profound differences between them.
U.S. science curricula tend to focus
on broad coverage of science topics
and shallow depth, and connections
between concepts are given little
attention (National Research Council,
2007).
Current research and examples
of effective science instruction from
high-achieving countries should
be used to shape U.S. education
policy and curriculum. As Vitale
and Romance point out in their
analysis of TIMSS, “… the curricula
53
of high-achieving countries was
characterized as focused around big
ideas, conceptually coherent, and
carefully articulated across grade
levels. In contrast, the curricula in
low-achieving countries (including the
U.S.) emphasized superficial, highlyfragmented coverage of a wide range of
topics with little conceptual emphasis
or depth” (2006, p. 336).
However, changes such as these
must begin in the earlier grades.
Student achievement in science
amongst fourth graders is on par with
other higher achieving countries.
However, as students progress in
the U.S. education system, the
discrepancy between U.S. students
and their foreign counterparts become
more glaring. One explanation for
this is that up until the fourth grade,
expectations for student learning are
similar between the United States and
other countries. However, as students
in the U.S. progress through the upper
grades, more time is spent on repeating
previously learned concepts instead
of providing in depth coverage of
new concepts. As a result, the list
of topics that need to be covered at
subsequent grade levels continues to
grow (Valverde & Schmidt, 1997).
Interestingly, as a part of NCLB,
programs of instruction and teaching
practices are supposed to be aligned
with research about effective
instruction (Mundry, 2006). While
NCLB calls for the use of research
in making decisions about science
education, it does not provide any
recommendations about ways that this
research can be practically applied in
the classroom. One way to incorporate
current research into teaching science
is to revise and restructure curricula.
Because science education research is
ongoing, so is the pursuit of a more
effective curriculum. This process
54
has to begin at the national level with
reform of our education policies.
In addition, while our current
system of education allows each state
to individually develop standards,
curriculum, and assessments, countries
like Japan that have high-achievement
in science have a national science
curriculum (Stigler & Hiebert, 1993).
A more nationalized approach to
science education would eliminate
the inconsistencies in expectations
and execution of science education
throughout the nation. With this
approach, new research about student
learning or effective teaching practices
could be implemented on a much
broader scale, since the curriculum
would be consistent throughout the
entire country. A national proficiency
test for scientific literacy that is aligned
with revised national standards should
be developed and used in place of
the statewide tests. A national test
that could be administered in schools
nationwide would give educators
and policy makers better data with
which to make comparisons between
states or regions of the country and
make it easier to identify areas which
need improvement. A nationalized
approach to science education would
also help to alleviate some of the
problems that arise when students
move into different school districts,
because the expectations would be the
same regardless of the school being
attended.
Implications for development of
a national science curriculum are
wide-reaching. The Benchmarks for
Science Literacy is a great resource
in developing a national science
curriculum, because it describes
levels of understanding and abilities
expected at each grade level. The
focus of Benchmarks is on science
literacy, which is considered to
be as a broad base of scientific
knowledge and understanding rather
than detailed factual knowledge about
specific science disciplines (American
Association for the Advancement
in Science, 1993). For this reason,
Benchmarks offers good guidelines
for the creation of effective national
science curriculum programs that
address the interconnectedness of
knowledge and ways to build upon
that knowledge across grade levels.
New curriculum programs should
be changed to reduce the amount of
content and emphasize core concepts
and the connections between them
across grade-levels and disciplines.
Connections between concepts need to
be identified and explicitly outlined and
mapped in curriculum programs.
Conclusion
In conclusion, educators in the
United States must look for ways
to increase science proficiency and
overall science literacy. Research about
how students learn science should be
used to develop teaching strategies that
facilitate student learning. With better
teaching methods and improvements
in science instruction, students will
develop deeper understanding of
science concepts, which should
translate into better performance on
assessments (Gallaher, 2007).
Higher levels of science
achievement can be attained through
an understanding of how students
learn science, as well as development
and implementation of more effective
instruction. It is also known that
learning science is a progression
throughout the years and that a more
thorough understanding of science
concepts occurs when depth is
emphasized over breadth of content. In
addition, the content must be organized
in a conceptual framework that allows
SCIENCE EDUCATOR
for the retrieval and application of
scientific knowledge. This needs to
begin in the early stages of science
education and not just at the secondary
education level.
The subject of improving student
achievement in science is becoming
increasingly important, because
districts and schools are now being
held accountable for student success
under NCLB. Students are also feeling
the pressure to achieve as they are faced
with passing a large-scale, high-stakes
science assessment in order to graduate
from high school. Our current science
education policy may have put the cart
before the horse by expecting results
without allowing time to institute
changes in practice. Instituting even
a few of the proposed changes to
our current system of education
could have significant impacts on
student learning and achievement in
science. However, it will take time
to implement such changes, and the
results may not become apparent
for many years. Nonetheless, policy
makers need to review the current
research in science education and assist
educators in acquiring tools to help our
students achieve success in science,
which will, in the long run, benefit
not only individual students, but our
communities and our country as well
by preparing our youth to compete in
the global economy.
Finally, science educators have
the responsibility to provide the most
effective science education possible
to our students so that they have the
skills necessary to be successful adults.
As the 21st century economy becomes
more global, American students need
to be more competitive with their
foreign counterparts and in order
to accomplish this, they must have
the scientific knowledge necessary
to secure work in the growing
FALL 2009
VOL. 18, NO. 2
fields of science, engineering, and
technology.
References
AmericanAssociation for theAdvancement
of Science (1993). Benchmarks for
science literacy: Project 2061. New
York: Oxford University Press.
Banilower, E.R., Heck, D.J., & Weiss, I.R.
(2007). Can professional development
make the vision of the Standards a
reality? The impact of the National
Science Foundation’s local systematic
change through Teacher Enhancement
Initiative. Journal of Research in
Science Teaching, 44(3), 375-395.
Bybee, R.W. (2006). The science
curriculum: Trends and issues. In
Rhoton, J. & Shane, P. (Eds.). (2006).
Teaching science in the 21st century
(pp. 21-37). Washington, D. C.: NSTA
Press.
Gallagher, J.J. (2007). Teaching science
for understanding: A practical guide
for middle and high school teachers.
Upper Saddle River, NJ: Pearson
Education, Inc.
Grigg, W.S., Lauko, M.A., & Brockway,
D.M. (2006). The nation’s report card:
Science 2005. Washington, DC: U.S.
Government Printing Office.
Loomis, S.C. & and Bourque, M.L.
(Eds.). (2001). National assessment
of educational progress achievement
levels, 1992-1998 for Science.
Washington, DC: National Assessment
Governing Board.
National Research Council. (2007).
Taking science to school: Learning
and teaching science in grades K-8.
Washington, D. C.: National Academy
Press.
Martin, M.O., Mullis, I.V.S., and Foy,
P. (2008). TIMSS 2007 International
Science Report: Findings From IEA’s
Trends in International Mathematics
and Science Study at the Eighth and
Fourth Grades. Chestnut Hill, MA:
Boston College.
Moreno, N.P. & Tharp, B.Z. (2006).
How do students learn science? In
Rhoton, J. & Shane, P. (Eds.). (2006).
Teaching science in the 21st century
(pp. 291-305). Washington, D. C.:
NSTA Press.
Mundry, S. (2006). No child left behind:
Implications for science education. In
Rhoton, J. & Shane, P. (Eds.). (2006).
Teaching science in the 21st century
(pp. 243-255). Washington, D. C.:
NSTA Press.
Stigler, J.W. & Hiebert, J. (1999). The
teaching gap. New York, NY: The
Free Press.
U.S. Department of Education. (2004).
Four pillars of NCLB. Retrieved June
11, 2008, from No Child Left Behind
Web site: <http://www.ed.gov/nclb/
overview/intro/4pillars.html>.
Valverde, G.A., and Schmidt, W.H.
(1997). Refocusing U.S. math and
science education. Issues in Science
and Technology, 14(2), 60-66.
Vitale, M.R. & Romance, M.R. (2006).
Research in science education: An
interdisciplinary perspective. In
Rhoton, J. & Shane, P. (Eds.). (2006).
Teaching science in the 21st century
(pp. 329-351). Washington, D. C.:
NSTA Press.
Wenning, R., Herdman, P.A., Smith,
N., McMahon, N., & Washington,
K. (2003). No child left behind:
Testing, reporting, and accountability.
ERIC Digest. New York, NY: Eric
Clearinghouse on Urban Education.
Tara M. Owens is a biology and physical
science teacher at Whitmer High School in
Toledo, Ohio. She received her Bachelor of
Arts degree in psychology from Ohio Wesleyan
University, and has completed the Master’s
degree program in Curriculum & Instruction
at the University of Toledo. Tara is interested
in improving science education in the United
States through more effective teacher education
and reforms to education policy with an
emphasis on nationalization of standards and
testing. Correspondence concerning this article
can be sent to <tara-owens@bex.net>.
55
Mahsa Kazempour
Impact of Inquiry-Based Professional
Development on Core Conceptions and
Teaching Practices: A Case Study
This case study focused on changes in teachers’ core conceptions and the
translation of such changes to classroom practices needed to enhance
students’ science learning experiences.
Introduction
Teaching science through inquirybased, student-centered instructional
methods has been consistently
emphasized by science education
reform documents such as the National
Research Council’s (NRC, 1996)
National Science Education Standards
(NSES), and practically all states have
adopted inquiry standards. The NSES
emphasize inquiry as a content to be
learned and a way to learn science. In
treating inquiry as a content, the NSES
encourage students’ participation in
activities and learning opportunities
that allow them to experience the
process of scientific inquiry by posing
questions, developing and carrying
out experiments, gathering and
analyzing results, and communicating
findings with their peers. Through
this process, they also gain a better
understanding of the nature of science
and the importance of collaboration
and communication in science.
As an approach to teaching and
learning science, inquiry-oriented
instruction, based on the constructivist
theory of learning, emphasizes the
active role of students in the learning
process. In this model, teachers must
pay attention to and access students’
56
prior understanding and experiences,
which, in turn, should shape the
direction of instruction. Furthermore,
teachers need to guide and facilitate
the learning experience by allowing
students to take an active role in their
learning and construct their own
understanding through first-hand
experience, discourse, and reflection.
Assessment plays a critical role in
an inquiry-based classroom, because
it can help in diagnosing students’
prior knowledge, gauging students’
understanding throughout the learning
experience and guiding instruction,
and measuring their understanding
and knowledge at the completion of
the learning experience.
In order for science education
reforms to succeed, it is necessary
for teachers to be familiar with and
utilize inquiry-based practices in
their classrooms; however, this is not
the case in many classrooms around
the country (Weiss, Pasley, Smith,
Banilower, & Heck, 2003). Although,
there may be numerous explanations
to account for this unfortunate
phenomenon, one of the most important
reasons to recognize and address is
teachers’ lack of familiarity with and
inability to effectively employ inquiry-
based instructional methods in their
classrooms. Inquiry-based teaching is
simply an abstract idea to teachers who
never encountered this type of teaching
during their own K-16 education and
did not learn to teach in this fashion as
part of their teacher education training.
Prior studies (e.g. Cronin-Jones, 1991;
Hashweh, 1996; Keys & Bryan, 2001;
Thompson & Zeuli, 1999; Wallace
& Kang, 2005) have indicated that
teachers’ knowledge and beliefs about
(1) science, (2) the learning process,
(3) their students, and (4) effective
teaching influence their classroom
instructional practices. Hence, it is
evident that instigating changes in
teachers’ classroom practices requires
a transformation in their beliefs and
understanding with regard to the
abovementioned areas. Literature
on professional development (PD)
suggests that such changes, especially
improving teachers’ understanding
of how science operates and use of
inquiry-based teaching techniques,
can be achieved through effective
professional development programs
(Bazler, 1991; Caton, Brewer, &
Brown, 2000).
Professional development as a
tool to enhance teaching is especially
SCIENCE EDUCATOR
stressed in science education reform
documents (e.g. NSES) that emphasize
inquiry teaching; however, as
suggested by prior studies, not all
professional development experiences
can be defined as successful and
fruitful. For instance, Hawley and
Valli (1999) propose that short
PD models that simply “teach”
teachers how to teach through lecture
rather than involving them as active
participants in the process fail to be
effective. It is recommended that
PD programs be directed more by
the participating teachers and based
on teachers’ long-term reflections of
their own conceptions and practices.
Professional development programs
should model inquiry-based instruction
and allow teachers opportunities to
experience science inquiry in an active,
collaborative setting and through
authentic inquiry research (LoucksHorsley et al., 2003; Thompson &
Zeuli, 1999).
Beginning in 2003, one such
professional development program has
allowed high school science teachers
in a particular Midwestern state to
have opportunities to experience
science inquiry first-hand and learn
about inquiry-based teaching. Several
studies have focused on the participants
completing this program (Bonner,
Lotter, & Harwood, 2004; Lotter,
Harwood, & Bonner, 2007). The
case-study research by Lotter, et al.
(2007) involving the three high school
science teachers who participated in a
two-week inquiry-based professional
development workshop reported on
the type and degree of change in
four core conceptions: conceptions
of science, conceptions of students
and student learning, conceptions
of effective teaching practices (esp.
inquiry), and conceptions about the
purpose of education (esp. science
FALL 2009
VOL. 18, NO. 2
education). It also indicated that the
type and amount of inquiry instruction
performed in the classrooms were both
positively and negatively influenced
by the participating teachers’ core
conceptions. Furthermore, these
findings alluded to internal and external
constraints that impeded participating
teachers’ implementation of inquirybased instruction in their classrooms.
Some of the key constraints, previously
mentioned by Tobin and McRobbie
(1996), include a perceived lack of
time, the need to prepare students for
state exams, and the need to cover
all of the material mandated by state
standards or school districts.
Purpose
The above case study focused on
changes in teachers’ core conceptions
and the translation of such changes
to classroom practices with regard
to only one specific course for each
participant. The instructional practices
of the study’s three cases and the
core conceptions that were found to
influence their instruction fell into
three categories: 1) teacher-guided
inquiry and few instructional changes,
2) real world inquiry-based units and
reflective teaching, and 3) controlled
inquiry and cautious change. It would
be valuable to extend these findings by
exploring other program participants’
core conceptions and instructional
practices and whether they fit any
of the mentioned categories. The
current study focuses on a participant
attending the same professional
development program two years later
whose teaching assignments included
three different courses. The aim of the
study is to explore the changes in the
core conceptions and instructional
practices of this teacher with regard to
all three courses. Furthermore, factors
that aid or inhibit the implementation
of inquiry-based teaching in these
different courses are examined.
Methodology
Context of Study
Beginning in 2003, a group of
science and science education faculty
at a large Midwestern university
took part in a collaborative effort
aimed at improving K-16 science
education. One component of this
multi-tiered project, which was funded
by a Howard Hughes Medical Institute
grant, included an inquiry-oriented
professional development (PD) for
high-school science teachers from
across the state. The PD consisted of
a two-week summer workshop and
three follow-up workshops during the
academic year. The summer workshop
was divided into morning and afternoon
sessions. In the morning sessions,
participants actively participated in a
variety of inquiry oriented activities
and discussions. The first week was
devoted to teachers developing a
7-step plan aimed at solving their
students’ “bottlenecks,” which refer to
concepts that students have difficulty
comprehending (Bonner, et al., 2004;
Lotter, et al., 2006). Teachers then
developed inquiry-based lessons to
address their selected bottlenecks.
During the second week, each
participant presented their inquirybased lesson to the rest of the group
followed by a discussion in which
facilitators and other participants
provided feedback on the lessons and
ideas about ways to improve them.
Each day, participants completed
readings on topics addressed in
the workshop and reflected on
the workshop activities as well as
their own learning and beliefs. The
afternoon sessions allowed teachers
to work in authentic settings alongside
57
assigned science faculty conducting
research in biology, chemistry, or
physics. Participants also completed
daily reflections on their experiences
in the labs.
Research Design
The current study is part of a
larger, ongoing study exploring the
experiences of the science teachers
participating in these annual workshops.
Since the aim of this study is to better
understand the experiences, changes in
conceptions and practices, and factors
influencing classroom practices of one
particular teacher, a qualitative case
study approach was deemed most
appropriate. This approach allows
for in-depth examination of data from
various sources in order to provide
a rich and holistic description and
picture of the particular case (Merriam,
1988). These data sources included:
a brief questionnaire on participants’
views and instructional practices both
before the workshop and during the
instructional year, field observations
in all three classes several times a
week for a period of four weeks during
the following academic year, and a
semi-structured interview and several
informal conversations during the
observation period.
Sample
The case in focus, referred to as
Seth from this point forward, was a
high school science teacher in the same
town as the university in which the
summer workshops were held. Seth,
who had been teaching for 17 years,
received his undergraduate degree
in geology and completed an M.A.T
program majoring in biology. He was
teaching College Preparatory Biology
(i.e. regular biology), Life Science
(remedial, lower level course), and
Advanced Environmental Science
58
(Junior/Senior level course) in a school
that was one of the few in the state to
receive a distinguished Great Schools
rating of 8 out of 10. The school has
slightly over 1500 students, 83% of
whom are white, 6% black, 4% Asian/
Pacific Islander, and 2% Hispanic with
21% of the student body eligible for
free/reduced lunch. The school enjoys
above state average math and English
scores. The classes are arranged
in a Block 8 schedule with four 85
minute classes alternating every other
day. Seth taught two biology and an
advanced environmental science class
on one day and one biology and two
lower level life science classes the
next day with each class consisting
of 22-27 students. Seth’s proximity
to the University and the number of
different types of courses taught were
the main reasons that he was selected
for this study.
Data Analysis
Interview data were analyzed
using the constant comparative
method (Glaser & Strauss, 1967;
Denzin & Lincoln, 2000) to identify
themes regarding Seth’s four core
conceptions as identified by Lotter,
et al. (2007) and factors that influence
the implementation of inquiry-based
instruction. Observation logs were
analyzed in order to document
emergent patterns regarding Seth’s
instructional practices in the three
courses. The process of analyzing
the data involved several iterations
of reading and coding as well as
discussion of themes between the
authors to identify patterns.
Findings and Discussion
The following sections describe 1)
changes in Seth’s core conceptions,
2) changes in his instructional
practices, 3) factors that augmented
the implementation of inquirybased teaching in his classroom,
and 4) factors that impeded such
instruction.
Conception of Science
Seth explained that, although a few
of his M.A.T courses had addressed the
nature of science to some degree, he
had continued to view science mainly
as a body of facts about the world
around us. He further explained that
his own experiences with learning and
teaching science had left him thinking
about science mainly as terminology,
facts, equations, and theories he had
memorized or learned superficially,
and he admitted that this influenced
his actions in the classroom. His main
focus had always been on science as
a content to be mastered. However,
upon completion of the summer PD,
he began to view science as more than
just facts and unrelated content as
described in the following quote:
I had always known that science
was more than just facts, but the
classes I have attended and those
I taught have caused me to lose
touch with many important aspects
of science and to overlook them in
my teaching. In my classes, facts
and terminology were always
emphasized, but now I see, and
try to help my students see, that
science is more than that. It is really
about posing questions and solving
problems. It is about thinking
critically and trying different things
and being active in the pursuit of
answering questions.
Furthermore, Seth’s understanding
of the scientific process expanded
from a simplistic, unrealistic scientific
method to a more cyclical and
integrated model of inquiry that
involves continued iterations of posing
SCIENCE EDUCATOR
questions, making observations,
collecting and gathering data, and
analyzing and communicating results.
Seth explained that he had always
begun his courses by introducing
the scientific method and followed
that specific protocol in the few labs
his students would do in class. He
emphasized that, although students’
thinking was of great value to him,
he had, up to that point, mainly done
cookbook confirmation type of labs
that allowed little room for obtaining
a unique answer. Seth explained:
Up to last year, my students
probably could all tell you the steps
of the scientific method. Sometimes
I would see some of them struggle
with the order of the steps or
become frustrated because they
did not get the “correct” answer.
But now I think back, and I see that
students can arrive at solutions to
problems in different ways. (Pause)
I had taken out the creative and
imaginative aspect of science. Even
though I had asked my students to
always base their conclusions on
evidence, I had invariably pushed
them to come up with the results
that confirm what I had taught
them. Now I want my students to
think outside the box. I want them
to be able to not be scared to state
that their results were inconclusive
or that their results do not support
their original predictions. It is still
difficult for them to do that because
they are not used to it, but at least
now I find myself pushing for that
mindset.
This indicates that after the PD, Seth
had a more enhanced understanding of
the nature of science and the process
of scientific inquiry. As a result of
PD discussions about the inaccuracy
of a rigid and linear model of science
FALL 2009
VOL. 18, NO. 2
inquiry and the idea of the scientific
process as fluid because each step may
lead to further questions, observations,
and experiments, Seth replaced his
conception of the scientific method as
an inflexible set of rules with a cyclical
model of scientific inquiry. These
changes make Seth’s conceptions of
science more consistent with ideas
presented in science education reform
documents. However, there were
some minor inconsistencies in his
responses that should be mentioned.
Although he indicated understanding
science as more than content and
the importance of science process
skills, he added that this was not the
case in all his classes. For example,
he explained that in his College
Preparatory (CP) biology course he
could not and did not emphasize the
more accurate depictions of the nature
of science and how science is done as
much as in his environmental science
course. He pointed to the continued
importance of presenting facts and
content information in that class in
order to prepare students for the state
exam and college. He further described
the current structure of the life science
course as also inconsistent with some
of the changes he had mentioned.
He added that he found it difficult
to portray an accurate depiction of
the nature and processes of science
to these students, because he had not
yet incorporated much change into the
techniques used for this class.
Conceptions of Students
Similar to his beliefs about science,
Seth’s conceptions of students also
underwent change as a result of the
summer PD. Seth admitted that,
prior to the PD, he did not take into
consideration the unique needs of
every group of students and taught all
of his classes in the same way without
regard for the diversity of learners in
his classroom. Because he had always
emphasized science as facts and
content to absorb and put to memory,
he had not paid attention to differences
in his students’ abilities, learning style
preferences, prior experiences, and
processes of cognition. As described
in the post-PD interview, he began to
view students as an important variable
in the equation.
I have come to realize that students
are not “blank slates” to be
injected with information. They
come to my classes with different
abilities, experiences, and levels
of understanding which I need
to acknowledge in my teaching.
I have also come to realize how
important their prior understanding
and experiences are, not only to
themselves, but to others in class.
There have been so many instances
where they have shared something
that has been valuable to our class
discussions and lessons. Instead of
saying ‘here is something new, let’s
learn about it’, it’s like ‘what do we
already know about this?’ So it is
more of an immediate connection
to their own experiences.
He continued to explain that “students
in the regular and higher level courses
are capable learners who should be
actively involved in their own learning
and given the freedom to explore their
own questions and discover content for
themselves with teacher guidance.”
Here again, a slight point of conflict in
his views was seen as he proceeded to
comment: “Of course, students in the
lower level classes may be able to do
so too but need to be guided more and
should be given the tasks to complete
and the instructions to follow, because
they may have difficulty otherwise.”
Seth indicated a lack of trust in these
59
students’ abilities and a hesitancy to
allow them more autonomy in their
learning.
Conceptions of Effective Teaching
Seth’s new understanding about the
role of students in the learning process
partly describes his newly formed
beliefs about effective teaching. In
reference to his old teaching methods,
he described himself as “a usual
lecturer with frequent worksheets
and occasional labs and hands-on
experience.” When asked about his
post-PD views on effective science
instruction, he displayed plenty of
enthusiasm for the inquiry-based
method of instruction and mentioned
that he had strayed away from
traditional methods.
He also mentioned the importance
of “incorporating inquiry opportunities
for students to pose questions and
investigate them and use science
process skills and problem solving
skills in order to discover more about
various class topics.” He placed
emphasis on engaging students in the
learning process by making learning
personal and capturing their attention
and interest early in the process.
The following quote from one of
the informal conversations further
clarifies the change in his beliefs
regarding effective teaching:
Making it personal and relevant
and capturing students’ attention,
(pause) that was something that was
modeled in a lot of the workshop
sessions. Let’s say we have a demo,
what do we know about what’s
going on here? So trying to pull out
from them the knowledge and you
can guide that and add things to it
and it becomes a teachable moment
based on something they already
know instead of saying ‘here you
60
go, here is some knowledge’—I
think it is more engaging to them,
immediately captures their interest,
makes it more personal.
He also added:
I think it’s basically getting students
involved in coming up with their
own questions and directing their
own learning and engage them
more in the process of the lesson.
I think that is most valuable. One
aspect that I most like about it is the
gathering of common knowledge, in
a group. Students find that exciting
and empowering a lot of times.
He continued with his emphasis on
the importance of engaging students
in the learning and the “mind capture”
approach, as it had been referred to in
the workshops. It was also clear that,
in his revised view, lectures played
a less important role and were to be
limited to discussions that should
follow active exploration of concepts
rather than preceding them.
Capturing their interest is very
important, (pause) get them excited
about the lesson instead of just
me saying, ‘here, we are going to
lecture on a topic and then now we
are going to do a lab on it’. I had
always tried to introduce an idea
and then do a lab. This PD has
kind of changed my idea a little bit
(pause) pose a question and have the
problem present itself, then do the
lab, and then discuss the concept
at the end”
Furthermore, he viewed inquiry-based
teaching as an investigative approach
and defined any learning activity in
which “groups of students work as
collaborative teams to explore and
think through problems” as inquiry.
He continued: “In an inquiry-based
classroom, students may be presented
a problem or an action and be asked to
figure out why.” The PD workshops
heavily emphasized that this type of
inquiry could occur outside the walls
of the laboratory.
Conception of purpose of learning
The final category of beliefs
examined in this study was views about
the purpose of learning. In response
to questions related to the purpose
of learning, Seth described how the
PD had “opened my eyes” to realize
how in the past he had “incorrectly
viewed the purpose of teaching to be
for students to gain knowledge that
they can use in their future classes and
careers they pursue” without much
attention to anything besides content.
He stated that “scientific critical
thinking and problem solving are the
two most important goals of science
education” and added that possessing
these two capabilities “applies to every
student’s daily life and will continue
to be used in adulthood, regardless
of direct involvement in science.” He
emphasized the importance of giving
students the opportunity to “learn to
do science and think in a way that
scientists think—like looking at data
and interpreting them without help … .
to get to a point where they make those
judgment calls.” Finally, Seth made
a comment regarding his CP biology
course that indicated he had not yet
completely abandoned some of his
previous ideas. He described his CP
biology course as more content-driven
because of the “standards and the state
test.” He added that it is important that
“students come away from that class
with knowledge of certain vocabulary,
processes, and concepts that they will
encounter in their lives, college, or on
the state exam.”
SCIENCE EDUCATOR
Classroom practice
The second research question
is concerned with the ways in
which Seth’s four core conceptions
translate into teaching practices in
the classroom. Field observation and
interview data were used to provide
a rich description of his classroom
practice and evaluate the extent to
which his instruction was aligned with
science education reform initiatives
that call for inquiry-based teaching.
When asked about his teaching
practices since the PD, he described a
continuous process of reflecting on his
instruction and modifying lessons and
activities to make them more inquirybased and student-centered. He stated:
“Since the workshops, I find myself
constantly thinking about changes. As
a teacher, I am looking at everything
so differently now.” Seth indicated that
although unable to create changes in
every aspect of his teaching or re-do
everything he had done so far, he was
attempting changes and thinking about
aspects he might handle differently in
the near future.
I can’t do it (inquiry) everyday,
especially with three different
classes that I need to teach, but
whenever I am really rethinking
a lesson that’s always in the back
of my mind ‘how can I do this in a
more inquiry manner’?
Observing Seth’s classrooms
clarified several items. First, the
rethinking and tweaking of lessons
and activities Seth had mentioned
were indeed occurring. Second, there
were noticeable differences in Seth’s
instructional practices, including the
incorporation of inquiry-based teaching
techniques in the three classes which
will be described below. The Advanced
Environmental Science course, based
FALL 2009
VOL. 18, NO. 2
on Seth’s own accounts and the
classroom observation data, was the
most inquiry-based class. Students
often worked collaboratively in teams.
Seth’s lectures had been replaced with
class discussions, video presentations,
and team presentations. Students
participated in projects and long-term
experiments rather than occasional,
brief, cookbook labs, which had been
the case previously. One example of a
long-term investigation that had been
introduced after the PD involved the
study of lemna. In previous years, Seth
had merely discussed and lectured
about population growth, and then
the class reproduced a simple lemna
population growth laboratory exercise
out of the textbook. However, this
year, he turned this one-time cookbook
lab into a year-long investigation that
spanned two semesters and addressed
other topics besides population growth,
including ecosystems. His description
of the project follows:
This year I wanted to do something
different and thought the lemna
project might be the best route.
First semester we explored the
population growth of lemna in a
more guided inquiry where I was
still the one directing students’
attention to the question and gave
them some directions for the
investigation and data collection.
But they were really into it. We were
able to address not only population
growth but also how to make data
tables and show data on graphs. It
was very successful! We got some
of the best growth curves I have ever
seen. Then this semester I thought it
would be cool to continue with the
lemna population activity and allow
my students more freedom this
time around. So I used the previous
project as a baseline study and
had my students think about how
the introduction of various things
into the environment might effect
the population growth of lemna. It
has been great! They have really
surprised me.
Classroom observations coincided
with the last week of the open-inquiry
lemna investigation. Students were
seen walking into the classroom and
going straight to their stations to check
on their lemna population and collect
data. This time was also used to carry
out routine procedures such as adding
more of the “contaminant”, checking
temperature, and adding water. Each
team was doing something different
in accordance with their investigation
design. Seth circulated around the
classroom and observed teams at
work. Occasionally, he would ask
members of a particular team questions
about their protocol, observations,
or other matters relevant to their
study. When in need of guidance, the
teams would ask him questions as he
listened carefully and in return Seth
asked further questions to guide the
students, rather than giving them the
answers. Seth described his role in the
classroom as such:
It (lemna activity) is an ongoing
activity. So at the beginning of
each class I wander around as they
collect data and solve problems like
‘our lemna died what we do?’ I try
to get them to think and redirect
questions. ‘OK what should we do?’
They pose ideas such as ‘mess w/
the concentrations? Let’s try with
half and see what happens?’ So it
takes some thinking on your part
and not giving them the answer but
drawing it out of them.
After their initial period of
observation and data collection,
61
students returned to their seats and
had a brief chance to discuss their
findings and possible next steps with
their teams. Seth continued to facilitate
discussions. Afterwards, he asked
them to begin thinking about how to
analyze their data and present their
findings to the class. Students worked
in their teams to draw graphs, check
journal articles for prior studies similar
to their own, and discuss conclusions
and the implications of their study.
Several days were spent on this phase
of the project, and then several class
sessions were devoted to presentations
of the individual projects. Each
presentation was followed with a
question and answer session in which
the audience would pose questions
or make suggestions for improving
the study, and this would develop
into whole class discussions on the
implications of the findings. This
project was extremely student-driven
and engaging. Students were constantly
active in exploration, discussion,
analysis of data, collaboration, and
communication.
Seth also described another project
he had developed for this course that
had taken place prior to the observation
period. He had reflected on and
modified the recipe-type forest density
lab and created a more student-centered
investigation of the successional stages
of trees. He describes the forest
ecology investigation in this way:
I had always done the lab in the
book, and, although students used to
have fun going outside and looking
at the trees, I did not think they were
thinking much (pause). This time
we went out there, and we said ‘ok
let’s pose a problem—how can we
figure out what’s going on here?
Who are the dominant species?’ Got
some of their ideas, we came back
62
and talked and shared those ideas
and came up with pros and cons
of each. Came up with ideas that
were pretty similar to what we’ve
done in the past, but I felt this year
they had a better understanding of
what they were doing—whereas
in the past they were plugging in
numbers into equations and not
really understanding what those
equations were.
It was evident from the interview
and observation data that the most
changes had occurred in this class.
Seth described his interest in making
learning relevant to students and
allowing them to experience science
firsthand. He also described courses
such as environmental science as his
“favorite” and “ideal,” because they
gave him plenty of flexibility to teach
in this fashion.
My ideal classroom would
be outside. In some ways my
environmental science I tried to
make my ideal class. Part of it is
because there are not set standard.
I try to take them outside and
with field trips and look at local
resources and ecosystems to make
it more applicable and conducive
to their lives.
In his biology classrooms, the
tweaking and slow process of change
and reflection he referred to in his
interview were evident during the
observation period as well. This class
consisted of some inquiry-oriented
activities, but Seth mentioned that he
had not yet dramatically changed any
lab, activity, or unit in this classroom.
When asked why he had not yet taken
steps to modify this class in the same
manner as the advanced environmental
science course, he identified the
quantity of content material that
needed to be covered as the chief
factor preventing the more immediate
implementation of change.
With bio you need to cover. So
much of it is just vocabulary and
the concepts behind the vocab.
There is a time limit. Not easy to
do long-term experiments. You feel
like you have to cruise through the
material/units quickly, so you have
to modify the inquiry.
This is not to say that he did not reflect
on or change his instruction at all.
Instead of large, sweeping changes,
Seth had resorted to changing small
components of the course, such as
doing more class discussions in place
of lectures, using attention-grabbing
demonstrations or discrepant events,
posing problems to engage students
in the learning, and asking more
questions during activities and class
discussion. As he put it, “a lot of it is
not changing the lab, but how I present
the lesson and the topic—for example,
brainstorm before we start. Regular
lesson, but they introduce it.” Although
there were greater instances of teacherdirected instruction in this class, Seth
attempted to maintain his facilitator
role during student activities. He also
gave students the opportunity to share
and discuss the results of the labs and
activities instead of simply doing the
activities and moving on.
As for Seth’s third course, there were
little to no changes in the life science
classes. There was little inquiry-based
learning occurring in this class, and it
continued to be dominated by teacherled lectures, occasional cookbook labs,
worksheets, and bookwork. When
asked about the life science course,
Seth explained that he had spent the
least amount of time changing that
course. Since the workshops, he found
SCIENCE EDUCATOR
himself thinking about his teaching
mainly in the other two courses. He
continued: “maybe it is because I am
used to using the set activities from
before that are shared with the other
instructor teaching the same course.”
Other possible reasons for the lack of
change in this course will be described
in a subsequent section.
Seth had, however, included some
demonstrations to catch students’
attention and interest. For example,
when discussing osmosis and diffusion,
he did a demo that involved placing
an egg in three different solutions:
water, vinegar, and corn syrup. He
also occasionally utilized video clips
of Bill Nye the Science Guy and other
educational videos to partially replace
his lectures. However, there was little
change in terms of the students’ role in
the learning process. They continued
to play a passive role in the learning
process in that they were most often
observed listening to Seth, watching
videos, and taking notes. They did
work in teams for their labs, but
this teamwork did not involve much
collaboration or communication, and
the conversations that did occur were
usually about procedural details.
There were hardly any discussions
of the steps being carried out or the
data gathered. Collaboration was not
extensive; in most teams there were
some students who were participating
less than others. Collaboration was
limited to following prepared steps,
reading out loud the instructions,
copying down the data, and cleaning
up. There were few or no questions
for students to think about and discuss
to guide their learning. Seth did try
to facilitate team discussions, but
these were limited to procedures and
observations. A significant portion of
the teams’ results were confirmatory
FALL 2009
VOL. 18, NO. 2
of his lectures and the textbook
information. As a result, students
often simply repeated his statements
or regurgitated information from the
textbook.
One such example occurred when
students looked at some slides of cells
under the microscope. This lab was
prefaced with reading the textbook
section on cells and a lengthy lecture
with transparency slides of plant and
animal cells and cell organelles. When
students were looking at the slides
under the microscope, they simply
scribbled a drawing of the slides
without much discussion. They were
often having off-topic conversations
about their personal lives, other
classes, and so forth. There was hardly
any discussion of their observations,
the differences between the types
of cells, or the organelles. The only
recognizable features of the cells in
their drawings were the cell wall,
the cell membrane, and the nucleus.
In addition, most students copied
down a few other lines or shapes
that they struggled to label. When
Seth approached one of the teams
and inquired about their drawings,
students began checking their lecture
notes in order to point to and name
the organelles that they had observed.
His conversations with the teams were
very limited and brief.
Factors promoting change
Seth was cognizant of the
changes that had occurred in his core
conceptions and the ways in which
that was beginning to take shape
in his teaching. He also repeatedly
mentioned that his understanding of
the processes of teaching and learning,
especially inquiry-based instruction,
had been enhanced as a result of his
participation in the PD.
I just feel very good about the
PD. I learned a lot, more than I
can describe. It will take me some
time to be able to digest all of it
and apply it in my classes. Like I
said before, I am finding myself
thinking about my teaching all
the time. I am incorporating some
changes here and there, and, even
though it may not be much, I have
learned so much!
Similar to the participants in the Lotter
et al. (2007) study, Seth cited numerous
aspects of the PD experience that he
had found beneficial to enhancing
his understanding. He felt several
features of the two-week summer
workshops were especially valuable.
First, the workshops modeled effective
instructional methods rather than just
informing participants about them.
Seth noted, “It was nice not to be told
or trained on what to do but rather
shown by the action of the facilitators
themselves. It was more powerful that
way.” Similarly, he found it immensely
useful to be an active participant and
experience inquiry-based learning
first-hand.
I felt like my students. I was
doing things in this workshop
rather than being given lots of
information. We went out and made
observations, we did the inquiry
activity with the bread facilitated
by the science facilitator, and so
forth. I found myself constantly
thinking and active. We then put to
use the information we had gained
about inquiry-based teaching and
looked for ways to change one of
our current lessons. I could not
imagine participating this much
and applying my knowledge so
quickly.
63
He also discussed the importance of
being in a group of peers and having
ample opportunity to discuss ideas with
them. He found the large and small
group discussions and conversations
“very stimulating and encouraging.”
Finally, he considered the readings,
activities, and discussions regarding
the inquiry process especially useful,
because “gaining a better understanding
of the process of science meant that I
would also try to portray science more
accurately in my classes and would
also try to have my students’ learning
mimic inquiry.” He added:
Learning about inquiry-based
learning and all the other stuff
we learned wouldn’t have made a
difference if we had not addressed
our misconceptions about the
scientific inquiry process first. I
used to drill the scientific method
into my students’ head. My teaching
of science was dry and linear and
mimicked the unrealistic scientific
method rather than the more
accurate model of the inquiry wheel
that we learned about.
Seth also discussed the importance
of the second portion of the summer
workshops, the research experience.
He felt it was extremely interesting
and valuable to join science research
laboratories and to work alongside
science faculty and graduate research
assistants. He mentioned that he thought
the afternoon sessions “complemented
the morning activities and discussions”
by allowing participants to “see and
experience science inquiry first-hand.”
He noted the experience equipped him
with a better understanding of science
content, investigative techniques and
equipments, and the process of doing
science.
Although I like to stay up to date
with information in my field, I found
64
much of the stuff I experienced
in lab very interesting and eye
opening. I had no clue about some
of the procedures or equipments.
It was so different to see these
scientists in action and to have some
part in their work during that short
period of time.
The experience also allowed him
to be more cognizant of his students’
experiences in science.
This was a great way for us teachers
to step out of the teacher mode
and see things from our students’
perspective. At times when I
couldn’t understand what was
going on around, I could totally
sympathize with my students. Do
they understand when I am lecturing
them or is the information just way
beyond them? At other times, I
found myself thinking ‘how can I
do this in the class’ or ‘how can I
apply this to my teaching so that
my students get to enjoy their
experience and learn from it as I
am’.
Finally, he noted the importance of
the experience in helping him to
better understand the value of working
collaboratively and communicating
effectively. They were able to “bounce
around ideas, share frustrations, and
explain things to each other.” He
added:
It was great to see the collaboration
amongst ourselves and also the
scientists that we were observing
or working alongside. I used to
have my students work in teams
but not enough and not effectively.
I am hoping I have gained a thing
or two in the PD.
At the time of the post-PD interview
and class observations, Seth had
already participated in one follow-
up workshop. He felt the follow-up
session had been necessary. He
indicated that going back to the schools
and trying to implement the lessons
learned in the summer was not an
easy process and expressed gratitude
for the opportunity to discuss those
experiences with his peers. The sharing
of lessons and the stories of successes,
failures, struggles, and means of
coping with the difficulties was cited
as extremely valuable. He referred
to the significance of feeling a sense
of community that allowed members
to benefit from sharing experiences,
ideas, and feedback.
It was just fabulous. We got a
chance to come back and just talk
for a while and discuss what we had
done and what it had been like. It
was amazing some of the similar
situations we had experienced. It
was also great to share how our
lessons went and share other lessons
we had come up with.
Seth also found the additional inquiry
activities that were modeled in the
workshop to be a good refresher of the
summer workshop. One such activity
Seth referred to was the modeling of
the 5E learning cycle that involved
investigating the process of burning a
candle and factors that affect the rate at
which that occurred. Seth added: “The
candle activity allowed me to see 5E
as a model of inquiry teaching that we
had learned about. This process made
sense, and I got to understand it even
better because I was experiencing it
like a student.”
Constraints to inquiry teaching
As noted above, Seth’s views and
core conceptions had undergone
major changes as a result of the PD,
but the implementation of inquirybased teaching in his classrooms
SCIENCE EDUCATOR
was not as obvious nor as consistent.
During the course of the interview and
informal conversations, Seth alluded
to several constraints and offered
a number of explanations for not
incorporating more changes and doing
so consistently in the three courses.
Figure 1 provides a depiction of the
four main factors and explanations: 1)
lack of support, 2) lack of time, 3) lack
of resources, and 4) lack of flexibility
and the interconnections between these
four factors.
The overarching factor that directly
or indirectly influenced many other
areas was the lack of support that Seth
described having from his peers, the
department, school administrators,
and the state. Seth explained that
state mandated tests and requirements
had caused a series of practices and
requirements at the district and school
level that inevitably had made teachers,
such as Seth, feel a lack of flexibility
and autonomy in their classrooms,
especially those such as CP biology,
which has a state-mandated exit exam.
Seth expressed feeling overwhelmed
by the amount of content to be covered
in the course as well as the need to
prepare students for the state test and
the end of unit tests that were created
and used jointly by all biology teachers
at the school. The inflexibility paired
with “a dearth of available inquirybased curriculum material” caused him
to feel as though he had an insufficient
amount of time available for both the
planning and execution of inquirybased lessons.
I have tried looking for inquiry
lessons to no avail. It is timeconsuming and often unproductive.
I just do not feel I have the creativity,
energy, and the time to do more than
a few inquiry-based lessons at a time
or bring about more changes than
Figure 1: Four main constraints to implementing inquiry-based teaching
FALL 2009
VOL. 18, NO. 2
65
I have. It is just too much to try to
do, and I have really tried.
I really feel the lack of resources in
the other two classes.
I have to be honest that my lower
level bio class is getting the least
attention this year as I try to change
my teaching. I have a hard time
finding the time to change the other
two courses, and so I find myself
not paying as much attention to
changing this class and resort to
the old material I already have
preplanned. Maybe next year I can
spend more time on this course
too.
Finally, Seth felt a sense of isolation
and frustration, because he was the
only one in his department who had
undergone the PD experience, knew
about inquiry-based learning, or cared
much for it. He felt that he did not
have the necessary support from his
peers to be able to collaborate and
bring about changes on a wider scale
to science instruction in the school.
This lack of support also led to other
issues already mentioned above such
as the overemphasis of content and
pressure to prepare students for school
and state exams.
In my bio class, I just try to tweak
here and there and do some inquiry
whenever I can, but I just feel that
it is very hard to do in that class,
because I don’t have enough time
to cover everything. All the biology
teachers give the same test at the end
of the unit, and, no matter what I
do; I need to make sure my students
are ready for those tests. There is
hardly any time to do long-term
projects and investigations.
Besides the lack of resources for
planning inquiry-based teaching,
there was also a lack of resources for
carrying out inquiry-based instruction.
For example, microscopes had to be
shared by three different classes, other
equipment and materials needed for
various activities and projects were
not available, and there was no funding
to purchase such resources or pay for
requested field trips.
Even in my environmental science,
I feel I could do a whole lot more
if I had the funding for purchase
of equipment or to fund a field
trip or two my students and I have
been interested in taking. At least
I can take them outside and use the
areas around school to explore, but
66
Conclusions and
Implications
This case study provided
further support for the need for
effective inquiry-based professional
development opportunities for teachers
in order to bring about the changes in
their views and practices needed to
enhance students’ science learning
experiences. As noted in earlier studies
(e.g. Huberman, 1995; Lotter et al.,
2007), changing teachers’ views and
instructional practices is a slow and
intricate process that is dependent
on a variety of factors, as has been
illustrated to some degree in this study.
Seth’s case further demonstrated that
professional development experiences
should 1) occur over an extended period
of time, 2) involve active participation
of teachers by immersing them in
authentic scientific inquiry, inquirybased activities, and discussions,
3) model effective inquiry-based
instruction, and 4) allow teachers
opportunities for continuous reflection
on their beliefs and practices during
the PD and in their classrooms in
order to identify areas that could be
improved upon and implement the
necessary revisions. There is also
an immense need to provide PD
participants the means for continued
communication and collaboration
in an effort to 1) share ideas and
inquiry-based lessons, 2) discuss
frustrations, obstacles, and successes
faced during the implementation of
inquiry-based instruction, and 3)
facilitate communal reflection on ways
to further enhance students’ science
learning experiences.
Beyond the PD specific components, Seth’s case illustrated numerous
additional factors in the school environment that influence the implementation of inquiry-based instruction,
and, therefore, require serious consideration. One such factor is the statemandated tests and requirements that
put extra pressure on schools, some of
which, as illustrated in the case of Seth,
place tremendous emphasis on testing
and coverage of content material that
allows little flexibility and time to plan
and carry out inquiry-based lessons.
Additionally, in an effort to increase
test scores, little attention is given to
professional development for teachers
and, the promotion of inquiry-based
instruction is virtually nonexistent.
Furthermore, science teachers in their
department are often instructed to keep
their instruction the same, especially in
the core courses that students get tested
on, and administer the same end of unit
exam for all sections of a course. These
exams, along with the state mandated
tests, often overemphasize content and
vocabulary and are often unaligned
with inquiry-based instruction that
PD participants wish to incorporate
in their classrooms.
It is imperative that school
administrators realize the power of
SCIENCE EDUCATOR
inquiry-based learning in enhancing
student learning and science
experiences. The emphasis on testing
and content-driven curricula must
be replaced with an emphasis on the
augmentation of student learning
through experience in order to develop
a science literate student population
as defined by the NSES (NRC, 1996,
p. 22). School administrators must
play an active role in encouraging
inquiry-based teaching and learning in
all aspects of the school by providing
teachers with the encouragement and
support necessary for participation
in professional development and
implementation of inquiry-based
instruction.
In addition to the lack of flexibility
and time for inquiry-based instruction,
the scarcity of time available to devote
to creating and planning inquirybased lessons makes achieving these
goals extremely challenging. Truly,
inquiry-based curriculum materials
are scarce, and many teachers, such as
Seth, find it difficult, time-consuming,
and sometimes unproductive to
undertake the process of converting
to inquiry-based instruction. The
science education community must
strive to equip teachers with inquirybased curriculum materials and aid
teachers in finding resources and
planning out their own lessons and
units. Teachers who participate in
PD experiences may find themselves
struggling to concomitantly meet
school requirements, adopt inquirybased instruction, and create a
community of change within their
schools. In order for these teachers to
be successful, they must be provided
assistance along the way in the form
of peer and expert coaching (Fullan &
Stiegelbauer, 1991; Thiessen, 1992).
FALL 2009
VOL. 18, NO. 2
Another underlying issue is the
lack of support and the sense of
isolation PD teachers feel when
they return to their schools and find
themselves surrounded by colleagues
who may not be familiar with
inquiry-based learning or have no
interest in non-traditional methods of
teaching. Several steps, in addition to
continued communication with the PD
facilitators, must be taken to alleviate
this sense of isolation and helplessness.
First, PD facilitators should encourage
and assist participants in finding a
means of staying in communication
with one another upon their return
to their schools. This could be done
by arranging group meetings or
through social networks. Services
such as Twitter and Facebook or
online discussion forums provide
a convenient, low-cost medium
through which members can stay
abreast of group activities and share
lessons, ideas, problems, and so forth.
These communities could even be
extended to include other teachers,
from across the country, who have
gone through similar experiences
through the formation of critical
friends groups. Second, many of the
previously mentioned obstacles may
be eliminated if PD planners focus
on teachers from the same schools or
districts so that they are all equipped
and better prepared to promote and
instigate changes in science instruction
once they return to their buildings.
Moreover, this will enable these
teachers to work collaboratively in
planning lessons, creating assessments
that are aligned with the curriculum,
receiving feedback on their instruction
from each other, and discussing issues
and obstacles that they may continue
to face. Focusing on “communities of
practice” and building a “professional
culture” allow for supportive and
nurturing environments that are key
to the adoption of inquiry-based
and effective instructional practices
(Loucks-Horsley, et al., 2003, p.
91). If the ultimate goal is to better
prepare a science literate citizenry,
we must begin our work by not
only enhancing the instructional
capacity of teachers through effective
professional development, but also by
calling attention to the culture of the
educational institutions to which they
return and needs that may arise after
the PD experiences.
References
Bazler, J. A. (1991). A middle school
teacher summer research project.
School Science and Mathematics,
91(7), 322-324.
Bonner, J. J., Lotter, C., & Harwood, W.
S. (2004) Improving Student Learning,
One Bottleneck at a Time. The
Science Teacher. Dec. 2004: 26-29.
Caton, E., Brewer, C., & Brown, F. (2000).
Building teacher-scientist partnerships: Teaching about energy through
inquiry. School Science and Mathematics, 100(1), 7-15.
Cronin-Jones, L. L. (1991). Science
teacher beliefs and their influence
on curriculum implementation: Two
case studies. Journal of Research in
Science Teaching, 28, 235-250.
Denzin, N.K., Lincoln, Y.S. (2000), “Introduction: the discipline and practice
of qualitative research”, in Denzin,
N.K., Lincoln, Y.S. (Eds), Handbook of Qualitative Research, Sage,
London, pp.1-28.
Fullan, M. G., & Stiegelbauer, S. (1991).
The new meaning of educational
change (2nd ed.). New York: Teachers
College Columbia University.
Glaser, B. G., & Strauss, A. L. (1967).
The discovery of grounded theory.
Chicago: Aldine.
67
Hashweh, M. Z. (1996). Effects of science
teachers’ epistemological beliefs
in teaching. Journal of Research in
Science Teaching, 33, 47-63.
Hawley, W. D., & Valli, L. (1999). The
essentials of effective professional
development: a new consensus. In L.
Darling-Hammond & G. Sykes (Eds.),
Teaching as the learning profession
(pp. 127-150). San Francisco: JosseyBass.
Huberman, M. (1995). Networks that
alter teaching: Conceptualizations,
exchanges and experiments. Teacher
and Teaching: Theory and Practice,
1(2), 193-211.
Keys, C. W., & Bryan, L. A. (2001). Coconstructing inquiry-based science
with teachers: Essential research for
lasting reform. Journal of Research in
Science Teaching, 38(6), 631-645.
Lotter, C., Harwood, W.S., and Bonner,
J.J. (2006) Overcoming a Learning
Bottleneck: Inquiry Professional
Development for Secondary Science
Teachers. Journal of Science Teacher
Education. 17: 185-216.
Lotter, C., Harwood, W.S., and Bonner, J.J.
(2007) The Influence of Core Teaching
Conceptions on Teachers’ Use of
Inquiry Teaching Practices. Journal of
Research in Science Teaching. 38(6),
650-661).
68
Loucks-Horsely, S., Love, N., Stiles, K.,
Mundry, S., & Hewson, P. W. (2003).
Designing Professional Development for
Teacher of Science and Mathematics,
2 nd edition. Thousand Oaks, CA:
Corwin Press, Inc.
Merriam, S. B. (1988). Case study research
in education: A qualitative approach.
San Francisco: Jossey-Bass.
National Research Council (NRC). (1996).
National science education standards.
Washington, D. C.: National Academy
Press.
Reiff, R., Harwood, W. S., & Phillipson,
T. (2002). A scientific method based
upon research scientists’ conceptions
of scientific inquiry. Paper presented
at the Association for the Education
of Teachers in Science, Greenville,
N.C.
Thiessen, D. (1992). Classroom-based
teacher development. In A. Hargreaves
& M. G. Fullan (Eds.), Understanding
teacher development (pp. 85-109)
New York: Teachers College Press.
Thompson, C. L., & Zeuli, J. S. (1999). The
frame and the tapestry. In L. DarlingHammond & G. Sykes (Eds.), Teaching
as the learning profession (pp. 341375). San Francisco: Jossey-Bass.
Tobin, K., & McRobbie, C. J. (1996).
Cultural myths as constraints to the
enacted curriculum. Science Education,
80(2), 223-241.
Wallace, C. W., & Kang, N. (2004). An
investigation of experienced secondary
science teachers’ beliefs about
inquiry: an examination of competing
belief sets. Journal of Research in
Science Teaching, 41(9), 936-960.
Weiss, I.R., Pasley, J.D., Smith, P.S.,
Banilower, E.R., & Heck, D.J. (2003).
Looking
Inside the Classroom: A Study of K-12
Mathematics and Science Education
in the United States. Chapel Hill, NC:
Horizon Research, Inc.
Mahsa Kazempour is a Visiting Assistant
Professor of Science & Math Education at
Fairfield University in Fairfield, CT. She has
co-authored several papers and presented at
national conferences including NARST, ASTE,
and SSMA. She serves as the faculty mentor for
the Fairfield University NSTA student chapter.
Correspondence concerning this article can be
sent to <mkazempour@fairfield.edu>.
SCIENCE EDUCATOR
Information for Authors
The Science Educator, official journal of the
National Science Education Leadership Association
(NSELA), is a refereed journal which seeks
manuscripts dealing with topics and issues of interest
to professionals involved in science education
leadership across a variety of roles, institutions, and
agencies. The journal is published bi-annually with
provisions for quarterly publication. The journal
serves as a vehicle for the exchange of information on
current theory, research, and classroom applications.
Articles by both practitioners and researchers are
encouraged. Generally, articles are sought that aid
1. All manuscripts should be sent electronically
to <rhotonj@etsu.edu>; manuscripts should
not exceed 5000 words. Samples of previously
published articles in the Science Educator can be
viewed on the NSELA homepage <www.nsela.
org>.
2. Manuscripts should be typed double-spaced on
standard 8.5 x 11 inch paper. Figures, tables
and pictures must be submitted in camera ready
condition. Tables should be typed on separate
pages at the end of the manuscript.
3. Manuscripts must be concisely written. It is
important to maintain coherence of thought.
4. The first page of the manuscript should contain the
article title, author’s name, affiliation, and address
to which correspondence and proofs should be
sent. The author should not place his/her name
on the other manuscript pages so that anonymity
can be maintained during the review process.
5. References cited in the manuscript should be listed
at the end of the manuscript. Provide accurate and
complete information. For questions of format
practitioners in making more informed decisions
about current or future operations, and which improve
science teaching and learning, at all levels, in our
nation’s schools. Of particular interest are topics
that deal with the following issues: curriculum
and instruction, science education reform, science
education leadership, preservice and inservice teacher
enhancement/professional development, teaching and
learning, needs assessments/research and evaluation,
and polices and practices.
To have a manuscript considered for publication,
please review the guidelines below:
follow the Publication Manual of the American
Psychological Association (Fifth Edition) and
Webster’s College Dictionary.
6. Authors who are not members of the National
Science Leadership Association (NSELA) will
be assessed a publication fee of $50.00 per
article. Non members may avoid this charge by
joining NSELA prior to their manuscripts being
published. The institutional charge is $75, $30
individual journal-only. Membership dues are
payable at the beginning of each calendar year.
Information and application for membership can
be obtained from the NSELA Executive Director
<susansprague@yahoo.com> or Membership
Chair <beth@seven-oaks.net>.
7. Receipt of manuscripts will be acknowledged.
All submitted manuscripts will be reviewed by
at least three members of the Editorial Board.
Authors will be notified by the Editor regarding the
recommendation on publication of the manuscript.
Transfer of the author’s copyright to the National
Science Education Leadership Association is a
condition for publication.