NCTQ Reading Policy Action Guide 2024
NCTQ Reading Policy Action Guide 2024
NCTQ Reading Policy Action Guide 2024
How States
Can Implement
and Sustain
Strong Reading
Instruction
January 2024
CONTENTS
Introduction 3
Action 1 4
Set specific, detailed reading standards for teacher
prep programs
Action 2 13
Review teacher prep programs to ensure they teach the
science of reading (and do NOT teach contrary practices)
Action 3 20
Adopt a strong elementary reading licensure test
Action 4 26
Require a high-quality reading curriculum and train
teachers on how to use it skillfully
Action 5 32
Provide professional learning and ongoing supports to
sustain implementation of the science of reading
Common themes 38
Acknowledgements 39
Appendix 40
Guidelines for alternatives to licensure tests
Endnotes 43
INTRODUCTION
The last few years have been a watershed moment for reading instruction. Many states have
passed new policies to support effective reading instruction, and more states may soon follow
suit. (To read more about the policy changes states across the country have made, see State of
the States 2024: Five Policy Actions to Strengthen Implementation of the Science of Reading.)
The states most successful in leveraging policy to improve reading outcomes for students have
taken a cohesive and comprehensive approach focused on improving teachers’ capacity to
deliver great reading instruction.
Each policy action in isolation can make a difference for students. But when done in concert,
these policy actions build upon and bolster the others, leading to state policies that are
greater than the sum of their parts in their ability to boost reading outcomes for students.
This action guide also shares stories of states that have leveraged these policy actions to
support greater teacher effectiveness in reading. Each section explores how state leaders
invested in teacher prep programs and teachers, the pitfalls they faced, and how they
overcame them.
ON IMPROVING LITERACY
“If you want improved outcomes for literacy, there’s hard work
involved. We didn’t realize that half the battle was getting there,
and the other half is staying there.”
Sean Ross
Executive Director, Arizona State Board of Education
States need to clearly and explicitly communicate what teacher prep programs must teach
their candidates, or programs will likely fall short of making important and necessary
changes. Listing components of reading without providing more detail gives programs a
great deal of leeway and undermines the efficacy of these standards. States and districts are
investing hundreds of millions of dollars in professional development for teachers, often on
skills they should have learned in teacher prep programs. To prevent states and districts from
having to repeat these investments, every new cohort of teachers should enter the classroom
well versed in scientifically based reading instruction (SBRI).
Clear standards are part of the chain of strengthening teachers’ knowledge: they help
programs identify what their candidates need to know and be able to do, give states the
criteria through which to hold programs accountable, and set candidates up for success on
aligned reading licensure tests that provide a final check on their knowledge before becoming
teachers of record.
CURRENT PRACTICE
26 4 13 8
Number of states
State leaders who updated their teacher prep standards consistently shared
that they achieved greater success when they brought many people to the table.
Successful states include literacy experts, as well as the people most affected by
these standards: prep program leaders and faculty, district leaders and teachers,
and parents and caregivers. Teacher prep programs need to be able to implement
these standards in their courses. State leaders should engage them in building the
standards and give them an opportunity to weigh in so that they understand what
the standards should look like in practice and feel more invested.
Standards should do more than list the components of reading. Consider what
teachers need to know and to be able to do and what strong reading instruction
looks like.
No matter where they teach, every elementary teacher is likely to teach ELs
and students who struggle to read. Standards for prep programs should clearly
delineate what teacher candidates should know to support these groups of
students in becoming proficient readers.
Outcome data can help identify where programs are effectively teaching
standards and where they need to strengthen instruction. For example, Florida
publishes data linked back to teacher prep programs, including whether
program completers achieve learning gains for their students (as a whole and for
specific groups of students such as those who are economically disadvantaged)
and completers’ scores on teacher evaluations.1 Outcome data can include pass
rates on reading licensure tests and student growth data connected back to their
teachers’ prep programs. Evidence of success can also help build the case for
keeping this focus on SBRI going.
HOW TO DO IT
UTAH
Utah turned its attention to reading because one out every two of its children could not read
proficiently. While this statistic puts Utah above many states, education leaders felt it was not
nearly good enough.
“If we want to improve outcomes for kids and their life success,
reading is critical... This is today’s civil rights movement.”
Jennifer Throndsen
Director of Teaching and Learning, Utah State Board of Education
The state began by gathering information about the current context. In 2016, they surveyed
practicing educators to ask how confident they were in teaching the five core components
of reading, finding that teachers were least confident in phonemic awareness and phonics.
Addressing this problem would require a focus on the teacher pipeline, so the state convened
the eight teacher prep programs across the state to enlist their help. Through those
convenings, literacy faculty from the teacher prep programs worked with district literacy
specialists and state education leaders to establish clear, specific standards for what teacher
candidates need to learn about literacy.
While these standards were intended as suggested guidelines for programs, the prep program
faculty themselves asked the state to put these standards into Board Rule (which has the
effect of law) to require the programs to follow the standards. Now faculty are pushing the
state to go further and create a requirement for the number of classes programs must devote
to reading instruction.
These standards have laid the foundation for other steps the state is taking to strengthen
literacy outcomes for children, including increasing its teacher prep program approval focus
on literacy and implementing a strong licensure test (see Action 3).
The collaborative process between the state and the teacher prep programs has fostered a
continuous improvement approach. Faculty can now opt into a regular convening of faculty
from across preparation programs, a group which has swelled to represent about half of all
core literacy faculty from the eight prep programs. Utah state staff facilitate these meetings
in coordination with participating teacher prep faculty. They use the meetings to learn
from one another and share what is and is not working in their programs. Participants in
the convenings can also opt into training to strengthen their own understanding of SBRI,
called Language Essentials for Teachers of Reading and Spelling (LETRS). These convenings
are driven by data, as the leaders use Foundations of Reading licensure test data to identify
programs that have strong outcomes in certain areas so that the faculty from those programs
can share how they are achieving results.
Utah has steadily pushed prep programs toward stronger literacy instruction, but at a pace
that has allowed the prep programs themselves to take partial ownership over the work and
to build their own capacity to follow the state’s lead. This approach is one that Throndsen
characterized as “gentle pressure, relentlessly applied.”
Phonics
Understand that phonics is the Understand the connection of the sounds and
connection between graphems corresponding letters.
and phonems and how they
form words.
Know and apply strategies for Use an explicit phonics lesson framework that
organizing word recognition includes review of a previously learned skill or
and spelling lessons by concept, introduction of a new skill or concept,
following an explicit instruction supported practice, independent practice,
phonics lesson plan. and fluent application to meaningful reading
and/or writing.
Know the structure of English Define key terms (e.g., grapheme, phoneme,
orthography patterns and syllable, suffix), and identify examples of each.
rules that inform the teaching
of single- and multisyllable Map regular words by phoneme-grapheme (or
regular word teaching. grapheme-phoneme) correspondences.
Or consider the short text in the Council for the Accreditation of Educator Preparation
(CAEP)’s standard 2.a—the only CAEP standard on literacy—which several states rely upon to
serve as their teacher prep program standards:
The other national accrediting body, Association for Advancing Quality in Educator
Preparation (AAQEP), has no specific standards related to reading instruction and makes no
mention of the five core components of reading, instead relying upon the specific knowledge
candidates must demonstrate based on the state’s licensure requirements and standards and
on national standards from the International Literacy Association.4
In Michigan, the state education agency was concerned when hearing repeatedly from
districts that their first-year teachers were coming in poorly prepared. At the same time, the
governor had established a commission on literacy and the legislature had allocated several
million dollars to update the state’s licensure tests. The state also learned through formal
conversations with teachers, administrators, and people from teacher prep programs that the
current certification structure was seen as too broad and not deep enough, so that teachers
were not strong in everything they were licensed to teach. In response to the concerns raised,
the state created narrower certification bands, coupled with more aligned teacher prep
standards, and then developed new licensure tests.
To revise the early literacy standards, the state gathered 75 stakeholders (representing varied
roles, including teachers, parents, teacher prep leaders and faculty, from a range of different
locations and backgrounds) along with literacy researchers. The group decided to create
new teacher prep standards from scratch, outlining what teacher prep programs should be
preparing candidates to do for each licensure band. This framework is based on domains of
early literacy (e.g., phonics), grounded in the four critical aspects that teachers need to know
about each domain: (1) What is it, (2) How does it develop in a child, (3) How do you teach it,
and (4) How do you assess it?5
The state built the PreK-3 certification framework and related teacher prep standards,
then moved on to the grades 3-6 certification framework, engaging many of the same
stakeholders. Since then, they have turned their attention to implementation. These finalized
standards served as the foundation for the state’s new licensure tests. The state worked
extensively with prep programs to help them implement these standards and align their
coursework, using the program approval process to review programs for their alignment
with these revised standards. These steps included:
Monthly drop-in program revision workshops hosted by the state team for prep
program faculty to discuss the standards and how they are evident in coursework
across programs.
It is too soon to track the effect on the state’s literacy outcomes, but already the state has seen
a shift in program coursework and higher pass rates on licensure tests.
Many states’ standards now address what teacher candidates need to know to support
a diverse range of learners, including English Learners and struggling readers.
CALIFORNIA
California’s new literacy standards6 for multiple subject programs (the state’s general
elementary certification) specifies that coursework and field experiences should include
attention to struggling readers and English Learners, and provides specific skills to
support these students (e.g., for struggling readers, screening students for potential
learning disabilities including dyslexia; for English Learners, basing foundational skills
instruction on their previous literacy experiences in their home language, helping them
use English to access academic content across all subjects, and developing oral language
proficiency).
COLORADO
Colorado has an entire set of standards7 related to teaching English Learners that applies to
all teacher prep programs, and which is intended to be followed in addition to (not instead
of) the state’s Culturally and Linguistically Diverse (CLD) Endorsement. These standards
apply to every prep program (for example, not only elementary education, but also
computer science and secondary mathematics). Programs preparing for program approval
processes can complete a matrix in which they indicate what coursework satisfies each
standard and substandard.
FLORIDA
Resources
Ten maxims: What we’ve learned so far about how children learn
to read by Dr. Reid Lyon
W H Y T H I S A C T I O N M AT T E R S
Every teacher knows that class rules are important, but they quickly become meaningless if
teachers fail to enforce them. Similarly, well-defined and clearly communicated standards
for teacher prep programs have limited value if states do not hold programs accountable for
meeting them. Program approval offers this enforcement mechanism. It requires programs
to verify that their coursework is aligned with the state’s standards, that they adequately
address scientifically based reading instruction (SBRI), and omit content contrary to
research-based practices. When programs are out of alignment, the program approval and
renewal process is the state’s opportunity to either compel programs to improve or to levy
consequences that should include the possibility of closing a program down.
States that have established a stronger program approval system, using detailed standards and
including reviews of syllabi and licensure pass rates during program renewal, are seeing prep
programs teach SBRI much more consistently than before these changes were implemented.
CURRENT PRACTICE
16 19 16
Number of states
Uses BOTH syllabi and pass rates Uses EITHER syllabi or pass rates Uses NEITHER syllabi nor pass rates
States should use multiple available data sources to gather a holistic understanding of the
strength of programs’ instruction in reading.
Some states defer to national accrediting bodies for program approval rather than
conducting program reviews themselves. When states instead maintain control
of program approval, they can check programs’ alignment with the state’s own
standards and can give more attention to reading and other top priorities within
the state.
Syllabi provide insight into what instructors intend to teach and how they plan to
provide practice opportunities and assess candidates’ knowledge.
While programs should receive ample support and opportunities to improve, there
may be rare instances where they do not align with the state’s standards for reading
instruction. In the event that this happens and programs have been given sufficient
time to improve, states should be ready and willing to close down a program and to
guide that program’s candidates to other, more effective prep programs.
HOW TO DO IT
RHODE ISLAND
Rhode Island has implemented several reading policy changes simultaneously as part of a
Right to Read package of legislation. The legislation requires that teacher prep programs
align their coursework with SBRI.
Teacher prep programs in Rhode Island have to prepare candidates for two different levels
of understanding of reading instruction: For teachers who are not likely to teach early
literacy (e.g., secondary teachers), they must meet an “awareness” level of familiarity,
marked by completing about 10 hours of preparation. Elementary teachers, K-12 special ed
teachers, and others likely to teach early literacy must meet a much higher “proficiency”
bar. Rhode Island is currently reviewing plans from teacher prep programs for evidence
that they meet these levels, and the state will also include this review during the regular
cycles of the program approval process.
The Rhode Island Department of Education (RIDE) met with prep programs directly
to explain what the scope of the review would be and what the expectations were for
each program.
RIDE provided a folder for each prep program in Google Drive with examples of rubrics
and the types of paperwork they would need to complete, as well as a matrix showing
what candidates would complete throughout their program.
The state engaged with the Collaboration for Effective Educator Development,
Accountability, and Reform (CEEDAR Center) to provide programs with a syllabi
refinement tool, which is designed to help prep programs review their syllabi for
alignment with Rhode Island’s competencies, outlined by the state’s Right to Read
Act guidelines for educator prep programs and to make a plan for how to update
courses to ensure that candidates complete the program with a robust understanding
of the competencies.
Programs could work with a coach (a Rhode Island prep program faculty member who
had helped develop some of these program resources) who had already gone through
the review process.
Programs were invited to submit their materials for program approval several
months early so that they could receive feedback from the state and make changes
As of fall 2023, the state was in the process of reviewing programs’ submissions and providing
feedback. They are still seeing some reading practices that are not aligned to the state’s
expectations, as well as instances where programs’ syllabi say they are covering the science of
reading but their powerpoint slides contradict that. This approval process allows the state to look
more deeply and with greater specificity at what programs are teaching their future teachers.
READING INSTRUCTION
The state has developed trainings targeted to teachers of specific groups of students. It worked
with external vendors on a Spanish-English program that meets proficiency requirements
for dual-language teachers, as well as a program targeted to educators of multi-language
learners, and it worked with a prep program to develop a course series for teachers of students
with severe intellectual disabilities, which meets the reading proficiency expectations.
Colorado also revamped its program approval process, building on the state’s new (and
more explicit) literacy standards, issued in 2016, as well as the greater authority provided by
Colorado’s 2012 READ Act. To effectively apply its new authority to ensure that programs’
reading instruction was aligned with state standards, the Colorado Department of Education
(CDE) created a detailed matrix for programs to complete prior to their site visit. In this
matrix, programs provide evidence about the “level of implementation” for each standard
and sub-standard, ranging from candidates having the opportunity to learn information
through course readings to candidates receiving feedback and reflecting on their practice.
The program approval process includes literacy experts who attend program approval visits,
review syllabi, sit in on literacy classes, and give feedback on programs’ alignment to state
standards. Reviewers also interview faculty, teacher candidates, and recent graduates to
gauge their understanding of SBRI.
When the state began its new review process, CDE realized that under its approval structure,
program review had only two possible end points: approval or probation. Putting a program
on probation prevents that program from enrolling new candidates, making it an unpalatable
option. Instead, CDE worked with the state board of education to create a new category,
conditional reauthorization, which they codified in policy. Conditionally approved programs
received a list of specific changes to make within one year. If they did so, they could be
recommended for full approval.
Between 2018 and 2023, CDE conducted 23 reauthorization site visits with programs that
have scientifically based reading standards in one or more endorsement areas (elementary,
early childhood, special education). Seven programs were subsequently put on conditional
reauthorization to address a need for deeper content for and understanding of SBRI for their
candidates. Evidence from NCTQ’s 2023 Teacher Prep Review: Reading Foundations report
found that after only a few years, Colorado programs are now among the best in the country
for teaching SBRI, with almost no evidence of contrary practices.
Ohio recently passed legislation that provides programs with feedback, an opportunity to
improve, and then a high-stakes audit coupled with public transparency. The new legislation
requires the Ohio Department of Education to complete the following actions:
Revoke approval for programs that are not in alignment and have not yet addressed
findings of initial audit.
Develop and publish annual summaries of literacy instruction strategies and practices
for all prep programs.
Develop a dashboard with first-time pass rates on the reading licensure test.
Even in states with explicit standards on reading for teacher prep programs, prep programs’
quality of reading instruction varies widely.10 Licensure tests, especially when used in concert
with strong standards for prep programs and a robust program approval process, offer an
important check of teachers’ knowledge of reading instruction. These tests also send a clear
directive to prep programs that they are expected to teach candidates this essential content and
provide helpful feedback to programs on where candidates are strong and where they struggle.
States that have implemented high-quality reading licensure tests are more confident
in incoming teachers’ knowledge, and, when aligned with teacher prep standards and
coursework, they are seeing higher pass rates than on the older tests.
CURRENT PRACTICE
18 1 1 2 28 1
Number of states
A reading licensure test should adequately address the core components of the
science of reading, as well as how to teach a range of diverse students (e.g.,
English Learners, struggling readers). The test should focus only on reading (or on
reading and English language arts), rather than combining reading with other
subjects (which makes it hard to discern teachers’ knowledge of reading
specifically). And the test should not include content contrary to research-based
practices (e.g., three-cueing) unless it makes clear that these are undesirable
practices. (See which commonly used reading licensure tests are rated
acceptable or strong.)
State education agency staff and teacher prep program leaders and faculty should
learn how to use these data systems to explore trends in the data in their program,
institution, or state. The training should empower them to identify candidates
who need additional instruction, identify areas in which prep programs need to
strengthen their reading preparation, and identify programs that excel in an area
and may serve as a model to other programs.
UTAH
Utah is phasing in a new reading licensure test, the Foundations of Reading, over a four-year
period. In year 1, programs could opt into taking the test. In year 2, everyone had to take
it, but there was no cut score (or programs could set their own). In year 3, everyone was
required to take it and would be held to the state’s cut score, but the passing test was not
required for a teaching license. In year 4, candidates must pass the test to earn a teaching
license—and programs will be responsible for helping candidates succeed on the exam.
This test reinforced the new standards that Utah implemented (see Action 1) and has been a tool
to give data back to programs so they can identify their strengths and areas for growth. Some
programs have taken this exam more seriously, setting a minimum passing score that they
communicate to candidates; others have not, and that lack of emphasis is reflected in their low
passing rates (only 50% of candidates are passing the exam at these institutions).
SUPPORTING SUCCESS
“Give programs a grace period [with licensure tests], but also give
them the data to show them how they’re doing in reality.”
Jennifer Throndsen
Director of Teaching and Learning, Utah State Board of Education
Reviewing data from this test has fostered greater collaboration among institutions. Four years
ago, programs did not share data of any kind. Now the state education agency and instructors
and leaders from prep programs join the Utah Council of Education Deans’ group at least twice
a year to examine data and talk about outcomes.
A four-year rollout gives programs time to build capacity and revamp coursework.
The state engages prep programs in closely tracking candidates’ data and using that data
to identify strong programs that can train other program faculty.
The state pays for aspiring teachers’ first test attempt so the requirement does not pose
an excessive burden on candidates.
Utah engages their testing company to provide additional training on how to use the data
management system to further explore the data.
Utah law requires that prep programs provide candidates with additional support (e.g.,
course modules, tutoring) free of charge until they pass the Foundations of Reading, up to
ARIZONA
Two years ago, Arizona passed legislation requiring that all K-5 teachers of reading (e.g.,
general elementary teachers, special education teachers, English Learner teachers) earn a
K-5 literacy endorsement.
For in-service teachers, this requires coursework in the science of reading and in reading
instruction (including interventions for struggling readers, including students with
dyslexia), as well as passing the Foundations of Reading licensure test. The state has
provided a list of courses and trainings that meet the criteria for this endorsement. In-
service teachers have until 2028 to earn this endorsement.
Pre-service teachers, who have until 2025 to meet this requirement, must take relevant
coursework in their teacher prep programs and also pass the Foundations of Reading test.
To ensure that candidates were prepared to not only pass the test but to teach reading, prep
programs had to add two new courses, one focused on the science of reading and one on the
science of reading with a focus on reading intervention for struggling readers and students
with dyslexia.
BUILD BUY-IN
“Let them learn, let them talk, let them see an exemplar.”
Sean Ross
Executive Director, Arizona State Board of Education
The state heavily emphasized collaboration and building buy-in from teacher prep programs.
Before the new legislation passed, the state held a series of convenings with teacher prep
programs. The first convening previewed the imminent law and gave prep programs a chance
to share their concerns. At the second, the state invited reading expert Louisa Moats to explain
what the science of reading is and what it is not, to address misconceptions about the term,
and to clarify that the science of reading and culturally relevant curricula can go hand in hand
(a specific concern raised by programs). At the third meeting, Arizona invited Dr. Angela
Rutherford from the University of Mississippi, who led the university’s transition to the science
of reading. She “spoke the same language” as the prep programs, explaining that many of her
colleagues resisted the emphasis on science of reading when it first rolled out, but they now
understand its value. This three-part convening series worked.
To further support the transition, Arizona invited the state’s testing company, Pearson, to
provide an overview of the material on the test so that faculty knew what to focus on in courses.
To ease the cost to teacher candidates, the state offers everyone in Arizona one free attempt.
They intentionally offer only one free attempt to encourage candidates and teachers to complete
the training and coursework first (since passing the test allows people to bypass the training).
MICHIGAN
Michigan used funding earmarked for updating its licensure test system to first update its
teacher prep standards and then build licensure tests to match (for more detail, see Action
1). The state saw higher pass rates on its new licensure test aligned with SBRI because
candidates’ preparation was more closely aligned. Moreover, the state found a benefit to
using a non-compensatory test (where candidates have to separately pass a subtest in each
subject): Candidates were ultimately more successful because when they struggled in one
area, they only needed to retake a subtest in that area, rather than studying for and paying
for the entire test again.
What are the strengths and weaknesses of the licensure test my state
currently uses?
Who is required to take the test and who isn’t? What does this mean
for student learning?
How long should my state take to roll out changes to licensure tests?
What is the right balance to strike between giving programs time
to understand the new requirements and adjust coursework and
ensuring that elementary students have teachers entering with a
strong understanding of reading instruction as soon as possible?
NCTQ blog post: How states are making licensure tests free to
aspiring teachers
NCTQ blog post: How some states use licensure test pass rate data to
build a stronger, more diverse teacher workforce
Curriculum materials aligned to the science of reading can make a real difference to students.
In fact, some researchers estimate that the effect of using high-quality curriculum materials
could be greater than the difference between a brand new teacher and one with three years of
experience.11 There are dozens, if not hundreds, of literacy curricula on the market, but their
quality and adherence to the science of reading vary widely. Some of the most popular tend
to use balanced literacy principles, methods that run contrary to the research.12 Good curricula
can also help ensure more equitable access to strong instruction. The adoption of a new
curriculum should be coupled with professional learning for teachers on how to implement
it effectively.
CURRENT PRACTICE
9 24 18
Number of states
Ensure the list aligns with the science of reading and is absent of content contrary
to research-based practices (e.g., three-cueing, miscue analysis). Be sure the
curriculum includes resources to teach ELs and struggling readers. Providing a list
of high-quality required curricula (which can be based on existing reviews from
resources like EdReports) takes the guesswork out of curriculum selection for
districts and greatly cuts down on the time and energy district leaders must
devote to examining curriculum options. If this is not feasible, provide a
recommended list—and couple that guidance with a tool and training to enable
districts to vet curricula on their own.
Core curricula should be designed to meet the needs of all learners, including
English Learners and struggling readers. States should examine whether core
curricula meet these needs, and if they do not, states should identify supplemental
curricula that could augment the core materials. For example, Texas includes
“supports for all learners” in its rubric for evaluating course materials, and Rhode
Island provides a list of “non-negotiables” for selecting curricula that support
multilingual learners.
Prep programs often ask candidates to design lessons or entire units from scratch.
Yet as more districts and states are moving toward high-quality instructional
materials, teachers need to know less about how to create lessons and more about
how to implement or adapt pre-developed, research-based lessons. The Council of
Chief State School Officers (CCSSO) has developed a new set of competencies and
standards for teacher prep program coursework and clinical practice, which can
be applied as program approval standards, among other uses.
Several states have partnered with other organizations such as EdReports, which conduct
independent reviews of curricula, and then use these reviews to create websites identifying
which districts in the state use which curricula, along with information about the quality of
those curricula. Some states are investing in coaches to support teachers in implementing
new curricula.
RHODE ISLAND
Rhode Island (RIDE) created a “Curriculum Visualization Tool” that pulls in information
from EdReports to determine a quality rating for each curriculum. The tool allows users
to drill down by local education agency (LEA) or school to see what the curriculum is in
math and English language arts at each grade and whether it meets the expectations set by
EdReports, is locally developed, or has not been rated. This makes it easy to scroll through
and see, for example, the one district in the state that is using a curriculum that “does not
meet expectations for high quality.”
ARKANSAS
Arkansas considers how all literacy systems work together, in what Secretary of the
Arkansas Department of Education Jacob Oliva described as an “educational house.” In this
metaphor, the concrete foundation is strong standards outlining what students need at
each grade level. The floor of this house is the curriculum that teachers use to teach those
standards, the walls are training for teachers, and the roof is how the state and schools
measure student learning.
The state has sought to build a stronger house over the last few years, starting with revising
literacy standards for students. The new standards are more grounded in the science of
reading and more explicit about the types of texts with which students should engage. Now
the state is evaluating instructional materials using ratings from EdReports to ensure their
alignment with SBRI, resulting in a list of approved curricula.
To support teachers’ use of these curricula, the state has hired about 80 literacy specialists
to work in the highest-needs schools, focusing on training teachers on reading and on how
to teach these curricula. The state’s law is very explicit that literacy specialists can go into
classrooms and provide direct coaching and support (which does not factor into teacher
evaluation), since in some states, teacher contracts have prohibited literacy specialists from
providing direct feedback to teachers.
The state enforces curriculum requirements by tying funding directly to whether districts
use approved curricula. To monitor districts’ use of curricula, districts provide assurances
in annual reports about which curricula they use as their primary instructional tool(s).
Arkansas is also considering how to support all students, including English Learners. The
state is part of several national collaboratives that engage in this work, including
evaluating supplemental curricula for the needs of English Learners, specifically looking for
evidence of explicit attention to listening, speaking, reading, and writing.
“We have to act with urgency. These kids are in their critical
foundational years. This can’t be a 20 year plan.”
Jacob Oliva
Secretary, Arkansas Department of Education
The state continues to work on building the “roof” of its instructional house, developing
new assessments that are aligned to its elementary standards. The state is also working on a
coordinated progress monitoring tool to provide a snapshot of student performance in K-3.
Does my state tell the public about the curricula in use and
whether they are high quality?
Many teachers have not learned scientifically based reading instruction (SBRI) and are eager to
learn to increase their positive impact on students. States can select strong curricula17 that are
aligned with the science of reading, but teachers cannot effectively implement them if they do
not understand SBRI. Teachers need to be ready to identify misconceptions, provide scaffolded
support, and redirect or correct students when needed. No curriculum can possibly predict
every possible response or differentiation teachers will need; teachers themselves need to be
familiar with the research on reading so that they can build upon their curricula.
States that emphasize teacher training are seeing enthusiastic responses from teachers and are
making progress tracking data on student outcomes.
CURRENT PRACTICE
30 21
Number of States
Yes No
1 Secure funding:
Training teachers requires money. Many states were able to leverage Elementary
and Secondary School Emergency Relief (ESSER) funds to provide training to at
least a portion of teachers. If states were not able to use ESSER funds, consider
dedicating funds to provide professional learning now. With training on SBRI,
more than 90% of children can learn to read.18 The cost of providing training to
teachers is likely to be far less than the cost of providing remediation (also known
as Tier 2 and Tier 3 instruction) to students who struggle.19 If you cannot afford
to train all teachers at once, consider how to target teachers. Should teachers of
certain grades be prioritized? Teachers in high-need schools or districts? A literacy
specialist in each school who can share the instruction with their fellow teachers?
This decision should be informed by data about your state’s current performance
and areas of need.
Numerous training programs are on the market (see Resources, below, for
a link to a list of recommended programs) and vary in their cost and time
requirements. Some states have selected a single program, while other states give
teachers a choice between several options. States should play a role in vetting
these programs, especially when they are providing the funding. For a list of
professional learning programs recommended by NCTQ’s expert panel, see The
Four Pillars, page 8.
Teachers already have a full plate, and completing a training program requires
more time and energy. Consider steps to limit this burden, such as providing
teachers plenty of time to complete the requirement, providing the training at
different times (e.g., during the summer, on weekends or after hours, or during
the school day with substitute coverage), and aligning the training with the credit
hour requirements teachers must complete to renew their license.
Track data on teachers’ completion rates and feedback, and then on students’
reading outcomes after their teachers have completed the training. Tracking this
data can help identify what’s working well and what needs to change and can
make the case for future investments in training.
5 Share successes:
Especially for states that are rolling out training requirements gradually, sharing
success stories from teachers and their students is a powerful tool to build buy-in
and encourage more teachers to sign up for the training—and to take it seriously.
HOW TO DO IT
Both the District of Columbia and Arizona started small and scaled up. They leveraged
available funding, identified successes, and used early wins to make the case to bring
training to a wider scale of teachers.
DISTRICT OF COLUMBIA
The District of Columbia Office of the State Superintendent of Education (OSSE)20 used
ESSER funds from the pandemic to purchase training from TNTP on the science of reading.
This training required a one-time payment of nearly $1 million from OSSE, and the training
now lives on the state’s learning management system, where it is offered asynchronously
to teachers across the state at no cost. The state also leveraged federal ESSER funding and
Comprehensive Literacy State Development grant funding to provide Language Essentials
for Teachers of Reading and Spelling (LETRS) training for a large share of teachers and
administrators, with a heavier emphasis on early childhood teachers (D.C. offers free
universal preschool). Between the money from the state and an additional push for this
training from District of Columbia Public Schools (the traditional public school district
within D.C.), altogether, nearly 10% of the teacher workforce has either started or
completed training in SBRI.
OSSE has encouraged teachers to take either of these SBRI trainings by offering a stipend
($1,000 for the TNTP training and $1,200 for the more time-intensive LETRS training). D.C.
has seen high rates of teachers completing the training, as well as anecdotal evidence that
teachers enjoy the training and find it valuable. The state’s communication team is sharing
these stories to build further teacher engagement. OSSE is also building out a data system
that will allow them to track whether they see greater student outcomes for teachers who
went through the trainings.
“We are very cognizant that reading training takes time, and time is a
finite resource. We’re being as strategic as possible, and leaning into
existing activities when possible. We’re providing more resources and
structure to educators, which will enable them to be more successful,
feel more successful, and sustain them in their profession.”
Elizabeth Ross
Assistant Superintendent, Teaching and Learning, OSSE
ARIZONA
Arizona recently passed legislation requiring all elementary teachers to earn a K-5 literacy
endorsement, which necessitates additional training. But even before that legislation passed,
the state had begun a steady effort to train teachers, especially in high-need areas. Arizona
attributes much of its success to collaboration and bringing the right people to the table. A
decade ago, Read On Arizona assembled people from across the state to coordinate action
on literacy, often doing work that the government is not able to, such as fundraising. The
state also collaborated with other successful states through “learning collaboratives,” where
leaders from the states visited each other, exchanged lessons learned, and problem solved
(with Mississippi in 2016–17 and later with Florida).
The state started small, training a few hundred teachers in SBRI. Based on early successes, the
state education agency was able to work with the governor’s office and state legislature to
designate ESSER funding to train 4,000 teachers in LETRS, and many districts used their own
ESSER money to train more teachers.
Since Arizona could not financially afford to train all teachers, they took an ingenious approach
to identify which districts and teachers to focus on first. They worked with a group called the
Maricopa Association of Governments (MAG), which houses data in numerous fields (e.g.,
transportation, environmental) for local governments across the state, but did not yet have
education data. The Arizona SEA and Read On Arizona provided MAG with education data,
and they built a cross-sector data set that used variables such as chronic absenteeism and
standardized test data, as well as census data, average age of first doctor visit, unemployment
claim data, and COVID outbreak numbers to identify “hot zones” across the state that would most
benefit from intensive reading instruction.
With the newly required literacy endorsement, which all in-service and pre-service K-5
teachers of reading (including special education teachers) must earn, all elementary teachers
must now complete literacy training (though in-service teachers have until 2028 to do so).
In-service teachers can choose from a list of state-approved trainings to meet the 90-hour
training requirement. To ease the burden of this policy, the state is aligning the required
credit hours with the number of credits teachers already need to acquire for recertification
during that time period, making the training free to teachers, and offering the training at a
number of different times so that teachers can take the training on a schedule that works for
them. (For more detail about these requirements, see Action 3).
Perhaps best-known are the efforts of Tennessee and Mississippi, which both undertook
expansive efforts to retrain teachers on the science of reading.22 As a recent FutureEd report
details, Mississippi began by training literacy coaches, then extending LETRS training to all
K-3 teachers and K-8 special education teachers, though this training was only required for
“teachers in schools with the lowest literacy results.”23 A subsequent research study found
that after completing the LETRS training, teachers had increased scores on a measure of
their knowledge of early literacy skills. Ratings of teachers’ quality of instruction, teaching
competencies (e.g., planning, classroom management), and student engagement all increased
compared to teachers who had not started the training.24 Tennessee contracted with TNTP
to provide 60 hours of professional development on reading to elementary teachers across
the state; teachers were paid a $1,000 stipend for completing the training, and teachers in
grades K-2 also received curriculum materials. Both states have seen substantial increases in
teachers’ knowledge of literacy skills.
Resources
Build collaboration and collective impact: State leaders repeatedly shared that when
they involved stakeholders in the design of new policies, they were more invested and
more likely to follow new laws and policies–and the policies were better when they
prioritized collaboration.
Give stakeholders time to adjust to new policies, but backstop these changes with
consequences for those who are unwilling to change: Gradual implementation allows time
for prep programs to change course requirements, for districts to purchase new curricula,
and for candidates to study for new licensure tests. But at some point, states have to rely on
policy and on enforcing accountability measures for those who refuse to comply.
When prioritizing limited resources, identify the top needs and start there: States have
found creative ways to stretch their dollars and prioritize where they go. And given
limited resources, all investments should come with evaluation of their impact.
Recognize that literacy starts before kindergarten: While NCTQ’s analysis in this
report focuses on the elementary years, literacy starts at birth—and more states are
recognizing this through their increasing emphasis on early childhood programs.
ACKNOWLEDGEMENTS
Project Leadership those of the authors and do not necessarily reflect positions
or policies of the project funders.
Shannon Holston, Hannah Putman, and Heather Peske
Communications
Ashley Kincaid, Lane Wright, and Hayley Hardison
Design Team
Teal Media
Suggested Citation
Guidelines for
considering alternatives
to licensure tests
While states have historically used licensure tests to assess teachers’ knowledge before
entering the classroom, many states have loosened their requirements over the last few years.
One way that states have lowered requirements is by offering a choice of different licensure
tests, which often vary in quality. Another way states have done this is offering other
measures in lieu of licensure tests, such as portfolios, transcript reviews, or completion of a
teacher prep program.
The ultimate goal of any of these measures should be to ensure that every person who
becomes a licensed teacher has a thorough understanding of the science of reading, among
other content and skills, and is prepared to help their children become proficient readers.
Any measure of teachers’ knowledge of reading should be scrutinized for its ability to meet
that goal.
When considering alternative measures, states should answer the following questions:
1 How fully does this measure address the range of knowledge that
candidates need to know?
Without clear guidelines, a portfolio would also allow candidates a great deal
of flexibility in the types of lesson plans, student work, and other evidence
they provide, and would not guarantee coverage of all components of reading.
Similarly, teacher prep programs’ coursework varies in its attention to the science
of reading,25 so using a transcript review or program completion may need to
be coupled with a thorough program approval process that reviews programs’
reading instruction.
The most commonly required reading subtest (the Praxis 5002) is part of an
elementary content test that costs $180 for the first attempt and then $64 to
retake any subtest afterward. Other approaches offer a range of costs. Portfolios
require substantial time but may only require the cost of supplies (and perhaps
not even that if done virtually). Performance assessments tend to be more costly
than content licensure tests; the edTPA costs $300 for the first attempt and then
between $100 to $300 for subsequent retakes.
A measure of teachers’ knowledge should meet the basic properties of being valid
(measuring the constructs that it intends to measure, in this case knowledge of
the science of reading, rather than creativity, classroom management, etc.) and
reliable (scoring is consistent over time, between raters, etc.). These expectations
have been described in more detail by the Council for the Accreditation of
Educator Preparation (CAEP)’s criteria for assessments used by teacher prep
programs,26 but the same principles should apply to measures states use as
well. Some forms of assessments, such as portfolio reviews, may be difficult to
standardize so that they are valid and reliable measures. Others, such as formal
performance assessments, may face reliability challenges despite being run
by testing companies. For example, the edTPA has been critiqued as having
insufficient reliability in its scoring process.27
Guidelines for the evaluation and scoring of any qualitative materials, such as
essays, videos of instruction, or portfolios, should include specific “look-fors”
for each standard and clear guidelines about what happens if those are absent or
inadequate in the portfolio.
1. MC Research Corporation. (2023). Florida teacher preparation programs: 2022 annual program
R
performance reports summary and analysis. Florida Department of Education. https://www.
fldoe.org/core/fileparse.php/7502/urlt/23AnnTeachPrepReport.pdf
3. ouncil for the Accreditation of Educator Preparation. (2021). CAEP 2018 K-6 elementary teacher
C
preparation standards: Initial licensure programs. https://caepnet.org/~/media/Files/caep/
standards/2018-caep-k-6-elementary-teacher-prepara.pdf?la=en
4. ssociation for Advancing Quality in Educator Preparation. (2023). Guide to AAQEP accreditation
A
(p. 17). https://aaqep.org/files/2023%20Guide%20to%20AAQEP%20Accreditation.pdf
5. ind the full set of Michigan standards for preparation of teachers of lower elementary (PK-3)
F
education here: https://www.michigan.gov/-/media/Project/Websites/mde/educator_services/
prep/standards/approved_lower_elementary_pk3_education_preparation_standards.
pdf?rev=19ca40ad8ac548aaa85bd0dd6595f96e
8. hile generally quite strong, one drawback of these competencies are that they include running
W
records among potential informal assessments. NCTQ encourages the use of more reliable progress
monitoring tools, such as an oral reading fluency test, even when used for informal assessments.
9. ast research has found that teacher prep faculty have an uneven understanding of SBRI (Joshi, R. M.,
P
& Hougen, M. (2012). Peter effect in the preparation of reading teachers. Scientific Studies of Reading,
16(6), 526-536; Joshi, R. M., Binks, E., Hougen, M., Dahlgren, M. E., Ocker-Dean, E., & Smith, D.
L. (2009). Why elementary teachers might be inadequately prepared to teach reading. Journal of
Learning Disabilities, 42(5), 392-402; Kurtz, H., Lloyd, S., Harwin, A., Chen, V., & Furuya, Y. (2020).
Early reading instruction: Results of a national survey. Editorial Projects in Education.), and a more
recent survey by EdWeek similarly found that some reading faculty in teacher prep programs hold
misconceptions about reading. For example, this survey found that 65% of postsecondary instructors
still teach three-cueing, a discredited process, while only 57% of instructors would first tell a student
to sound out a word that they don’t know. For this reason, several states have offered reading training
to faculty as part of their efforts to improve literacy instruction.
10. llis, C., Holston, S., Drake, G., Putman, H., Swisher, A., & Peske, H. (2023). Teacher Prep
E
Review: Strengthening elementary reading instruction. Washington, DC: National Council on
Teacher Quality. https://www.nctq.org/review/standard/Reading-Foundations
11. ane, T. J. (2016). Never judge a book by its cover - use student achievement instead. Brookings.
K
https://www.brookings.edu/articles/never-judge-a-book-by-its-cover-use-student-
achievement-instead/
12. national survey found that several curricula that teach content contrary to research-based
A
practices, including Fountas & Pinnell and Units of Study, were also among the most popular.
[EdWeek Research Center. (2020). Early reading instruction: Results of a national survey.
https://epe.brightspotcdn.com/1b/80/706eba6246599174b0199ac1f3b5/ed-week-reading-
instruction-survey-report-final-1.24.20.pdf]. Evaluations of these programs are available from
EdReports at https://www.edreports.org/.
13. dReports approaches its reviews from the perspective of alignment and fidelity to College and
E
Career Ready standards, with reviews specific to each grade level. On the topic of early reading,
The What Works Clearinghouse (WWC)—part of the Institute of Education Sciences (IES)—
gives effectiveness ratings to interventions, such as reading programs, based on the number of
high-quality studies done on those interventions and the findings from those studies. WWC
intervention reports document the cost of commercial products, if known, but do not discuss
their alignment with scientific research on early reading.
14. oan, S. & Shapiro, A. (2023). Do teachers think their instructional materials are appropriately
D
challenging for their students? Findings from the 2023 American Instructional Resources Survey.
Rand Corporation. https://www.rand.org/pubs/research_reports/RRA134-21.html#:~:text=Three%20
in%20ten%20K%E2%80%9312,the%20majority%20of%20their%20students.
15. dReports approaches its reviews from the perspective of alignment and fidelity to College and
E
Career Ready standards, with reviews specific to each grade level. On the topic of early reading,
EdReports considers a program’s adherence to foundational skills as well as the capacity of
materials to build knowledge in young readers, a reasonable proxy for efficiency and effectiveness.
EdReports also recently added a new “science of reading snapshot.” This review does not consider
cost or the academic outcomes reported by various studies.
16. The What Works Clearinghouse (WWC)—part of the Institute of Education Sciences (IES)—
gives effectiveness ratings to interventions, such as reading programs, based on the number of
high-quality studies done on those interventions and the findings from those studies. WWC
intervention reports document the cost of commercial products, if known, but do not discuss
their alignment with scientific research on early reading.
17. or over a decade, NCTQ’s Teacher Prep Review has consistently found that prep programs are
F
inconsistent and often lacking in their attention to SBRI. For the most recent report, see Ellis,
C., Holston, S., Drake, G., Putman, H., Swisher, A., & Peske, H. (2023). Teacher Prep Review:
Strengthening elementary reading instruction. Washington, DC: National Council on Teacher
Quality. https://www.nctq.org/review/standard/Reading-Foundations
A 2019 EdWeek survey found that more than a quarter of K-2 and elementary teachers mistakenly
thought that sight word recognition was one of the five components of reading identified by the
National Reading Panel, nearly 20% of teachers could not correctly identify how many phonemes
are in the word “shape,” about 80% of teachers had the misconception that skilled readers rely
on context and visual cues to know what a word says (whereas in reality skilled readers are far
more likely to sound out words), and 35% of teachers entered the classroom feeling “somewhat”
or “completely unprepared.” EdWeek Research Center. (2020). Early reading instruction: Results
of a national survey. https://www.edweek.org/research-center/research-center-reports/early-
reading-instruction-results-of-a-national-survey
18. orgesen describes this finding in Torgesen, 2004. Specifically, the analyses he describes were
T
based on the proportion of students reaching the “low average level” of word reading skills by
second grade. While word reading is not the same as reading comprehension, it is a necessary
precursor to comprehension, and measures of word reading fluency (and gains in that fluency)
are predictive of broader student reading performance (Smith, J. L. M., Cummings, K. D., Nese,
J. F., Alonzo, J., Fien, H., & Baker, S. K. (2014). The relation of word reading fluency initial level
and gains with reading outcomes. School Psychology Review, 43(1), 30-40.). For more on studies
finding that 90% or more of students can read with proper instruction, see: Torgesen, J. K. (2004).
Preventing early reading failure. American Educator, 28(3), 6-9; Torgesen, J. K. (1998). Catch
them before they fall: Identification and assessment to prevent reading failure in young children.
American Educator, 22(1-2), 32-39. www.aft.org/sites/default/files/periodicals/torgesen.
pdf; Lyon, G. R. (1998). Overview of reading and literacy initiatives (Report to Committee on
Labor and Human Resources, U.S. Senate). Bethesda, MD: National Institute of Child Health and
Human Development, National Institute of Health. https://files.eric.ed.gov/fulltext/ED444128.
19. or one cost estimate of reading interventions, see Shrestha, P., Tracy, T., Mazal, M., Blakeney,
F
A., Kennedy, N., & May, H. (2022). A cost analysis of Reading Recovery and alternate
interventions under the i3 Scale-Up. Paper presented at the Annual Conference of the American
Education Research Association (AERA). https://drive.google.com/drive/folders/1R4eZlidReG-
1zFA4LKL9nX9sPbkM-t0q
20. hile the District of Columbia is not a state, for the purposes of this analysis, we refer to D.C. or
W
the Office of the State Superintendent of Education (OSSE) as the “state,” to distinguish it from
DC’s traditional public school district, District of Columbia Public Schools, or DCPS. OSSE oversees
both the traditional public school district, DCPS, as well as a large number of charter school local
education agencies, which teach about half of public school children across the city.
21. ational Center for Education Statistics. (2022). State average scores. https://www.
N
nationsreportcard.gov/reading/states/scores/?grade=4
22. lson, L. (2023). The reading revolution: How states are scaling literacy reform. FutureEd.
O
https://www.future-ed.org/wp-content/uploads/2023/06/The-Reading-Revolution.pdf
24. olsom, J. S., Smith, K. G., Burk, K., & Oakley, N. (2017). Educator outcomes associated with
F
implementation of Mississippi’s K-3 early literacy professional development initiative. REL
2017-270. Washington, DC: U.S. Department of Education, Institute of Education Sciences,
National Center for Education Evaluation and Regional Assistance, Regional Educational
Laboratory Southeast. https://ies.ed.gov/ncee/edlabs/regions/southeast/pdf/REL_2017270.pdf
25. llis, C., Holston, S., Drake, G., Putman, H., Swisher, A., & Peske, H. (2023). Teacher Prep
E
Review: Strengthening elementary reading instruction. Washington, DC: National Council on
Teacher Quality. https://www.nctq.org/review/standard/Reading-Foundations
26. ouncil for the Accreditation of Educator Preparation. (2017). CAEP evaluation framework for
C
EPP-created assessments. Washington, DC: CAEP. Retrieved from http://caepnet.org/~/media/
Files/caep/accreditation-resources/caep-assessment-tool.pdf?la=en; Regional Educational
Laboratory at Marzano Research. (2019). Examining the reliability and validity of teacher
candidate evaluation instruments [PowerPoint slides]. REL Central. Retrieved from https://ies.
ed.gov/ncee/edlabs/regions/central/pdf/slides-reliability-and-validity.pdf
27. CALE and Pearson refuted the concerns in a response, Stanford Center for Assessment, Learning
S
and Equity. (2019). Affirming the validity and reliability of edTPA: A response authored by
the Stanford Center for Assessment, Learning, and Equity (SCALE) and Pearson. Retrieved
from https://cga.ct.gov/ed/tfs/10000001_Archived%20-%20edTPA/20200115/Chair%20
Alfano%20Report/Affirming-Validity-and-Reliability-of-edTPA.pdf. In a follow-up article,
Gitomer, Martínez, and Battey provided some additional context for their concerns, reiterated
the limitations of the edTPA, and raised some new concerns about the Technical Advisory
Committee. Gitomer, D. H., Martínez, J. F., & Battey, D. (2021). Who’s assessing the assessment?
The cautionary tale of the edTPA. Phi Delta Kappan, 102(6), 38-43. Retrieved from https://
kappanonline.org/whos-assessing-assessment-cautionary-tale-edtpa-gitomer-martinez-
battey/. An annual edTPA Administrative Report set to be released in summer 2021 intends to
support the test’s validity and reliability.
28. or example, a review of the edTPA found that Hispanic test takers systemically scored lower
F
than white test takers. Goldhaber, D., Cowan, J., & Theobald, R. (2017). Evaluating prospective
teachers: Testing the predictive validity of the edTPA. Journal of Teacher Education, 68(4), 377-
393. Another study found that differences in scores between Black and white test takers were
growing over time. Petchauer, E., Bowe, A. G., & Wilson, J. (2018). Winter is coming: Forecasting
the impact of edTPA on Black teachers and teachers of color. The Urban Review, 50(2), 323-343.
However, disparities in scores do not necessarily indicate bias in the instrument.