District Data Toolkit
District Data Toolkit
District Data Toolkit
MODULES
Introduction
Getting Ready
Inquiry
Information
Knowledge
Action
Results
Table of Contents
Introduction
Introduction—1
the Introduction
INTRODUCTION
Welcome to the District Data Team Toolkit. This Toolkit is designed to help a
district establish, grow, and maintain a culture of inquiry and data use that can
inform decisions that impact teaching and learning, and ultimately improve the
achievement of all students. This short introduction will help you understand and
navigate the tools and resources available to support this work.
These activities can help build the capacity of a District Data Team to engage in
inquiry and use data to inform district-level decisions. Over time, the Team can
engage the entire staff in using multiple data sources to continuously improve
teaching and learning throughout the district. Districts that engage with the
Toolkit should plan for a multi-year commitment to increase and embed a
capacity for effective data use.
The Toolkit is designed around a theory of action, the Data-Driven Inquiry and
Action Cycle (see diagram) on the next page, which provides a foundation for
effective data use. The Cycle provides a structure that takes data use from
asking the right questions to getting results. It is an iterative process in which the
district uses data to target and support continuous improvement. A disciplined
application of this kind of data-driven approach can build a district and school
environment that is focused on continuous improvement grounded in evidence.
This Cycle is also the basis for the ESE Education Data Warehouse trainings,
which further provide excellent tools to access and analyze data. But analyzing
data alone will not result in continuous improvement. Concrete actions that are
grounded in evidence and continually monitored through the collection and
analysis of multiple forms of data are critical to achieve improved results.
The ESE District Data Team Toolkit can help district staff:
Engaging with this Toolkit can help a district identify and/or refine a focus for
improvement, including determining if current improvement efforts are having the
desired effect on student learning outcomes. For example, a district may frame
an inquiry process around one aspect of an existing District Improvement Plan as
a means to delve deeply into questions about the impact of the related initiatives.
Once a District Data Team has built its own capacity for data use and a culture of
inquiry, it will be better poised to support such efforts with principals, teachers,
and other stakeholders in the district.
6. Results module shares methods for monitoring the work, evaluating the
impact, making mid-course corrections if necessary, and communicating
the outcomes with stakeholders.
Each of the modules provides specific tools and activities to implement the steps
of the inquiry process. Some tools are best used electronically. It is important to
understand, however, that superimposing a process does not necessarily yield a
positive result. A district must be mindful of doing what it can to embed a culture
of inquiry and data use that goes beyond technical compliance with processes.
Tools are templates, protocols, organizers, or other items that the Team
will work with to build its knowledge and expertise.
district in engaging with ones that will be most useful to its work.
If in doubt, a district might gain the most value from starting in Module 1: Getting
Ready and working through the Toolkit sequentially, committing to a multi-year
process of building robust use of data at the district level. If the District Data
Team has been in existence for several years and needs to work on refining
processes and policies that support data use, it may find it useful to go directly to
certain tools in the Toolkit.
This self-assessment can help a district determine its strengths and needs,
and how best to use this Toolkit to support inquiry and data use.
(0.2.1T: District Data Team Self-Assessment)
Many thanks to all the individuals who contributed to the creation of this Toolkit,
For more information on this and other district support resources, or to share
feedback on this tool, visit http://www.doe.mass.edu/sda/ucd/ or email
districtassist@doe.mass.edu.
Module 2 (Inquiry) will help a District Data Team use the above roles and vision to:
Formulate questions to drive an inquiry process
Create and present effective data displays and data overviews
Identify the data needed to answer the questions
Module 3 (Information) will help a District Data Team use the above questions and data to:
Collect and organize data relevant to the inquiry process
Distinguish between observations and inferences
Make inferences from multiple sources of data
Module 4 (Knowledge) will help a District Data Team use the inferences generated above to:
Clearly articulate a problem statement
Identify and explore root causes of the problem
Cross-reference solutions with research and local knowledge
Begin to capture information on the district’s improvement efforts
Module 5 (Action) will help a District Data Team use the knowledge generated above to:
Craft a logic model or theory of action to guide subsequent action and evaluation
Articulate meaningful measures of implementation and change
Develop action plans, if necessary, to implement new strategies or to implement existing
strategies more effectively
Module 6 (Results) will help a District Data Team use the action plan generated above to:
Decide what to evaluate
Develop an evaluation plan
Analyze evaluation data
Identify and develop a communication strategy
Continue the process of inquiry
Module 0: Introduction
0.1.1R: Objectives for All Modules
0.1.2R: Tools and Resources for All Modules
0.2.1T: District Data Team Self-Assessment
4.2.2T: 20 Reasons
4.3.2R: Educational Research Websites
For more information on this and other district support resources, or to share feedback on these tools,
visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Background: This tool is designed to give a District Data Team an indication of its strengths and
challenges in a variety of areas related to promoting a district-wide culture of inquiry and data use. The
self-assessment is comprised of six short surveys which are aligned to each of the six modules in the
Toolkit and to the six steps in the Data-Driven Inquiry and Action Cycle. Each survey has a number of
selected-response questions grouped by Data Team practice. The possible responses are described in
the rubric below.
1. Print each page of this self-assessment (including this page so that the rubric is readily available to
anyone taking the survey) and provide a full copy to each member of the group.
2. Individually complete the survey, assigning a rating from the rubric below to each indicator.
3. As a group, discuss each page of the survey and agree on a rating for each indicator. It is not
necessarily best to average the individual scores to get this final rating. If responses among individuals
vary widely, engaging in a discussion about which rating best represents the level of practice can help
the Team begin the hard work of developing a common understanding of the work.
4. Enter the final rating for each indicator into the spreadsheet version of this survey.
5. Print out the Graphs page (or use a projector to display it on the wall), and as a group talk through the
discussion questions for each graphical display.
1 No Evidence There is no evidence that this indicator is in place within the district.
There is some evidence of this indicator in the district, but the evidence
2 Emerging Evidence indicates that the practice is far from standard procedure and has clear room
for improvement in both quality and frequency.
This indicator has clear evidence of existence in the district and is consistently
3 Adequate Evidence practiced in many places. There is room for improvement in either quality or
frequency.
A radar chart, also known as a spider chart or a star chart because of its appearance, plots the values of
each category along a separate axis that starts in the center of the chart and ends on the outer ring. In the
example below, each step of the Data-Driven Inquiry and Action Cycle is plotted on an axis. This makes it
possible to compare your survey results across the steps. When your surveys are complete, your results
will be displayed in a ring plotted on the chart. A district performing consistently across all of the steps will
be displayed in a near circle. A district performing higher in some steps than others will be displayed in a
more free form shape.
The horizontal bar charts will display your Team's perceptions of strength within each step. The questions
in each step are grouped by practice. The chart displays the averages of the responses within each step,
allowing you to view a disaggregated depiction of performance. When viewing these charts, you may find
it valuable to go to the modules themselves to find the tools, resources, and activities that contribute to the
effective implementation of that practice. Each of the practices surveyed in this instrument are supported
in the modules.
Getting Ready
Strengths by Practice
The district’s vision for data use is widely understood and accepted by all stakeholders. 0
The vision is supported by district policies and published expectations that support the use of inquiry and 0
data for instructional, school, and district improvement.
Practice Average: 0.0
Data Displays
The IT staff and District Data Team create user-friendly data displays (charts, tables, and reports) that 0
facilitate meaningful conversations and promote new insights on the work of the district in service of
teaching and learning.
Practice Average: 0.0
0 No Knowledge
Data analysis at the district level is conducted collaboratively within and among departmental teams. 0
Data analysis results in the identification of specific problems or questions that need to be addressed, 0
e.g., problems at the student level, classroom level, school level, or district level.
0 No Knowledge
Root cause analysis helps the Team decide on the one potential factor that, if addressed, would 0
eliminate or dramatically alleviate the problem.
Potential root causes and proposed solutions are also investigated through the consultation of local 0
knowledge or expertise to construct strong inferences about possible solutions/action steps.
Potential root causes and proposed solutions are also investigated through the consultation of 0
information on programs and practices (including data on instruction) to construct strong inferences
about possible solutions/action steps.
The District Data Team encourages collection, dissemination, and active use of one or more forms of 0
documentation of lessons learned and promising practices from improvement efforts in a library of local
knowledge.
0 No Knowledge
Action plans identify the available resources necessary to carry out the action steps. 0
The district can justify to stakeholders how it uses resources to achieve desired outcomes. 0
District personnel can articulate the district's program goals. 0
District personnel can articulate their role in achieving program goals. 0
Practice Average: 0.0
0 No Knowledge
The district uses the results of program evaluations to inform the development of new programs. 0
The district has a process for codifying best practices at the district, school, or classroom level. 0
The process for communicating results creates opportunities to solicit feedback to inform the 0
development of new focusing questions.
0 No Knowledge
Guiding Questions
What observations do you have as you view the data displays?
What additional information do you gain when looking at the responses to indicators in the survey?
Strengths by Steps
Getting Ready
4.0
3.0
1.0
0.0
Action Information
Knowledge
Notes:
Notes:
Getting Ready
Strengths by Practice
0.0 2.0
Inquiry
Strengths by Practice
0.0 2.0
Information
Strengths by Practice
0.0 2.0
Notes
Knowledge
Strengths by Practice
0.0 2.0
Action
Strengths by Practice
0.0 2.0
Results
Strengths by Practice
0.0 2.0
Table of Contents
Introduction—1
Where Are We Now?—1
Module Objectives—1
Taking Stock—13
Taking Stock of Current Data and Processes—13
Types of Data That Inform Inquiry—13
Data Inventory—15
Data Collection—16
Data Dissemination and Access—17
Data Literacy—18
Managing the Change Process—20
Why Is Change Management Necessary?—20
Module Summary—23
MODULE 1: GETTING READY
Getting Ready
Team
Assessment
Template
Understanding Concerns
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
MODULE OBJECTIVES
The Getting Ready module will help a district:
disseminate data
This activity will help a district begin thinking about the role(s) the
District Data Team will fill and who should serve on the Team.
BUILDI NG A CULTURE OF
INQUIR Y A ND DA TA USE
To make the most of the data available within a district, there must be
something more. In highly successful data use initiatives, there is a
cultural shift that causes people to want to work differently, where teams
of educators will meet regularly to analyze data, ask questions, and dig
deeply to understand and fix problems. In all cases, there is a process
that drives this kind of work and collaboration.
In highly successful Having a culture of inquiry means having people within a district who are
data use initiatives, regularly asking questions about what all students should know and be
able to do, how best to teach content and skills, and what student
there is a cultural
demonstrations will be acceptable ways to measure learning. The
shift that causes leadership that a District Data Team can provide is central to creating this
people to want to district-wide culture of inquiry.
work differently.
The modules in this Toolkit will help a district establish or enhance its
District Data Team, as well as build the foundations to create a culture of
inquiry and data use. One key to creating this culture is to understand
what might be getting in the way of the district developing a thriving
culture of inquiry and data use. Debra Ingram (2004)1 and others
uncovered seven barriers to the use of data to improve practice:
Cultural Barriers:
Technical Barriers:
Political Barriers:
Use this activity to begin thinking about the challenges the Team will
address to improve data use in the district.
The district likely has a mission statement that answers the question,
―Why do we exist?‖ and serves as a clear statement of purpose for
everyone in the district. At its core, the statement puts a stake in the
ground and declares why you exist—to educate children.
A district vision statement for data use should derive from the district’s
overarching mission and vision. It will have a slightly different tone that
focuses on data use, while still connecting in some way to improving
performance, taking action, or doing things differently than they have been
done in the past. The vision statement should define the future so it can
serve as a guidepost for all data use efforts.
If there are people anywhere in the district who aren’t sure of what data
are available, what actions data should inform, or why certain data are
even collected, this may be a sign that the district lacks a clear and
shared vision for data use. Consider for a moment the data use in your
own district.
How prepared are principals to use inquiry and data to inform their
own work?
How prepared are principals to lead their staff and teachers in
inquiry and data use? To what extent do they actually do this?
How prepared are staff and teachers to use inquiry and data to
inform their work? To what extent are they actually engaged in
data use?
To what extent can principals, teachers, and others in the district
articulate how data inform their practice and further the district’s
mission for educating its students?
Open the Vision for Data Use document and complete the activities to
either assess the district’s existing vision for data use, or craft a new
one.
Using the results of 1.1.1T: Functions of a District Data Team and 1.3.1T:
Vision for Data Use as a guide, identify the departments and people who
will be essential in helping the District Data Team fulfill all five key
functions necessary to support data use in the district, which were
previously noted in this module.
The Team must also have a data manager who is in charge of the more
technical aspects of the work, such as:
It is also critical that the superintendent shows support for the inquiry
process and the work of the District Data Team by modeling data use and
visibly responding to the needs of the Team.
Core Data Team members will be determined within the context of the
local setting, but could include:
Subject-area directors
Principals
Grants director
Union leadership
The Norm Setting Protocol will help the Team articulate and agree on
ways of working together in order to foster risk-tasking and effective
communication during tricky conversations. The templates provide
models for agendas and capturing meeting minutes, in order to assure
productivity and high-quality communication during and after meetings.
TA KING STOCK
Any data team, but a District Data Team in particular, has a responsibility
to consider data from multiple sources in order to gain an understanding
of the quality of work being done in service of teaching and learning—not Four important
only in the classrooms, but in all areas of the district. The Team can types of data:
investigated. This may involve comparing different forms of the same type Demographics
the Team may compare two entirely different types of data, such as Processes
The graphic below, based on the work of Victoria Bernhardt4, outlines four
primary domains of data: student outcomes, demographics, perceptions,
and school (or district) processes. This lens highlights the fact that
student achievement data provide only one view on the work of a district.
The Data Team must also analyze data related to processes such as
hiring, procurement, and even facilities maintenance, or perceptions of
stakeholders, in order to gain new insight on the supports needed from
the district to take teaching and learning to the next level. This may mean
looking for data in forms other than numbers that can be easily counted,
and also considering data generated by what one sees (such as through
Learning Walkthrough site visits) or hears (such as through stakeholder
surveys and focus groups).
This diagram also describes the interaction of data from the four primary
domains and the kinds of inferences that can be drawn from the
intersections.
It is important to note that of these four domains, only one can be directly
modified by a District Data Team (or anyone else, for that matter), and
that is processes. It is only by changing the way adults interact and
conduct business that a district can hope to shift the evidence it sees in
the realms of demographics, perceptions, and student outcomes.
DATA INVENTORY
With this lens of the four domains, the District Data Team can inventory
the data available in the district, when they are available, how readily they
can be accessed by the Team for consideration in the inquiry process,
and how they are being used in service of teaching and learning.
Completing this inventory serves multiple functions. It can help a district:
This activity will help determine current availability and use of data and
DATA COLLECTION
For the available data to further the inquiry process, they must be
complete, accurate, and timely. Collection and distribution tools and
processes need to be efficient and effective to ensure that these criteria
are met.
Understand and are invested in what the data will be used for
Understand how the data they collect will be integrated into other
systems
Participate in the creation of and agree to the use of a common
Data Collection Practices handbook
Are adequately trained to complete the task
Have appropriate tools to support the collection process
Work in an environment free from distraction
Are provided the time to collect the data and ensure the data’s
integrity
Without this support, it is highly likely that the district will not get valid
information, which in turn would detract from its ability to make quality
evidence-based decisions.
The District Data Team can contribute to the effective collection and
distribution of data by continually monitoring the needs of the district; the
effectiveness of the tools in place for data collection, storage, and
dissemination; and the training of those who are responsible for data
collection and input. One of the most important things that members of
the Team can do is listen and respond to the needs of the staff in charge
of the data collection process.
However, the Team must also pay close attention to who is given access
to what data, and why. Federal, state, and local regulations determine
who can have access to personally identifiable data. The Massachusetts
Department of Elementary and Secondary Education has published
guidelines regarding access to data that comply with these regulations.
Each district also has privacy policies to inform decisions regarding
access to data. Specific access guidelines have been developed by the
ESE for the ESE Education Data Warehouse. These guidelines can serve
as a model for the development or critique of locally developed guidelines
for data access.
Beyond compliance with federal, state, and local data access regulations,
the District Data Team must consider the logistics involved in providing
appropriate data in a user-friendly and timely manner to those who need
it. Faithful use of the data dissemination schedule will ensure that a large
segment of the community will be provided with the data that it needs.
Some may have access to data through the student information system,
while others will gain access through use of the ESE Education Data
Warehouse. It is important for the District Data Team to be sensitive to
the data needs of the district as a culture of systemic data use evolves,
and to act to meet those needs.
The data dissemination activity can help a District Data Team construct
and publish a schedule for the distribution and use of major data
elements.
DATA LITERACY
To effectively use the data available to them, principals, teachers, district-
level staff, and the community need certain knowledge and skills. It is
particularly important that the members of the District Data Team have
competencies in data and assessment literacy. Additionally, each of these
stakeholders needs to develop a shared understanding of the purposes
and uses of various data as they pertain to their roles in serving students.
Stakeholders must understand what data to use when, the uses and limits
of specific assessments, ways to interpret and use the various reports
produced by those assessments, and specific statistical terminology and
calculations used in those reports. A successful District Data Team will
For each standardized assessment used in the district, there are unique
details about test and item construction that must be communicated to
and understood by all consumers of the test. This includes teachers,
principals, and other staff as they analyze results in preparation to take
action, as well as parents and students as they receive reports designed
to inform them of specific areas of strength, challenge, and progress
toward attaining proficiency in core curriculum standards.
These resources will help a District Data Team develop its assessment
literacy, as well as that of other stakeholders in the district.
The challenge is to both implement the change while also managing the
change process. When introducing or enhancing a cultural norm of a truly
collaborative learning community—one where all members regularly ask
questions about their practice and what more can be done in service of
student learning and achievement—the Team must pay attention to the
human element, the students and adults who are being asked to
approach work differently in order to achieve new outcomes.
A District Data Team can use the following framework as a guide when
introducing the inquiry process district-wide. The guidelines suggest steps
that the Team can follow to support school-level data teams and other
teams within the district as it initiates the first steps in the collaborative
Data-Driven Inquiry and Action Cycle.
Build Awareness
Understand Concerns
Talk openly with staff at all levels in the district about stress they
may experience as change is implemented
Actively listen: solicit and act upon the concerns of staff members
to facilitate the change process
Acknowledge losses that people may feel as they shift established
habits and approach their work in new ways
Build Capacity
Develop a broad base of support among all stakeholders
Lead a discussion of how the vision can be realized through the
action of school-level teams
Involve staff in collaborative and objective analysis of data to
answer the high-interest questions that they have developed
Help schools and district offices form data teams
Provide support to school-level teams as they utilize the resources
of the Toolkit
Provide professional development to help district personnel build
assessment literacy and use relevant data warehouses
Assist schools as they learn to prepare local data for upload to
centralized data warehouses
Provide professional development activities to build assessment
literacy
Celebrate Success
Positively reinforce movement toward desired goals for a culture
of inquiry and data use as well as improved student achievement
Over time, with patience, perseverance, and strategic action, the District
Data Team can help the district as a whole establish and/or enhance a
cultural norm in which inquiry and data use is a regular part of everyone’s
work, where data are regarded as impartial evidence that can spark a
question, trigger an idea, or measure a result.
This protocol can help a District Data Team gain a better understanding
about the concerns of stakeholders as it engages in this work.
MODULE SUMMARY
This module explores the roles and functions of a District Data Team to
set the course for data use in the district and support the establishment of
a culture of inquiry. It discusses the value of a vision statement for data
use and provides guidance on how to create or refine one. Setting and
communicating the vision for how the district will use data to make
decisions is key to success with the inquiry process outlined in the
remainder of the Toolkit’s modules.
If you have not yet done so, consider administering the District Data
Team Self-Assessment at this time (0.2.1T). It will help the Team identify
its strengths and challenges related to an inquiry process, providing
guidance on how to use the remaining resources in the Toolkit.
REFERENCES
1
Ingram, D. S. (2004). Accountability policies and teacher decision making:
Barriers to the use of data to improve practice. Teachers College
Record, 106(6), 1258–1287.
2
Curtis, R. E. and E. A. City. (2009). Vision: Keeping the end in mind. Chapter 4
in Strategy in Action. Cambridge, MA: Harvard Education Press.
3
http://www.merriam-webster.com/dictionary/data (December 24, 2009)
4
Bernhardt, V. L. (2004). Data Analysis for Continuous School Improvement.
Larchmont: Eye on Education.
5
Sagor, R. (1992). How to conduct collaborative action research. Alexandria, VA:
Association for Supervision and Curriculum Development (ASCD).
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Purpose To understand the role and functions your District Data Team fulfills to support a culture of Related Documents
data use. 1–Getting Ready Module
Description Team members will review the functions of a District Data Team and think specifically
about how these tasks are accomplished within its district. The Team will also identify gaps
that might exist on the Team and begin thinking about how to address them.
Time 45 minutes to an hour.
S T E P 1 ) Use the table below to brainstorm the specific tasks already being performed in the district within each function. Allow
individual think/work time before sharing and charting everyone’s ideas. (See Module 1: Getting Ready for explanations of the five
functions)
1) In which functional area(s) is the district performing particularly well? What is the
evidence?
2) Which function is currently the district’s biggest challenge? What is the evidence? What
is getting in the way of success in those areas?
3) What key tasks, if any, are missing from the list of tasks currently being performed?
4) What tasks are the greatest priority for the coming year, given the district’s strategic
priorities and improvement plan?
6) Which tasks are currently being performed exclusively within one department? (Note
which departments).
7) If the District Data Team is going to fill all of these key tasks and functions, whom does
the Team need to have as members? (Identify by name and/or role).
Purpose To identify barriers or problems your Team might face regarding data use. Related Documents
1–Getting Ready Module
Description The District Data Team will make a list of possible barriers or problems that might slow its
progress. You will also begin to think about solutions to them.
Time 30 minutes.
As a team, brainstorm a list of the barriers the district currently faces in creating and/or maintaining a culture of inquiry that is
embedded in everyone’s work. Try to identify a range that includes cultural, technical, and political barriers. As a team, identify which
barriers are the most significant and that, if addressed, would result in the greatest shift toward an embedded culture of inquiry. For
each of these prioritized barriers, identify possible strategies and people who can help address these barriers. Try to think out of the
box in identifying these people, looking beyond titles and positions. Keep this list accessible as the Team works through this module
and the rest of the Toolkit. These barriers will become areas of focus for the District Data Team.
Cultural,
Technical, People to Involve
Barrier or Political? Possible Strategy to Address in the Solution
Purpose To develop a shared vision for data use that will guide Related Documents
the District Data Team’s work. 1–Getting Ready Module
Description Team members will develop a shared vision for data
use in the district and craft a vision statement to drive
the Team’s work.
Time About 2 hours. (Can be done in two blocks).
If the district already has a vision statement that incorporates data use, locate it and use this
guide to assess and revise it if necessary. If there is no reference to data use in the existing
district vision, use this guide to draft a vision statement to guide the work of the Team.
If the Team has completed 1.2.1T: Barriers to Effective Data Use, it may want to have those
notes available for reference.
1. Let the Team know that the purpose of the activity is to begin to articulate a vision for the
work of the Team. The actual writing of the vision statement will come later.
2. Provide each person on the Team with a large note card or sticky note and instruct them to
individually write a vision for data use in the district. Guide them by asking a specific
question such as:
If this team were to be successful in promoting data use, what would that look like?
What do we want the future of data use in the district to look like?
You might also ask members to consider:
Provide about 5 minutes of silent work time and let them know that responses will be shared.
3. Once everyone has had a chance to write his or her vision statement, let the group know
that the next step is to work together to sort the notes in relative order, so that the most
immediate aspirations come early (Vision 1 or 2), while the longer-term aspirations are near
the end of the horizon line (Vision 3 or 4).
In preparation for this, each individual should review what he or she wrote and write each
separate thought onto a different card or sticky note.
Vision 3 Vision 4
Vision 2
Vision 1
The curved line represents the future, while under the person’s feet is current reality.
5. Once all Team members have completed visions and separated statements onto separate
cards (if necessary), they attach their note to the diagram.
6. Review all the statements, discuss and arrange the notes until all members of the Team are
satisfied with the order.
7. As a Team, review the assembled statements and add any key ideas that seem to be
missing. Also ask if anyone has any questions or concerns about any of the ideas, and
whether anyone would have a hard time getting ‘on board’ with them.
The Team now has a view of a shared strategic focus. The diagram outlines priority areas of
need to be addressed by the District Data Team and is beginning to paint a picture for data
use in the district.
If the district has a vision for data use already written, compare it to the array of ideas the
Team has just created. Determine if the existing vision is in alignment with the shared
strategic focus the Team just developed. If there is not alignment, consider whether it is the
existing vision or the Team’s strategic focus that may need revision.
Note: This process can be modified for use in other settings, such as crafting or revising a
district’s vision for education. The key is to articulate a clear guiding question to focus the initial
work in step 1.
2. Ask each member of the Team to write a statement that incorporates the Team’s shared
strategic focus using the sentence starter as a guide.
3. Record each person’s draft vision statement on chart paper (or an electronic document
displayed with a projector).
4. Review the statements as a Team. Look for opportunities to combine similar ideas and
identify unique ideas.
5. Merge all of the ideas into a clear statement of the district’s vision for data use. The
statement may be multifaceted or bulleted, but it should include the essential elements
of the original sentence starter.
a. A timeframe
b. Accomplishments or goal statements
c. Methods or strategies that will be used to achieve the vision
6. Refine the statement until all members of the Team are satisfied that it captures the
Team’s priorities and vision for data use in the district.
7. Consider the authority with which the District Data Team has been charged. Does the
vision need to be approved by another team? How will this vision be finalized and
communicated to district leadership, schools, and other stakeholders?
Purpose Tools for launching or supporting the work of a District Related Documents
Data Team. 1–Getting Ready Module
1.4.2T: Data Team
Description This protocol will establish the norms under which the
Meeting Agenda
District Data Team will operate. Setting norms helps
1.4.3T: Data Team
keep unproductive behaviors in check while fostering
Meeting Minutes
risk-taking and effective communication during tricky
conversations.
Time 15–30 minutes.
3–5 minutes
Invite people to reflect in writing on the response to this question “In order to reach our
vision, what norms will we need?” Explain that norms are guidelines for an interaction or
meeting, and can include both process (e.g., start and end on time) and content (e.g., taking
risks with our questions and ideas).
5–10 minutes
Invite people to share norms. It’s sometimes best to do this round-robin style so that you hear
one from each person, and then open it up for other ideas. Record the norms on chart paper
or using a computer and projector. You don’t need to write these exactly as stated—just
capture the idea.
10–20 minutes
Ask if there are any norms people have a question about (sometimes people will ask a
clarifying question about what something means) or couldn’t live with during future meetings.
You may need to rephrase or reframe norms to pose them in a way that everyone is
comfortable with. When everyone seems clear and comfortable with the list, ask if there is
anyone who can’t live with and support these norms.
Note: Norms are only valuable if the Team regularly references them and holds each other
accountable for upholding them. Consider establishing a few rituals to keep the Team’s norms
alive, such as:
Purpose Tools for launching or supporting the work of a District Related Documents
Data Team. 1–Getting Ready Module
1.4.1T: Norm Setting
Description This template is a good model for meeting agendas
Protocol
that lead to productive meetings.
1.4.3T: Data Team
Time Ongoing. Meeting Minutes
Location:
Meeting Date:
Agenda
Resources
Data Team Norms: (List all norms established and recorded by the Data Team—this list
should appear on all meeting agendas.)
Purpose Tools for launching or supporting the work of a District Related Documents
Data Team. 1–Getting Ready Module
1.4.1T: Norm Setting
Description To improve the effectiveness and efficiency of quality
Protocol
communication, it is a good idea to capture meeting
1.4.2T: Data Team
minutes accurately and efficiently. This template can
Meeting Agenda
serve as a good model to follow.
Time Ongoing.
Location
Meeting Date
Submitted by (name)
Submitted date
Agenda Item #
Subject
Discussion
Agenda Item #
Subject
Discussion
Purpose To develop an inventory of currently available data and how the data are being used in service Related Documents
of teaching and learning. 1–Getting Ready Module
1.5.2T: Data Inventory
Description Complete the attached templates to determine current availability and use of data in the district.
Template: SIMS
Time 1–2 hours to review template; 1–2 weeks to gather information, with ongoing upkeep. and EPIMS Data
1.5.3R: ESE Data
Resources
NOTES:
The sections of this Data Inventory align to the four domains of data described in the text of the Getting Ready module:
demographics, district and school processes, stakeholder perceptions, and student outcomes.
1.5.2T has been pre-populated with all the data elements collected for SIMS and EPIMS.
If the Team has completed 1.8.1T Data Literacy Training Catalogue or 2.2.1T Inventory of District and School Initiatives, it may want
A Team might want to copy and paste these tables into Excel in order to be able to sort and group the information.
DIRECTIONS:
1. Organize data elements into district-wide and school-based data:
a. District-wide Data is common across all schools, a set of grade-alike schools (e.g. elementary) or at least across a given
population (e.g. all 5th graders, all English Language Learners, or all students who receive free or reduced lunch)
b. School-based Data is that which is not necessarily collected in other schools in the district, such as data a principal decides on his
or her own to collect and use with school personnel, or data for unique programs such as Expanded Learning Time or pilot schools.
2. For each assessment or element, provide the indicated information in the columns to the right.
a. Location/Owner of Data refers to the physical location of the data, e.g., Education Data Warehouse or school paper files.
b. Access refers to the degree to which the data are available to District Data Team members. (1 = hard to access; 4 = easily accessible).
c. Current Data Use describes how the data are used to inform decisions at the district, school, and/or classroom level.
3. Consider involving others in the data collection:
a. Ask personnel such as the SIS data manager, assessment coordinator, or guidance director to contribute information.
b. Consider having each school complete sections A2, B2, C2, and D2, in order to learn what schools collect and how the data are
used.
Purpose To develop an inventory of currently available data and how the data are being used in service Related Documents
of teaching and learning. 1–Getting Ready Module
1.5.1T: Data Inventory
Description The attached template is pre-populated with the data elements collected for SIMS and EPIMS.
Template
Districts can use this as a starting point to determine current availability and use of this data in
1.5.3R: ESE Data
the district, in conjunction with other data identified in the district’s Data Inventory.
Resources
Time 1–2 hours to review template; 1–2 weeks to gather information, with ongoing upkeep.
Notes:
This tool is meant to be used in conjunction with 1.5.1T: Data Inventory Template, which has further guidance and directions.
o The DOExxx series represents data elements from the Student Information Management System (SIMS)
o The IDxx, SRxx, and WAxx series represent data elements from the Education Personnel Information Management
System (EPIMS)
For descriptions and more information on these data elements see:
o SIMS Version 2.1 Data Handbook http://www.doe.mass.edu/infoservices/data/sims/DataHandbook.pdf
o EPIMS Data Handbook: http://www.doe.mass.edu/infoservices/data/epims/
Directions:
1. For each data element identified, confirm the information that is pre-populated, and complete the remaining columns.
a. Location/Owner of Data refers to the physical location of the data, e.g., Education Data Warehouse or school paper files, and/or
the person or department who is responsible for collecting the data and ensuring their quality.
b. Access refers to the degree to which the data are available to District Data Team members. Rate Access on a scale of 1–4
(1 = hard to access; 4 = easily accessible).
c. In the Current Data Use column, describe how the data are currently used to inform district-level decisions. The Team can
decide if it also wants to describe how the data are used to inform decisions at the school or classroom levels.
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 1/11
<<District Name>> Data Inventory
SECTION A1— Demographic Data: District-wide Measures (SIMS and EPIMS Data)
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
The DOExxx series represents data elements from the Student Information Management System (SIMS)
DOE001 – All
Locally Assigned All
Student Identifier
(LASID)
DOE002 – All All
State Assigned Student
Identifier (SASID)
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 2/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
DOE009 – All All
Gender
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 3/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
DOE019 – All All
Low-Income Status
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 4/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
DOE028 – All All
Title I School Choice
Participation
DOE033 – All
High School Completer 12
Plans
DOE034 – All
Special Education 1–12
Placement, ages 6–21
DOE035 – All
Career/Vocational 6–12
Technical Education –
Type of Program
DOE036 – All All
Special Education –
Nature of Primary
Disability
DOE037 – 12 All
Graduate, Completed
Massachusetts Core
Curriculum
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 5/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
DOE038 – All All
Special Education –
Level of Need
DOE040 –
Special Education All All
Evaluation Results
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 6/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
DOE046 – All All
Number of Out-of-
School Suspensions
DOE052 – All
Student Truancy All
The IDxx, SRxx, and WAxx series represent data elements from the EPIMS
ID04 – All
Staff Date of Birth
ID05 – All
Staff Gender
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 7/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
ID06 – All
License / Certification
Number
ID07 – All
Local Employee
Number
SR01 – All
Massachusetts
Education Personnel
Identifier (MEPID)
SR09 – All
Employment Status at
Time of Data Collection
SR10 – All
Reason for Exit
SR11 – All
Date of Hire
SR12 – All
Federal Salary Source 1
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 8/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
SR13 – All
Percent of Federal
Salary Source 1
SR14 – All
Federal Salary Source 2
SR15 – All
Percent of Federal
Salary Source 2
SR16 – All
Federal Salary Source 3
SR17 – All
Percent of Federal
Salary Source 3
SR18 – All
Degree Type 1
SR19 – All
Degree Institution 1
SR20 – All
Degree Subject 1
SR21 – All
Degree Type 2
SR22 – All
Degree Institution 2
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 9/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
SR23 – All
Degree Subject 2
SR24 – All
Degree Type 3
SR25 – All
Degree Institution 3
SR26 – All
Degree Subject 3
WA01 – All
Massachusetts
Education Personnel
Identifier (MEPID)
WA06 – All
District / School
Identification Number
WA07 – All
Job Classification
WA08 – All
Teacher /
Paraprofessional
Assignment
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 10/11
Program/ Location/ Access
Grade Date Data
Data Element Department/ Owner of Current Data Use
Level(s) Available (1–4)
Content Area(s) Data
WA–09 – All
Grade
WA10 – All
Subject Area-Course
Code
WA11 – All
Class Section
WA12 – All
Full Time Equivalent
(FTE) (as per DSSR)
WA13 – All
NCLB Instructional
Paraprofessional
Requirements
WA14 – All
Highly Qualified
Teacher Status
WA15 – All
Subject Matter
Competency
1.5.2T: Data Inventory Template: SIMS and EPIMS Data—Version 1.0 11/11
ESE DA TA RE SOUR CES 1 . 5 . 3 R
Purpose This group of resources and tools can help a District Data Team understand Related Documents
what data is made available to districts by the Massachusetts Department of 1–Getting Ready Module
Elementary and Secondary Education, what it means, how to access it, and 1.5.1T: Data Inventory Template
how to use it effectively. 1.5.2T: Data Inventory Template: SIMS
and EPIMS Data
Description Website links to the most current information on a variety of data sources.
Time Ongoing.
The majority of data available from the ESE resides in one of two locations:
1. Education Data Warehouse: A collaborative effort of ESE and local school districts to centralize K–12 educational performance
data into one state-coordinated data repository hosted by the Department. It contains the SIMS and MCAS data for every district
in the state and will soon contain the EPIMS data for every district in the state. Data are available at the level of the state, district,
group, and individual student. Over 30 reports exist to compare data from individual schools and districts to state totals. After
receiving appropriate training, districts can load local data into the EDW and write their own reports. EDW training materials and
data can be accessed via the security portal. EDW Quick Tips and the EDW User Guide from the Information Service’s
Education Data Warehouse are available for download on the EDW webpage: http://www.doe.mass.edu/infoservices/dw/.
2. School/District Profiles: Sortable data reports on a variety of information, including enrollment, teacher data, and MCAS results.
Data are available at the level of the state, district, and group, but not at the level of individual student. Directories and reports
from individual organizations can also be found here: http://profiles.doe.mass.edu/.
The MA ESE also publishes an annual data collection schedule which includes forms, descriptions of data, technical guidance, and
access information for data that are transmitted from districts to the state:
http://www.doe.mass.edu/infoservices/data/schedule.html.
Data Element Release Source of Data User Guides and Other Resources
and Brief Description Schedule
Statistical Reports. The ESE provides Links to other ESE web pages with data and
statistical reports in the following areas: additional information on each type of statistical
Graduation rates report.
Grade retention reports Note: The Select Report list for School and District
Dropout rates See link in next Data Reports has some but not all of the same
Varies
Educator data column reports available in the Quick Statewide Reports
Enrollment data list on the School and District Data Reports page
Plans of high school graduates itself.
Student exclusions
School and district data reports
State profile
http://finance1.doe.mass.edu/statistics/
Statistical Comparisons. The ESE
provides statistical comparisons for districts
in the following areas: Links to data reports and supporting resources.
Some data sets allow for easy comparison to
Per pupil expenditure reports similar districts, and can be easily downloaded as
Enrollment trends See link in next Excel files.
Varies
Average teacher salaries column
Special education direct expenditure
trends
School and district data reports
State profile
Directions
1. The District Data Team should familiarize itself with the self-assessment instrument and delete any items necessary to adapt
it to the Team’s local situation.
2. As a team, determine who should participate in the survey process.
3. Distribute the instrument to the target audience.
4. Collect and tabulate the results.
5. Analyze the results to determine the effectiveness of the data collection, storage, and dissemination systems.
6. Recommend changes to improve the system as necessary.
This survey is designed to gather your perception of the efficiency and effectiveness of data collection, storage, and dissemination in
your district. Please share your perceptions by indicating your degree of agreement with the following statements.
Data Storage Data are added to the student information system 1 2 3 4 N/A
in a timely manner.
Purpose To communicate data availability and use broadly throughout the district. Related Documents
1–Getting Ready Module
Description This is an example of a Data Dissemination Schedule. 1.7.2T: Data Dissemination
Time Ongoing. Schedule Template
1.7.3R: ESE Policies for Data
Access
2009–2010
September Middle school performance of Principal Identify populations Conduct a data overview for each
incoming ninth graders by team Team leaders in need of school-level team
Grade 9 teachers intervention
Lists of students in at-risk Grade 9 team leaders Identify specific Develop interventions as
populations, e.g., Grade 8 and teachers students for necessary for identified students
Failures; Grade 8 Poor intervention at the team or grade level
Attendance.
Purpose To communicate data availability and use broadly throughout the district. Related Documents
1–Getting Ready Module
Description In this activity, a District Data Team will construct and publish a schedule for the 1.7.1R: Data Dissemination
distribution and use of major data elements. Schedule Example
Time Ongoing. 1.7.3R: ESE Policies for Data
Access
<<School Year>>
August
September
October
November
December
January
February
March
April
May
June
July
Purpose To connect districts to current ESE policies for data Related Documents
access in order to inform district policies on data 1–Getting Ready Module
access and dissemination. 1.7.1R: Data Dissemination
Schedule Example
Description These documents should be reviewed, and
1.7.2T: Data Dissemination
corresponding district policies developed, prior to
Schedule Template
disseminating data within the district.
Time N/A.
These ESE Education Data Warehouse resources provide background information necessary
for a District Data Team, in conjunction with the district’s Information Technology Department, to
assign data access consistent with federal, state, and local regulations.
Purpose This group of resources and tools will help a District Data Team build district-wide Related Documents
capacity to use data. 1–Getting Ready Module
1.8.2R: Assessment Glossary
Description This template will help the Team identify the training and support the district currently
provides for data literacy.
Time Ongoing.
Directions
1. Begin by listing the data used in your district. If you have completed 1.5.1T: Data Inventory, you can use that list of data elements. If you
have not yet completed a data inventory, you may want to begin by focusing this activity on the most commonly used data elements.
2. After you have listed the data, complete the next three columns.
3. As you consider each data element, make note of any opportunities for improvement.
Training or Resource
Data Element Provided Audience Department Responsible
Purpose This group of resources and tools will help a District Related Documents
Data Team build district-wide capacity to use data. 1–Getting Ready Module
1.8.1T: Data Literacy
Description This document outlines general assessment
Training Catalog
terminology.
Time Ongoing.
Note: These terms apply primarily to student assessment data, but could be extrapolated to apply to
other forms of data, such as those related to adult practice or district systems and processes.
Aggregated Data: Data that are presented in summary (as opposed to student-level data or data broken
down by student group).
Alignment: Judgmental procedures undertaken to ensure the content of state tests appropriately reflects
the knowledge, skills, and abilities articulated in the state’s content standards for each grade level and
subject area.
Benchmark: A standard against which something can be measured or assessed.
Cohort: A group of individuals sharing a particular statistical or demographic characteristic.
Decile: One of ten segments of a distribution that has been divided into tenths. The ninth decile shows
the number (or percentage) of the norming group that scored between 80 and 90 NCE.
Disaggregation: Summary data split into different subgroups, e.g., gender, race, ethnicity, lunch status.
Distractor: An incorrect option in a multiple choice test item.
Equating: A set of statistical procedures undertaken in order to a) adjust for differences in the difficulty of
different test forms for the same subject area and grade level from year-to-year (horizontal equating), or
b) scale test scores (and/or performance levels) so they have a consistent meaning across adjacent
grade levels (vertical equating, vertical scaling, vertical articulation or moderation).
Formative: Assessments at regular intervals of a student’s progress designed to provide information to
improve the student’s performance.
Gain Score: The difference between two administrations of the same test. A student can have either a
Inference: A conclusion that is drawn from a data set. The process of using data from a sample of
students to generalize to other similar groups of students, such as assuming the observed three-year
th
upward trend in 10 grade mathematics achievement will continue next year.
Measure: Outcome data that can be used to measure the performance of a student or group of students.
Median: The score that is the midpoint in a series of scores; half of the data values are above the
Slaughter, R. (2008). Assessment literacy handbook: A guide for standardized assessment in public
education. Portsmouth, NH: Public Consulting Group, Inc.
Massachusetts Department of Elementary and Secondary Education. (2008). Data Warehouse 102
Handbook: Understanding MCAS Reporting. Malden, MA: Author.
Note: Keep in mind that this activity is a good faith attempt to take into account the concerns of
constituents, but there is no way to know for sure without asking them directly. A District Data
Team may choose to follow in this protocol on its own, or to engage different stakeholders in the
process through surveys or focus groups. While the former approach may take less time, the
latter could generate valuable perspectives and ideas that the Team may not think of on its own.
Directions:
1. As a group, identify the stakeholders who will likely be impacted by the district’s
2. Individually review the seven stages of concern that individuals commonly experience in
response to a change effort. Record specific concerns the various stakeholders may
have for each of the stages. Be sure to include yourself and your own concerns.
Individuals may ask themselves ― What am I hearing from the field?‖
number of votes that represents 1/3 to 1/2 of the total ideas generated, e.g., if 21
ideas were generated, each member could get 8 votes. Members cast one vote for
each idea they see as a priority. Tally the votes to determine which ideas are seen
5. As a group, brainstorm ways to mitigate the impact of each of the prioritized concerns.
Record the suggested strategies on a new sheet of chart paper.
Note: Again, it may be useful to reference the strategies generated in 1.2.1T:
Barriers to Effective Data Use
6. Discuss, prioritize, and come to agreement on the strategies that make the most sense
to pursue at this time. Document these strategies and revisit them periodically, noting
concerns that get resolved, and new ones that may emerge. The group may want to
retain all of the notes from this discussion for future reference as well.
Introduction—1
Where Are We Now?—1
Module Objectives—1
Culture of Inquiry—2
The Data-Driven Inquiry and Action Cycle—2
Asking the Right Questions—3
Types of Questions—3
Question Formulation—4
Module Summary—13
Inquiry
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
MODULE OBJECTIVES
The Inquiry module will help a District Data Team:
Formulate questions to drive an inquiry process
Create and present effective data displays and data overviews
Identify the data needed to answer the questions
CULTURE OF INQUIRY
The Data-Driven Inquiry and Action Cycle drives the effective use of data
to answer critical questions about teaching and learning that result in
school improvement and higher achievement for all students. If the Team
asks the right questions, collects and analyzes appropriate data to
address the questions, views the information it has gathered in the
context of findings on research and practice to form an appropriate
knowledge base, and takes action on the knowledge it has gained, the
district and its schools will improve and its students will perform at higher
levels.
The modules in this Toolkit take you step-by-step through the Data-Driven
Inquiry and Action Cycle and provide you with the tools and resources
necessary to effectively implement this collaborative data use framework.
Module 2: Inquiry initiates this activity.
TYPES OF QUESTIONS
Educators ask questions about their district, schools, and students all the
time. The questions are based on their observations, experience, gut, and
hopefully, on data. The challenge is to craft meaningful questions to drive
the inquiry process that are based on all of these sources and that, if
answered, will significantly improve teaching and learning in the district.
Questions about factors that districts and schools can influence form the Questions about
basis for the action step of the Data-Driven Inquiry and Action Cycle. factors that districts
and schools can
The Team may also ask questions about factors that can have an effect
on teaching and learning, but that cannot be influenced or changed by influence form the
districts and schools. These questions are more descriptive in nature and basis for the action
help educators develop a better understanding of their students. This step of the Data-
understanding can provide insight into structures and strategies that can
Driven Inquiry and
be implemented which place teaching and learning in the context of
students’ experiences. Action Cycle.
QUESTION FORMULATION
Questions that a Team might want to explore can be formulated based on
some of the following considerations: demographics, perceptions, school
processes, and student outcomes. Each category provides the district a
framework for which it can begin to craft both focusing and clarifying
questions from the data gathered. Districts can then use those questions
to guide the next steps in the data review process.
Focusing Questions
Broad questions are called focusing questions. Focusing questions
provide a starting point to help a Team identify the data it will need to
begin its inquiry. By beginning with the broad categories above, a district
can begin the process of looking at data across sets of schools.
Are the programs for special populations effectively meeting their goals?
Throughout this Toolkit, the Team will use protocols to guide productive
discussions on a variety of topics. The Question Formulation Protocol
will help the District Data Team develop, organize, and prioritize
questions to structure its inquiry.
Clarifying Questions
Narrower questions are called clarifying questions. Focusing questions
provide a starting point to help a Team identify the data it will need to
begin its inquiry. Clarifying questions are generated in response to the
analysis of the initial data set and often require the collection and analysis
of additional data. In turn, based on this subsequent data collection and
analysis, original clarifying questions can become focusing questions for
the next phase of inquiry.
Help the Data Team coordinate efforts with other existing teams
Help the Team identify data that might be available to inform the
inquiry process
Districts and schools have many initiatives in place at one time. Adding
a new initiative that addresses a focusing question may be redundant if
the question is already being effectively addressed by an existing
initiative. The Inventory of District and School Initiatives will identify
current initiatives and will provide data on the effectiveness of the
implementation of those initiatives.
To achieve this objective, the District Data Team must build user-friendly
data displays that tell a valid and interesting story about the focusing
question. The District Data Team must then involve stakeholders in the
collaborative analysis of the data and the creation of clarifying questions.
The specific content of a data overview will vary based on the audience
(administrative team, full faculty, department faculty, specialists), but the
purpose and structure remain constant. A typical data overview meeting
will contain the following sections.
The data overview should result in at least two specific outcomes. The set
of clarifying questions developed through the brainstorming protocol and
the identification of related data help guide the next steps in the inquiry
process. Additionally, as the group engages with the data and formulates
hypotheses and clarifying questions, they increase their capacity for
inquiry and become invested in the process. This buy-in is critical for
subsequent processes and is crucial toward creating a district-wide
culture of inquiry.
In this activity, you will review and critique a sample data overview
presented by the Scenic Cove District Data Team. Review the
PowerPoint presentation and use the Data Overview Checklist to
determine if all of the essential elements are present. As a District Data
Team, discuss how the Scenic Cove School District ELA Data
Overview could be improved.
a framework for the Team to assess the quality of the data displays it
creates. The Types of Data Displays and More Data Display Resources
provide some ideas for different ways that data can be represented.
With the data displays freshly made and assessed and the presentation
assembled, the Team is ready to engage a larger audience in the inquiry
process. Plan the meeting and deliver the data overview that the Team
has created.
These tools will help the Team deliver a data overview and follow up
afterwards. The results of this work will lead the Team into Module 3:
Information and serve as the foundation for the rest of the inquiry cycle
throughout this Toolkit.
MODULE SUMMARY
The module includes tools to help a District Data Team build its capacity
to design meaningful data displays and present them in an effective data
overview to targeted audiences. This approach can help engage
stakeholders in the inquiry process, as well as inform the process of
generating clarifying questions that refine the focus of the inquiry and
identifying the data needed to provide insight on those questions.
A District Data Team should emerge from this stage of the process with
clearly articulated focusing and clarifying questions, as well as a list of
data it plans to collect and analyze to answer those questions.
REFERENCES
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Directions
1. Identify an issue in your district that you as a District Data Team wish to address. Write
the issue on the top of a piece of chart paper. It can be formulated as a statement or
question. Your issue/question should be related to student outcomes.
5 minutes
2. As a Team, brainstorm questions that stem from the original question/statement. Write
the questions as stated on the chart paper. All items must be phrased as questions.
Your questions should be related to student outcomes.
15 minutes
3. From this group of questions, identify three questions that deal with issues that the
district has control over and which, if positively resolved, will have a significant impact on
teaching and learning. Out of these three, identify the top priority question.
10 minutes
4. Your top priority question should serve as the focusing question to initiate the Data-
Driven Inquiry and Action Cycle.
Purpose To identify data sources and help avoid redundancy in relation to district/school initiatives Related Documents
that address the focusing question the District Data Team has decided to investigate. 2–Inquiry Module
Description Districts and schools may have multiple initiatives in place at one time. Adding a new
initiative that is potentially related to or has an impact on the focusing question may be
redundant if the question is already being effectively addressed by an existing initiative.
The Inventory of District and School Initiatives will identify current initiatives and will provide
data on the effectiveness of the implementation of those initiatives.
Time Approximately 1 hour.
Directions
As a District Data Team, think about the initiatives/programs that are currently part of the improvement efforts in your district. List
each initiative/program in the Inventory of District and School Initiatives on the following page. For each, provide the information
indicated in the columns to the right of the initiative name. The District Data Team may have to call upon others in the district to help
provide the required information.
After you have gathered the required data on each initiative, determine which initiative/program(s) is directly related to your
focusing question.
For the related initiatives/programs, consult the Effectiveness of Implementation and Desired Outcomes columns (3–5) to
determine which appear to be addressing your focusing questions effectively.
If, as a District Data Team, you feel you need to gather more data to determine effectiveness, collect the data and re-evaluate
the initiatives.
If the consensus of all relevant parties is that the initiative is achieving the desired result, select a new focusing question. If
not, move forward with the inquiry.
District Name:
Teachers Extent of
Implementing Implementation
4 = All (100%) 4 = Complete
Staff 3 = Most (>75%) 3 = Progressing Other Evidence that
Name of Instructional Responsible for 2 = Some (25–75%) 2 = Partially/Weak Evidence of Desired Would be Helpful to
Initiative Implementation 1 = Few (<25%) 1 = Just beginning Outcomes Collect
Purpose To provide the District Data Team with an example of a data overview presentation. Related Documents
2–Inquiry Module
Description In this activity, you will review and critique a sample data overview presented by the Scenic 2.3.2R: Scenic Cove School
Cove District Data Team. Review the PowerPoint presentation and use the Data Overview District ELA Data
Checklist to determine if all of the essential elements are present. As a District Data Team, Overview
discuss how the Scenic Cove School District ELA Data Overview could be improved.
Time Approximately 1 hour.
Purpose To provide the District Data Team with an example of a data Related
overview presentation. Documents
2–Inquiry Module
Description In this activity, you will review and critique a sample data 2.3.1T: Data
overview presented by the Scenic Cove District Data Team. Overview
Review the PowerPoint presentation and use the Data Checklist
Overview Checklist to determine if all of the essential
elements are present. As a District Data Team, discuss how
the Scenic Cove School District ELA Data Overview could be
improved.
Time Approximately 1 hour.
Release 2009
W/F 61.1% 52.2% 8.9%
8 A/P 7.8% 12.9% -5.1%
(N = 116)
NI 20.7% 27.0% -6.3%
W/F 71.6% 60.1% 11.5%
10 A/P 3.0% 21.8% -18.8%
(N = 66)
NI 19.7% 36.8% -17.1%
2.3.2R: Data Overview Example—Version 1.0
W/F 77.3% 41.4% 35.9% 7
District State Difference
Tests Administered
2009 2009
as values
3 A/P 27.1% 26.2% 0.9%
Why is the (N = 70)
NI 38.6% 40.7% -2.1%
performance of ELLs W/F 34.3% 33.0% 1.3%
in some grades 4 A/P 14.6% 18.0% -3.4%
(N = 82)
closer to the state NI 43.9% 46.8% -2.9%
Tests Administered
2009 2009 2009 2009 2009
as Values
3 N 4 21 0 0 32
A/P 100.0% 23.8% 21.9%
NI 0.0% 23.8% 56.3%
W/F 0.0% 52.4% 21.9%
4 N 6 2 4 2 49
A/P 16.7% 0.0% 0.0% 0.0% 18.4%
NI 83.3% 0.0% 100.0% 100.0% 34.7%
W/F 0.0% 100.0% 0.0% 0.0% 46.9%
5 N 3 5 15 42 0
A/P 100.0% 0.0% 0.0% 2.4%
NI 0.0% 0.0% 60.0% 31.0%
W/F 0.0% 100.0% 40.0% 66.7%
*Minimum 10 Students
Source: Data Warehouse > Public Folders > ESE Cubes > MCAS Official Release 2009
Coastal MS Rock MS
Tests Administered
2009 2009
as Values
6 N 53 8
A/P 13.2%
NI 34.0% 25.0%
W/F 52.8% 75.0%
7 N 70 2
A/P 10.0%
NI 30.0%
W/F 60.0% 100.0%
8 N 95 18
A/P 2.1% 27.8%
NI 20.0% 22.2%
W/F 77.9% 50.0%
*Minimum 10 Students
Source: Data Warehouse > Public Folders > ESE Cubes > MCAS Official Release 2009
Ebb Tide HS
Tests Administered
2009
as values
10 N 66
A/P 3.0%
NI 19.7%
W/F 77.3%
*Minimum 10 Students
Source: Data Warehouse > Public Folders > ESE Cubes > MCAS
Official Release 2009
All Students - SC 20 18 32 23 7
First Year - SC 45 23 22 6 3
Second Year - SC 10 19 34 27 10
Third Year - SC 4 16 38 35 8
Fourth Year - SC 3 6 38 38 16
Source: Massachusetts English Proficiency Assessment (MEPA) Statewide Results: Spring 2009; Spring 2009 MEPA Results by District
http://www.doe.mass.edu/mcas/mepa/results.html
2.3.2R: Data Overview Example—Version 1.0
Brainstorm Groups
Purpose To build data displays based on data related to the Related Documents
focusing question. 2–Inquiry Module
2.4.2R: Data Display Rubric
Description This activity enables District Data Team members to
2.4.3R: Types of Data
apply the principles of data display construction to tell a
Displays
story related to the focusing question.
2.4.4R: More Data Display
Time Approximately 1 hour. Resources
During this activity, the District Data Team will work collaboratively to create a strong data display for
the selected focusing question. Before beginning, review and become familiar with the related tools
and resources listed above. These should be used each time you prepare a data display.
Directions
1. As a District Data Team, restate the focusing question you crafted in 2.1.1.T: Question
For example: How did the achievement of the population of English language learners
vary across all schools in the Scenic Cove School District?
2. Examine a few sources of high-level district data, such as MCAS/AYP reports or student
growth data. Think individually, then discuss as a Team: What do you see in the data that
relates to the focusing question? As a Team, brainstorm and chart the collective observations
for all to see. Be sure to make only factual observations and interpretations about what the
data appear to say—don’t make inferences from the data.
3. Each member of the District Data Team should now sketch on a piece of chart paper a data
display that illustrates what s/he thinks are the most important observations. Refer to 2.4.3R:
Types of Data Displays for guidance regarding the construction of data displays. Post the data
displays for the whole District Data Team to review.
4. Each District Data Team member should present one data display to the balance of the Team
or to one of the small groups. Number the data displays to identify them later.
5. Each presenter should ask group members what they see in the data (observations, not
inferences). Presenters should record observations on chart paper for each display. (5–10
minutes).
What story s/he wanted to tell with the display that s/he selected
What clarifying questions the display elicits for him/her (5–10 minutes)
7. After each presentation, each person fills out the Data Display Rubric for each data display,
including the presenters.
9. Think individually and discus as a Team: How do the sketches compare? Be sure to record
answers for future reference.
10. Regroup as a District Data Team. Review the feedback on each data display. Spend about
5–10 minutes digesting and summarizing the feedback. Note common themes across data
displays. Discuss the various sketches that Team members created and reach consensus as
a District Data Team on the data displays that best communicate the story to be told about the
data.
Alternative Approach
1. Have District Data Team members work in pairs to collaboratively develop each data display.
2. Often there is more than one story to be told by a set of data. See how many different valid
and interesting stories about the data can be told using different data displays.
Purpose To assess the quality of data displays and gain Related Documents
feedback to improve them. 2–Inquiry Module
2.4.1T: Building Data
Description This rubric can be used to assess the quality of a data
Displays Protocol
display. It can be used with the Data Display Feedback
2.4.3R: Types of Data
Protocol to gain group input, or can be used as a tool
Displays
for individual reflection.
2.4.4R: More Data Display
Time 15 minutes. Resources
Use the scale provided below to rate each of the following statements about the data display.
4 = Excellent: No 3 = Good: Some 2 = OK: Moderate 1 = Not So Good:
change needed changes needed changes should Needs extensive
be made rework
Question Response
4. What do you like about this data
display?
school year?
Either form of stacked bar chart can help answer questions such as:
Which performance category has the highest concentration of students receiving FRPL?
Which grade level has the highest concentration of lower-performing students?
Multiline Chart
A multiline chart is similar to a clustered bar chart,
except that the data are represented with lines
rather than bars. As with the single line chart, some
people like to use multiline charts when
representing data across a time scale (as in the
example).
A multiline chart can help answer questions like:
Are we closing the achievement gap
Correlation Chart
A correlation chart allows you to
examine the relationship between two
different measures using two different Y
axes. The first measure appears as a
bar chart whose scale is on the left Y
axis. The second measure appears as a
line chart whose scale is on the right Y
axis.
A correlation chart allows you to answer
questions such as:
What is the distribution of % Correct compared to the number of tests administered across
grade levels?
What is the relationship between the number of correct items and the number of possible
items?
Pie Chart
A pie chart shows part-to-whole relationships.
Pie charts show the relative distribution of
performance for a specific population across
performance categories, which sum to 100%.
Pie charts can answer questions such as:
What was the relative distribution of student
scores across performance levels for a
specific subgroup?
Which subgroup had the highest proportion
of students achieving Proficiency?
3D Bar Chart
A 3D bar chart is helpful when you want to visually represent a data set across multiple
categories. It allows you to see the relationships and trends in a data set across three
dimensions.
A 3D bar chart allows you to answer
questions such as:
Where are our greatest
achievement gaps?
What do year-to-year trends tell
us about the learning needs of
different subgroups of students?
In which subject areas and grade
levels do we have the greatest
concentration of lower performing
students?
Purpose To provide a structure that enables all members of the Related Documents
target audience to become familiar with the focusing 2–Inquiry Module
question, engage with relevant data, and help further 2.5.2T: Focusing Question
the inquiry process. Investigation
Template
Description Use this protocol to facilitate the center of your data
overview presentation. The brainstorming activity
provides an opportunity for the target audience to
collaboratively interact with the data displays
associated with the focusing question. Through this
collaborative inquiry, the audience will identify
problems revealed by the data, develop hypotheses
about the cause of the problem, craft clarifying
questions to extend the inquiry process, and identify
data needed to address those questions.
Time 30–45 minutes.
Directions
Divide the target audience into groups of 4–5 people. Provide sticky note pads, chart paper,
markers, and large copies of the data displays for each group. Provide a facilitator for each group.
1. Write the focusing question on the top of a sheet of chart paper. Check to make sure
each person understands the question.
2. Post the large copy of the data display (or displays) for the group to view. These may
have been created in 2.4.1T: Building Data Displays Protocol.
3. Ask individuals to silently observe the data and record objective, factual observations
about what the data say in the data display. Ensure that all have adequate time to
process the information and ask clarifying questions if necessary.
4. Ask individuals to share their observations with the group. Record the observations on
chart paper with the focusing question next to the display. Highlight observations that
represent ―problems‖ revealed by the data.
5. On a new sheet of chart paper, the group should write the title–Hypotheses about
Possible Causes. They then brainstorm hypotheses about the causes of the ― problem(s)‖
revealed by the data and record them on the chart paper.
6. As a group, then write the title—Clarifying Questions—at the top of a new sheet of chart
paper.
8. As a group, review the questions and group similar questions together if possible.
Develop a title for each group such as: Questions about Achievement; Questions about
Relationships among Variables; etc.
9. Reach consensus on the clarifying questions that seem most appropriate to move the
inquiry deeper. Record these questions on a new piece of chart paper. Leave room
between questions on the chart paper, or put each question on a separate page.
10. Under each question, identify the evidence (data elements) that needs to be collected to
address each of the clarifying questions. If possible, note where each piece of data can
be found and how it can be collected.
11. Share the clarifying questions and additional data elements needed with the whole
group. The District Data Team will record the questions and data elements on a sheet of
chart paper for the whole group to see.
12. Use template 2.5.2T: Focusing Question Investigation Template to record the key ideas
for future reference.
Note: The District Data Team may choose to close the meeting at this point, or the Team
may ask the group to help prioritize the clarifying questions that would be most useful and
meaningful to extend the inquiry process. Either way, the Team should clarify next steps for
how the inquiry process will move forward, and how the stakeholders in attendance at this
data overview may be impacted.
Focusing Question
1.
2.
3.
4.
2.
3.
4.
*Access refers to the degree to which the data are available to District Data Team members. Rate
Access on a scale of 1–4 (1 = hard to access; 4 = easily accessible) or N/A if the needed data element is
not currently being collected.
*Access refers to the degree to which the data are available to District Data Team members. Rate
Access on a scale of 1–4 (1 = hard to access; 4 = easily accessible) or N/A if the needed data element is
not currently being collected.
*Access refers to the degree to which the data are available to District Data Team members. Rate
Access on a scale of 1–4 (1 = hard to access; 4 = easily accessible) or N/A if the needed data element is
not currently being collected.
*Access refers to the degree to which the data are available to District Data Team members. Rate
Access on a scale of 1–4 (1 = hard to access; 4 = easily accessible) or N/A if the needed data element is
not currently being collected.
Table of Contents
Introduction—1
Where Are We Now?—1
Module Objectives—1
Preparing Data For Analysis—2
Analyzing Data—4
Analyzing Assessment Data—5
Incorporating Other Types of Data—6
Facilitating the Process—7
Module Summary—8
Information
3.1.1T: Data Collection Planning Tool Also revisit tools from Inquiry:
3.2.1T: Practice Making Valid 2.4.1T: Building Data Displays Protocol
Inferences 2.4.2R: Data Display Rubric
3.3.1T: Data Analysis Protocol 2.4.3R: Types of Data Displays
2.4.4R: More Data Display Resources
District Data Team Toolkit—Version 1.0 Module 3: Information
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
Raw data alone does not support the inquiry process. Central to turning
raw data into information is the process of data analysis. The Information
module can help a District Data Team build its capacity to analyze data
by considering the appropriate use of assessment results and the
formation of valid inferences.
Through the data overview process introduced in the Inquiry module, the
District Data Team identified the data needed to move the inquiry process
forward. Next the Team must collect and organize the data in order to be
poised to analyze the data and make meaning from them.
MODULE OBJECTIVES
The Information module will help a District Data Team:
At this stage of the process, the District Data Team should develop a list
of data needed to address the clarifying questions related to its focus of
inquiry. Now the Team must actually collect the data and organize these
in a meaningful way that promotes rigorous analysis. The Focusing
Questions Investigation Template (2.5.2T) and the Data Inventory
Template (1.5.1T) can be useful in taking this next step.
This tool will help guide the collection of specific data needed to answer
a focusing question and related clarifying questions.
Once data are collected, the Team will want to display the data in a
meaningful way that prompts curiosity and allows viewers to make
comparisons and inferences about causality. Displays should show as
much information as possible in as small an area as possible, without any
distractions or extraneous information. The Inquiry module introduced a
variety of data displays that may be useful for the Team to revisit at this
stage in the process. In addition, the Team may want to consider the
following questions:
a framework for the Team to assess the quality of the data displays it
creates. The Types of Data Displays and More Data Display Resources
provide some ideas for different ways that data can be represented.
The first step in data analysis, as described in the data overview process
in the Inquiry module, is the objective description of what the data say.
What patterns and trends are evident in the data? It is very important to
Approaching data focus on this first step before making inferences or drawing conclusions
with a truly open from the data, because clarifying questions often need to be posed and
additional data collected before valid inferences can be made. Colleagues
mind takes practice
on a District Data Team can play an important role in helping each other
and discipline.
use language that is as specific and objective as possible when
discussing information and data. For example, helping each other
distinguish between observations and inferences:
Example: About one third of our students are not on track to meet the
mathematics criteria for graduation.
Both observations and inferences play crucial roles in the data analysis
process. What is important is to distinguish between the two. The Team
should be sure to rigorously examine the data for patterns, trends, and
outliers that can be factually explained, prior to making any inferences or
conclusions about what those patterns may mean.
During this activity, the Team will view multiple data displays and check
the inferences made by another data team for validity.
This activity also appears in the ESE Data Warehouse course DW 102.
The Data Displays used are all “Pre-defined Reports” from the ESE
Data Warehouse. You may want to revisit Activity 1.5 Assessment
Literacy.
Questions the Team might ask when triangulating across data sets
include:
These protocols can guide the District Data Team in the process of
analyzing data from non-traditional and/or multiple sources.
Be prepared to be surprised
Think ahead about what the group might want to report out to
others in the district and how, and look for ways to generate
reports and visuals as part of the discussion process. For
example, might the group want to leave certain flip charts up for
display and public comment? Would it help to type notes directly
into a laptop so they don’t need to be rewritten later?
MODULE SUMMARY
The Information module provides concepts and tools that enable the
District Data Team to further the inquiry process introduced in the
preceding module, Inquiry. It offers guidance for collecting data
specifically related to the focusing and clarifying questions generated in
the Inquiry module, and revisits tools from that module to guide
meaningful displays of that data.
The District Data Team should emerge from this stage in the process with
inferences or conclusions drawn from the data analysis process, and
perhaps also with some new questions for consideration.
All of this work sets the stage for the next module, Knowledge, which will
help the Team place the information that it gathers in the context of
research and practice literature, as well as local knowledge and expertise.
This will help the Team narrow and refine its focus even further as it
moves toward identifying strategies and actions steps to address the
problems that it has identified.
References
1
Adapted from Tufte, E. (2009, November 10). Presenting Data and
Information. Boston, MA.
2
Adapted from Bernhardt, V.L. (2004). Data Analysis for Continuous
School Improvement. Larchmont: Eye on Education.
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Purpose To guide the collection of specific data needed to answer a focusing question Related Documents
and its related clarifying questions. 3–Information Module
Description Use this template to identify who will collect specific data needed for analysis.
Time 30–90 minutes
Instructions: In the table below, begin by listing the specific data elements needed in order to address each of the clarifying
questions in your inquiry process. If the Team has completed 2.5.2T: Focusing Question Investigation Template, it can simply use
the list of data documented there.
For each data element, indicate the required information. A Data Inventory (1.5.1T) can help identify the location/owner of the data.
For this stage in the process, the most important details to note are who will collect the information, by when, and in what format.
Data Element Needed Location/Owner Who Will Collect By When? In What Format?
It for the Team? Paper, Electronic, etc.
Description This activity can be used within your Data Team or with 3–Information Module
other audiences to improve data analysis skills. During
this activity, you will have the chance to view multiple
data displays and ―che ck‖ the inferences made by
another data team for validity.
Time About 30 minutes.
SCENARIO
The Data Team in District A wanted to examine the performance of 8th grade
students on the 2007 MCAS ELA and Mathematics tests. The Team posed this
focusing question.
FOCUSING QUESTION
How did our 8th graders, district-wide, perform on the 2007 MCAS tests?
4
District Performance Distribution Report (R-303)
Observation Not
Statement or True Necessarily False
Inference? True
1A. Our students are smarter in
English than they are in
mathematics.
1B. Compared to the state, our
students performed poorly in
mathematics.
1C. About one third of our students
performed below proficient in
mathematics.
SCENARIO
CLARIFYING QUESTION
How did the mathematics performance of the 8th graders in our district change
over the past three years?
3.2.1T: Practice
Making Valid 7
Inferences—
Version 1.0
District Distribution by Year Report (R-305)
Observation Not
Statement or True Necessarily False
Inference? True
2A. Students who were in 8th grade
in 2007 made gains from year-
to-year since 2005.
2B. 8th grade performance has
improved from year-to-year.
2C. Our year-to-year trend
performance follows the state’s
trend performance.
SCENARIO
The District Data Team reviewed the longitudinal performance of the district’s
students and concluded that the percent of students scoring at the lowest level
decreased each year and the percent scoring at the Advanced level increased
dramatically in 2007. This was encouraging, but the Team felt that performance
could still be improved. The Team formulated the following clarifying question.
CLARIFYING QUESTION
With which specific strands and standards did the students have the most
difficulty?
3.2.1T: Practice
Making Valid
Inferences— 10
Version 1.0
District Standards Summary Report (R-306)
Observation Not
Statement or True Necessarily False
Inference? True
3A. Our students performed better
than students statewide in each
of the strands.
3B. Out of all the strands, our
students performed worst in
Measurement.
3C. Compared to student
performance statewide in the
strand Patterns, Relations,
and Algebra, our students
performed the best on the
symbols standard.
3.2.1T: Practice Making Valid Inferences—Version 1.0 11
Scenario #4
SCENARIO
The review of the District Standards Summary Report helped the Team
determine specific areas where the students were weak on the 2007 test. The
Team delegated several members to review this report for the three prior years
to see if these strands were problems for the students on those tests.
The Data Team also wanted to learn more about the performance of subgroups
on specific test items. The Team posed the following clarifying question.
CLARIFYING QUESTION
How did the ELL students in our LEP program perform across all test items as
compared to all students in the district and in the state?
Observation Not
Statement or True Necessarily False
Inference? True
4A. The performance pattern of all
students in our district follows
the state more closely than the
pattern for LEP students.
4B. Item 36 is the most difficult
item.
4C. Our LEP program is not
preparing our students
adequately.
Instructions:
Reflecting on the statements that have been made using the data reports:
• Which are Observations and which are Inferences?
• Which are True, False, or Not Necessarily True (more data needed)?
• What clarifying questions would help make better inferences?
Scenario #1
1A. Inference (NNT) – The English (reading comprehension and writing) assessments are developed to
assess completely different knowledge and skills, so a mathematics score cannot be directly compared
to an English score.
1B. Inference – (F) The statement is false because our students performed BETTER than the state at each
performance level. It is an inference because poorly is a conclusion that is not factually precise.
1C. Observation – (T) 29% of our students performed below proficient.
Scenario #2
2A. Inference – (NNT) While the data do show increases in MCAS performance from 2006 to 2007, there
could be slight differences in the student cohort due to changes in population since 2005. Additionally,
the data do not reflect individual student growth over time, only MCAS scores for a class from one year
to another. You would need more data to be sure.
2B. Observation – (T) The percent of students below proficient decreased over time. Caveat–again, these
are different groups of students.
2C. Observation – (F) For the first two years, our students showed a small decrease in the percent of
students below proficient, while the state stayed at about the same level (percent at warning actually
increased slightly). In the most recent year, there was a decrease in percent below proficient at the
state level, but a much larger decrease among the tested students in District A.
3.2.1T: Practice Making Valid Inferences—Version 1.0
15
Possible Answers (continued)
Instructions:
Reflecting on the statements that have been made using the data reports:
• Which are Observations and which are Inferences?
• Which are True, False, or Not Necessarily True (more data needed)?
• What clarifying questions would help make better inferences?
Scenario #3
3A. Observation – (T) A larger percentage of District A students was successful in each strand than
students statewide.
3B. Observation – (T) Relative to all other strands, our students did indeed score the poorest in
Measurement.
3C. Observation – (F) They performed best in Models relative to the state (10 percentage points
difference).
Scenario #4
4A. Observation – (T) LEP student pattern is up and down and district pattern and state pattern are very
similar overall. Stress that this is probably due to the relatively small size of the population. Smaller
populations show greater variation.
4B. Inference – (NNT) A factual interpretation (observation) is that the LEP group and the State scores
are lowest for Item 36, but not for the District. It is an inference that this is the most difficult item for
these groups, as there could be other reasons why so many students scored low on it.
4C. Inference – (NNT) You can’t infer this from the data. For example, the most difficult items for the
LEP group may be those that have the most language, such as story problems. The next step is
looking at the actual items and drawing conclusions about what might have made the items difficult
for the LEP subgroup. Also, the LEP students did better than the other two populations on several
items.
3.2.1T: Practice Making Valid Inferences—Version 1.0
16
D A T A A N A L Y S I S P R O T O C O L 3 . 3 . 1 T
1. Write the question(s) being analyzed at the top of a piece of chart paper. Check to make
sure each person understands the question. (1–5 minutes)
2. Distribute copies of the data in either graphical or numerical displays to each member
of the Team. Ask each person to silently observe the data by taking notes and jotting
observations. (5 minutes)
Note: By this point, the Team may have three levels of data: high-level data that spurred
the inquiry in the first place; data used in the data overview (2.3.1) to generate clarifying
questions; and even more specific data collected subsequently to address these
clarifying questions. In some cases, the first two data sets may be fairly similar.
Engaging with all data sets simultaneously can better poise the group to see patterns,
trends, and outliers that had not previously been evident.
3. Observe: (15 minutes) Ask Team members to take turns (round-robin fashion) and
report one of their observations. Observations should be facts or evidence that can be
readily seen in the data and stated without interpretation.
Instruct participants to use a sentence starter like one of the following to keep the
observations factual:
I see…
I observe…
I notice…
Capture the observations in list form on the chart paper as quickly as possible and
without comment. Capture questions on a separate sheet. Continue until all Team
members have reported all of their observations. (Note: During this step, it is acceptable
for Team members to make observations based on those made by others in the group.
Allow the process to proceed as long as logical and factual observations can be made.)
Note: It is often helpful to make a very distinct transition from the observation stage to
the interpretation stage, clarifying when the group can begin to allow statements that
may not be factually based.
4. Interpret: (20 minutes) Ask each Team member to review the entire list of
observations. Working together, code (or group) the observations into categories of
findings. To facilitate this process ask questions such as:
What assumptions might be underneath what we are noticing in the data?
What clues help explain why a certain population is meeting or missing targets?
Which of these observations are most relevant and important to our inquiry?
And finally:
Based on our observations, what do we know now?
5. Extend: (10 minutes) On a new piece of chart paper, write ― New Questions and
Conclusions.‖ Work as a group to identify new questions that this analysis has raised
and any possible conclusions that have been identified. The questions may serve as the
basis for another round of analysis, so it may be helpful to conclude by prioritizing them.
Any conclusions will become the basis for subsequent action.
Table of Contents
Introduction—1
Where Are We Now?—1
Module Objectives—1
Clarifying the Problem—2
Writing a Problem Statement—2
Root Cause Analysis—4
Understanding Root Causes—4
Overview of Root Cause Protocols—6
Facilitating the Process—8
Connecting to Research and Local Knowledge—10
Connecting to Research and Local Expertise—10
Cataloguing Problems Under Investigation—12
Module Summary—13
Knowledge
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
The most important parts about this transition from analysis to action is
taking time to make sure all members of a group are clearly in agreement
on the problem being addressed, and that an effort is made to connect
the problem to research and to other district efforts to solve the same
problem. Being purposeful during this step helps a Team avoid repeating
past mistakes and strengthens its ability to take effective action.
MODULE OBJECTIVES
The Knowledge module will help a District Data Team:
A problem Teams that want to explore the questions that emerge from the data
statement can help analysis may want to engage with the root cause activities outlined later
the Team focus its in this module to gain new perspectives on the factors that may explain
the patterns, trends, or aberrations evident in the data. If this process
work prior to
does not help the Team gain agreement on the problem to be addressed,
validating potential then it will likely reveal a need for more data or different questions, which
solutions with would cycle the Team back to the Inquiry stage of the process.
research and then
Teams that emerge from the data analysis in the Inquiry module with
moving on to action.
strong conclusions may be ready to move toward planning action by first
crafting a problem statement.
From this angle, when identifying the solution to address the underlying
problem, a district would be wise to first consider how it can reallocate
existing resources and improve existing initiatives. The inclination is often
to identify new strategies or initiatives, but the Team should first evaluate
the efficacy and impact of current initiatives before adding new ones. A
new initiative should be added only if it is unarguable that the need exists.
The collaborative tools shared here are meant to help the Data Team
understand and agree on the issues that are most responsible for the
problems it has have identified, in order to begin planning well-considered
and researched strategies and engage people in the process of changing
practice. These activities are not intended to be used to place blame on
anyone in the system, but rather to understand where the most energy
and attention should be placed in order to get different results.
It should be noted that root cause activities are useful for analyzing the
factors that contribute to success, as well as those that contribute to a
problem. For example, if an initiative produced very strong results,
engaging in these activities could help the Team capture lessons to scale
up in other areas of the district’s work.
In choosing the approach that is best for the situation, the Team will want
to consider the complexity of the problem and the depth of additional
analysis needed in order to gain agreement on the root cause. It will also
want to ensure it has the time and facilitation skills required to conduct the
activity successfully.
After completing any of the Root Cause Analysis Protocols, the Team
should return to the Writing a Problem Statement worksheet and prepare
a newly aligned view of the problem and potential solution.
These activities can facilitate the Team’s discussion of root causes. The
a collaborative way. The Team should select the approach that seems
best for its particular group or situation, or create its own using these as
templates.
(4.2.2T: 20 Reasons)
The Team should think strategically about which groups to involve in the
process of root cause analysis. While the District Data Team on its own
could likely generate valuable insight on a problem, it is often best to
engage those closest to the problem in the identification of the root
causes that, if addressed, would improve the situation. As well-
intentioned as the Team may be, it may miss valuable information by not
going closer to the source.
Each of these root cause protocols is based on the premise that adult
behavior and district processes impact student learning outcomes. If we
If we believe that believe that all students can learn, and they aren’t, then we need to look
all students can at what we can do differently. While some root causes may indeed be out
of the district’s hands, such as student mobility or the effects of poverty,
learn, and they
the District Data Team needs to look very closely at how the district
aren’t, then we need conducts the business of educating students and what aspects of this
to look at what we work may or may not be contributing to the problem at hand. When done
can do differently. well, engaging in a root cause activity can promote honest and
sometimes difficult conversations about how personnel in all corners of
the district conduct their work, including the members of the Team itself.
However, participants frequently disagree about root cause explanations
for the original problem, the sequence of causes and effects, or the
relative importance of various possible causes during the brainstorming
phase of this activity. The group may even come up with explanations
that are directly contradictory to one another. This has some important
implications for facilitating the process of discussing root causes.
is more likely to be true. If this does not happen, the Team can
consult research and local knowledge for more insight
Many of the ideas generated in the activities should be regarded
as biases, opinion, or conjecture until proven otherwise with data
or research. The Team must objectively look at the assumptions it
holds and check them against research, data, and expert opinion.
Just because everyone in the group happens to agree does not
mean any given potential root cause is right
In fact, unlike cars that won’t start, we often don’t really know the
right answer. The Team has to pick one potential root cause that,
based on data analysis, research, and local knowledge, seems
like it may make the most impact, and then try to resolve it. By
monitoring progress, evaluating results, and continuing the inquiry
process, the Team can model the truly adaptive nature of
education where educators learn the work by doing it, and develop
the answers together along the way
When designing the format for a discussion of root causes, a district may
want to assign a facilitator who can help the group with these key points,
as well help the Team:
Once the Team has clearly defined the problem and everyone has agreed
to a general strategy to alleviate it, the Team might feel ready to move
straight to building an action plan with specific goals, timelines, and data
collection points. But, before moving on, it is important to begin making
connections to research and local knowledge, looking outward for
information that might be helpful in shaping the Team’s work.
Up to this point, the Team has worked on its own—or perhaps with some
input from stakeholders—to identify the underlying problem and a
proposed solution. Taking time to consult local experts, research
literature, and others outside the District Data Team who have gone down
the same path, can increase the effectiveness of the plan for action, as
well as increase its credibility and validity.
When consulting research, the District Data Team should be mindful that
the Internet makes it much easier to connect to a wide range of scholarly
research—however, not all research is good research. The District Data
Team has a responsibility to ensure that the research it uses is credible,
and as such should look for research from credible independent sources.
When identifying local knowledge and expertise that can further clarify the
problem and aid the development of an effective action plan, the Team
may want to consider:
This activity helps the Team identify assumptions that need to be checked
and questions that need to be answered about a problem or potential
intervention. Several websites are provided to help connect the Team to
related research.
The next activity is designed to give the District Data Team a systematic
way to capture problems. The Team should begin by documenting
information relevant to the inquiry process in which it is engaged. Over
time the Team can collect information from other teams, as well as from
its own subsequent inquiry processes. The way the Team elects to
capture and share this evidence and knowledge is highly unique to its
local systems, personnel, time, and resources. As the Team engages with
the template, it will likely want to refine the categories and format to suit
local needs and initiatives. However, beyond determining the exact
headers on a template, the Team also needs to make a long-term plan for
collecting, storing, and using this information.
MODULE SUMMARY
The District Data Team should emerge from this stage in the process with
a clearly articulated problem statement that outlines the original problem,
the suspected cause, the goal for improvement, and a proposal for
moving forward.
The next module, Action, guides the Team in articulating a logic model
and crafting or revising a plan to take action on the identified problem.
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Stakeholders who are most Who is most directly impacted by this problem? Alternately, who
affected by the problem would benefit the most if this problem were resolved?
Based on the data analysis and/or the root cause analysis, what
Suspected cause of the does the Team think is the most significant cause(s) contributing to
problem this problem? What, if addressed, would make the greatest impact
on resolving the problem? (Include specific evidence).
Goal for improvement and The wishes, dreams, and general vision describing the target. The
long-term impact Team will write a clearer, measureable goal statement in Module 5.
Tie the above statements into 3–5 coherent sentences that could be
Final problem statement
easily understood by a wide range of stakeholders.
1
Adapted from Sagor, R. (2000). Guiding School Improvement Through Action Research. Association for
Supervision and Curriculum Development, Alexandria, VA.
Goal for improvement and We want all our third graders to read at grade level or above.
long-term impact
Type of problem
Directions
Why, Why, Why? is a relatively quick, informal way to identify root causes of problems. Start by
writing the problem being addressed and then ask the group to give a reason for ― Why this
might be happening?‖ Record the answer after the first ― Because‖ and then ask the question
again in reference to the first ―Because.‖ Repeat the process three to five times, asking ―Why?‖
for the previous ― Because‖ until the group feels that it has arrived at the root cause of the
problem. If after three to five questions and answers, the group does not agree that it has found
a root cause, consider using another root cause protocol in the Toolkit.
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
Why?
Because: ______________________________________________________________
______________________________________________________________________
______________________________________________________________________
Why?
Because: ______________________________________________________________
______________________________________________________________________
______________________________________________________________________
Why?
Because: ______________________________________________________________
______________________________________________________________________
______________________________________________________________________
Directions: Use a computer and projector to display the 20 Reasons worksheet on the last
page, or use chart paper to recreate the simple list.
1. Begin by writing the problem in the box at the top of the page.
2. Ask the group to give possible reasons for why the problem may be occurring. It may be
helpful to use a round-robin response order to get people started, but try to allow the
team to call out reasons as they come to mind. Record them all until you have reached a
full list of 20 reasons.
3. Allow the group to review the list silently for a few moments.
4. Ask each member to identify what s/he thinks might be the root cause of the problem.
Place a checkmark next to the statement as s/he speaks and encourage him/her to
explain his/her reasoning before moving on to the next person.
5. Continue to facilitate the discussion until the group feels that it has identified a potential
root cause.
Key Points
It should be emphasized that this is a brainstorming activity and all responses are
welcome and valid.
You may find the last several reasons are more difficult to come up with, but frequently
the effort is worth it, as the root cause will likely appear near the bottom of the list.
Many problems do in fact have more than one root cause. It is fine to identify more than
one root cause, but do push the group, through reflection and discussion, to narrow the
list to no more than three root causes.
Additional Information
Participants frequently disagree about explanations for the original problem, the sequence of
causes and effects, or the relative importance of various possible causes during the
brainstorming phase of this activity. Rather than allowing debate during the brainstorming of the
list, keep the group focused on listing possible reasons first. When the group reflects individually
to identify possible root causes, it can move past less important disagreements to focus on the
root of the problem.
The group may come up with explanations that are directly contradictory to one another. If this
occurs, record them all rather than immediately dismissing any. The reflection and subsequent
discussion will sometimes reveal which are more likely to be true.
Problem: Our ELL population struggles to meet proficiency on the ELA section of MCAS.
Root
# Possible Explanation
Cause?
1 ELL students have a wide variety of needs and abilities that are difficult to meet.
2 Programs we have for ELLs are not being implemented effectively in every school.
Many teachers have not received enough PD and support to help them work with
3
ELLs effectively.
5 The number of ELL students in our schools is increasing faster than we thought.
8 .
10
11
12
13
14
15
16
17
18
19
20
Problem:
Root
# Possible Explanation
Cause?
10
11
12
13
14
15
16
17
18
19
20
Additional Information
During the brainstorming, participants may come up with possible causes that do not fit easily
into one of the previously identified categories. This can indicate a need to identify a new
category or broaden an existing category. Do not discard an idea solely because it does not fit
into a previously identified category. The purpose of the major categories is to provide a
structure to guide the brainstorming. These categories should be used to inspire, rather than
restrict, participants’ thinking.
In the early stages of the process, participants often use the activity as an opportunity to vent
frustrations and criticisms. This can be acceptable in the beginning, but be sure to steer them in
a more constructive direction as the activity progresses.
Families Students
too large
We have not changed specialists
Not enough materials
with the change in
We treat many population Some teachers
students the same do not have Not enough
even though their Every school seems experience with professional
needs are different to be doing ELL students development
something different
Processes Teachers
Problem:
2. Working independently, brainstorm factors that influence the situation and write them on
sticky notes. (Allow approximately five minutes).
3. Compare and discuss the factors suggested by different members of the team, adding,
discarding, and revising factors as needed.
Look for duplicates. A factor that was suggested by multiple members is likely to be
relevant and important. Select one and discard the others.
Look for similarities and consider combining ideas that are similar, but not exactly the
same.
Refine ideas that are imprecise.
Consider whether each factor is too specific or too general.
Debate and decide whether each factor is relevant.
4. Arrange the sticky notes on the chart paper in a pattern that indicates how the factors
are related.
6. Study the entire graphic representation with a critical eye, asking questions such as:
Are the relationships between variables shown correctly?
Are there other variables or issues that should be added?
Revise and refine the overall arrangement based on the answers.
8. If there are relationships that you believe are valid and important, but you cannot be
certain that they are, list them on another piece of chart paper and note how you could
investigate them further.
9. Study the graphic representation and identify factors on which to focus solutions by
asking such questions as:
Adapted from How To Conduct Collaborative Action Research, by Richard Sagor, 1992, ASCD,
Alexandria, VA.
Mathematics CR questions
Students Students
problems problems
Our Rubric
for Students
Students
Constructed solve the
solve the
Responses problem
problem
correctly incorrectly
Note: This activity is an adaptation Step Four of the Performance Improvement Mapping (PIM)
process—Identify the most significant causes of the weaknesses in students' knowledge and
skills—available at http://www.doe.mass.edu/sda/regional/pim/.
Directions:
1. Write the inference or conclusion from your data analysis (3.3.1T) where all can see,
e.g., flip chart or projected by an LCD.
2. Brainstorm all the possible underlying causes that might have contributed to this
outcome. For each potential root cause, write a short summary on a piece of paper and
tape it on the wall where everyone can see and read it.
Note:
The group may want to give individuals silent think/work time before
brainstorming as a group.
Make sure that the written causes are specific enough to be interpreted after the
discussion is over. For example, a cause written as curriculum does not describe
what really is lacking.
3. Once the brainstorm is complete, consolidate any duplicate or very similar ideas.
However, avoid consolidating causes in ways that make them too broad and vague.
4. Review all the causes and note any that are outside of the direct control of the district,
e.g., those dealing with student behavior, families, or the community. For each of these
causes, discuss the following:
Is this potential root cause important enough for the district to focus time and
energy on as part of an action planning process?
If so, can this cause be stated in terms of something over which the district has
control?
After discussing each of these causes, the Team has two options:
Rewrite the cause in terms of actions the district could take, such as securing
resources, modifying processes, and/or shifting actions of district personnel. (See
examples below).
5. Once all causes have been written in terms that represent things over which the district
has control, sort them into one of three dimensions by moving the papers on the wall. It
may help to have a separate flip chart or wall space designated for each realm.
Core realm: Contains factors that most directly affect student outcomes. These
tend to be classroom-level factors.
Enabling realm: Contains conditions that must be in place in order to make the
core elements successful in affecting student outcomes. These tend to be a mix
of school- and district-level factors.
Supporting realm: Contains conditions that are helpful toward making the core
elements successful in affecting student outcomes. These tend to be a mix of
district- and community-level factors.
Note that the amount of control that teachers and the school have is greatest at the center.
Conversely, district control is greatest in the enabling and supporting realms. The district has
the unique perspective, responsibility, and authority to act at the enabling and supporting levels
in order to make systemic improvements that affect student learning and achievement.
Dimensions of district improvement:
Realm Definition Sphere of Amount of School Amount of
Implementation and Teacher Control District Control
Core Factors that most directly Classroom School and teachers District has
affect student outcomes have a great deal of responsibility, but
control less direct control
Enabling Conditions necessary in School School has some District has
order to make activities in control significant control
core realm successful and leverage
Supporting Conditions that are helpful District/ School has little District has some
in making activities in core Community control control and
realm successful leverage
Are there any issues that arose in one brainstorm that are similar to those in
others, suggesting they affect multiple areas within the district?
Do these causes primarily affect a subgroup of students, teachers, or other
stakeholders, or do they affect a much wider segment of the population we
serve?
Consolidate the issues that affect multiple areas or stakeholders. Record these in
worksheet 4A: Far-Reaching Causes, and record the remainder in 4B: Problem-Specific
Causes.
It would be impractical to address all of the causes identified. Therefore, narrow the list of
causes to identify those which can be addressed most productively by the district.
8. Rate each cause based on the impact it is likely to have on student learning and
achievement, and on the amount of control the district has over it. Causes that rate high
on the amount of impact and the amount of district control should become the focus of
subsequent action planning.
Guiding Questions:
1. Which potential root causes have the greatest impact on the work of the district?
2. Which causes does the district have the most immediate control over?
3. What evidence does the Team have to verify its theories about why this problem exists?
Impact on
student District’s
Potential Root Cause achievement control Evidence
1 = minimal 1 = very little
2 = some 2 = some
3 = substantial 3 = a lot
Use this worksheet to record the potential root causes that apply to only one identified problem, function area, or group of
stakeholders. Be sure to indicate the target for each potential root cause.
Guiding Questions:
1. Which of these potential root causes have the greatest impact on the work of the district?
2. Which causes does the district have the most immediate control over?
3. What evidence does the Team have to verify its theories about why this problem exists?
Impact on District’s
student control
Problem, function area, achievement 1 = very
Potential Root Cause Evidence
or stakeholder group 1 = minimal little
2 = some 2 = some
3 = substantial 3 = a lot
Directions:
1. Restate the underlying problem and proposed solution articulated in the Problem
Statement.
2. As a group, brainstorm questions about the problem or proposed solution that should be
checked before moving forward. It can be useful to note the underlying assumptions the
group has, e.g., that a certain factor is the most significant root cause, or that a certain
solution will have the greatest impact, and translate those into a question for
investigation. If the list is long, the Team may want to prioritize them.
3. For each question, complete the information below until all are captured and a clear plan
to investigate each is identified. Copy the table as many times as necessary to
document how the Team will address each question it has about the problem or solution.
It is not necessary to consult both research and local expertise for each question.
Example:
Problem or solution under investigation: Teachers don’t get sufficient training and support in
our reading program, so we are going to start a teacher mentoring program.
Question
we have: Lead Investigator:
Research
sources to
consult:
Local expertise
to consult:
Date for
completion:
Question
we have: Lead Investigator:
Research
sources to
consult:
Local expertise
to consult:
Date for
completion:
Question
we have: Lead Investigator:
Research
sources to
consult:
Local expertise
to consult:
Date for
completion:
Question
we have: Lead Investigator:
Research
sources to
consult:
Local expertise
to consult:
Date for
completion:
To give the District Data Team a systematic way to capture Related Documents
Purpose 4–Knowledge Module
problems.
Time Ongoing.
Problem Academic
Problem Full Problem Statement Subject Team Date Date Results Available
ID Content
Keywords (or summary) Area(s) Investigating Begun Completed (location)
Number Standard(s)
0 K–5, ELA, • Many third grade students at ELA N/A Reading 13-Jan-09 Reading
Mentoring, our school do not read at grade Intrervention Intervention
Training, level.
Assessment • We believe that this is a result
of teachers not having sufficient
training in our reading program
and not accurately measuring
students’ reading levels in grades
K–3.
• We want all third graders at our
school to read at grade level or
above.
• We will start a teacher
mentoring program focused on
reading and implement more
rigorous reading assessments in
the primary grades.
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
Collected Research and Practice Literature: Articles and Sources
Location or
Associated
Associated Problem Person with Date Added
Problem ID Title of Article, Study, or Other Item Source: Publisher or Website
Keywords Copies to this List
Number(s)
Available?
0
Collected Research and Practice Literature: Articles and Sources
Location or
Associated
Associated Problem Person with Date Added
Problem ID Title of Article, Study, or Other Item Source: Publisher or Website
Keywords Copies to this List
Number(s)
Available?
0
0
MODULE 5: ACTION
D i s t r i c t D a t a T e a m T o o l k i t
Table of Contents
Introduction—1
Where Are We Now?—1
Module Objectives—1
Crafting a Logic Model—2
Why Develop a Logic Model?—2
Components of a Logic Model—3
Building a Logic Model—4
Articulating Meaningful Measures—8
Putting It All Together—11
Taking Action—12
When Are Action Plans Necessary?—12
Module Summary—14
Action
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
MODULE OBJECTIVES
The Action module will help a District Data Team:
The resulting model gives the reader a clear sense of how the district
perceives the problems it identified, the strategies selected to address
those problems, the supports needed to implement those strategies
successfully, and the outputs and outcomes that should occur if the
strategies are implemented effectively.
1 4 5 6 3 2
INPUTS OUTPUTS OUTCOMES IMPACT
Adapted from: W.K. Kellogg Foundation (2004). Logic Model Development Guide. Battle Creek, MI: Author.2
2. Next the Team articulates the overall goals the initiative is intended to
accomplish or, stated another way, the impact the initiative will have
on the culture of the district, its schools, or its classrooms. This
section describes the wishes, dreams, and general vision describing
the target, the sustained effects, or consequences the district expects
to see over a multi-year period. The Team should seek the support of
the superintendent in developing appropriate goal statements
because they should align with the district’s strategic goals. (These
can also be drawn from 4.1.1T).
3. Having identified the impact of the initiative, the district articulates the
outcomes, or measures of change (or outcomes) that indicate
progress is being made toward those goals. These measures provide
evidence of shifts in the skills, knowledge, and behavior of the adults
and students targeted by the strategies. They articulate in specific and
measurable in terms what will change, for whom, by how much, and
by when.
4. Once the Team has clearly articulated its outcomes, it then goes back
to the second column and lists its strategies—the specific means,
methods, or approaches to solving the identified problem(s). These
should represent promising practices drawn from research, local
knowledge, and local expertise. Sources should be noted if possible.
(These can also be drawn from 4.1.1T if the Team has completed it).
5.
What the Team identifies for resources answers the question: What
supports are available to the district or our schools to implement our
strategies? While the most obvious resource is funding, a district’s
primary resource is its personnel. The Team should consider: How is
the time of teachers, administrators, and support staff throughout the
district being used, and how might it be used differently to address the
strategy or strategies?
These tools will guide the District Data Team in crafting and refining a
logic model to guide implementation and evaluation of its strategies.
By reflecting on these questions, a Team can narrow the focus and scope
to those measures that will best document and communicate progress
toward outcomes and impact.
As mentioned earlier in this module, measures of change in particular
often describe things that can be hard to quantify easily and effectively.
However, well-defined measures can provide a powerful focus for all
involved in the effort.
1
See Module 6: Results for more information on the evaluation process.
Similarly, the Team should embed into its work the means to collect,
monitor, and evaluate the evidence generated by these measures, rather
Specific: What are the specific criteria against which the outcome
will be judged?
Measurable: What will be the method or tool used to measure
progress?
Action-oriented: Is there consensus among stakeholders that this
is a worthy outcome?
Realistic: Is this outcome sufficiently bold, yet still achievable
given available resources, knowledge, and time?
Timed: Does the indicator specify by when the outcome will be
achieved?
A diverse array of
measures will Evidence for these measures, like other data, comes in a variety of forms.
The Team might consider articulating data that represent the four
provide a richer
domains of data outlined in Module 1: Getting Ready—demographics,
picture of the effect perceptions, and school or district processes, as well as student
of the strategy than outcomes. Another way to think of it is to consider data that can be
measures that capture counted (such as assessment data), seen (such as classroom
observation data), and heard (such as stakeholder feedback).
the same or similar
evidence, and will A diverse array of measures will provide a richer picture of the effect of
facilitate self- the strategy than measures that capture the same or similar evidence,
and will facilitate self-monitoring and measurement of success in each
monitoring and
area. Sample measures include:
measurement of
success in each area. Achievement, assessment, improvement, and percent
completion data
Stakeholder perceptions, such as a survey of teachers,
administrators, school committee, students, and/or families
Self-assessments, such as the Common Planning Time self-
assessment or Essential Conditions rubric2
Observation data, such as Learning Walkthrough evidence
2
For more information on these and other district support resources, visit http://www.doe.mass.edu/sda/ucd/
or email districtassist@doe.mass.edu.
Organizational data
External review data (such as MASBO or PQA)
TA KING A CTI ON
In developing its logic model, the District Data Team identified a strategy
(or, more likely, a set of strategies) that will be used to address the
problems it identified. Now the Team is ready to begin taking action.
In some cases, the logic model alone will be sufficient to guide the work
and the Team will not need any additional detail. In other cases, an action
plan may be useful or even necessary.
However, action plans are not always necessary. In some cases, a strong
logic model with clear strategies and measures may be all the Team
needs to guide implementation and monitoring of the work.
In considering whether or not to craft an action plan, the Team might ask:
Do action plans already exist for the strategies the Team has
identified?
Would an action plan significantly enhance the district’s ability to
delegate steps and/or monitor their completion?
Generally speaking, action plans are only required for new strategies or
strategies that require changes to function as intended. For example, a
district initiating professional development in sheltering instruction for
English language learners may need to specify the action steps required
to implement the training to make sure it occurs. On the other hand, if
walkthroughs were selected as a strategy and the district already uses
If the District Data Team determines that an action plan would be a useful
tool for the situation it is addressing, it will want to reference the attached
documents that describe the components of an action plan and guide the
Team in building one.
These tools will guide a District Data Team in crafting and refining an
action plan to guide its work.
MODULE SUMMARY
Module 5: Action builds the capacity of the District Data Team to craft a
logic model that provides a rationale for how the strategies it selected to
address the identified problems will work. The logic model is the district’s
rationale and justification for addressing the problems the District Data
Team helped identify. It gives an argument for why existing strategies,
given some adjustments, and possibly some new strategies as well, will
achieve superior results than in the past. Although the Team may not be
the primary author or owner of the district’s logic model or action plans, it
will certainly play an important support role in their creation.
This module also guides the Team on when and how to develop action
plans to support the implementation of these strategies.
References
1
Race to the Top Executive Summary, p. 2. U.S. Department of
Education. Washington D.C. November 2009
2
W.K. Kellogg Foundation (2004). Logic Model Development Guide.
Battle Creek, MI: Author
3
Bernhardt, V. L. (2004). Data Analysis for Continuous School
Improvement. Larchmont: Eye on Education.
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Purpose Articulate to internal and external stakeholders–in plain language–the connection between Related Documents
the problem statements identified and the strategies identified to address them. 5–Action Module
5.1.2T: Logic Model Checklist
Description Use the template below, the guidance in the text, and the checklist to craft a logic model
that shows the links between the team’s ideas about improvement, strategies, resources,
outputs, outcomes, and the intended impact on district and/or school practices.
Time Varies. 2–3 hours to start, but may take many meetings to refine.
Directions:
Follow the numbering sequence below, respond to the guiding questions in each column. For example, begin in the first
column by naming the problem and proposed solutions; then jump to the last column and name the end goal (the desired
impact) before identifying the measures of change that will demonstrate improved outcomes for adults and students.
It is not necessary to have a one-to-one relationship between items in one column and items in the next. For example, a
district may develop a logic model with 3 strategies, 12 resources, 5 key measures of implementation, and 3 key measures of
change, in order to reach 1 goal.
See the text in Module 5 for additional guidance on completing the logic model.
5.2.1T and 5.2.2R provide more detailed guidance for crafting and refining meaningful measures of implementation and change.
1 4 5 6 3 2
Problem Goals
Measures of
Statements &
Strategies Resources Implementation Measures of Change (Outcomes) (Desired
Proposed
(Outputs) Impact)
Solutions
What are the specific What supports are What intermediate and longer-term results do we expect
How will we know What are the
means, methods, or available to the to achieve as measured by changes in skills,
What is our proposal whether the strategies sustained effects or
approaches we will district or our knowledge, and behavior?
for addressing the we described were consequences we
use to solve the schools to
problem? implemented by the Students expect to see over a
problem(s) we implement our Adults
adults? multi-year period?
identified? strategies?
Purpose To help a Team refine or revise a logic model that will articulate its theory of action for Related Documents
addressing an identified problem. 5–Action Module
5.1.1T: Logic Model Template
Description This checklist can guide a District Data Team in designing or refining a logic model to
guide its action and evaluation plans.
Time 1–2 hours.
Strategies Y/N
Are strategies described at the district level and at the school or classroom levels, as needed, to address the problems that were
identified?
Taken as a whole, are the strategies articulated in the logic model sufficiently bold and ambitious?
Taken as a whole, are the strategies articulated in the logic model sufficiently attainable?
Taken as a whole, can the strategies withstand scrutiny?
Taken as a whole, do the strategies represent a justifiable use of resources in light of current strategies and initiatives in the district?
Are existing strategies described in ways that indicate how they will achieve different results than in the past?
Do the descriptions of the strategies communicate to internal and external stakeholders how the strategy will address the issues
identified?
Resources Y/N
Do the resources identify the full range of supports needed by the district or its schools to implement the strategies?
Note:
This activity may be conducted with the District Data Team, or with other stakeholders
involved in the improvement effort being discussed.
This activity is most useful if the Team has drafted a logic model, action plan, or evaluation
plan prior to beginning.
Measures are also known as indicators or benchmarks, and can be short-term (formative) or
long-term (summative).
Directions:
Use this checklist to reflect on and refine the measures you have crafted to demonstrate
implementation and change.
First respond to the questions separately for each individual measure you have crafted, then
reflect on all the measures collectively in relation to the identified strategy (or strategies) or
action steps.
Examples:
Unrefined measure: Increase the percentage of students who pass the MCAS tests on the first
attempt.
Refined measure: To increase the percentage of students who pass the MCAS ELA test on the first
attempt by 4 percentage points each year from 2010 to 2014.
Directions:
2. Using this document as a guide, each Data Team member should create a well-written
measure from each of the following scenarios by:
3. As a Data Team, reach consensus on the most appropriate measure for each scenario.
Possible ―answers‖ are included at the end of this document, which you may want to
consult AFTER your discussion!
Typical Measure
Increase the percentage of students who pass the MCAS tests on the first attempt.
A clearer and more useful measure would address the following questions:
1. What will change?
The percentage of students who pass the MCAS ELA test.
3. By how much?
What percentage passed in 2005–06? (57% in ELA; 58% in mathematics)
(ELA: 100% (in 2014) – 73% (in 2006–07) = 27 percentage points difference;
2014–2008 = 7 tests remaining to show progress; 27/7 = 3.9 percentage
points/year needed to reach target if linear improvement is assumed.
4. By when?
Annual improvement target?
By 2014?
Elements:
What will change?
For whom?
By how much?
By when?
Statement:
Scenario 2: In response to chronically low mathematics performance across the district, the
superintendent reallocated resources from the professional development budget to the salary
account to enable hiring mathematics coaches to provide embedded professional development
for the middle school mathematics teachers in one of the two district middle schools. The
superintendent hoped to see a significant increase in the percent of proficient students (at least
a 10 percentage point increase) in the middle school students whose teachers had participated
in the embedded professional development provided by mathematics coaches within three
years.
Elements:
What will change?
For whom?
By how much?
By when?
Statement:
Scenario 3: The superintendent of the Scenic Cove School District reviewed cohort graduation
data (same students grade 9 to grade 12) for the class of 2009 and was shocked to see that
only 80% of the cohort graduated on time, while the state average was 95%. The
superintendent instructed the assistant superintendent to work with the District Data Team to
develop a measurable improvement target for the class of 2012, in order to bring the district in
line with the current state graduation rate.
Elements:
Scenario 5: During the district principals’ meeting, the principals of the four district high schools
noted that the data displays the District Data Team provided clearly showed a positive
relationship between high absence and low performance on the grade 10 MCAS ELA test. On
the 2009 MCAS Mathematics test, 30% of the grade 10 students had been absent for 10 or
more days prior to test administration. Of these, 90% scored at the failing level. The principals
worked together, with support from the District Data Team, to craft an improvement goal for
attendance in their schools that would have no student with 10 or more absences prior to the
2011 MCAS test administration date.
Elements:
What will change?
For whom?
By how much?
By when?
Statement:
Scenario 6: While investigating the cohort graduation rate, the assistant superintendent noticed
that students who were retained in grade 9 generally didn’t graduate with their class. Five
percent of the students in the class of 2009 cohort had been retained in grade 9 and only 10%
of these students graduated with their class. To develop an action plan to address this problem,
the assistant superintendent must craft a measurable improvement target for grade 9 retention
for the class of 2013.
Elements:
What will change?
For whom?
By how much?
By when?
Statement:
Scenario 2
Elements:
What will change? Percent of students proficient in MCAS Mathematics
For whom? Middle school students
By how much? 10 percentage points
By when? Within three years
Statement: To increase within three years the percentage of middle school mathematics students in the
target school who score at the proficient level or above on the MCAS mathematics test.
Scenario 3:
Elements:
What will change? Graduation rate
For whom? Class of 2012
By how much? From 80% to 95%
By when? 2012
Statement: To increase the cohort graduation rate from 80% to 95% for the class of 2012.
Scenario 4:
Elements:
What will change? The gap between proficient male and female students
For whom? Male students at all grade levels
By how much? To equal the percentage of proficient female students
By when? 2012 MCAS test administration date
Statement: To increase the percentage of proficient male students at each grade level to equal the
percentage of proficient female students by the 2012 MCAS test administration date.
Scenario 5:
Elements:
What will change? The percentage of students with 10 or more days of absence
For whom? Grade 10 students
By how much? No student with 10 or more absences
By when? 2011 MCAS test administration date
Statement: To decrease to 0 the percentage of grade 10 students with 10 or more days of absence prior
to the 2011 MCAS test administration date.
Scenario 6:
Elements:
What will change? Grade 9 retention rate
For whom? Class of 2013
By how much? By 5 percentage points
By when? 2013
Statement: To decrease the grade 9 retention rate for the Class of 2013 by 5 percentage points.
5.2.2R: Elements of a Well-Written Measure—Version 1.0 5/5
ACTI ON PLAN TE MPLATE 5 . 3 . 1 T
Purpose To document the series of steps needed to ensure that the strategies identified by the Related Documents
District Data Team to address the identified problem areas are implemented as intended. 5–Action Module
5.3.2T: Action Plan Checklist
Description Develop an action plan (if needed) to implement new strategies or to implement existing
strategies more effectively.
Time 30–45 minutes for each strategy requiring an action plan.
Note: As discussed in the text of Module 5: Action, it is not always necessary to develop an action plan. In considering whether or
not to craft an action plan, a district might ask:
Do action plans already exist for the strategies we have identified?
Would an action plan significantly enhance the district’s ability to delegate steps and/or monitor their completion?
Generally speaking, action plans are only required for new strategies or strategies that require changes to function as intended.
In relationship to the logic model, an action plan has the following components:
Strategy:
Begin by asking:
If the district intends to leverage an existing strategy to address the problems it identified, does the strategy need to be
refined, adjusted, or improved prior to its implementation?
If the district identified a new strategy to address the problems it identified, does the new strategy require multiple
steps to be implemented?
If the response to either of the above questions is “NO”, then the strategy does not require an action plan. The District Data
Team should focus its time and energy on developing action plans for the other strategies it identified, or proceed to Module 6:
Results to develop an evaluation plan.
If the response to either of the above questions is “YES”, use the template below to craft a plan that will guide implementation of
the activities:
Begin by articulating the overarching strategy that the action plan will support.
Respond to the guiding questions in each column, proceeding from left to right. For example, begin in the first column by
naming the specific action steps. For each action step, indicate the necessary and available resources, the measures of
implementation (which can be drawn from the logic model, if it exists), the owner, and the deadline.
Each action step should have corresponding information in each of the other columns. For every action step, there should be
resources, at least one measure of implementation, an owner, and a deadline for completion.
Note:
If the Team has crafted a logic model (5.1.1T and 5.1.2T), it should have that available for reference, as the strategies,
resources, and many of the implementation measures will have already been identified.
5.3.2T: Action Plan Checklist provides additional guidance on refining the completed plan.
1. Indicate the strategy—the specific means, method, or approach to solving the identified problem(s). If the Team has
completed a logic model, these are outlined in the second column. If the Team has not completed a logic model, it will need to
determine the best approach to addressing the identified problem. Since the strategy is the driver of the action plan, it is
essential that the Team think carefully about this component.
2. List the action steps that describe the major steps that must be taken to implement the strategy. They should be listed in
order of completion, note which need to be completed before others, and which need to be sufficiently described so others
can understand them and carry them out.
In articulating action steps, a district should stay focused on the big picture, naming only the most significant, far-reaching
steps that need to be monitored as evidence of progress toward the goal. However, the owner of a specific action step may
wish to add detail to the action plan to guide his or her particular work, e.g., if he or she has to manage a team of people to
get the work done, or has many details to track.
3. Indicate the resources needed to implement one or more action steps, if needed. In some cases, key resources may be
lacking or not yet allocated to the project, in which case one activity would be to secure those resources. For example, ESE
offers a wide range of technical assistance, but accessing that assistance for a particular project would require someone from
the district approaching the appropriate office to see what is available. As with the strategy, if the Team has completed a logic
model, it will have already identified this information.
4. Indicate the measures of implementation that tell when the action step or strategy is fully realized or carried out. For
example, the measure of implementation of a professional development strategy might be that a certain percentage of
teachers receive training over a specified period. Again, if the Team has completed a logic model, it will have already
determined what will serve as evidence that key action steps have been implemented, and can use those measures as a
starting point. However, since the action plan by definition provides more detail, the Team may want to add additional
measures of implementation that capture a finer grain of detail.
5. Indicate the owner—the individual most closely responsible and accountable for a given action step. It is essential that this be
a specific person and that they have the resources, capacity, authority, and support required for completing the step.
6. Give the deadline by when the action step will be completed. Completion of the last step signifies the date by which the
strategy is expected to be fully operational and by which measures will be available for analysis.
Strategy:
Measures of
Action Steps Resources Owner Deadline
Implementation (Outputs)
What steps must be taken to What specific supports are How will readers of the plan know Who is most closely responsible and By when will the step
implement our strategy? needed to implement this action the action step or strategy is fully accountable for taking each action step? be completed?
step? realized or carried out?
Purpose To craft an action plan to support the implementation of strategies identified in the previous Related Documents
module, Knowledge. 5–Action Module
5.3.1T: Action Plan Template
Description This checklist can guide a District Data Team in designing or refining an action plan to guide
implementation of a strategy.
Time 1–2 hours, depending on the number of action plans to be developed.
Directions:
If the response to either of the above questions is “NO,” then the strategy does not require an action plan. The District Data
Team should focus its time and energy on developing action plans for the other strategies it identified, or proceed to Module 6:
Results to develop an evaluation plan.
If the response to either of the above questions is “YES,” use the template below to craft a plan that will guide implementation of
the activities.
Note: If the Team has crafted a logic model (5.1.1T and 5.1.2T), it should have that available for reference, as strategies, resources,
and many of the implementation measures will have already been identified.
If this is an existing strategy, is it described in a way that indicates how it will achieve different results than in the past?
Does the description of the strategy communicate to internal and external stakeholders how it will address the issues identified?
Are just the key, far-reaching action steps that need to be monitored as evidence of progress toward the goal identified?
Are action steps sufficiently described so that others can understand them and carry them out?
Resources Y/N
If necessary, are any special supports (technology, materials, funding, infrastructure, people, polices, political support, etc.) required
to implement the strategy indicated?
Do outputs document technical steps for which the actions are clear and the results easily quantified?
Are outputs specific in terms of what will take place, with/for whom, to what extent, and by when?
Owner Y/N
Is each action step ascribed to an individual person (not a group, title, or team)?
Does each person assigned an action step have the resources, capacity, authority, and support needed to complete the step?
Deadline Y/N
Table of Contents
Introduction—1
Where Are We Now?—1
Module Objectives—1
Monitoring Progress—4
Conducting an Evaluation—8
Analyzing the Evidence—8
Communicating Results—10
Evaluation as Continuous Improvement—11
Module Summary—12
Results
INTR ODUCTI ON
Getting
Inquiry Information Knowledge Action Results
Ready
Once a District Data Team, or any other district team, has begun
implementing a logic model or improvement plan, it will want to monitor its
progress toward the desired goal. The Results module can help a Team
build capacity to evaluate the desired outcomes, monitor its progress on a
given strategy (or strategies), and communicate those results to various
stakeholders.
MODULE OBJECTIVES
The Results module will help a District Data Team:
Since what gets measured gets done, it is wise not to wait until the end of
implementation to draft an evaluation plan. Instead, the Team should
begin thinking very early in the implementation process—if not before
implementation even begins—about the most important evidence to
measure.
An evaluation plan can guide the overall evaluation process, where the
Team reflects on, and reports publicly, the extent to which new skills,
knowledge, and expertise have been acquired by the targeted adults
and/or students, and the extent to which they are having an impact on
student achievement and organizational culture.
By considering these questions, the Team can narrow the focus and
scope of its evaluation to those measures that best document and
communicate progress toward outcomes and impact.
If the Team is unsure how the data will be used or by whom, they may not
be worth collecting. If the Team has identified a substantial amount of
If the District Data data to collect, it should seek guidance from district leadership in setting
Team has crafted a priorities appropriately. For example, while it is wise to consider
logic model, then responding to the questions stakeholders ask, the Team might not want
to get distracted by answering all of their questions, but rather focus on
much of its work in
those that will most further the district’s vision, mission, and strategic
deciding what to plan.
evaluate is already
If the District Data Team has crafted a logic model (5.1.1T and 5.1.2T),
done, as the logic
then much of its work in deciding what to evaluate is already done, as the
model articulates the logic model articulates the essential outcomes, or measures of change, in
essential outcomes, adult and student practice that the district will look for as it implements its
or measures of strategy or strategies.
change.
If the Team does not have a logic model to work from, it will need to think
strategically about what to evaluate, when, and why. The Team should
consult with district and school leadership in order to select the areas that
will provide the most useful information for the Team, district leadership,
and other stakeholders.
MONITORING PROGRESS
As soon as the first action step is underway, the Team (or some other
entity) can begin monitoring the progress of the district’s work. It will likely
monitor the implementation of the actions related to the district’s
strategies. Likewise, the Team may also begin monitoring those
strategies for efficacy and impact. However, the two forms of monitoring
should not be confused:
This work of monitoring both implementation and change can begin once
the strategies are in motion, and may in fact coincide with one another,
making it all the more important to distinguish between the two.
Goal:
Activity 6.1 provides a template for an evaluation plan. Again, if the Team
has crafted a logic model, it already has the majority of elements needed.
If the Team has not yet done this, the evaluation plan template will guide
it to think about the necessary information.
Once the evaluation plan is in place, the team is ready to begin collecting
and analyzing relevant data.1 Because a strategic plan or logic model is
based on a causal chain of events, each of which is dependent on the
next, the focus areas may need to be examined chronologically: one
cannot look for impact without outcomes; one cannot look for outcomes
without outputs; one cannot look for outputs without inputs.
Naturally, the Team will want to evaluate the outcomes of a strategy once
it has been fully implemented. If it has engaged in analyzing outcome
data along the way, this final evaluation will not take as much additional
effort as it would if a district has not been evaluating the early evidence of
actual change in practice.
1
The Data Analysis Protocol (3.1.1T) in Module 3 can help the Team shape its evaluation of outcome
data.
adults. If this outcome data does not demonstrate any movement toward
the desired goal, the Team should first consider:
For example, if a District Data Team has evaluated all aspects of its logic
model and has deemed that, in fact, it did not identify the real problem, it
may want to visit or revisit the guidance in earlier modules to see if it was
asking the right questions in the first place. The District Data Team Self-
Assessment (0.1.1T) can help a Team understand its strengths and
weaknesses in regard to inquiry, action, and results, and can direct it to
tools and resources in other modules in this Toolkit that could be useful at
this point of the process.
2
Module 1: Getting Ready provides guidance for managing the change effort and addressing some of
the concerns that stakeholders may raise during the process.
EVALUATION AS
CONTI NUOUS IMPR OVEMENT
For example, if the District Data Team did not see the desired results, it
may need to modify its evaluation plan. It may find that it does not have
enough quality data, or that data are not presented in a way that
highlights key trends or outliers. The Team may want to reflect on
whether it identified the best action steps and resources for the strategy,
of if the strategies and action steps it decided on in the first place were
the most appropriate. Continuing to reflect back on the process, the Team
may even find that the initial focusing question used to frame the inquiry
process might not have been the best one, and it may want to re-engage
in inquiry with a different question in mind.
MODULE SUMMARY
References
W.K. Kellogg Foundation (2004). Logic Model Development Guide. Battle
Creek, MI: Author
For more information on this and other district support resources, or to share feedback on
this tool, visit http://www.doe.mass.edu/sda/ucd/ or email districtassist@doe.mass.edu.
Purpose The evaluation plan provides structure and guidance for the Team as it considers the impact of a Related Documents
certain strategy (or strategies) on the identified problem. 6–Results Module
Description The District Data Team completes the evaluation plan before implementation of a strategy begins,
or very soon after, in order to articulate the data it will collect, when, and from where, in order to
evaluate if the strategy and action steps are having the desired effect on student and adult
outcomes.
Time 1–2 hours to create. Ongoing time to gather and analyze data.
Notes:
If the Team has crafted a logic model (5.1.1T and 5.1.2T), it should have that available for reference, as the strategy and
related measures of change (outcomes) will have already been identified.
If the Team has completed a data inventory (1.5.1T), it can reference that for information on available data, their location, and
the ease with which the Team can access the data.
Directions:
1. Begin by articulating the goal or desired impact that the Team is trying to achieve.
2. Respond to the guiding questions in each column, proceeding from left to right.
Begin in the first column by naming the specific measures that will serve as indicators that the strategy is working. It is
essential that this includes any measures of change (outcomes), but in some cases this may also include measures of
implementation (outputs) and even resources.
Then for each measure indicate the specific evidence that will be collected, its source or location, the date for analysis
by the Team, and the person responsible.
Each measure of change should have corresponding information in each of the other columns.
3. Use the plan to ensure the Team has the right data at the right time and is prepared to analyze it for results.
See the text in Module 6 for additional guidance on completing the evaluation plan.
See Module 3: Information for guidance on analyzing data.
Template
Overview
One effective strategy for communicating results is the use of data walls. A data wall is a visual
representation of data related to a specific question or problem. It is comprised primarily of
numbers, charts, and diagrams, using only enough text to annotate the data and articulate the
inferences and conclusions that the Team has agreed on. A data wall may also capture any
questions that have emerged as a result of the inquiry process. Ultimately, a data wall should be
dynamic, interactive, and evolve over time as new data are added and new conclusions drawn.
An interactive data wall contains data that will be updated and manipulated often over a period
of time, making the data wall a ― living‖ display. For example, a district might create a wall to
track the percent of students across the district that are meeting district-wide or statewide
standards. As new assessment data comes out, the Team could engage stakeholders in
updating the wall and noticing shifts in outcomes. Such a data wall might be used with
stakeholders most closely involved with implementing an action plan and monitoring its progress.
Consolidating in one display a range of information that summarizes the district‘s focus for
inquiry and data-driven action can serve several purposes, including:
Data walls can be portable (such as on a tri-fold science fair display board or rolling bulletin
board) or may be a permanent installation on a wall. It is increasingly common to find data walls
in the lobbies and offices of schools, often displaying data related to student assessment
scores. The District Data Team might consider the value of displaying data walls in central office
spaces as well, including areas devoted to functions such as finance, operations, and human
resources. The Team may also consider having the same display in all buildings district-wide.
1. Find a data set that is closely related to the focus of inquiry. Display the data in a space
where they will be easily accessible and easily viewed. After initial data is posted, and
once action is taken and similar data are later collected, the data on the interactive data
wall can be manipulated. A district‘s effort around progress monitoring students at the
end of each marking period in third grade relative to reading achievement is one
example of a dynamic data set that could be updated frequently. The periodic monitoring
of student achievement would inform the district as to whether or not literacy strategies
employed were resulting in an increase in student reading ability and if the district were
meeting its goal related to the number of students proficient in that grade.
2. Identify the individual unit that will be monitored. In schools, this is the student. However,
a district-wide data wall might use the classroom, grade level, or school as the unit to
monitor. Use small magnets, post-its, or another material to create a marker for each
unit, e.g., each classroom, grade level, or school.
a. It may be best to label each unit in a way that maintains privacy, while also
allowing the identification of specific units, e.g., specific classrooms.
b. The Team might consider using additional coding of the markers to add a third or
fourth dimension of data. For example, green markers could represent classrooms
with teachers in their 1st or 2nd year, yellow could represent those in their 3rd–6th
year, and orange could represent those with 7 or more years of experience.
School
A
School
B
School
C
School
D
*Depending on district size, the unit of analysis may be by classroom or by grade level.
In this grid, an audience can view sets of classroom level data across a district in relation to how
many students were scoring at benchmark in each class. Subsequent administrations of a
district-wide literacy assessment could then be administered and classrooms redistributed in
relation to the number of students achieving benchmark. The movements of data points on the
data wall provide stakeholders an opportunity to connect with data and see a picture of both
aggregate and disaggregate results due to the nature of the display.
4. As data are updated, the Team can track movement of individual units from one
Note:
a. Prior to manipulating or updating data, document the existing data wall in some form,
such as with a picture, so the Team can reflect back on changes over time.
b. Ideally, when engaging with the data, those closest to the data should be the ones to
manipulate and update the data wall.
This is an outline of the major components of an evaluation report. The associated activities or
resources in the Toolkit are indicated in parentheses. If the Team has completed any of the
activities listed, it will want to have those available for reference as much of the information in
the report derives directly from them.
I. Overview: A summary that describes the problem being addressed by the action plan.
A. Original focusing question (and potentially also the related clarifying questions)
(See 2.1.1T)
B. Summary of initial findings from the original data displays and data overview
(See 2.5.1T and 2.5.2T)
C. Description of the suspected cause of the problem (See 4.1.1T)
D. Goal or desired impact (See 4.1.1T and 5.1.1T)
II. Implementation Description: What the district did to address the problem.
A. Brief narrative (1–2 paragraphs) identifying the strategy and major steps taken to
address the problem (See 4.1.1T, 5.1.1T, and 5.3.1T)
B. Description of key resources that contributed to the effort. (See 5.1.1T and
5.3.1T)
III. Evaluation Results: What the effect was of implementing the strategy.
A. Data displays depicting the measures of implementation (outputs) and the
measures of change (outcomes), highlighting the acquisition of skills, knowledge,
and expertise, as well as shifts in habits and beliefs related to teaching and
learning. (See 2.4.1T through 2.4.4R)
B. Short narratives to describe findings from analysis of this data (See 3.3.1T)
IV. Recommendations and Next Steps: How the district will apply what it learned.
A. Identification of new focusing questions (See 2.1.1T and 3.3.1T)
B. Identification of immediate next steps to re-enter the Data-Driven Inquiry and
Action Cycle
Original
Focusing
Question
Summary of
Initial Findings
Suspected Cause
of the Problem
Description of
Strategies and
Major Actions
Taken
Description of
Key Resources
Use this section to summarize your results with data displays and written descriptions of your findings.
Attach pages as necessary.
New Focusing
Questions
Such as: New team formulation, creation of new data displays and data overviews,
audiences for communication…
Next Steps