Section 1:: Data Strategy in Education Agencies
Section 1:: Data Strategy in Education Agencies
Section 1:: Data Strategy in Education Agencies
1 These practices are drawn and adapted from the Federal Data Strategy: https://strategy.data.gov/.
2 For more information on Data Use, see the Forum Guide to Taking Action With Education Data, available at https://
nces.ed.gov/forum/pub_2013801.asp.
5 For more information on the Family Educational Rights and Privacy Act (FERPA), see the U.S. Department of
Education's FERPA resources, available at https://www2.ed.gov/policy/gen/guid/fpco/ferpa/
6 For more information on the Individuals with Disabilities Education Act (IDEA), see the U.S. Department of
Education's resource, available at https://sites.ed.gov/idea/
State and local education agencies (SEAs and LEAs) regularly collect data for multiple purposes,
and data collection and reporting may be conducted by many different individuals within an
agency: teachers, administrators, analysts, or even students themselves. Data-related activities
must be managed and coordinated to focus available resources where they are most needed,
and in the most efficient and cost-effective manner. Additionally, processes must be put into
place that provide the foundation for sound management and policy decisions about which
data collection and reporting initiatives to pursue. Such decisions must be based on adequate
information and must include the timely involvement and participation of stakeholders.
Given the complexity of data collection and reporting, SEAs and LEAs increasingly are
developing and adopting data strategies to clarify and maximize the purpose of their education
data and effectively collect, manage, use, and protect those data. Section 1 of this resource
provided an overview of data strategies in education agencies. This section discusses best
practices for implementing one aspect of a data strategy—data collection and reporting.
It can be helpful to consider the information lifecycle when planning a strategy for data
collection and reporting. The information lifecycle is the series of steps needed in properly
planning for, executing, and finalizing a data collection and the resulting uses and releases of
data. The six phases of the information lifecycle:
• Phase 1: Definition, Planning, and Development
• Phase 2: Data Collection
• Phase 3: Verification and Processing
• Phase 4: Analysis and Use
• Phase 5: Dissemination
• Phase 6: Disposition
Data do not exist in isolation. They are representations of the status of different parts of
an agency and are interrelated and influential upon each other. Similarly, the clarity and
effectiveness with which the data move through the phases of the lifecycle affect their quality
and usefulness throughout the process. Because a data strategy allows an organization to
consider how the collection and reporting of particular data can improve the organization’s
functioning and allow it to reach specified goals, all parts of the lifecycle should be considered
when planning and implementing that strategy. Across all phases, data need to be collected,
managed, and used in ways that maintain their integrity, quality, and intended purpose.
1. Document the circumstances that Agencies should avoid collecting personally identifiable
information (PII) or other sensitive data unless they are
make the collection of information
absolutely necessary to the collection. For example, PII
necessary, including any legal or likely is not necessary to a collection intended to assess
administrative requirements. the school or grade as a whole. Collecting overlapping
2. Indicate as specifically as possible information on different surveys or data collections can
how, by whom, and for what increase opportunities for outsiders to crack security
measures on more sensitive items.
purpose the data will be used.
3. Determine whether available data
can be used to meet an emerging information need before initiating a new collection.
4. Identify required data collection activities, as well as the accuracy and specificity
necessary to achieve collection objectives.
5. Analyze the costs and benefits of the proposed data collection to the producer and
provider and, where appropriate, the costs of alternative strategies.
6. Review the terminology and data definitions to be used in the data collection to
ensure that they conform to accepted use. Any deviations from accepted use should
be explained. Definitions should conform whenever possible to nationally developed
definitions to ensure that the data produced will be comparable to data produced by
education agencies and organizations at the school, district, state, and federal levels.
7. Document data providers' concerns and data requestors' responses to those concerns.
Managing for Interoperability
In the past, separate data collections were
Interoperability: the ability of different information
managed independently, and this practice systems, devices, and applications (“systems”) to access,
still is common in some SEAs and LEAs. exchange, integrate, and cooperatively use data in a
Many categories of data are required to coordinated manner, within and across organizational,
be reported in various ways for various regional, and national boundaries, to provide timely and
purposes, such as subgroups for state and seamless portability of information
federal reporting. If the data are collected
one way for one report and a different way
for another, this can lead to complications for the agency.
However, it is increasingly important that data are collected in a manner that meets as many
needs as possible. For example, the same employee information may need to be collected by the
human resources department and the information technology (IT) department, and collecting
these data once and sharing them across departments increases both efficiency and accuracy.
This practice helps to reduce burdens on agency staff by streamlining data collections (that is,
they do not have to collect the same data twice), and it also improves data quality by ensuring
that data stored in different systems are not contradictory.
In many agencies, these processes still are not in perfect alignment. For example, in some
states, districts may choose their student information system (SIS), which often differs from the
state SIS. This puts the onus on districts to ensure that complete and accurate data entry occurs
By August 2019, because of legislation changes, the California student data collection system (CALPADS) changed
sex/gender data collection to allow students to declare themselves as non-binary. https://www.cde.ca.gov/ds/sp/
cl/calpadsupdflash158.asp
To be compliant with the changes, student information system (SIS) vendors have updated their systems so that
they can report student gender within the guidance provided by CALPADS. These changes trickle down to the local
education agency (LEA) level with procedural changes being implemented by administration and put into effect by
registrar staff charged with collecting this information.
8 For more information on developing agreements to share data with researchers and others, see the Forum Guides
to Supporting Data Access for Researchers: A Local Education Agency Perspective and A State Education Agency Perspective,
which are available at https://nces.ed.gov/forum/pub_2014801.asp and https://nces.ed.gov/forum/pub_2012809.asp.
9 For more information about PTAC, see https://studentprivacy.ed.gov/.
For more information, see the Forum Guide to Metadata, available at https://nces.ed.gov/forum/pub_2009805.asp.
10 Privacy Technical Assistance Center (PTAC). (2014). Best Practices for Data Destruction. https://studentprivacy.
ed.gov/resources/best-practices-data-destruction.
This chapter provides case studies from state and local education agencies (SEAs and LEAs) that
discuss the specifics of their agencies’ data strategy plans, how their overall data strategy was
envisioned and developed, and potential challenges and solutions they experienced along the
way. These case studies also include SEAs or LEAs that still are within the development process
for their data strategy, as the details and nuances of creation and implementation are likely to
be useful to readers.
Washington State Office of Superintendent of Public Instruction: The Case for Data
Strategy Documentation
The Washington State Office of Superintendent of Public
Though many SEAs and LEAs have well- Instruction’s “why” for data strategy is that the data
considered data strategies, they often are only as good as the information behind them,
have varying levels of documentation of and what that information represents. Policy decisions
their overall strategy or specific processes. require accurate and relevant data.
Agencies that have clear documentation
are in a much better position to navigate through changes or transitions. For example, during
a recent change in administration, the Washington State Office of Superintendent of Public
Instruction11 found how advantageous its existing documentation of the agency’s data strategy
was for the numerous stakeholders involved in the transition. In the data governance manual,12
which was published publicly and created using significant stakeholder input, the data
governance team had not only made a clear case for the “why” of the overall data strategy, but
also why particular processes were in place and the origin of those processes. This meant that
the incoming new administration found a concise, clear document waiting for it that provided
rapid comprehension of the agency’s existing data processes, as well as the carefully considered
reasons for them.
With this documentation in place, the new Superintendent of Public Instruction quickly
understood the agency’s intentions for different data, how different offices functioned and
worked together in data collection and reporting, and the specific roles of individual staff
members and teams. The new administration also made a point of meeting with people across a
range of positions and departments, asking them to describe their roles and how they fit into the
larger system and contribute to the agency’s mission. This approach to the transition allowed
the new administration to understand the history and goals of the agency’s data strategy, and let
it expand beyond the foundation of the strategy documentation.
11 For more information on the Washington State Office of Superintendent of Public Instruction's K-12 Data
Governance workgroup, see https://www.k12.wa.us/about-ospi/workgroups-committees/currently-meeting-workgroups/k-
12-data-governance
12 Washington State Office of Superintendent of Public Instruction. (2015). Data Governance System for K-12 Data:
Policies and Procedures. https://www.k12.wa.us/sites/default/files/public/cisl/pubdocs/DataGovernanceManual.pdf
13 For more information about the storming stage, see 5 Stages of Team Development: Tuckman's Group Development,
from https://project-management.com/stages-of-team-development/
14 For more information on data governance in Loudoun County Public Schools (VA), see the case study that
begins on page 44 of the Forum Guide to Data Governance, available at https://nces.ed.gov/forum/pub_2020083.asp.
Wisconsin Department of Public Instruction (DPI): Strong Data Quality Measures and
Agile Leadership Transform Strategic Data Use
The Wisconsin Information System for Education (WISE) comprises multiple interoperable tools
that support data collection to meet all state and federal reporting requirements. The complexity
of these interoperable systems drove state data leaders to establish formalized data and project
governance, as well as a structured data quality process. While the state’s foundational priority
is collecting and sharing required data, a specific focus on data quality and transparency has
allowed Wisconsin to be more strategic in its coordination, analysis, and use of data.
Agile Leadership and the Scrum Process
From a structural and process perspective, the Wisconsin Department of Public Instruction's
(DPI's) data strategy focuses on one major project management philosophy: agile development
practices that use the scrum process at the team level, which then is scaled. The agile
development methodology is an iterative approach in which large projects are broken down
into more manageable tasks tackled in short iterations or “sprints,” empowered by small teams.
The scrum team framework is a team design with specific roles and teamwork expectations, in
which the members work together to deliver required product increments. Wisconsin uses these
concepts in tandem to direct its product development and data strategy.
DPI's product development revolves around an agile leadership mindset. The philosophy focuses
on satisfying the customer (in this case, program areas or LEAs) through early and continuous
delivery of valuable software and data solutions. In the agile approach, team members identify
what they are working toward with the customer, and the team begins by building a small
initial piece to get feedback from the customer. The team continues to develop new iterations,
rolling out small pieces every 2 weeks (the time of the agency’s “sprint” cycle). The belief is that
constant feedback allows for a better product, as the teams interact regularly with customers
throughout the sprints. The state’s data leaders find that they have been able to connect better
with customers, built a relationship based on trust, and have bridged gaps between program
areas and IT. Advisory groups consisting of LEA users for specific products were established
to receive continuous feedback on product developments, which ensures development is
prioritized based on the most important needs.
Within the agile mindset, Wisconsin also depends on the scrum team, a structure that encourages
high levels of communication among team members and an integrated working environment.
Each scrum team is empowered to deliver solutions based on an assigned vision, and each has
standard team roles. The product owner’s main responsibility is to answer the question, “What
is the team doing next?” This person prioritizes key tasks and is responsible for coordinating the
product vision and conveying it to the development team. The scrum master is considered the
process owner. This person helps remove impediments, facilitates meetings, and works with the
Forum Guide to Strategies for Education Data Collection and Reporting 29
product owner to make sure the backlog is in good shape. Finally, the development team consists
of the business analyst, the quality assurance analyst, and the developers. Depending on the
scrum team, the development team may range from three to seven members.
Before implementing the scrum process as a core element of project and data governance,
the state conducted development efforts that were not as streamlined, leading to potential
redundancies or unidentified needs. Additionally, these projects used traditional waterfall
project management methods, which map out a project into distinct, sequential phases, with
each new phase beginning only when the prior phase has been completed. Over time, data
leaders have made changes to the entire process to increase productivity, collaboration, and
transparency. They now have the timely and accurate data they need to identify and provide
needed resources, support students and educators, and continually improve processes.
Wisconsin now uses a scaling framework for its approach to project governance. It comprises
multiple scrum teams, the WISE Leadership Team, the WISE Steering Committee, and the IT
Project Request and Prioritization Process.
• Scrum teams (application development, data warehouse, and DevOps) use the scaling
framework as an agile development methodology, which uses a strategy that allows
solutions to be delivered in usable and workable iterations. Each program area, or core
product, has an assigned scrum team. Each scrum team has one individual assigned to
the role of product owner, a scrum master, and one or more team members assigned to
the development team (analysts, developers, and quality assurance).
• The WISE Leadership Team, which meets weekly, is made up of the IT management
team, scrum team product owners, and other key team members. This team handles
the project request process, which involves a weekly review of any project requests
entered by agency staff through a form on the agency’s intranet site. The team
determines whether the project request can be assigned directly to a scrum team or if it
needs review and prioritization by the WISE Steering Committee. The leadership team
also communicates across the agency about items that may affect more than one team.
• Although the WISE Steering Committee originally was developed for the WISEdata
project, the committee now is a cross-agency group that covers the entire WISE
product suite. It includes IT directors and program area directors from any program
area that has data at the DPI, essentially every division and team in the agency. The
steering committee prioritizes project work using a decision protocol it developed
itself, which is crucial when program areas are competing for scrum team or staff time.
The committee informs the product roadmap, following the group’s central goals of
transparency and criterion-driven, consensus-based decisionmaking. The steering
committee also represents the policy tier of the data governance structure at DPI and
can make decisions and set priorities on that level.
The scaling framework allows DPI to coordinate and facilitate work between multiple scrum
teams and also to provide accountability and transparency. Each scrum team performs a daily
scrum stand-up. This meeting lasts 15 minutes or less, and all team members share information
based on three questions: What did you do yesterday? What are you doing today? What is
standing in your way?
Like the daily scrum standups, there also is a daily scrum of scrums, or scaled daily scrum
meeting. This meeting consists of one representative from each scrum team. The purpose of
the meeting is to discuss how teams can work together efficiently, provide team updates, and
identify and resolve any dependencies between teams.
15 For more information on the Wisconsin DPI’s early warning systems, see the case study that begins on page 48 of
the Forum Guide to Early Warning Systems, available at https://nces.ed.gov/forum/pub_2019035.asp.
Legal References
Americans With Disabilities Act of 1990, Pub. L. No. 101-336, 104 Stat. 328 (1990).
https://www.ada.gov/pubs/adastatute08.htm
Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g (1974).
https://www2.ed.gov/policy/gen/guid/fpco/ferpa/
Additional Resources
California Department of Education Guidance for Changing a Student's Gender in the
California Longitudinal Pupil Achievement Data System: https://www.cde.ca.gov/ds/sp/cl/
calpadsupdflash158.asp
Common Education Data Standards (CEDS): https://ceds.ed.gov/
Current Measures of Sexual Orientation and Gender Identity in Federal Surveys: https://nces.
ed.gov/FCSM/pdf/buda5.pdf
EDFacts Disclosure Review Board: https://www2.ed.gov/about/inits/ed/edfacts/ed-disclosure-
avoidance-overview.pdf
Elementary and Secondary Information System (ElSi): https://nces.ed.gov/ccd/elsi
Fairfax County Public Schools (VA) Student Information System (SIS): https://www.fcps.edu/
resources/technology/student-information-system-sis-fcps
Five Stages of Team Development: Tuckman's Group Development: https://project-management.
com/stages-of-team-development/
Florida Association for Testing Administrators: https://www.floridatestadmin.com/
Kentucky Student Information System Data Standards: https://education.ky.gov/districts/tech/
sis/Pages/KSIS-Data-Standards.aspx
National Center for Education Statistics (NCES) Confidentiality Procedures: https://nces.ed.gov/
statprog/confproc.asp
Privacy Technical Assistance Center (PTAC): https://studentprivacy.ed.gov
PTAC Best Practices for Data Destruction: https://studentprivacy.ed.gov/resources/best-
practices-data-destruction
PTAC Data Governance Checklist: https://nces.ed.gov/Forum/pdf/data_governance_checklist.pdf
Project Management Institute’s A Guide to the Project Management Body of Knowledge (PMBOK®
Guide): https://www.pmi.org/pmbok-guide-standards