Ddi Ontrack Reports
Ddi Ontrack Reports
Ddi Ontrack Reports
Instructions: Use this Task Card to assist in effectively using the DDI tabs in OnTrack. The first
section of this task card focuses on general reports functionality. The second section focuses on
effective use of DDI reports for instructional next steps. The order of the reports matter and
should be done in the suggested order. If issues or errors are encountered in executing any of
the performance tasks, please contact the Service Desk.
Steps
Internet Access: This site works best when viewed with Visit http://houstonisd.org/fa for
Google Chrome more OnTrack task cards and
resources.
Access: Go to www.houstonisd.org/ontrack
Click on Login in Active Directory
Reports:
- Select Reports from flip card or Top Menu Bar
- Select DDI Reports
- Select Item Analysis.
Other popular reports include Student Response,
Standard Analysis and Student Grouping by
Standard. Toggle across tabs to view various reports
Change Assessment: Note: System defaults the test to the
most recent assessment accessed
- To change assessment, Select the and use filters
such as year, subject, assessment name. For TEA STAAR
data select: HISD+STAAR+Subject+Language+Year
(ex: HISD STAAR Math English 16-17)
- Select
- If needed, select the grade level tested, course
group (High School), course and/or period(section)
Select
- Optional: Select to filter by student groups,
ethnicity, gender, and other special pops indicators,
then select
Sort Data
- Any column can be reordered by clicking on the
column header or (i.e. % Correct from lowest to
highest, alpha sort vs. ranking students, etc.)
- Click column header or again to reorder from
highest to lowest)
- One arrow on the column header means the
report is already sorted by that column
Show additional report features:
- In the Show menu, select to add or remove
features including: Student ID, Level Tested, Time
Spent, Demographics, Graph, Color Scale
Note: selections are based on reports type or
information availability ex: time spent is only available
on online assessments)
- Tip: when selecting Graph, hover over each bar to
view details (Standards, Description, Average percent
correct, Number of Items, and Item numbers)
- Select Mastery/Non-Mastery or By Performance
Levels to see various reporting metrics
Surface Analysis
Get the big picture by looking at the overview of your data through surface
analysis.Deep Analysis
See strengths and growths at a glance.
Go deeper. Look for patterns in your students’ performance to distinguish which items require whole group support vs.
Use the smaller
Standard Analysis
groups Report with
of student (sort % correct to rank standards)
common misconceptions.
Whole Group
Use the Item Analysis Report (sort % correct to rank items) Items with high failure rate require whole group intervention. Other items
with smaller groups of students with common misconceptions require small group intervention.
Small Group
DDI- Item Analysis
Purpose: Use this report to determine which test questions were difficult and note the key questions to review to
analyze causes and solutions (distractors, ambiguity, and further instruction). It helps deconstruct questions to
determine which subskills are being addressed, understand distractor rationales, and make informed decisions
about effective, corrective, and adaptive instruction.
Measured: Percent of student responses per answer choice (Choice 1-A/F, Choice 2-B/G, etc.)
Standards are labeled as (S) Supporting and (R) Readiness and Process has its own column
Tips: Click on the Blue % Correct bar in the % Correct column to view an Item Summary, that contains an
interactive pie chart and student grouped by their responses select Item Preview tab to view online student’s
view, correct answers, and item properties.
Guiding Questions:
1. Which standard(s) need the greatest attention? What have you noticed about instruction for that
particular standard? Select STOP Items sub-tab under Item Analysis Scrutinize These On
Performance): An item is a STOP item if the threshold was not met (Item Correct Threshold) or if
too many students choose the same wrong answer (Item Distractor Threshold) You may change the
Thresholds under My Preferences
2. What misunderstandings do the students’ errors reveal? What do you think students were doing wrong? (Refer
to the test question)
3. Look within standards: On questions that measured the same standard, were students better on some questions
than on others? (Tip: click on column title Student Expectation to sort by SE) If so, how do those questions differ
in difficulty? Why did students do better on one than on another?
4. Compare similar standards: Do the results on one standard influence the other?
5. What needs to be different next week to ensure scholars achieve mastery this specific standard?
6. Is the issue on the content or process standards, or both?
Student Response
Purpose: This report indicates incorrect answer choices made by individual students. It allows teachers to see
individual student misconceptions. Teachers can even see student griddable and rubric responses to see
performance in open ended type questions. This report allows a teacher to surface misconceptions made by
individual students and can help group students who made similar mistakes. It also allows teachers to prioritize re-
teaching standards.
Notes: Standard Category: R – Reporting; S – Supporting
PS – Process Standard (PS currently only apply to Math, Science and Social Studies)
Measured: Aggregate percent correct of all students’ responses and individual student selection per answer choice
(Griddable- actual student response, Rubric 0-4, = no response)
Tip: Hover over TEKS number to view in detail and click on Item # to preview question and correct answer
Guiding Questions:
1. Based on the mastery threshold, which students achieved mastery? What are areas of celebration?
2. Based on the mastery threshold, which students need remediation to achieve mastery? Areas of growth?
3. Which standards were the most challenging for the students?
4. Which students have mastered the standards and may serve as peer-tutors?
5. Are there similar trends in the students’ responses?
6. How are individual students performing on readiness and process standards?
7. How can knowing this information help understand a student’s level of mastery of a standard?
8. What strengths and weaknesses can be seen for each individual student?
9. How does the mastery threshold help have discretion around which student achieved mastery?
Student Grouping by Standard
Purpose: This report provides student performance by class by standard. It places students in one of three
bands. Teachers use this report to group students by standard performance in an effort to provide
enrichment or remediation on identified areas of need.
Tip: To change the percent correct bands for each column, select Percentage of Questions Correct to edit
percentages of the 4b ands, edit the percent’s as need and select
Guiding Questions:
1. Which students need intervention and on what standards (TEKS)?
2. Can students be further divided within a group?
3. What sort of practice do the students need to master this standard – heavy repetition of
computation skills? Following a multi-step protocol?
4. Based on the class performance, what re-teaching do I need to do?
5. What are the standards that will be reviewed or retaught for the whole class?
6. Are the struggling students’ misunderstandings different than those of the rest of the students on these
standards?
7. What additional support or steps will the struggling students need when these standards are being
reviewed?
8. Are there any student’s not attaining proficiency across reporting categories?
9. How can the question numbers be leveraged to support instructional next steps?
10. What interventions and resource material could support the mastery of the standard?
11. How can this report help document an action plan and instructional next steps?
Student Feedback Cards
Purpose: This report displays individual student performance on individual questions, total percent correct, item count
and percent correct by standard for each student. Teachers can use this report conference with individual
students to discuss performance and standard mastery. This report provides a meaningful way to provide
individual students with feedback for data conferences, and documentation for student DDI binders.
Measured: Average percent correct of individual students including student responses and performance by standard
Standard Percent Correct (# of items correct/# of total items) Student Responses vs Correct Response
Find Snapshots