Medt8480 sp19 A4-Harris 1 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Program Evaluation Worksheet

I. Program Information

Date: March 31, 2019


Program name: Babb Middle School Technology Integration Professional Development Initiative
Applicant name: Tijuana Harris
Contact person: Tijuana Harris
Phone/e-mail address: tharri33@my.westga.edu

II. Program Summary

Program purpose: The purpose of the program is to train teachers how to use and integrate technology into their curriculum delivery.
Program goal: A minimum of 80% of all certified teachers will receive training during a professional development initiative. Of the 55 teachers, the goal is to
train a minimum of 44 certified teachers.

Page 1 of 8
Program objectives:

Objective 1 – Training workshops based on proficiency levels will take place between February 12-15, 2019.

Objective 2 - Shoulder to shoulder training will begin between April 18 –May 22, as a support for teachers.

Objective 3 – April 15, 2019 A final Professional development session will feature teachers from each proficiency level. The teachers will provide walk through

presentations of featured technology. This exposure and material will provide a learning experience and teach teachers how to use a variety of fitting

software.

Objective 4 -A final measurement will be performed by May 15, 2019 to assess the overall results. Did 80% of all certified teachers receive technology

training? A measurement of success would include google classroom use and the introduction of 1-3 supplemental, educational software products per

teacher.

Objective 5 - Another measurement would include a side by side comparison of student engagement and overall performance.

Short-term objectives that are direct by -products of integrating technology into the curriculum across the board
 Decrease in behavior referrals by 10% or greater
 Increase in student engagement in the classroom based on teacher and administrators’ observations
 Teacher’s comfort level integrating technology increases on an ongoing basis (Post survey – 6 months)
Intermediate
 Increased student performance on high stakes testing by a full level (Beginner to Developing, etc.)
 Consistent student growth of 35% or greater
 Increase in the % of students reading at or above grade level

Long-term
 Student growth amongst each learner group; struggling, average and advanced students by 35% or greater
 Increase in students taking advanced courses
 A significant increase in Career and College Ready students upon graduation
CCPRI Results will increase for the school. CCPRI = College and Career Ready Performance Index

Program description and primary activities:


Babb Middle School’s Professional Development will train certified teachers how to integrate technology into the existing curriculum. This tiered approach will
address the needs of all users from beginners to advanced levels of experience.
Page 2 of 8
III. Evaluation Plan Overview

Evaluation classification:
Role of the evaluator: The evaluator will review and assess the main components of the program ensuring that the main goal is attainable and that it is tied
directly to the short, intermediate and long term goals.
Evaluator qualifications: Formal training – The evaluator has formal training in a graduate level. Program Evaluation course. The evaluator is also a candidate
for graduation in the Ed. S degree program.
Evaluation timeline and completion of evaluation report:

 75 days prior to survey administration - Develop one teacher survey to assess the depth of knowledge pertaining to technology skills and comfort level of
each teacher.

 60 Days prior to the survey administration – Evaluate the survey to ensure that it will properly place teachers in the appropriate learning tiers in order to
ensure that the needs of the teachers are met. This will involve an evaluation of each question as well as each answer choice to ensure that the
responses logically place each respondent in the correct tier. Develop one teacher survey to assess the depth of knowledge pertaining to technology skills and comfort level
of each teacher.

50-55 Days prior to the survey administration – Test the system to ensure that there are no system glitches, test to ensure that the survey links are
functional and that there is no timer on the survey.

 40 Days prior to the professional development -


1. Coordinate district level and school level personnel
2. Hold a meeting to level set expectations and outcomes
3. Coordinate survey results and schedule appropriate training personnel per workshop
4. Publish Training Calendar
5. Develop a post training effectiveness survey

 35 days prior to the professional development


1. Develop learning targets and objectives that transcend content areas
Page 3 of 8
2. Develop a professional development agenda for each workshop (Beginner Intermediate and Advanced)
3. Develop an agenda for the Final Forum PD – schedule presenters from each learner group
4. Preset shoulder to shoulder appointments for teacher who would benefit from one to one follow up post- training, based on survey results

30 Days prior to the professional development


1. Create e a teacher post training via an effectiveness survey
2. Plan to Evaluate Shoulder to Shoulder training feedback
3. Identify indicators that may predicate the frequency of future technology training

3 months to 5 years post training


Evaluate post training short, intermediate and long term goals

Participants who developed the evaluation plan: Tijuana Harris


This evaluation plan aims to serve the following purposes in addition to meeting funding requirements: The evaluation aims to serve the purpose of
providing effective training for certified teachers. The evaluation is designed to examine each component of the technology training. Examining the design of
the survey, the tiered training, the use of shoulder to shoulder training and post training survey design is a critical and proactive means of diagnosing this plan.
Projected use of findings: Based on the findings, modifications and adjustments can be made and a final, modified plan will be implemented. Ensuring that
the goal, the program and the predicted results are aligned is key to this evaluation. The overall delivery and outcome depends heavily on a proper evaluation.
IV. Audiences
Primary stakeholders for this evaluation:
 Clayton County School District
 Babb Middle School Administrators
 Certified Teachers
 Students and the community
Use of evaluation results for these primary stakeholders: The evaluation results will be used to measure the effectiveness of this program and to
determine its use in other schools within the district. This could possibly move to a pilot involving select elementary, middle and high schools within the
district. This is necessary because varying schools and levels will render more a across the span of schools, students and teachers.
V. Evaluation Questions
Key evaluation questions to be answered by this evaluation:
 Will teachers will have the ability to integrate technology into the curriculum after this training?
 Will teachers be taught the skill set and gain preparation for selecting meaningful technology for their students?
 Is this program ready for delivery?
 Are all components of this program aligned and ready for delivery?
 Are all roles defined?
 Are the surveys perfected and designed to gather the correct data?

Page 4 of 8
 Does the program align with the purpose?

VI. Evaluation Design


Summary: Based on the design and detail the process will for evaluation will augment and elevate the effectiveness of the program. Modifications will
streamline the delivery process and overall effectiveness of the end product. Serving the district and stakeholders will be the central focus. Teachers who
become more comfortable with technology integration will incorporate it frequently, serving our ultimate stakeholder the student. Confidence begins
with knowledge and experience, which will be the focus of the professional development.

Data types: Quantitative Data – Survey results, school performance data


Objective – Survey based results
Subjective - Teacher observations and student observation and feedback post training (participants, training facilitators and shoulder to shoulder
facilitators)

Ethical considerations for this evaluation: Confidentially administering post surveys and providing Confidentially statements when surveying skill sets and
attitudes towards technology use in the curriculum
VII. Data Collection – Methods and Instruments
Proposed data collection methods:
Surveys administered via Google docs
Excel spreadsheet to house data

Instruments to be used:
Survey links - Google docs
Beginner Level Educational Software
Intermediate Educational Software
Advanced Educational Software
Person(s) responsible for data collection:
Tijuana Harris
Data collection timeline:
45 Days prior to Professional development – Administer the Teacher survey
Post Training Survey and Feedback
Post Training Teacher Observations
Post Training Short, Intermediate and Long Term Goal Assessment (itemized in Section II)

VIII. Data Management and Analysis

Page 5 of 8
Data management: The evaluator will use Google docs for surveys, converting the results into pie charts .
 Data analysis strategies: Evaluation/Feedback method during the initial training and make necessary adjustments based on the formative evaluation
Purpose – This feedback will strengthen the summative evaluation and impact the overall teacher training
 Quantitative Data – Survey results, school performance data
 Qualitative – 0bservations and feedback
 Objective – Survey based results
 Subjective - Teacher observations and student observation and feedback post training (participants, training facilitators and shoulder to shoulder
facilitators)

Person(s) responsible for data analysis: Tijuana Harris

IX. Strategies for Using Evaluation Findings


 Reporting:
 Teacher proficiency in using Google Classroom
 Technology Integration within the LMS – Google Classroom
 Post Training Surveys
 School Data Results – Lexile, High Stakes Testing, Student Growth & Performance
 Disciplinary Data

Evaluation debriefing:
 Evaluate feedback during the initial training and make necessary adjustments based on the formative evaluation
 Purpose – This feedback will strengthen the summative evaluation and impact the overall teacher training
 Survey post training and post one to one training
Collect feedback from Administration based on classroom observation
Post-evaluation action plan: Make any and all modifications to the original program as prescribed by the evaluation prior to delivery. Incorporate or revamp
areas of concern. The surveys, the course agenda and content are the primary focus of this evaluation. Objectively does the program meet the goal and
objectives? If the answer is no or maybe the findings will be incorporated into the final plan. For example, initially Professor Cisney-Booth recommended a
modification to the goal. Removing the 20% increase to teacher technology use was necessary in order to focus on the goal of delivering training to 80% of the
certified teachers at Bab Middle School.
X. Attachments

Logic Model
Evaluation Method for Data
Evaluation Question Process Logic Model Element Information Source Collection

Page 6 of 8
How effective was Process Content area teachers will meet weekly during BMS Administrative Focus Group
the training and how Evaluation Collaborative Planning to level set and integrate team, instructional Performance Data
effective is the post meaningful technology into each week's lesson planning. coach, teachers, Comparing
training integration technology liaison Performance data, Pre
by teachers? and post artifacts of
students' work

How effective was Process The technology teacher will join the planning and BMS Administrative Focus Group, survey
the training and how Evaluation collaboration meetings on a bi-weekly basis to provide team, instructional Performance Data
effective is the post help in coordinating the latest technology tools available. coach, teachers, Pre and Post artifacts
training integration technology liaison of students" work
by teachers?

Did the integration of Outcome Short Term Outcomes • Decrease in behavior referrals BMS Administrative Observation, School
meaningful Evaluation by 10% or greater team, teachers, Data, Survey
technology impact • Increase in student engagement in the classroom based students
positive student on teacher and administrators’ observations
behaviors? Did the • Teacher’s comfort level integrating technology
training strengthen increases on an ongoing basis (Post survey – 6 months)
technology skills
amongst teachers?

Did the integration of Outcome Intermediate Outcomes • Increased student BMS Administrative School data, High
meaningful Evaluation performance on high stakes testing by a full level team, instructional Stakes Testing
technology close the (Beginner to Developing, etc.) coach, teachers, Results, School Data,
student achievement • Consistent student growth of 35% district personnel Lexile Scores
gap? • Increase in the % of students reading at or above grade
level

Page 7 of 8
Did the integration of Outcome Long Term Outcomes • Student growth -35% or BMS Administrative State Assessment -
meaningful Evaluation greater team, instructional CCPRI , High Stakes
technology result in • Increase in students taking advanced courses coach, teachers, Testing Georgia
significant student • A significant increase in Career and College Ready district personnel Milestones
growth of 35% at a students upon graduation
steady and consistent CCPRI Results will increase for the school. CCPRI =
rate? Did this student College and Career Ready Performance Index
growth create an
increase in students
taking advanced
courses?

Page 8 of 8

You might also like