Basic Program Evaluation: NTSC Training Materials
Basic Program Evaluation: NTSC Training Materials
Basic Program Evaluation: NTSC Training Materials
Program Evaluation
NTSC Training Materials
Purpose/Objectives
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Program Evaluation
Training – Modules
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Module 1 –
Introduction
Module 1 – Introduction
• Why evaluate?
• What is evaluation?
• What does evaluation do?
• Kinds of evaluation
Why Evaluate?
• Outcome
• Implementation
• Formative
• Summative
Outcome Evaluation
• Formative
– as the program is happening to make
changes as program is being implemented
• Summative
– at the end of a program to document
results
Module 2 – Overview
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Overview – The 9-
step Process
• Planning
• Development
• Implementation
• Feedback
Overview – The 9-
step Process
Overview – The 9-
step Process
Overview – The 9-
step Process
Module 3 – Defining
the Purpose
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process
Step 1: Define
Purpose and Scope
Step 1: Scope/Purpose
of Evaluation
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process
Step 2: Specify
Evaluation
Questions
Evaluation
Questions
• Strategic plans
• Mission statements
• Policies
• Needs assessment
• Goals and objectives
• National standards and guidelines
Broad Questions
• Broad Scope
– Do our students contribute positively to society after
graduation?
– Do students in our new mentoring program have a
more positive self-concept and better decision-making
skills than students without access to the mentoring
program?
– To what extent does the state’s career development
program contribute to student readiness for further
education and training and success in the workforce?
Narrow Questions
• Narrow Scope
– Can our 6th grade students identify appropriate
and inappropriate social behaviors?
– How many of our 10th grade students have
identified their work-related interests using an
interest inventory?
– Have 100% of our 10th grade students
identified at least 3 occupations to explore
further based on their interests, abilities, and
knowledge of education and training
requirements?
Exercise 1 –
Scope (p. 2 of Workbook)
• From the list of questions, identify
those that might be considered broad
and those that might be considered
narrow
• How large will the resources need to
be to answer the question
Exercise 2 – Scope
Write (p. 3 of Workbook)
• List one broad evaluation question
and one narrow evaluation question
Module 5 –
Identifying Evidence
Needed
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Identifying Evidence
Needed to Answer Your
Questions
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process
Step 3: Specify
Evaluation Design
Types of Designs
Step 4: Create a
Data Collection
Action Plan
Organize Your Evaluation
With a Data Collection
Action Plan
• Students
• Parents
• Teachers
• Counselors
• Employers
• Friends
• Documents and other records
Exercise 5 – Data
Sources (p. 6 of Workbook)
• Commercial instrument
• Survey/questionnaire
• Focus group/interviews
• Observations
• Archived information
Module 9 – Using
Commercial
Instruments
• Reviewexamples of a completed
Data Collection Action Plan on
Pages 10-12 of the Workbook
Module 11 –
Collecting Data
Step 5: Collect
Data
How Much Data Should
You Collect?
Step 6: Analyze
Data
What is Data
Analysis?
• Data collected during program
evaluation are compiled and analyzed
(counting; number crunching)
• Inferences are drawn as to why some
results occurred and others did not
• Can be very complex depending on
your evaluation questions
• We will focus on simple things that can
be done without expert consultants
Types of Data Analysis
Simple Frequency Counts
Types of Data Analysis
Sort by Relevant
Categories
Types of Data Analysis
Calculate Percentages –
Exercise 8 (p.13 of Workbook)
Types of Data Analysis
Showing Change or
Differences
Types of Data Analysis
Reaching an Objective or
Goal
Types of Data Analysis
Observing Trends
Types of Data Analysis
Graph Results
Types of Data Analysis
Calculate Averages – Exercise 9
(p. 14 of Workbook)
Types of Data Analysis
Calculate Weighted
Averages
Types of Data Analysis
Calculate Weighted
Averages
Types of Data Analysis
Rank Order Weighted
Averages
Types of Data Analysis
Graph Weighted Averages
Using Focus
Group/Interview
Information
Step 7: Drawing
Conclusions and
Documenting
Findings
Drawing
Conclusions
• Examine results carefully and
objectively
• Draw conclusions based on your
data
• What do the results signify about
your program?
Exercise 10 – Interpreting
Results (p. 15-16 of
Workbook)
• Program description
• Evaluation questions
• Methodology (how and from whom and
when)
• Response rate
• Methods of analysis
• Conclusions listed by evaluation question
• General conclusions and findings
• Action items
• Recommendations for program
improvement and change
Document the
Successes and
Shortfalls
Step 8: Disseminate
Information
Determining
Dissemination Methods
• Businesses
Partners that work with your program
Employers
• School Level
School administrators
Counselors
Teachers
Students
Parents
Potential
Audiences
• Media
Local newspaper
TV station
Radio program
Community or school newsletter
• Education Researchers
• Members of Community or faith based organizations
Church members
Religious leaders
Rotary club
Boys or girls club
• Anyone who participated in your evaluation!
Dissemination
Techniques
• Reports
• Journal articles
• Conferences
• Career Newsletter/Tabloids
• Presentations
• Brochures
• TV and newspaper interviews
• Executive summary
• Posting on Web site
Exercise 11 – Disseminating
Information (p. 17 of
Workbook)
Step 9: Feedback
to Program
Improvement
Opportunities to
Fix Shortfalls
• Evaluation results may show areas where
improvement is necessary