0% found this document useful (0 votes)
105 views6 pages

Evaluation Proposal - RFP

This evaluation will determine the effectiveness of the Determine Instructional Purposes (DIP) training program. The evaluation will review training materials, conduct sample training groups, and survey participants to determine if the program is successful and should continue being marketed and distributed.

Uploaded by

Dessa Schurr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views6 pages

Evaluation Proposal - RFP

This evaluation will determine the effectiveness of the Determine Instructional Purposes (DIP) training program. The evaluation will review training materials, conduct sample training groups, and survey participants to determine if the program is successful and should continue being marketed and distributed.

Uploaded by

Dessa Schurr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Evaluation of

Determining
Instructional Purposes
Training Program
A Proposal Submitted to Far West Laboratory
for Educational Research and Development
By W.D. Oswald Evaluation Firm
Dessa Schurr, President

Page 1


Introduction
Far West Laboratory for Educational Research and Development submitted a request
for a proposal of their Determine Instructional Purposes (DIP) training program. The
purpose of this evaluation is to determine if this program is suitable for future marketing
and distribution of the training package. The overall goal of this proposal is to justify that
the program is worthy of the monetary investment to continue.

Description of Program Being Evaluated
The DIP training package was developed for school administrators and graduate
students majoring in administration on how to effectively plan school programs. The
program consists of a Coordinators Handbook and three training units: Unit 1 Setting
Goals, Unit 2 Analyzing Problems, and Unit 3 Deriving Objectives.
The units are divided into modules to further learn the content. Each unit consists of four
to six modules that provide training on a specified number of program objectives. A
module consists of reading material associated with the skills being taught and provides
both small group and individual activities. Planning teams are established to help
trainees apply the new skills to hypothetical situations. The units can be used
individually or in conjunction with each other, depending on the administrations desired
plan for training. The program allows for flexibility through its design of workshops or
individual sessions. The estimated time frame for Units 1 and 3 require about 10-15
hours each and Unit 2 requires about 12-18 hours of instruction. A coordinator is
required to supervise the training and is available to organize, guide, and monitor
activities with the trainees.
Evaluation Method
The purpose of this evaluation is to reveal the effectiveness of applying the DIP training
program into a marketing campaign geared towards school administrators and graduate
students. Through the evaluation of the DIP training units, FWL will be able to determine
the future sale of the training units. The main goal is to learn if the DIP training program
is optimal for investment, as well as further production and distribution of program
materials. The audience for the results of this evaluation will be the FWL team, which
consists of their stakeholders and marketing team. The FWL team includes executive
officers and staff. Once the evaluation report is complete, the W.D. Oswald team will
need to meet with FWL to share their findings and recommendations for the future of
the program.
It will be determined through this evaluation if the DIP training program should continue
its production and increase marketing. The intended audience for this program is school
administrators and graduate students pursuing skills in planning effective school
programs.
The evaluation can be broken down into three steps:


Page 2


1. Evaluation of DIP Materials
The subject matter expert will review the DIP training material content. The
purpose of this step is to evaluate the quality of the materials as it relates to the
objectives of the program. The SME will provide comments and suggestions on
the materials. Data sources in this stage include an interview and provided
comments from SME. Materials include a full set of training materials.

2. Sample Training Groups
There will be three different sample groups to participate in different training
units. One sample group will receive one training unit, another sample group will
receive two training units, and the third sample group will receive all three
training units. Each sample group will consist of 6 recruited local administrators
and graduate students interested in the program. Participants will have the
opportunity to provide suggestions and offer feedback on the training program.
Pre- and Post-surveys and post-interviews will be conducted with the sample
groups. Course coordinators will be provided the Coordinator Handbook prior to
the sample training groups to become familiar with the material. Training
sessions will take place at the Convention Center in Downtown Indianapolis.
Training sessions will last about 1-2 weeks due to units requiring 15-18 hours
each. Two months after the training, participants will be sent a questionnaire
about if and how they are using their training. Data sources include interviews,
pre- and post-surveys, and follow up questionnaire. The materials include 3 sets
of training materials and 3 coordinator training handbooks.

3. Review Findings and Materials
The pre- and post-surveys will be collected and examined. The participants will
be interviewed to help collect additional data and information regarding the
training sessions. The W.D. Oswald evaluators will collect the surveys and
conduct the interviews. Evaluators will also review the course materials based on
the SMEs initial comments in relation to the survey results. W.D. Oswald will
review all results and compose the final report to present to FWL. Data sources
include the interviews and surveys. The materials include copies of the surveys,
interviews, and training session materials.

At the conclusion of this evaluation, the sample group participants will help determine
the effectiveness of the training. The results of this evaluation will help FWL
stakeholders decide if the program has accomplished its intended purpose and has
affected the participants thinking in planning school programs. Using both quantitative
and qualitative evaluation methods, the impact the DIP training program has on the
participants skills in planning school programs will be identified.
The results and success of the evaluation can be used to help market the program to
school administrators and graduate students. FWL can use the information and data in
their campaign to show that this program is successful.


Page 3


Task Schedule
In order to complete this evaluation, W.D. Oswald Evaluations would propose a 7 month
time frame for this project. See the task schedule listed below for details.

Task Participants Estimated Date
Meetings between FWL and W.D.
Oswald Evaluations to discuss
proposal and desired outcomes.
FWL Stakeholders
FWL Marketing Team
W.D. Oswald Evaluators
April 1
Contact and provide training
materials to Subject Matter Expert
(SME) for review.
W.D. Oswald Evaluators
Subject Matter Expert
April 14
SMEs comments returned and
reviewed.
W.D. Oswald Evaluators
Subject Matter Expert
April 28
Prepare pre- and post-survey for the
sample group.
Test Developer
W.D. Oswald Evaluators
May 12
Select and contact recruits and
course coordinators to participate in
each sample group (3 total) for DIP
training.
W.D. Oswald Evaluators May 26
Provide course coordinator
handbook and materials to selected
coordinators for each sample group.
W.D. Oswald Evaluators
Course Coordinator
June 9
Hold the DIP trainings for the
established sample groups.
W.D. Oswald Evaluators
Course Coordinator
Training Session Participants
June 16
Post-surveys completed by
participants and course
coordinators.
Course Coordinator
Training Session Participants
July 21
Interview participants and course
coordinators from the sample
groups.
W.D. Oswald Evaluators
Test Developer
July 21
Collect and examine pre- and post-
surveys and comments. Review
findings from interviews.
W.D. Oswald Evaluators August 4
Send out follow up questionnaire to
participants
W.D. Oswald Evaluators
Training Session Participants
August 16
Review course materials and
information from the evaluation in
comparison to SMEs initial
comments.
W.D. Oswald Evaluators August 25
Prepare final evaluation report. W.D. Oswald Evaluators September 22
Present results and final evaluation
report to FWL.
W.D. Oswald Evaluators
FWL Stakeholders
FWL Marketing Team
November 3



Page 4


Project Personnel
Dessa Schurr, President
Dessa Schurr is the founder of W.D. Oswald Evaluation Firm. Dessa obtained her
Bachelors in Elementary Education from Purdue University. She received her Masters
in Educational Technology from Boise State University. Prior to establishing the W.D.
Oswald Evaluation Firm, Ms. Schurr taught K-8
th
grade Technology and was the IT
Coordinator for St. Louis de Montfort Catholic School in Fishers, Indiana. Dessa has 6
years of evaluation experience. For this evaluation, Dessa will be responsible for
overseeing, planning, and executing the evaluation.
Lena Atkinson, Subject Matter Expert
Lena Atkinson is the subject matter expert for this training package evaluation. Lena is
currently in her 8
th
year as the Curriculum and Instruction Coordinator at Cole
Elementary in Lafayette, Indiana. Lena received her Bachelors degree in Elementary
Education from Purdue University and her Masters degree in Curriculum and Instruction
from the University of Notre Dame.
Ian Kleck, Evaluator
Ian Kleck is an evaluator for this evaluation. Ian received his Bachelors in Educational
Evaluation and Research from Florida State University. Ian has worked for W.D. Oswald
Evaluation Firm for 5 years. He works as an associate with the firm and helps organize
and conduct evaluations.
Walt Miller, Evaluator
Walt Miller is an evaluator for this evaluation. Walt received his Bachelors in Secondary
Education from Ohio State University. Prior to working for W.D. Oswald, Walt worked as
a middle school science teacher for 15 years. Walt has been with W.D. Oswald for 4
years. He also works as an associate with the firm and helps organize and conduct
evaluations.
Kay Robinson, Test Developer
Kay Robinson is the test developer for this evaluation. Kay has worked for the W.D.
Oswald Evaluation Firm for 5 years. Kay received her bachelors in Education from Ball
State University and served as a Curriculum and Technology Specialist for 15 years.
She is responsible for creating the pre- and post-surveys, interviews, and questionnaire
to be used in the evaluation.




Page 5


Budget
The proposed budget for this evaluation consists of personal wages, travel, and
supplies. The total amount is $45,875.68.
Item Cost
Dessa Schurr 65 days X $200/day $13,000.00
Lena Atkinson 10 days X $125/day $1,250.00
Ian Kleck 65 days X $175/day $11,375.00
Walt Miller 65 days X $175/day $11,375.00
Kay Robinson 22 days X $125/day $2,750,00
Travel Expenses (round trip) 125.6 miles X $0.56/mile $70.33
Miscellaneous Travel 700 miles X $0.56/mile $392.00
Lodging (3 people)
$115/night per person
15 nights X $345
$5,175.00

Paper Supplies $400.00
Training materials (3 sets) 3 sets X $24.95 $74.85
Coordinator Handbooks (3) 3 handbooks X $4.50 $13.50
Total $45,875.68

You might also like