0% found this document useful (0 votes)
24 views20 pages

Ext 507 (How To Conduct Evaluation)

Uploaded by

sumesh0468
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views20 pages

Ext 507 (How To Conduct Evaluation)

Uploaded by

sumesh0468
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

How to conduct

evaluation
Steps to Evaluation
Step1. Identify and describe the program
to be evaluated
 Identify and describe the program you want to evaluate.
 A description should include:
 its goals and objectives.
 the geographic boundaries of the program.
 the clientele served.
 the program funders.
 the program staff.
 Identify the audience from whom you will gather information.
Step 2: Identify the program phase & the
appropriate type of evaluation study
Step 3. Assess the feasibility of
implementing an evaluation study
• If the answers to many of these questions are "No", this may not be an
appropriate time to implement an evaluation study.
 Is there an important decision to be made on the basis of the evaluation?
 Is there a commitment to use the evaluation findings?
 Will important program decisions be made regardless of evaluation findings?
 Is there a legal requirement to carry out an evaluation?
 Does the program have enough impact or importance to warrant
formal evaluation?
o Is this a one-time program?
o Will this program continue?
o Is the cost of the program so low that an evaluation
is unnecessary?
 Is it likely that the evaluation will provide valid and reliable information?
 Is it likely that the evaluation will meet acceptable standards of propriety?
o Will the evaluation violate professional principles?
o Is the evaluation threatened by conflict of interest?
o Will the evaluation jeopardize the well-being of
program participants?
 Is the program ready to be evaluated?
o If a summative evaluation is suggested, has the program
been operating long enough to provide clearly defined
outcomes?
 Are there sufficient human and monetary resources available to carry
out an evaluation?
 Is there enough time to complete the evaluation?
Step 4: Identify and consult key stakeholders
Stakeholders are people who have a stake or vested interest in the evaluation findings. They
can be program funders, staff, administration, clients or program participants. It is important to
clarify the purpose and procedures of an evaluation with key stakeholders before beginning.
This process can help determine the type of evaluation needed and point to additional reasons
for evaluation that ,nay prove even more productive than those originally
suggested.
 Come to agreement with stakeholders on:
 What program will be evaluated, what it includes and excludes.
 The purpose of the evaluation.
 The goals and objectives of the program. Program goals and objectives can be written as
statements indicating what the program will achieve and what criteria will be used to judge
whether the objectives have been met.
Each objective should:
o contain one outcome.
o
identify the target audience.
o
specify what you expect to change as a result of program participation.
o
be specific enough to be measurable
Clarify evaluation questions, issues, indicators and criteria:

Evaluations are conducted to answer specific questions, to address programmatic issues, to plan
for future programs and/or to apply criteria to judge value or worth of an existing program . If the
questions and issues that are being used are not clearly defined and the indicators and criteria
that will be .used to judge merit or worth are not well thought out, the evaluation may lack focus,
be irrelevant, omit important areas of interest or come to unsupported conclusions.
 Basic steps in selecting questions, issues, indicators and criteria
 List questions, issues and criteria from all sources consulted.
 Organize material into a manageable number of categories. Match level of program with
indicators appropriate for that level -- remember that it is not possible for an evaluation to
address all areas of interest.
 Come to agreement with stakeholders on the degree of incompleteness that is acceptable, given
monetary and time constraints.
 Focus the scope of the evaluation to the crucial and practical.
Step 5. Approaches to Data Collection
There are two basic types of data collection: quantitative and qualitative. Quantitative data
tend to focus on numerical data, while qualitative data are expressed in words.
Quantitative Methods measure a finite number of pre-specific outcomes and are
appropriate for judging effects, attributing cause, comparing or ranking, classifying and
generalizing results. Quantitative Methods are:
• Suitable for large-scale projects.
• Useful for judging cause and effect.
• Accepted as credible.
Qualitative Methods take many forms including rich descriptions of people, places, and
conversations and behavior. The open-ended nature of qualitative methods allows the person
being interviewed to answer questions from his or her own perspective. Qualitative Methods are
appropriate for:
• Understanding the context in which a program takes place.
• Complex problems and process issues.
• Clarify relationships between program objectives and implementation.
• Identifying unintended consequences of a program.
• Gathering descriptive information.
• Understanding operations and effects of programs.
• In-depth analysis of program impacts.
Multiple Methods combine qualitative and quantitative methods within one evaluation
study. This combination can be used to offset biases and complement strengths of different
methods. When using multiple methods, care should be taken to ensure that the selected
methods are appropriate to the evaluation questions and that resources are not stretched
too thinly. Multiple Methods are appropriate for:
• Understanding complex social phenomenon.
• Allowing for greater plurality of viewpoints and interests.
• Enhancing understanding of the both the typical and unusual case.
• Generating deeper and broader insights.
Step 6: Selecting Data Collection Techniques
There is no one best method to use when collecting data for project evaluation. Selection of a
method or methods should be influenced by the type of inforn1ation needed, the time available,
and cost. Last, but not least you should consider whether the information collected will be viewed
as credible, accurate and useful by your organization.

A large array of methods exist which can be used in evaluation. We ·will cover the following:
Step 7. Sampling for Evaluation
Step 8. Collecting, Analysing &
Interpreting Data
Various kinds of data analysis exist for both quantitative and qualitative data. You should consider
whether the analyses would provide the information needed to answer the questions posed by the
evaluation and the analytical skills the evaluator possesses.

Qualitative Data Analysis


Analysis and interpretation of qualitative data are not simple technical processes like the analysis of quantitative
data. Analysis of qualitative data is the process of bringing order to the data and organizing what there is into
patterns, categories and basic descriptive units. Interpreting qualitative data is the process of bringing meaning to
the analysis, explaining patterns, and looking for relationships and linkages among descriptive dimensions. The
evaluator and/or stakeholders then make judgments about assigning v.alue or worth to what has been analyzed
and interpreted.

Characteristics of qualitative data analysis:


■ It begins as soon as data collection begins.
■ It is an iterative process that continues throughout data collection.
■ Issues of validity and reliability are expressed in terms of clarity, verifiability and replicability.
Quantitative Data Analysis
Simple statistical analysis

Scales of measurement:
Scales of measurement refers to the type of variable being measured and the way it is measured.
Different statistics are appropriate for different scales of measurement. Scales of measurement include:

Nominal: mutually exclusive and logically exhaustive categories.


Examples: marital status; gender; group membership;
religious affiliation.
Ordinal: ranked or ordered.
Examples: letter grades; social class; attitudinal variables.
Interval: ranked and ordered in standard units of measurement.
Examples: years of age; degree:; calendar year; scores on a test; IQ.
Ratio: an interval scale with an absolute zero starting point.
Examples: years of age; years of education; time; length; weight.
Step 9. Communicate Findings
Evaluators have a responsibility to report their findings to stakeholders and other audiences
who may have an interest in the results. Communication with stakeholders should occur
throughout the evaluation process to help ensure meaningful, acceptable and useful results.
(Continue this)

Reporting plan: Developing a reporting plan with stakeholders can help clarify how, when
and to whom findings should be disseminated.
• Who are the intended audiences?
• What information will be needed?
• When will information be needed?
• What reporting format is preferred?

Reporting results: A variety of reporting procedures may be used.

♦ Verbal reports.
♦ Audio-visuals.
♦ Journal or newspaper articles.
♦ Graphs, tables and charts.
♦ Newsletters, bulletins, and brochures
♦ Poster sessions.
♦ Question-answer periods.
♦ Short communications.
♦ Executive summary.
♦ Public meetings.
♦ Personal discussions.

An Evaluation report usually contains:


A description of :
• The program.
• The purpose of the evaluation.
• The procedures used.
• Recon1mendations for future changes.
• An explicit justification of conclusions.
• An explicit justification of conclusions.
Step 10. Applying and Using Findings
• Anevaluation should not be considered complete until
the findings of the evaluation are applied:
References
How to Conduct Evaluation of Extension
Programs – Murari Suvedi, Kirk Heinze, Daine Ruonavaara

You might also like