0% found this document useful (0 votes)
37 views17 pages

Yambao - Guidance Program Evaluation Report

Uploaded by

jigs yambao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views17 pages

Yambao - Guidance Program Evaluation Report

Uploaded by

jigs yambao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

GUIDANCE

PROGRAM
EVALUATION
4-MAED-GC-B
MAEGC 2213
Organization, Administration and
Supervision of Guidance Services

Presentation by Jigs D. Yambao


INTRODUCTION
• Definition of Guidance Program Evaluation
The systematic process of collecting and analyzing data to determine the
effectiveness, efficiency, and impact of a guidance program in meeting
its intended goals and objectives.

• Importance of Evaluating Guidance Programs


Evaluation helps to ensure accountability, inform decision-making,
identify areas for improvement, and demonstrate the value and impact of
guidance services.

• Benefits of Effective Program Evaluation


Improved program quality, efficient resource allocation, data-driven
decision-making, and increased stakeholder buy-in and support.
THE PROCESS
1. SET GOALS
8. FOLLOW UP

2. CHOOSE METHODS
7. SHARE RESULTS

3. COLLECT DATA
6. PLAN ACTIONS

4. ANALYZE RESULTS

5. INTERPRET FINDINGS
TYPES OF EVALUATION
1. FORMATIVE EVALUATION (ONGOING ASSESSMENT):
Conducted during the implementation of a program to provide feedback and make necessary
adjustments.
2. SUMMATIVE EVALUATION (FINAL ASSESSMENT):
Conducted during the implementation of a program to provide feedback and make necessary
adjustments.
3. PROCESS EVALUATION (EXAMINING PROGRAM
IMPLEMENTATION):
Focuses on assessing the program's delivery,
implementation, and operations.
4. OUTCOME EVALUATION (ASSESSING PROGRAM
OUTCOMES):
Measures the extent to which a program has achieved its intended outcomes or
goals
EVALUATION MODELS
GOAL-BASED DECISION-MAKING EXPERTISE-ORIENTED
EVALUATION: EVALUATION: EVALUATION:
Assesses the extent to which Provides information to Relies on the expertise and
a program has achieved its guide decisions about judgment of external
predetermined goals and program continuation, evaluators or subject matter
objectives. modification, or termination. experts.

PARTICIPANT-ORIENTED
EVALUATION:
Focuses on the perspectives and
experiences of program participants
and stakeholders.
EVALUATION
CRITERIA
RELEVANCE (ALIGNMENT WITH GOALS AND
NEEDS):
• Assessing whether the program addresses the identified needs and aligns with the
stated goals and objectives
EFFECTIVENESS (ACHIEVING INTENDED OUTCOMES):
• Determining the extent to which the program has achieved its desired outcomes or
produced the intended results.
EFFICIENCY (OPTIMAL USE OF RESOURCES):
• Assessing whether the program is using resources (e.g., time, money,
personnel) efficiently and economically

IMPACT (LONG-TERM EFFECTS


• Measuring the program's long-term effects or influence on
participants, the organization, or the community.
DATA COLLECTION METHODS
SURVEYS AND QUESTIONNAIRES: OBSERVATIONS AND
Gathering quantitative and qualitative DOCUMENT ANALYSIS:
data through structured or open-ended Observations and document analysis:
questions.

INTERVIEWS AND FOCUS GROUPS:


Conducting individual or group STANDARDIZED ASSESSMENTS
AND TESTS:
discussions to gather in-depth insights
and perspectives. Utilizing validated instruments or tests
to measure specific outcomes or
constructs
QUANTITATIVE DATA ANALYSIS
DESCRIPTIVE STATISTICS (MEAN, MEDIAN, MODE, ETC.):
Summarizing and describing the basic features of the quantitative data.
INFERENTIAL STATISTICS (T-TESTS, ANOVA, ETC.):
Using statistical tests to draw conclusions and make inferences about the population based on the
sample data.
CORRELATIONAL ANALYSIS:
Examining the relationship or association between two or
more variables

REGRESSION ANALYSIS:
Modeling the relationship between a dependent variable
and one or more independent variables.
QUALITATIVE DATA ANALYSIS
• CODING AND CATEGORIZING DATA:
Organizing and labeling qualitative data into
meaningful categories or themes.

• IDENTIFYING THEMES AND PATTERNS:


Recognizing recurring ideas, concepts, or
experiences within the qualitative data.
• NARRATIVE ANALYSIS:
Analyzing and interpreting stories or narratives
shared by participants.
• CONTENT ANALYSIS:
Systematically analyzing the content of written
or visual materials.
EVALUATION REPORTING
EXECUTIVE SUMMARY: DATA VISUALIZATION
(GRAPHS, CHARTS, TABLES):
A concise overview of the
evaluation's purpose, methods, Using visual representations to
findings, and recommendations. clearly communicate and illustrate the
evaluation findings.

DETAILED FINDINGS AND


RECOMMENDATIONS:
A comprehensive report presenting DISSEMINATION STRATEGIES:
the evaluation results, analysis, and Determining effective ways to share
actionable recommendations for the evaluation report and findings
program improvement or decision- with relevant stakeholders.
making.
ETHICAL CONSIDERATIONS
INFORMED OBJECTIVITY AND RESPONSIBLE USE OF
CULTURAL
CONSENT AND BIAS: EVALUATION
SENSITIVITY AND
CONFIDENTIALITY FINDINGS:
INCLUSIVITY:
: MAINTAINING
OBJECTIVITY AND UTILIZING THE
ENSURING THAT
OBTAINING MINIMIZING EVALUATION
THE EVALUATION
INFORMED POTENTIAL FINDINGS ETHICALLY
PROCESS AND
CONSENT FROM BIASES OR AND RESPONSIBLY TO
METHODS ARE
PARTICIPANTS CONFLICTS OF INFORM DECISION-
CULTURALLY
AND ENSURING INTEREST MAKING AND
APPROPRIATE AND
THE THROUGHOUT PROGRAM
INCLUSIVE OF
CONFIDENTIALITY THE EVALUATION IMPROVEMENT
DIVERSE
AND PRIVACY OF PROCESS. WHILE AVOIDING
PERSPECTIVES.
DATA MISUSE OR
MISREPRESENTATION
OF DATA.
STAKEHOLDER
INVOLVEMENT
• Engaging stakeholders throughout the evaluation process:
Involving relevant stakeholders (e.g., program staff, participants, administrators) at
various stages of the evaluation to gather diverse perspectives and promote buy-in.

• Incorporating stakeholder feedback and perspectives:


Actively seeking and incorporating feedback and input from
stakeholders to improve the evaluation design, implementation, and
interpretation of findings.
• Promoting ownership and buy-in:
Fostering a sense of ownership and investment among stakeholders by involving
them in the evaluation process, which can increase the likelihood of implementing
recommendations and sustaining program improvements.
PROGRAM IMPROVEMENT
USING EVALUATION FINDINGS FOR
PROGRAM ENHANCEMENT:
Utilizing the evaluation results and recommendations to identify areas for
improvement and make data-driven decisions to enhance the program's
effectiveness, efficiency, and impact.
CONTINUOUS QUALITY
IMPROVEMENT CYCLES
Implementing a cyclical process of evaluation, program
modification, and re-evaluation to continuously monitor and
improve the program's quality over time.
IMPLEMENTING CHANGES AND
MONITORING PROGRESS:
Putting the recommended changes or improvements into practice
and establishing systems to monitor and track the progress and
impact of those changes.
CHALLENGES AND LIMITATIONS
• RESOURCE CONSTRAINTS (TIME, BUDGET, PERSONNEL):
Evaluations can be time-consuming, costly, and require adequate human resources, which can
pose challenges for programs with limited resources.
• DATA QUALITY AND VALIDITY CONCERNS
Ensuring the reliability and validity of data collection instruments and methods, as well as
addressing potential sources of bias or measurement error.

• RESISTANCE TO CHANGE AND EVALUATION:


Overcoming potential resistance or skepticism from stakeholders who
may perceive evaluation as a threat or unnecessary burden.

• STRATEGIES TO MITIGATE THIS CHALLENGE

may include effective communication about the value and purpose of evaluation, involving
stakeholders throughout the process, and fostering a culture of continuous improvement and
data-driven decision-making.
THANK YOU FOR LISTENING!

GOD BLESS US!

You might also like