Evaluation of Training
Evaluation of Training
Evaluation of Training
Page | 1
Contents
Preface ....................................................................................................................................................... 3
Chapter 1. Introduction ..................................................................................................................................... 4
1.1 Importance of training evaluation ..................................................................................................... 4
1.2 Current ecosystem .................................................................................................................................. 5
1.3 Challenges ................................................................................................................................................. 6
Chapter 2. Training evaluation model for Civil Service Training Institutions .................................... 6
2.1 Standardised models for Training Evaluation................................................................................. 8
2.1.1 Will Thalheimer's Training Evaluation Model......................................................................... 8
2.1.2 The Hamblin-Kirkpatrick model ................................................................................................. 9
2.1.3 The CIRO model ............................................................................................................................ 10
Chapter 3. Evaluation Plan .............................................................................................................................. 11
3.1 Steps for conducting training evaluation ...................................................................................... 11
3.2 Parameter for long term training program ................................................................................... 12
3.3 Parameters for short term programs .............................................................................................. 13
Chapter 4. Sub-committee Recommendation .......................................................................................... 13
Annexure 1. NSCSTI assessment parameter for Training Evaluation and Quality Assurance ....... 15
Annexure 2. Pre-Training Questionnaire ....................................................................................................... 18
Annexure 3. During the programme/ training – feedback form for trainees ..................................... 19
Annexure 4. End of the training – feedback form....................................................................................... 21
Annexure 5. Post training evaluation questionnaire (Long term courses) .......................................... 22
Annexure 6. Post training evaluation question are (short term trainings) .......................................... 25
Annexure 7. Post training evaluation questionnaire (long term trainings) ......................................... 26
Annexure 8. Post training evaluation questionnaire (short term trainings)........................................ 28
Page | 2
Preface
CBC has developed an accreditation framework known as the National Standards for Civil Service
Training Institutions (NSCSTI), to benchmark the quality of all training institutes. The framework
will introduce minimum standards as a means for continuous improvement of Civil Service
Training Institutions (CSTIs).
Training Evaluation & Quality Assurance is one of the key pillars of the framework1. This pillar aims
to capture the extent to which institutes conduct training evaluation and subsequently use that
analysis improve the courses quality. The framework is based on a process maturity scale, rating
institutions on the extent of their Training Evaluation practices. It is designed as an evaluation and
a planning tool to enhance capacities of CSTIs in delivering training programs.
The accreditation framework is based on a process maturity scale, rating institutions on the
performance of the faculty in the institute. See Annexure 1 to learn about the maturity levels in
Training evaluation & Quality Assurance as defined by the Capacity Building Commission (CBC).
The First Roundtable for Central Training Institutions (CTIs) was organized by Capacity Building
Commission (CBC) on 12th October 20212. The roundtable was attended by senior management
of 25 CTIs. As an outcome of the roundtable, six dedicated sub-committees were formed to drive
transformation across six key focus areas viz. (i) identification of training needs; (ii) promoting
knowledge sharing and creating a common knowledge repository; (iii) transformation to a
phygital world of capacity building; (iv) enhancing capacities of faculty; (v) embedding effective
assessment of training; and (vi) overcoming challenges in governance.
The Committee on ‘Embedding Effective Assessment of Training’ aims to support all training
institutions towards conducting training evaluation and updating courses based on the analysis.
To this effect, the committee members have created this guidance document for all training
institutions.
Sub-committee members:
1. Mr. S Behera, National Academy of Indian Railways (NAIR), Vadodara
2. Shri Abhishek Azad, National Institute of Defence Estates Management (NIDEM), Delhi
3. Bharat Jyoti, Indra Gandhi National Forest Academy (IGNFA), Dehradun
4. Dr. Brajesh Kumar, Arun Jaitley National Institute of Financial Management, Faridabad
1 Source: https://www.nscsti.org/assets/pdf_doc/CBC_Approach%20Paper.pdf
2 Source: https://pib.gov.in/PressReleseDetailm.aspx?PRID=1763318
Page | 3
5. N. Madhusudhana Reddy, Sardar Vallabhbhai Patel National Police Academy, Hyderabad
6. Mr. Kandarp V Patel, National Academy of Audit & Accounts, Shimla
Special Invitee:
1. Ms. Poonam Bhatt, Ex- Deputy Director, National Academy of Customs Indirect Taxes
and Narcotics (NACIN), Faridabad
Chapter 1. Introduction
Civil service officials are responsible for implementing policies and programs that impact the lives
of citizens. Training is a crucial tool for equipping civil service officials with the necessary
knowledge, skills, and competencies for public service delivery. Typically, learning interventions
are dispersed at specific points in an official’s career such as foundation training, mid-career
training etc. There is limited scope for continuous learning, and assessments rarely become inputs
to capacity building needs3.
Training evaluation is the process of assessing the effectiveness of a training program. It involves
gathering data and feedback to determine whether the training program achieved its objectives
and whether it provided value to the participants. The purpose of training evaluation is to identify
strengths and weaknesses of the training program and make improvements to future training
programs. A comprehensive and effective evaluation plan is a critical component of any successful
training program. Training evaluation is an essential feature of the systematic approach to
training. Apart from measuring the impact of training on the trainee (Summative Assessment), the
evaluation also provides pointers to suggest certain changes in the design of the training, to make
it more effective (Formative Assessment).
Training evaluation is an essential component of capacity building initiatives. It helps to assess the
effectiveness of training program, identifies areas of improvement and ensure that the training
meets the needs of the participants and related ministries/departments/organization (MDO).
Training evaluation also helps in following:
Page | 4
to make informed decisions about the effectiveness of the training program and to make
necessary adjustments.
2. Identifying areas for improvement: Evaluation can identify areas where the training
program can be improved. This information can be used to make adjustments to the
training program and to tailor it to the needs of civil service officials.
3. Ensuring accountability: Evaluation helps to ensure that resources invested in training
are being used effectively. It provides accountability for the training program and ensures
that policymakers are aware of its impact.
4. Enhancing the quality of training: Evaluation can help to improve the quality of training
by identifying best practices and areas for improvement. This allows training providers to
make adjustments to the training program and to incorporate new and innovative
approaches.
By following a structured approach to training evaluation, institutes can develop and deliver high-
quality training programs that can contribute to professional development of participants,
achievement institutions & relevant MDOs goals.
Most public organizations assess training outcomes in terms of the number of courses carried
out, number of employees trained, extent of training budget utilization and the feedback of the
trainees on the course, faculty, and training facilities. But the impact of training on the subsequent
job behaviour of the trainees is rarely assessed in government organizations4.
As per a study conducted by Capacity Building Commission (CBC), at least 7 out of 25 Central
Training Institutes have feedback mechanisms for the course and/or the faculty.
4
Functional Manual for Training Managers, UNDP, 2016
Page | 5
1.3 Challenges
Training and evaluation of civil service officials can be a complex and challenging task due to
several factors. Some of the challenges in training evaluation of civil service officials are:
1. Diverse job responsibilities: Civil service officials work in various government agencies
and departments, and their job responsibilities can vary greatly. Therefore, designing a
training program that meets the needs of all civil service officials can be challenging.
2. Limited resources: Civil Service Training institutes have limited resources to invest in
training programs and in training evaluation. Therefore, institute must design cost effective
training evaluation methodology
3. Measuring training outcomes: Measuring the effectiveness of training programs can be
challenging. It can be difficult to determine if the training has resulted in improved job
performance or other outcomes.
4. Lack of accountability: There may be a lack of accountability in the training and
evaluation of civil service officials. Some civil service officials may not take training
seriously or may not be held accountable for their performance.
5. Removing ‘Training Waste’: Training that are not needed are conducted and the
required trainings or other solutions that would be truly beneficial are not even identified
To overcome these challenges, training and evaluation programs for civil service officials must be
designed with careful consideration of the unique needs of different government agencies and
departments. Additionally, trainers must employ effective strategies to motivate and engage
learners and measure the effectiveness of training programs.
An ideal model for evaluating impact of civil service official training must provide a well-rounded
evaluation of the training program's effectiveness and its impact on both the officials and the
citizens they serve. Steps involved in this evaluation process are:
Page | 6
1. Use feedback forms or questionnaires to gather participants' feedback on training
content, delivery, and overall experience.
2. Assess participants' perception of the training's relevance, usefulness, and applicability to
their roles.
Page | 7
1. Conduct surveys or interviews with citizens to assess satisfaction levels regarding service
quality and effectiveness.
2. Analyse data on service delivery metrics like response time, accuracy, efficiency, and
citizen feedback.
By incorporating these steps into the training impact evaluation of civil service officials, institutes
can obtain a comprehensive understanding of the training program's effectiveness. It allows for a
holistic evaluation that considers both the participants' perspectives and the impact on the
citizens. This approach helps institutes make informed decisions regarding training improvements,
policy changes, and resource allocations to enhance the overall effectiveness and impact of civil
service officials' training programs.
1. Level 0: Did not participate - This level represents those who did not participate in the
learning program and serves as a baseline for comparison.
2. Level 1: Attendance - This level measures the number or proportion of learners who
attended the learning program.
5 Ref: https://www.worklearning.com/wp-content/uploads/2018/02/Thalheimer-The-Learning-Transfer-Evaluation-Model-Report-for-LTEM-v11.pdf
Page | 8
3. Level 2: Completion - This level measures the number or proportion of learners who
completed the learning program.
4. Level 3: Competence - This level measures the degree to which learners have acquired
the knowledge, skills, and abilities taught in the learning program, often assessed through
tests or other evaluations.
5. Level 4: Performance - This level measures the degree to which learners are able to apply
what they learned to their job or other relevant contexts, often assessed through
observation or other performance evaluations.
6. Level 5: Transfer - This level measures the degree to which learners are able to apply what
they learned in the learning program to their job, and the extent to which the organization
supports and reinforces learning transfer.
7. Level 6: Business results - This level measures the impact of the learning program on
business outcomes, such as increased productivity or revenue, reduced errors or accidents,
or other relevant metrics
By measuring each of these levels, institutes can gain a more comprehensive and nuanced
understanding of the effectiveness of their learning programs and make data-driven decisions
about future trainings.
1. Level 1 – Reaction: This level measures how participants respond to the training. It
includes collecting feedback from participants through surveys or interviews to gauge their
satisfaction, engagement, and relevance of the training.
2. Level 2 – Learning: This level measures the extent to which participants have gained new
knowledge, skills, and attitudes as a result of the training. This could include assessing
performance through tests, assignments, or other measures.
3. Level 3 – Behaviour: This level measures whether participants have applied what they
learned in the training to their work. It includes evaluating changes in job performance or
behaviour, such as increased productivity, quality of work, or changes in behaviour.
4. Level 4 – Results: This level measures the impact of the training on the MDO as a whole.
It includes evaluating the business outcomes or benefits of the training, such as increased
revenue, reduced costs, or improved customer satisfaction.
6 The model was developed by Donald Kirkpatrick and later revised by Roger Hamblin
Page | 9
By evaluating training programs at all four levels, Institute can gain a comprehensive
understanding of the effectiveness of their training initiatives and make informed decisions about
future investments in training and development.
1. Content: The first step in evaluating the effectiveness of a training program is to assess its
content. This involves reviewing the program's objectives, curriculum, and teaching
methods to ensure they are aligned with the needs of trainees. It is essential to ensure that
the training covers all the critical topics and skills required to perform the duties.
2. Input: The second step is to evaluate the input, which refers to the resources and effort
invested in the training program. This includes assessing the qualifications and expertise
of the trainers, the quality of training materials, and the facilities and equipment provided
for the training. It is also important to ensure that the training is delivered in a conducive
and supportive learning environment.
3. Reaction: The third step in the CIRO model is to evaluate the reaction of the civil service
officials who participate in the training. This involves gathering feedback from the trainees
to determine their satisfaction with the training program, the relevance of the content, and
the effectiveness of the teaching methods. Feedback can be collected through surveys,
interviews, or focus group discussions.
4. Output: The final step is to evaluate the output, which refers to the impact of the training
program on the performance of civil service officials. This involves assessing the extent to
which the training has improved the skills and knowledge of the officials, and whether it
has resulted in improved job performance. This can be measured through performance
metrics, such as productivity, efficiency, and effectiveness.
Overall, the CIRO model provides a comprehensive framework for evaluating the effectiveness of
short-term training programs for civil service officials. By assessing the content, input, reaction,
and output of the training, it is possible to identify areas of improvement and make necessary
adjustments to ensure that the training is effective in meeting the needs of civil service officials
and improving their job performance
Page | 10
Chapter 3. Evaluation Plan
The Parameters/ Competencies which will be evaluated for assessing the impact of training may
be grouped into three broad categories viz. Behavioural, Functional and Domain related. The
weightage given to each such parameter/ competency may depend on the broad objective and
focus of the training programme conducted and being evaluated.
The levels assessed may be assigned a numerical value as detailed below using the appropriate
methodology to assess the impact of the training. The open-ended answers may be analysed
using text analytics software for indexing and discerning the relevant information and value for it.
Indicative steps for conducting training evaluation of capacity building program are:
1. Establish clear objectives: It is essential to have clear objectives for the training program.
It will provide a basis for the evaluation and help to measure the effectiveness of the
training program against the set objectives.
2. Identify evaluation criteria: Identify the evaluation criteria that will be used to assess the
effectiveness of the training program. These could include measures of knowledge gained,
changes in behaviour or attitudes, and feedback from participants and stakeholders.
3. Collect data: Collect data through a variety of methods, such as surveys, interviews, and
observation. It's essential to gather both quantitative and qualitative data to gain a
comprehensive understanding of the impact of the training program.
4. Analyse data: Analyse the data to determine the effectiveness of the training program.
This involves comparing the data against the evaluation criteria and the objectives of the
training program.
5. Report findings: Report the findings of the evaluation to stakeholders. This should include
a summary of the evaluation criteria, data collected, and the conclusions drawn from the
analysis. The report should also include recommendations for improving the training
program.
6. Take action: Based on the findings of the evaluation, take action to improve the training
program. This could include revising the training content, delivery method, or evaluation
methods to better meet the needs of participants and the organization.
Page | 11
3.2 Parameter for long term training program
1. Attendance forms: This can be used for assessing the motivation of trainees to attend
each course during the conduct of the programs.
2. Activity: The activity participation of the trainees can be assessed through trainers’
evaluation report based on individual contribution in group works and assignment
submissions.
3. Learner perception: The learner’s perception of expectation from the training can be
captured as training need analysis. The perception of the trainees on course contents,
design, delivery and allied aspects may be assessed through an end of the training
feedback form.
4. Knowledge: The knowledge assessment by the training program may be assessed
through quizzes, periodical tests, viva and multiple-choice question based concurrent
assessments.
5. Decision-making competence: The Decision-making competence of the trainees based
on the training imparted can be captured by administering case studies and focus group
discussions by the trainers. The skills gained by the trainees can be assessed by the
trainers as trainer’s report.
6. Task competence: The task competence gained by the trainees can be assessed by the
trainers by providing scenario-based mock drills and evaluation of the application of
skills learned by the trainees.
7. Transfer and transfer effects: The transfer and transfer effects of the training program
can be assessed through a self-assessment online survey with the trainee as well as its
effect felt by the organisation and other stakeholders. The overall impact and societal
benefits may be assessed by the organisation based on improvements achieved in terms
of interface with the clients of the organisations/ other stakeholders comparing the pre
and post-training scenario analysis of the trainee job related aspects.
Attendance 05
15
Activity 10
Knowledge 10
Decision making competence 15 40
Task Competence 15
Page | 12
Levels Weightage (%) Group weightage (%)
Transfer 15
35
Transfer effects 20
Refer annexure 2, 3, 4, 5 and 7 for formats for data collection
The CIRO model of training assessment is useful for evaluation of short-term trainings, a four-
step process used to evaluate the effectiveness of a training program. the steps are:
1. Context: The context of the training need in terms of trainees’ perception and
organisations’ perception may be assessed before the commencement of the training by
the online survey by the CTI.
2. Inputs: The trainee’s perception of training inputs and their utility may be assessed at the
CTI level by gathering their opinions about the content, pace, methodology, tutorial
support, learning materials and the facilities made available.
3. Reactions: The reactions of the trainees in terms of knowledge, skill and other benefits of
the training can be captured through an end of the training questionnaire as well as
trainer’s report generated based on group discussion with the trainees.
4. Outputs: The transfer effects of the training in terms of output of the training program
may be assessed through a self-assessment online survey with the trainee as well as its
effect felt by the organisation.
Levels Weightage (%) Group weightage (%)
Context (Learner perception on training) 10 10
Input 15 55
Reactions 40
Output 35 35
Refer annexure 6 and 8 for formats for data collection
Training evaluation is crucial for civil service training institutions to ensure that they are
accountable, continuously improving, and meeting training goals. By conducting regular
evaluations of their training programs, civil service training institutions can ensure that their
training programs are effective, efficient, and contribute to the success of the trainees’ and
institute. Some of the recommendations by sub-committee for institutions are:
Page | 13
1. Conduct training evaluation for all core courses: The institutes should set procedure to
regularly conduct training evaluation for all induction, mid-career trainings and core
courses of the institute for every batch of trainees. The institute should promote
continuous improvement by identifying gaps and making necessary changes and
improvements.
2. Use a variety of evaluation methods: To get a comprehensive understanding of the
effectiveness of a training program, it's important to use a variety of evaluation methods
such as pre- and post-training assessments, feedback surveys, focus groups, and
interviews. This will help in getting feedback from participants, trainers, and supervisors,
which can be used to make improvements in future training programs.
3. Use standardized evaluation tools: It's important to use standardized evaluation tools
(such as LTEM, The Hamblin-Kirkpatrick model, LTAIE3M, CIRO, etc.) that are valid and
reliable to ensure consistency in the evaluation process. This will help in comparing the
effectiveness of different training programs and making data-driven decisions about
training improvements.
4. Conduct evaluations at multiple time points: Evaluations should not be limited to the
end of the training program. It's important to conduct evaluations at multiple time points
such as during the training, after the training, and a few months after the training to assess
the long-term impact of the training
5. Evaluate the transfer of learning: It's important to evaluate whether the participants
were able to transfer the knowledge and skills learned during the training program to their
job tasks. This can be done by conducting post-training assessments, observing their job
performance, and soliciting feedback from their supervisors.
6. Validate the outcome of impact assessment: Quantify the impact of the programme on
trainees and compare a set of outcomes between participants and non-participants of the
programme. This helps to determine the extent to which the program contributes to
positive changes, identify specific impacts attributable to the program and areas where
further enhancements are needed.
7. Citizen centric approach: a citizen-centric approach in training and impact evaluation is
essential for improving service delivery, enhancing citizen trust and confidence, fostering
accountability and transparency, and promoting citizen engagement. It contributes to
creating public value and building a responsive and effective civil service officials that
meets the needs of its citizens.
8. Use the evaluation results to make improvements: Finally, it's important to use the
evaluation results to make improvements in future training programs. The feedback
obtained from evaluations should be carefully analysed, and action plans should be
developed to address areas for improvement.
Page | 14
Annexure 1. NSCSTI assessment parameter for Training Evaluation and Quality Assurance
1 To what extent The institute has no The institute has The institute has well- The institute has well- The institute has well-
does the institute standard operating standard procedures defined procedures and defined procedures for defined procedures for
have well defined processes for defined for updating followed for all training updating and revising updating and revising
procedures for updating training of training courses courses and training content followed training content followed
updating training courses. but these are not programmes on as- periodically. periodically. The Institute
courses? followed. needed basis. analyses the evidence/ data
collected and clearly defines
action items to address the
areas of concern.
2 To what extent The Institute does not The Institute The Institute has The Institute has separate The Institute has separate
does the Institute allocate separate allocates budget separate budget budget allocation for all budget allocation for all
conduct cost budgets across across multiple allocation for all training training programmes and training programmes and
benefit analysis of training programmes training programmes and evaluates utilization of evaluates the utilization of
training offered. programmes, evaluates utilization of budget by monitoring budget by monitoring
programmes to however it does not budget by monitoring financial resources across all financial resources across all
evaluate their monitor utilization or financial resources training programmes. The training programmes. The
effectiveness? consumption of the across all training Institute almost completely Institute also defines strategy
financial resources programmes. The utilizes the training budget and formulates measures to
for all allocated Institute only partially allocated for a financial enhance utilization %age
training utilizes training budget year. (Utilization >90%) with the strong alignment
programmes. allocated to the training with strengthening the
programmes. (Utilization capacity building of the
<90%) institute and faculty
members (either through
Page | 15
S. N. Task Stage I Stage II Stage III Stage IV Stage V
enhancing quality or
quantity of teaching &
learning) on a periodic basis.
3. How does the The Institute does not There are several The Institute only The Institute has Mechanisms are defined for
Institute measure engage with any mechanisms defined collects feedback which comprehensive mechanisms soliciting feedback from
teaching & stakeholder for for engaging with is used as a mechanism defined for engaging with trainees using multiple
learning evaluating the multiple stakeholders for evaluating teaching multiple stakeholders for methods. Also engages with
effectiveness teaching & learning (including collection & learning effectiveness evaluating teaching & multiple stakeholders for
through effectiveness of of feedback from of each training course. learning effectiveness. The evaluating teaching &
engagement with training courses. trainees, ministries & Institute employs several learning effectiveness. The
concerned departments etc) for modes of collecting Institute analyses the
stakeholders evaluating training feedback (verbal / non- evidence/ data collected and
(Officer trainees, courses' effectiveness verbal, survey based) at clearly defines action items
demand side but not followed / various stages of training to address the areas of
agencies etc)? rarely followed by courses. concern.
the Institute.
4. Does the Institute The Institute does not The Institute has The Institute has The Institute has standard The Institute has standard
have defined have standard standard documents standard documents for documents for articulating documents for articulating
procedures for documents for for articulating the articulating the learning the learning outcomes for the learning outcomes for
ensuring articulating learning learning outcomes outcomes for 50-69% of 70-89% of all existing >90% of all training courses
achievement of outcomes or for <49% of all the all existing training training courses offered. offered. Additionally, all
learning outcomes identified impact training courses courses offered. training programmes
of intended from the training offered. undertaken by Institute have
stakeholders? courses. strong linkages to functional,
domain and behavioural
competencies, articulated in
Page | 16
S. N. Task Stage I Stage II Stage III Stage IV Stage V
standard documents on
learning outcomes.
5. To what extent The institute has no The institute has The institute has The institute has standard The institute has standard
does the Institute standard pre and post standard training standard training pre, post and during pre, post and during training
have well-defined training procedures. evaluation evaluation procedures training procedures that are procedures that are regularly
pre- and post- procedures during during and post the regularly followed for all followed for all training
training the training that are training that followed training courses and courses and programmes,
procedures? followed for all for all training courses programmes. the findings from which feed
training courses and and programmes. into modifying training
programmes. content and training delivery,
and retraining faculty if
needed.
Page | 17
Annexure 2. Pre-Training Questionnaire
1. To help us to identify your key learning expectations, please complete the following
statements.
2. Please rate your degree of knowledge for the following: (5 being the highest, 1 being
the lowest and 0 meaning “not applicable”)
Particulars 1 2 3 4 5 0
Understanding written containing work-
related documents
Paying attention to what people
are saying
To provide the right feedback to colleagues
Knowing how to ask the right
questions
Addressing critical issues in the right
manner
Understanding other’s point of view
Recognizing and rewarding hard work
Knowing how to facilitate constructive
communication
Working effectively in a changing
environment
Knowing how to be flexible and open to
new ideas
Getting things done in a timely
manner
Have a complete technical understanding
3. What should the training achieve for you in terms of professional development?
4. Please select the relevant topics that reflect your most important training needs
Is there anything else you would like to know from us?
Page | 18
Annexure 3. During the programme/ training – feedback form for trainees
Page | 20
Annexure 4. End of the training – feedback form
1 Tours/Attachments/Exercises
2 Physical Training/Yoga/Games
Page | 21
Annexure 5. Post training evaluation questionnaire (Long term courses)
We would like you to kindly recall aspects related to the above training while answering
the following questions.
2. How do use the learning from the Training/Workshop? Please tick(v) in relevant cell)
Page | 22
B. For Domain related competencies-
(Please recall and evaluate if you feel your knowledge/skill have improved and grade it on
following scale)
S. No. Topic 5 4 3 2 1
1 Knowledge level
2 Skill level
3 Attitude / Perception
5 being the highest, 1 being the lowest.
4. How far the inputs given during training are useful/relevant for–
S. No. Topic 3 2 1 0
1 Immediate Job
2 Future Job
3 Professional growth
3 being the highest, 1 being the lowest and 0 meaning “not applicable”.
5. Did you utilize the learning/ associations and linkages from the training after return
to your place of work?
Yes-Y, No-N
S. No. Topic Y N
1 Training material
2 Resource persons/experts
3 Peer contacts
4 Software/apps
5 Training Institute
6 Other: (please add here)
Page | 23
6. In case you use the material provided during the training after return to your place
of work?
If yes, please list specific instances when you used the training material
1)
2)
3)
1)
2)
B. Whether you would like to attend any advanced training on same/similar subject
area in future
Page | 24
Annexure 6. Post training evaluation question are (short term trainings)
For Ascertaining Training Effectiveness by Participants We would like you to kindly recall
aspects related to the above training while answering the following questions.
1. Evaluation of thematic learning from the training
S. No. Topic 3 2 1 0
1 Knowledge level
2 Skill level
3 Attitude / Perception
Please evaluate how you feel your knowledge/skill have improved on following scale: 3 being
the highest, 0 being the lowest.
2. How far the inputs given during training are useful/relevant for–
S. No. Topic 3 2 1 0
1 Immediate Job
2 Future Job
3 Professional growth
Please evaluate it on following scale: 5 being the highest, 1 being the lowest and 0 meaning
“not applicable”.
3. Did you utilize the learning/ associations and linkages from the training after return
to your place of work?
Yes-Y, No-N
S. No. Topic Y N
1 Training material
2 Resource persons/experts
3 Peer contacts
4 Software/apps
5 Training Institute
6 Other: (please add here)
To what extent do you feel you have been able to apply the learning from the programme?
(Please circle the score number that you feel most closely represents your views)
Page | 25
Annexure 7. Post training evaluation questionnaire (long term trainings)
1. What do you think about the usefulness of the training to the department?
HR AR NR NA
3. (a) Did the officer share the key learnings from the training with his team
members/other stakeholders? Yes/No
(b) If yes, how was it organized?
S. No. Sharing mechanism Yes No
1 Presentation/discussion with the team
2 Written report on Training
3 Hands on transfer of learning
4 Passive learning by team members
6 Other: (please add here)
4. Given below are certain parameters with respect to the training undergone by the
Officer. Please assess the improvement in respect of each of these, duly considering
the interest shown by the Officer in his current job
5. Some of the new ideas that the officer got at the end of the training course are
enclosed herewith. Please comment to what extent the Officer could translate these
ideas into action.
6. Which component of training do you consider might have been more effective for his
present job?
a. Field exposure / On job Training
b. Group activity
c. Classroom session
d. Hands-on / practical exposure
e. Brain storming /Discussion
f. Any other
7. How do you intend to utilize the expertise gained by the officer as a result of training?
8. Please indicate three interventions that need to be done by the department to enable
the Officer to improve his performance further in the current job. (Creating an enabling
environment)
9. Please indicate suggestions, if any, for further improvement of effectiveness of the
training:
Page | 27
Annexure 8. Post training evaluation questionnaire (short term trainings)
1. What do you think about the usefulness of the training to the department.
HR AR NR NA
2. Some of the new ideas that the officer got at the end of the training course are
enclosed herewith. Please comment to what extent the Officer could translate these
ideas into action.
3. How do you intend to utilize the expertise gained by the officer as a result of training?
4. How has training helped the officer to increase his overall understanding of the job?
Page | 28