0% found this document useful (0 votes)
156 views88 pages

NSDC-Assessment Processes and Protocols - Guide For STT - Final

This document provides guidelines for assessment processes and protocols for short-term skill training programs. It outlines best practices for developing assessment blueprints, item banks, and digital assessment platforms. It also covers assessment planning, administration, quality assurance processes, and data analytics to evaluate assessment quality. The goal is to help skill training organizations design valid, reliable and fair assessment systems for certifying student skills.

Uploaded by

NIRANJAN PATIL
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
156 views88 pages

NSDC-Assessment Processes and Protocols - Guide For STT - Final

This document provides guidelines for assessment processes and protocols for short-term skill training programs. It outlines best practices for developing assessment blueprints, item banks, and digital assessment platforms. It also covers assessment planning, administration, quality assurance processes, and data analytics to evaluate assessment quality. The goal is to help skill training organizations design valid, reliable and fair assessment systems for certifying student skills.

Uploaded by

NIRANJAN PATIL
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 88

Assessment Processes and Protocols:

Guide for Short-Term Skill


Training Programs

October 2020
i
ii
Table of Contents
1. Contents 1
2. List of Figures 3
3. List of Tables 3
4. Glossary 4
5. Acronyms 6
6. Executive Summary 7
7. Scope 8
8. Background 9
9. Acknowledgements 10
10. Key Stakeholders 11
11. Assessment Design 20
11.1. Context of the Assessments 20
11.2 Assessment Criteria 21
11.3 Assessment Blueprint 21
11.4 Mapping of Performance Criteria with Items 28
11.5 Marks Allocation 29
11.6 Time Allocation 29
11.7 Extrapolation of Scores 30
11.8 Types of Items 30
11.9 Difficulty Levels 31
11.10 The Role of Assessors in Assessments 33
11.11 Assessor Guide 33
11.12 Target Group & Language 34
11.13 Assessment Blueprint and Item Bank Approval & Review 35
11.14 Generating the Test Form 35
12. Assessment Planning 36
12.1 Annual Assessment Planning 36
12.2 Management of Assessment Agencies 36
12.3 Mode of Assessment Delivery 38
12.4 Digital Assessment Interface 40
12.5 Types of Proctoring Solutions 40
12.6 Assessment Personnel within the SSC 41
12.7 Orientation Required by Personnel for Conducting Assessments 42
12.8 Certification of Assessors 44
12.9 Recognition of Proctors 44

1
13. Assessment Administration 45
13.1 Assessment Activities 45
13.2 Assessment Timelines 47
13.3 Dropouts and Absentees 47
13.4 Code of Conduct 47
13.5 Evidence Gathering 50
13.6 Results 51
13.7 Re-evaluation 52
13.8 Re-assessment 52
14. Quality Assurance 53
14.1 Revision of Blueprint 53
14.2 Item Bank Review 53
14.3 Item Health Monitoring 54
14.4 Malpractice & Grievance Redressal 57
14.5 Accessibility in Digital Assessments 59
14.6 Data Security 60
14.7 Reports and Analytics 60
15. Conclusion 65
16. Annexures 66
16.1 Annexure 1: Tips for Constructing Effective Items 66
16.2 Annexure 2: Instructions for Candidates Taking Digital Assessments 67
16.3 Annexure 3: Sample Assessment Logs for Online Assessments 69
16.4 Annexure 4: Sample Scorecard 70
16.5 Annexure 5: Proctor’s Handbook 71
16.6 Annexure 6: Revised AA Empanelment Matrix 74
16.7 Annexure 7: Sample Assessment Schedule 79
16.7 Annexure 8: Training of Assessor Program and Eligibility Criteria 80
16.8 Annexure 9: Augmented Reality & Virtual Reality in Assessments 80
16.9 Annexure 10: Sample Item Analysis Reports 85

2
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

List of Figures

Figure 1: Summary of Key Activities for developing a Blueprint 22


Figure 2: Sample Cover of Assessment Blueprint 23
Figure 3: Detailed Blueprint Spreadsheet 25
Figure 4: Blueprint Conclusion 26
Figure 5: Bloom’s Taxonomy 27
Figure 6: Example of Assessment Team Structure at SSC/AA 42
Figure 7: Sample Item Analysis Report 61
Figure 8: Sample Analytics Report 62
Figure 9: Sample NOS-wise Analytics 63
Figure 10: Sample Analytics Dashboard 63
Figure 11: Sample Analytics Dashboard 2 64
Figure 12: Sample Analytics Dashboard 3 64

List of Tables

Table 1: Parameters for selection of SMEs with AAs/ SSCs 18


Table 2: Ownership Matrix of Assessment Activities 19
Table 3: Key Elements of a Blueprint 22
Table 4: Description of Sample Cover of Assessment Blueprint 24
Table 5: Description of Blueprint Spreadsheet 26
Table 6: Description of Blueprint Conclusion 27
Table 7: Sample Marks Allocation as per Difficulty Level 29
Table 8: Sample Marks Allocation based on PC Distribution 29
Table 9: Sample Matrix - Difficulty level and Time Duration 29
Table 10: Approximate Batch Size for Assessment 29
Table 11: Question Types 30
Table 12: Sample Difficulty Matrix across NSQF levels 32
Table 13: Sample Grading Matrix for Scenario-based Questions 33
Table 14: Sample Grading Matrix for Viva Voce Questions 33
Table 15: Sample Template for Assessor Guide 34
Table 16: Suggested Parameters for Performance Evaluation of AAs 36
Table 17: Suggested Matrix for Performance Evaluation of AAs 37
Table 18: Recommended Revenue Sharing between SSC and AA 37
Table 19: Parameters for Reviewing an Item Bank 53
Table 20: Difficulty Index and Correlation with Items 55
Table 21: Description of Sample Item Analysis Report 61

3
GLOSSARY
• Artificial Intelligence (AI): AI refers to system intelligence that enables machines to learn from
experience, adjust to the environment, and thereby perform human-like activities. Using deep learning
algorithms, machines can be trained to accomplish specific tasks by processing large amounts of data
and recognizing patterns. Some benefits of using AI in assessments are flagging of anomalies, reduction
in cheating, reduced human intervention, precision in testing and auditability.

• Artificial Intelligence (AI) based Proctoring: Artificial intelligence (AI) based proctoring solutions refer
to the usage of machine intelligence for proctoring. Tools such as object detection, face detection and
anomalous behaviors are detected using smart algorithms to prevent, flag, and record malpractices.

• Assessment: An Assessment is a test, examination, or observation that is the measurement in which a


sample of an examinee’s behavior in a specified domain is obtained and subsequently evaluated and
scored using a standardized process. TVET assessments focus on gathering cumulative information on a
candidate’s knowledge, skills/competencies, abilities, traits, dispositions, values, and other characteristics,
in a specific domain, at or for, a specific span of time.

• Augmented Reality (AR): AR is an interactive experience of a real-world environment where the objects
that reside in the real world are enhanced by computer-generated perceptual information, sometimes
across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. The
computer-generated virtual elements are projected over real-life physical surfaces using devices such as
smartphone screens.

• Assessment Blueprint: The Assessment Blueprint is a document that is attached to the qualification
that is to be assessed. The Assessment Blueprint gives a detailed outline of how the assessment shall be
designed, allowing stakeholders to define the complex relationship between performance outcomes/
assessment criteria, theory and practical items, difficulty levels, time and marks allocated to each
question, assessment methodology, and the evaluation thereof.

• Assessment Platform: An Assessment Platform is the software that facilitates the assessment. It refers
to the platform where questions are hosted, proctoring features are configured and the interface that
the candidate/ assessor/ proctor interacts with for conduct of the assessment.

• Assessor Guide: This Assessor Guide is a manual used by an assessor for conducting and evaluating
assessments. An Assessor Guide is based on a QP/NOS and gives a clear standard operating procedure
on how one must assess a specific qualification.

• Certification: Certification (or award, credential, license, diploma, degree, etc.) follows the process of
assessment and is a record that the individual’s competency has been validated. The certificate is usually
issued by an Awarding Body, which has public trust and competence; conferring official recognition of
an individual’s value in education, the labour market and training.

• Competency: Competency describes a cluster of related knowledge, skills and attitudes that are
observable and measurable, necessary to perform a work activity independently at a prescribed
proficiency level.

• Competency-based Assessment: Process of collecting evidence to show that candidates are able to
perform to the required standards of a particular job or a specific task in the workplace and making a
judgement on the competency of the candidate.

• Digital Assessment: Any assessment that can be administered using a technology platform is termed
as Digital Assessment. Digital Assessments use designated, secure, testing devices such as computers,
tablets or smartphones. Digital Assessments can be conducted online (using internet) or offline (without
internet - with Local Area Network (LAN)/pre-loaded application on the device).

4
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

• Distractors: The incorrect alternatives or choices in a selected response item.

• IP-based Camera: IP-based cameras are prefixed hardware, envisioned to be situated at the Assessment
Center to record and monitor the complete assessment process and allow the video recording to be
available to stakeholders.

• Item: An individual question in a test that requires the test-taker to produce an answer.

• Item Bank: An item bank is a repository of test items. A selection of items from the item bank is used for
creating a test form that assesses competency in a certain domain.

• Local Area Network (LAN): A LAN is a computer network that interconnects computers/tablets within
a limited area such as a residence, school, laboratory, university campus or office building.

• Non-Digital Assessments: Non-Digital Assessments are administered completely without the usage
of technology, using pen-paper mode for theory assessment. The assessments are administered by
the Assessor on-ground at the assessment location and may be supported by a proctor. The assessor
manually processes the assessment score and shares it with the Assessment Agency.

• Offline Digital Assessments: Offline Digital Assessments are a subset of digital assessments and are
conducted on assessment servers created locally (e.g. Local area network (LAN) or on an application
within which the assessment is preloaded). This mode of assessment is preferred for locations where
internet bandwidth and speed are not as per the requirement of the assessment platform.

• Online Assessments: Online Assessments are a subset of digital assessments and are conducted using
internet over a secure server on a technology platform. Online assessments can be conducted on-
ground or remotely using a combination of online proctoring tools like remote human proctoring and
auto proctoring tools.

• Remote Assessments: Remote Assessments are a subset of online assessments and allow the test-taker
to undergo an assessment from a remote location. Remote assessments are monitored through live-
streaming (via a human proctor) and/or automated proctoring tools.

• Simulations: Simulations are technology driven manifestation of real-life scenarios/situations, on a


digital device. Simulation-based assessments are used to assess how one would react to situations one
may encounter while working and how one thinks critically to solve problems.

• Stem: A question or statement followed by a number of choices or alternatives that answer or complete
the question or statement. (Stems are most commonly found in multiple-choice questions.)

• Test Form: A Test Form is a question paper/task set that contains a selection of items from an item bank,
such that it aligns with the requirements and mapping in the Assessment Blueprint.

• Virtual Reality (VR): VR is the use of computer technology to create a simulated environment. Unlike
traditional user interfaces, VR places the user inside an experience. Instead of viewing a screen in front
of them, users are immersed and able to interact with three-dimensional worlds in a seemingly real or
physical way using special electronic equipment, such as a helmet with a screen inside or gloves fitted
with sensors.

5
Acronyms
AA Assessment Agency
AC Assessment Criteria
AEBAS Aadhaar Enabled Biometric Attendance System
DGT Directorate General of Training
ITI Industrial Training Institute
LAN Local Area Network
MC Model Curriculum
MCQ Multiple Choice Question
MSDE Ministry of Skill Development & Entrepreneurship
NCVET National Council for Vocational Educational Training
NOS National Occupational Standard
NSDC National Skill Development Corporation
NSQC National Skills Qualification Committee
NSQF National Skills Qualification Framework
PC Performance Criteria
PMKVY Pradhan Mantri Kaushal Vikas Yojana
QP Qualification Pack
RFP Request for Proposal
RPL Recognition of Prior Learning
SME Subject Matter Expert
SP Special Project
SSC Sector Skill Council
SSDM State Skill Development Mission
STT Short Term Training
TC Training Center
ToA Training of Assessors
ToT Training of Trainers
TP Training Partner
TVET Technical and Vocational Education and Training

6
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Executive Summary
The purpose of the Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs is to
address the key subject of skill assessments, which are pivotal in assuring that the outcomes envisaged in the
‘Skill India Mission’, a flagship initiative of the Government of India, are of benchmark quality, and moreover
contribute to enhancing the employability and productivity of the Indian workforce. To that end, NSDC,
under the aegis of the Ministry of Skill Development & Entrepreneurship, worked with leaders in the skill
ecosystem to frame recommendations and frameworks for assessments. There are a range of stakeholders
involved in the process of designing and conducting assessments, all of whom must work in tandem to
make assessments possible. SSCs, AAs, Training Centers, Assessment Centers, Assessors and Proctors are
some of the key stakeholders – each performing a specific role in the assessment process. Assessments have
a cycle that is linear yet iterative, proceeding from design to planning, to conducting the assessment, all the
while being underpinned by a quality assurance framework.

The Assessment Design section covers the design and development of the assessment blueprint, a unique
document attached to every QP/NOS, essential for defining the rules and structure of the assessment
for the said qualification. Blueprints indicate a mapping of the performance criteria with the items in the
assessments, such that the competency of the candidate can be effectively and meaningfully addressed.
Blueprints also help define variables such as time duration for the assessment, allocation of scores, difficulty
levels of items, suitable types of items, the role of the assessor in the assessment, target group and language
considerations etc.

The Assessment Planning section details the activities that must be completed in anticipation of
assessments. This includes Annual Assessment planning by SSCs, parameters to be used for empanelment
and performance measurement of Assessment Agencies, various digital solutions to assessments, including
proctoring considerations and assessment interfaces. The document also clearly defines three primary
modes of assessments, i.e., center-based digital assessments, center-based non-digital assessments, and
remote online assessments, and the suitability of all these modes, dependent on factors such as availability
of internet, use of domain equipment, role of assessor/proctor, etc. Finally, this section covers the orientation
or training programs required for personnel involved in order to ensure preparedness for the assessment.

The Assessment Administration section lists activities pre, during, and post an assessment, starting from
ensuring infrastructure and technical readiness on the day of assessment to finally approving results and
generating certificates. Each activity is performed by a specific stakeholder, and each step is crucial in
ensuring the assessment proceeds smoothly.

Finally, the cycle is closed with a robust Quality Assurance framework that includes monitoring of the
overall assessment process including collection of evidence under strict data security norms, meticulous
reviews of the blueprint and item banks, grievance redressal for cases of maladministration and malpractice,
and assessment reports and analytics.

7
SCOPE
In India, the skill training and assessment ecosystem has historically existed in two parallel paradigms,
consisting of long-term training programs at Industrial Trainings Institutes (ITI) under the Directorate
General of Training, and short-term training programs under Sector Skill Councils (SSCs) led by National
Skill Development Corporation (NSDC). Over the years, the skill training and assessment practices of
both these skilling systems have evolved independently to bring in quality assurance and frameworks for
standardization. With the constitution of the National Council for Vocational Education Training (NCVET)
by the Government of India in 2018, the independent functioning of these parallel skilling systems has
come under a unified regulation. The scope of this Guide pertains to the processes and protocols to be
followed in the short-term skill training and assessment ecosystem wherein the following stakeholders
perform major functions:
1. Awarding function is performed by Sector Skill Councils
2. Assessment function, performed by Assessment Agencies on behalf of Sector Skill Councils
3. Governance function of SSCs by NSDC
Henceforth, all references to skill training and assessment in the document shall be limited to the short-term
skilling system operating with the above mentioned stakeholders.
The focus on skilled manpower by the Government of India has led to a rapid development in the need
for swift and targeted skill training. The success of a training program is adjudged through an assessment,
and the same is true when it comes to the recognition of previously acquired skills of the existing skill labor
force. The short-term skilling ecosystem conducts standards-referenced assessments based on National
Occupational Standards (NOS) primarily falling in the following two categories:
n Short Term Training programs (STT) consisting of training followed by assessment and certification
n Recognition of Prior Learning (RPL), which refers to an assessment process used to evaluate a person’s
existing skill sets, knowledge and experience gained either by formal or informal learning
In the government’s flagship PMKVY alone, 36,31,707 and 52,04,544 candidates have been assessed till
March 2020 in STT and RPL programs respectively. Besides PMKVY (2016-2020), there is a high-volume of
assessments being conducted in programs such as schemes under State Skill Missions and Central line
ministries, apprenticeships, fee-based programs, CSR-funded programs, vocationalization of education in
schools and colleges, and more. Given the immense scale of assessment activity that has emerged to fulfil
the needs of the ecosystem, there has been a need to develop quality assurance and standard frameworks
for administering assessments. This Guide aims to interpret the assessment value chain starting from
designing to planning & conduct to review of assessments, deconstruct the roles and responsibilities of
stakeholders involved, and define frameworks and procedures to ensure quality and standardization.

8
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

BACkground
An Assessment is a test, examination, or observation that is the measurement in which a sample of an
examinee’s behavior in a specified domain is obtained and subsequently evaluated and scored using
a standardized process. TVET assessments focus on gathering cumulative information on a candidate’s
knowledge, competencies, abilities, traits, dispositions, values, and other characteristics, in a specific domain,
and categorically serve the purpose of signaling employability, competence to work, and specialization.
In the short-term skill training ecosystem in India, assessments are administered in a variety of contexts,
for a variety of purposes, ranging from assessments of learning after a short training program, assessments
of apprentices who have undergone on-the-job training at workplaces, to recognition of already skilled
personnel in the ecosystem. With the high volumes of assessments being administered in short-term skill
training programs, NSDC, under the aegis of Ministry of Skill Development and Entrepreneurship, identified
a need to delve further into the quality assurance and standardization of the assessment systems in skills.
Over the years, NSDC has originated a sustained effort towards strengthening assessment processes and
protocols with SSCs. Some of the initiatives include the release of Assessment Reforms in July 2018, the
issuing of the Criteria for the Empanelment of Assessment Agencies by SSCs’ in August 2018, followed by a
series of consultations in January 2020, towards understanding existing assessment practices and developing
recommendations with key stakeholders involved in skill assessments. The stakeholders consulted were:
a) Assessors, for their direct involvement in administering assessments and scoring candidates;
b) Training partners/centers, for providing the venue and tools for assessment and also being impacted by
assessment results under Government schemes;
c) Assessment agencies, the empaneled bodies through which SSCs/Awarding Bodies design, facilitate,
and manage the complete assessment cycle; and
d) SSCs/Awarding Bodies, responsible for issuing certificates and ensuring quality assurance in assessments.
The outcome of the consultations was an identification of the primary roles and responsibilities of various
stakeholders during the assessment process, and delineated areas of improvement and recommendations
on optimizing operational efficiency and quality.
With a vision to strengthen assessments for short-term skill training programs, NSDC constituted a working
group of experts on assessments in April 2020 to come together and outline robust frameworks and
processes that could be instituted through a Guide on Assessments for Short Term skill training programs.
The working group consisting of 25 representatives from SSCs, Assessment Agencies and NSDC, deliberated
on the following key intervention points concerning assessments:
a) Assessment Design and Blueprint for a Qualification Pack
b) Digital Assessments and Proctoring Solutions
c) Assessment Administration and Processes
d) Assessor Guides for the conduct of Assessments
The expertise of the working group was formalized after multiple discussions through chapters drafted on
each intervention point. Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs
is a thorough document that aims to summarize the recommendations of the working group in a lucid and
exhaustive manner, to be taken up for implementation by the concerned stakeholders in the short-term skill
development system.

9
Acknowledgements
In the pages to follow, the Guide undertakes the gargantuan task of framing protocols and processes
governing skill assessments within the NSDC-led ecosystem, in a landscape as vast, varied, and complex as
India. The contents of the document are a collaborative work created by leaders, experts and innovators at
the forefront of skill assessment in India, constituted into a working group led by NSDC. The chapters in this
document have emerged after several rigorous rounds of deliberations spanning three months, thorough
chapter writing and exacting reviews and re-works. This document owes its thanks to the working group
members who worked tirelessly with a determined commitment to bring forth an output of supreme
quality that would serve as a guide for Sector Skill Councils, Assessment Agencies and other stakeholders
in assessments. For their contribution to the making of this Guide, National Skill Development Corporation
acknowledges the working group members, divided into four sub-groups, as follows:

Assessment Blueprint Digital Tools and Assessment Assessor Guide


Proctoring Solutions Administration
n Chairperson: n Chairperson: n Chairperson: n Chairperson:
Dr. Sandhya Chintala, IT-ITeS Tarun Girdhar, Arindam Lahiri, Automotive Mrinal Kumar, Navriti
Sector Skills Council NASSCOM Mercer-Mettl Skills Development Council Technologies
n Anand Patil, City & Guilds n Amit Singh, Aon’s n Ajit Padhi, IT-ITeS Sector Skills n Anand Kumar Singh,
Assessment Solutions Council NASSCOM Construction Skill Development
Council of India

n Arindam Lahiri, Automotive n Arvind Srivastava, SP n Anand Kumar Singh, n Ashish Srivastava, Apparel,
Skills Development Council Institute of Workforce Construction Skill Development Made-Ups & Home Furnishing
Development Council of India Sector Skill Council

n Basu Bansal, Aon’s Assessment n Hifaz Ashroff, City & n Anand Patil, City & Guilds n Hifaz Ashroff, City & Guilds
Solutions Guilds
n Dhruv Mathur, Aon’s n Meenu Sarawgi, n Arun Ujjwal, Tourism & n Dr Himakshi Khushwaha,
Assessment Solutions Automotive Skills Hospitality Skill Council Trendsetters Skill Assessors
Development Council
n Dr Himakshi Khushwaha, n Mrinal Kumar, Navriti n Arvind Srivastava, SP Institute n Lokesh Mehra, IT-ITeS Sector
Trendsetters Skill Assessors Technologies of Workforce Development Skills Council NASSCOM

n Shiv Kumar Pandey, Telecom n Sameer Naraspur, n Ashish Srivastava, Apparel, n Sameer Naraspur, Retailers
Sector Skill Council Retailers Association’s Skill Made-Ups & Home Furnishing Association’s Skill Council of
Council of India Sector Skill Council India
n Siddharth S, HireMee n Dr. Sandhya Chintala, IT- n Mrinal Kumar, Navriti n Shiv Kumar Pandey, Telecom
ITeS Sector Skills Council Technologies Sector Skill Council
NASSCOM
n Tarun Girdhar, Mercer-Mettl n Varun Nagpal, SHL India n Vasundhara Singh, n Varun Nagpal, SHL India
Mercer-Mettl
n Umang Kaur, IT-ITeS Sector n Vasundhara Singh,
Skills Council NASSCOM Mercer-Mettl

The representatives from NSDC involved in the consultations of the working group and drafting the Guideline were:

Bhumika Malhotra
Deepti Saxena

Mehr Pasricha

Rekha Menon

Dr. Sabeena Mathayas

10
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Key Stakeholders
Through the assessment process, various stakeholders perform different functions to ensure that an assessment
proceeds smoothly. The following section defines and details the role of each stakeholder specifically with respect to
assessments in the short-term skill system.

Sector Skill Council


Sector Skill Councils are Industry-led autonomous awarding bodies that are responsible for
assessment and certification on Qualification Packs for the short-term trainings. SSCs empanel/
select Assessment Agencies, administer assessments on-ground and are responsible for ensuring
quality in all aspects of the assessment process including design and operations.

Responsibilities:
1. Creation of Qualification Pack and Assessment Criteria: SSCs are responsible for creating qualifications and
strengthening them through various levels of review, and clearly defining assessment criteria across theory,
practical and other components, with Occupational Standards weighted appropriately for the assessment.
2. Creation of Assessment Blueprint: SSCs are responsible for creating Assessment Blueprint for each
Qualification Pack to ensure standardization of the assessment. They may do so in conjunction with Assessment
Agencies or Subject Matter Experts. The blueprint must be validated by two independent SMEs.
3. Ensuring Quality Question Bank: SSCs hold responsibility of ensuring quality assessments by creating and/or
validating questions used in assessments. Questions sets should be aligned to the Assessment Blueprint. SSCs
may create item banks through in-house expertise or align experts/ agencies for the task. A sample question
set or question paper should be placed in public domain for the reference of the candidates.
4. Empanelment and Performance Evaluation of AAs: AAs shall be empaneled by NCVET and can be further
selected by SSCs on standardized norms. SSCs conduct continuous performance evaluations of AAs.
5. Training of Assessor Programs: SSCs certify assessors to the requirement of the assessment process, thereby
making them eligible to conduct the assessments in the skill ecosystem.
6. Coordination: SSCs have visibility on upcoming assessments and align assessment agencies for administration
of assessment in a timely manner. SSCs also coordinate with the AA and TP/TC to guide and assist the
stakeholders, if required.
7. Assignment of Assessment Center: SSCs assign Assessment Centers for the assessment depending on
geographical and domain coverage, in alignment with the scheme requirements, wherever applicable.
8. Result Approval: SSCs validate the scores submitted by the Assessment Agency and publish results in the
required formats.
9. Certification: SSCs, in their capacity as the Awarding Body of the qualification, give the final sign off on the
results and generate certificates for successful candidates.
10. Monitoring & Compliance: SSCs are accountable for regulating the assessment process by defining and
following a monitoring framework. They must also ensure the requisite scheme compliances are met during
the assessment process.
11. Grievance Redressal: SSCs are responsible for addressing grievances and taking swift action in case an issue
is detected in the assessment process.
12. Review of Question Set & Blueprint: SSCs are responsible for conducting periodic item analysis on assessment
data collected by AAs. Once a review is conducted, corrective action must be taken to ensure the validity,
reliability and fairness of the assessment.

11
Assessment Agency
Assessment Agencies are bodies that facilitate the entire assessment process for SSCs. Assessment
Agencies often design and detail assessments in conjunction with Awarding Bodies, use their
software and technology in delivery of assessments, empanel assessors, and administer assessments
through assessors and proctors.

Responsibilities:
1. Assessment Blueprint and Item Bank: AAs are responsible for ensuring the question sets and assessment
process is aligned to the Assessment Blueprint of the QP and is vetted by the SSC. The item bank must be
regularly updated, and items not performing as expected should be retired.
2. Trained Personnel: AAs must ensure that the personnel (assessor, proctor) meet the eligibility criteria specified
by the Awarding Bodies and are trained/ oriented and certified as per the requirements of the assessments.
3. Coordination: AAs are expected to coordinate with the TP, SSC and Assessment Center in a timely manner,
and address communication points such as the assessment schedule (Date, Time and Place), IT infrastructure,
requirement for equipment and consumables, language preferences, documentation requirements and
details of assessment procedure and personnel, if required.
4. Candidate verification: It is the responsibility of the AAs to verify the identity of the candidate before
permitting them to appear for assessment. In some cases, the AA may need to check the eligibility of candidates
through biometric attendance or other means, as per the direction in the scheme guidelines.
5. Candidate orientation: AAs must orient the candidates with the process to be followed for assessment,
including the functionalities of assessment interface and requirements from the candidate. They may do so
through the assessor/proctor/demo-login/tutorial or other methods.
6. Assessment Administration: AAs are responsible for overseeing the complete assessment process and
ensuring assessment is facilitated smoothly.
7. Documentation & Compliances: AAs are responsible for ensuring all documentation and compliance
requirements are met on the day of assessment.
8. Proctoring & Invigilation: AAs must depute proctoring solutions (human proctor/AI-enabled proctor/remote
proctor) to ensure that cheating and malpractice of any sort is deterred.
9. Evidence Gathering & Audit: AAs must collect and store evidence as per requirement, for a minimum duration
of 5 years, or as mandated by the scheme. The AA should also maintain a detailed audit log of each assessment,
which is administered on their technology platform, including time logs, response logs and image captures.
The evidence and audit log may be maintained and retrievable for scrutiny purposes. Physical documents are
recommended to be stored in soft copy.
10. Upload of Scores: It is the responsibility of the AAs to upload scores on the Skill India Portal in the specified
format.
11. 360-degree feedback collection: AAs should enable a mechanism to capture feedback of candidates and
training partner digitally after the completion of assessments.
12. Score Analysis: AAs are responsible for compiling, preparing and sharing score analyses with SSCs, and
incorporating any corrections/feedback in the item bank.
13. Reports & Analyses: AAs are expected to generate assessment reports and analyses indicating batch
performance statistics, item performance, etc.

12
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Assessor
Assessors are stakeholders who are directly involved in conducting and scoring assessments. They
interface with trainees, training centers and assessment agencies. They also undergo a ToA program
conducted by the SSC that aligns them with the assessment requirements in the short-term skill
ecosystem.

Responsibilities:
1. Candidate Orientation: Assessors must orient candidates with the process to be followed for the assessment,
including the functionalities of the assessment interface and requirements from the candidate.
2. Availability of Domain Infrastructure, Tools, Equipment and Consumables: Ensuring that the domain
requirements are available in sufficient quantity for the conduct of the assessment.
3. Assessment Conduct and Scoring: Assessors are responsible for assessing the competency of each candidate
on the assessment. Assessors may do this through conducting viva voce, assessing practical competency or
evaluating theory papers, by following the instructions present in the Assessor Guide, limiting themselves to
the designated question set. Assessors are required for all assessments where evaluation is not undertaken
by system intelligence.
4. Invigilation: Assessors must uphold the integrity of the assessment by ensuring that no malpractice takes
place during assessment.
Note:
1. Assessments with no accompanying Proctor: In assessments where there is no accompanying proctor on the
day of assessment, assessors are also expected to perform the following functions:
n Candidate verification: Original Government photo ID of candidates must be checked for identity
verification before starting the assessment to ensure participation of only genuine candidates. In some
cases, the assessor may need to check the eligibility of candidates through biometric attendance or
other means, as per the direction in the scheme guidelines.
n Attendance capture: Attendance of all candidates appearing for assessment is required to be marked
correctly on the Assessor/Proctor App, unless indicated otherwise in the scheme guidelines.
n Geo-tagging: To meet compliance requirements, the location of the assessment must be geo-tagged in
the Assessor/Proctor Application.
n Documentation: Documentation should be completed as per the requirement of the scheme/SSC.
n Evidence collection: Photos/videos or any other media should be captured as evidence prior to
assessment (infrastructure check, documentation) and during the progression of the assessment (videos
of candidates) as per the requirements of the scheme or SSC.

13
Proctor
Proctors are technical support executives who participate in the assessment by ensuring readiness
and assistance on technology and technology-enabled infrastructure requirements, fulfilling
documentation requirements, invigilating the assessment and alerting authorities in case of any
anomalies.

Responsibilities:
1. Technical readiness: Proctors are responsible for ensuring readiness of technology systems for assessments:
n If assessment is being administered on computer systems, then ensuring that the computer systems are
compatible and configured for the assessment platform
n If assessment is being administered on tablets, then ensuring that the tablets are fully charged and
configured for the assessment platform
2. Technical assistance: Proctors are responsible for debugging and troubleshooting any technical issues or
queries arising in the assessment platform during assessment.
3. Candidate verification: Proctors must check original government photo ID of candidates for identity
verification before starting the assessment to ensure participation of only genuine candidates. In some cases,
the proctor may need to check the eligibility of candidates through biometric attendance or other means, as
per the direction in the scheme guidelines.
4. Invigilation: Proctors are responsible for invigilating assessments and ensuring that there is no malpractice/
cheating during assessment.
5. Attendance capture: Proctors are required to mark the attendance of all candidates appearing in the
assessment correctly on the Assessor/Proctor App, unless indicated otherwise in the scheme guidelines.
6. Geo-tagging: Proctors should geo-tag the location of the assessment on the Assessor/Proctor App as a
compliance norm.
7. Documentation: Proctors are required to complete documentation and compliance requirements as per the
scheme/ SSC.
8. Evidence collection: Proctors should capture photographs, videos or any other media as evidence, as
per requirement. This could include images of infrastructure availability, video recording or practical skills
assessment, audio recording of viva, intermittent images during theory examination, etc.
Note:
1. Assessments with no assessor: In assessments wherein there is no assessor, the proctor should brief
candidates about the assessment procedure and share the instructions familiarizing the candidates with the
assessment platform.
2. In Assessments where there is a remote assessor: In such cases, proctors are expected to video-record the
assessment as per directions given by the AA and/or enable the Assessor to interact with candidates through
a video call.
3. Remote online assessments: In remote online assessments, proctors must perform their functions remotely,
including candidate verification and attendance, troubleshooting, invigilation with AI-enabled support, etc,
during the assessment.

14
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Training Partner/ Training Centre


Training Partners are bodies that conduct skill trainings in various sectors and job roles across the
country through subsidiary Training Centers. Trainees at the training center undergo assessments
by the SSC at the close of training in order to earn a certificate.

Responsibilities:
1. Coordination: The TC needs to coordinate with and respond to the AA and SSC in a timely manner to confirm
the schedule of assessment, language preferences, assessment center location or any other requirement. The
TC also needs to coordinate with the Assessment Center to clarify expectations and formalities for the day of
assessments, if the assessments are not conducted at the training center location.
2. Ensure availability of genuine candidates: The TC should ensure that genuine candidates are present on
the day of assessment, carrying original government photo IDs.
3. Attendance: The TC should ensure that either of the following are made available to the AA 5-7 days prior to
assessment to enable the AA to make necessary preparations for the assessment:
n Biometric attendance logs
n Manual attendance
4. Candidate orientation: The TC should ensure that all candidates should participate in the orientation
program by AA.
5. Documentation: The TC is required to complete any documentation and compliance requirements for
assessment.

Note:
1. A training center may also serve as an assessment center for batches trained at their center, or for batches
from neighboring TCs. In such circumstances, a TC must ensure that it performs the role of an Assessment
Center during the assessment (covered in the following page).
2. In case of online remote assessments, it is the TC’s responsibility to ensure that candidates have access to the
required infrastructure:
n Minimum required internet bandwidth (2-4 MBPS), or internet bandwidth as indicated by the AA
n A working and sufficiently charged computer/laptop/tablet/mobile phone with webcam or front camera
n The device is compatible and has the browser requirements needed for the assessment

15
Assessment Center
An Assessment Center is a location fully equipped with domain infrastructure, tools and
consumables, where assessment of candidates (STT, RPL, walk-ins) may be undertaken. Assessment
Centers can be independent facilities, a candidate’s training location, or any other location fit for
the conduct of assessment in the specified domain.

Responsibilities:
1. Assessor and/or proctor verification: The Assessment Center should check original government photo ID
and verify the identity of the deputed assessor(s) and/or proctor(s).
2. Coordination: The Assessment Center needs to coordinate and respond in a timely manner to the AA and
SSC to confirm schedule of assessment, availability of infrastructure, equipment and consumables, language
preferences and any other requirement. The Assessment Center also needs to coordinate with the TC to
ensure candidates shall be present for assessments.
3. Infrastructure Requirement at Assessment Center:
n Minimum required internet bandwidth (2-4 MBPS), or internet bandwidth as indicated by the AA
n Sufficient computer systems/laptops/tablets/mobile phones with webcams/front cameras to cater to the
assessment for candidates in a batch
n Compatible and up-to-date browsers as per the requirement shared by AA on devices to be used for
assessment
n Camera or appropriate equipment to capture live feed of assessment
n Creating a cheat-proof environment for assessment
4. Domain infrastructure, equipment & Consumables: The Assessment Center should ensure that the domain
infrastructure, tools, equipment and consumables are available as per the assessment requirements for the
sector and job role.
5. Documentation: The Assessment Center is required to complete the documentation and compliance
requirements for assessment.
6. Proctoring: The Assessment Center needs to ensure staff is available on ground for technical support and
invigilation on the day of assessment.

16
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Other Stakeholders
I. Assessment Designers are Subject Matter experts (SME) who are responsible for ensuring fairness, reliability,
content validity and relevance of assessment by creating items (theory/practical/viva voce) that measure specific
skills/NOS and assess comprehension of concepts at different cognitive levels. They align the items to the Assessment
Blueprint, follow standard assessment methodologies, and utilize the expertise of assessment reviewers, language
experts, translators for item review and correction. The key responsibilities of an Assessment Designer are:
n Adhere to the item writing principles and techniques when developing test items
n Recruit SMEs, moderators, instructional designers, language experts, and translators to review
n Conduct training and workshops for SMEs, Moderators, IDs, Language Experts, and Translators on different process
and principles
n Prepare test strategy as per the Assessment Blueprint taking into account target groups
n Develop and maintain different assessment materials
n Perform editing and proofreading of assessments
n Conduct editing meetings with other designers, SSC members, etc
n Perform post moderation checks
n Perform entire process within the agreed budget, timeline and quality standards
n Understand the applicability of different types of items like simulations, gamification, AR/VR based questions, etc
II. Assessment Reviewers are Subject Matter Expert (SME) who have distinct skills or specialized knowledge and
expertise on a specific job or topic. These SMEs are accessed by instructional designers for designing course material
and learning programs, and by assessment designers for developing assessments. Assessment reviewers could be
a highly qualified instructional designers or industry experts. In the assessment process, the role of the reviewer is
to validate the blueprint and the item bank developed by the assessment designer. Sometimes they also partake
in item writing or serve as technical consultants. An SME is often required to sign off on the assessment developed
by evaluating the following:
n The structure and suitability of the assessment blueprint developed for the specified job role
n The relevance, suitability and clarity of the items prepared and their alignment to the assessment blueprint
n The appropriateness of the language used in the items to assess competence vis-à-vis the target group
n The appropriateness of the difficulty categorization of the questions
n Sufficiency of the items to adequately assess and judge the competence of the QP/NOS
n The alignment of the items with the NSQF level of the job role
The table below details the criterion that may be used by SSCs and AAs to empanel SMEs as Assessment Designers and
Assessment Reviewers:

17
Assessment Designer (SME) Assessment Reviewer (SME)
Experience 5 years in the relevant job role / higher job 7 years in the relevant job role / higher job role
role
Criteria n Must not be associated with any Training Partner (conflict of interest)
n Must possess good writing skills
Other Criteria Must meet the minimum eligibility criteria Must meet the minimum eligibility criteria
suggested for an assessor for the domain job suggested for a lead assessor for the domain
role job role
Responsibility Ensure: Ensure and verify:
n Language & terminologies of the n Language & terminologies of the
questions are as per the target group questions as per the target group
n Items are aligned to the blueprint n Items and distractors are unambiguous
n The difficulty level tagging of the n Questions are technically correct and
questions is correct relevant for the job role
n The Bloom’s taxonomy levels are followed n Answers contain appropriate distractors
n Questions are mapped to the appropriate n Correct answers, equipment list, steps,
NOS & PC suggested solutions are mentioned
n Questions are unambiguous and n Adherence to specific assessment
understandable guidelines shared by the respective SSC
n Questions are technically correct
n Answers contain appropriate distractors
n Consistency in terminology used
n Equipment list, steps, suggested solution
of practical and viva voce questions are
mentioned
n Content is grammatically correct without
spelling or punctuation errors
n Content is not plagiarized
n Adherence to specific assessment
guidelines shared by the respective SSC
Table 1: Parameters for selection of SMEs with AAs/ SSCs 

III. Translators are persons who decipher items in one language and render them in another language such that the
intent and sanctity of the test item remains. Translators are required for delivery of assessments in the language of
the target group. Once a test form has been translated, it should be reviewed by local experts for decipherability.
The translator’s task is to:
n Work with assessment coordinators to create translated copies of given test papers
n Understand the content context in one language and convert it into the second language
n Ensure translated content conveys the original meaning
n Absorb and act upon the feedback given by assessment coordinators
n Edit and proofread the created content to maintain quality
n Review translation work for the given subject
n Work with regional fonts to present digital copies of translation

18
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Ownership Matrix of Assessment Activities

Ownership
Stage Activity
Sector Skill Assessment Assessment Training
Council Agency Center Center
Pre-Design Developing QPs/ NOS and PCs p
Design Creation of Assessment Blueprint p
Development & Validation of Item Bank p p
Plan Selection & Performance Evaluation of p
AAs
Empaneling Assessors/ Proctors p
Training & Certification of Assessors p
Training of Proctors p
Execution Coordination p p p p
Availability of Domain Infrastructure, p
Equipment& Tools
Providing Assessment Platform p
Ensuring genuine candidates p
Candidate Verification p
Candidate Orientation p
Execution of the Assessment Process p p
Invigilation p p
Documentation and Compliance p p
Evidence Collection p
Post-Assessment Feedback Collection p
Result Upload p
Result Approval p
Certification p
Quality Assurance Monitoring of the Assessment Process p
Item Review & Analysis p
Grievance Redressal p p
Table 2: Ownership Matrix of Assessment Activities

19
Assessment Design
Assessments are a performance technique used to evaluate knowledge and skills acquired, including competencies
exhibited by an applicant, as a measure of the learning and experience imbibed. To exhibit the relationship between
knowledge, skills, competencies and performance outcomes, it is essential to articulate a framework using assessment
design principles, tools and technological platforms as appropriate, for the conduct of assessments - these are
Reliability, Validity and Fairness.
Reliability: The principle of reliability refers to the degree to which there is a consistency of the assessment scores for
a given candidate by different scorers in different environments.
Validity: The principle of validity is the degree to which the assessment measures what it purports to measure.
Fairness: The principle of fairness refers to the degree to which the assessment does not differentiate between different
socio-political-economic groups, based on race, gender, caste, religion, etc.
Flexibility: The principle of flexibility refers to the extent to which an assessment allows for flexibility in delivery and
method, without compromising on the reliability, validityand fairness of the assessment.
The foremost objective of this section is to articulate strengthening the design of skill assessments conducted by Sector
Skill Councils in short-term skill eco-system. It is envisaged that for each qualification to be assessed, a systematization is
brought forth through an Assessment Blueprint document that covers all modes of assessments, whether digital or non-
digital, and across projects such as STT, RPL, fee-based, government-funded, CSR-funded projects, Trainer & Assessor
programs, programs in schools and colleges, etc. This section aims to cover the complete gamut of considerations that
should go into designing an assessment.

Context of the Assessments

Assessments can be administered in different contexts in the ecosystem. Some of the types of assessments often
conducted in skill space are:
a. Diagnostic Assessment: Diagnostic assessments, also known as pre-assessments, typically precede the actual
learning program. They are used to identify a learner’s strengths and weaknesses, prior knowledge and skill levels,
profile learner interests and reveal learning style preferences with a view to providing an appropriate learning
program.Typically, a diagnostic captures a range of input data points:
n Contextual Information such as current profile of the user, key learning outcomes for the user
n Knowledge, skills and abilities on the subject, level of hands-on experience, and whether the user has the
ability to take up a particular learning journey
It is suggested that there should be a profiling assessment included in the process, wherever feasible. The candidates
should undergo a psychometric/aptitude/profiling test that helps them identify their suitable career choices based
on their aptitude and interest.
b. Formative Assessments: Formal assessments are conducted throughout a unit or course of study to monitor
student progress so that teachers can adjust their instructional practices to meet the needs of their students. Going
forward, it is recommended that formative assessments be integrated into trainings. The SSC may define the stage
at which the assessments should be conducted and assign a defined weightage to the outcome of this assessment.
c. Benchmark Assessments: Evaluations of student learning progress used to determine whether the students are
performing as expected at a certain point in time.
d. Summative Assessments: Formal assessments used to measure what students have learned at the end of a
defined period of instruction.
e. Learning Agility Assessment: Learning Agility Assessments are those that determine the ability of a candidate to
learn new skills and knowledge. All individuals have different learning curves. Two essential aspects of the Learning
Agility Assessments are swiftness and accuracy the candidate shows in learning new things, and the behavioral
inclination of the candidate towards learning.

20
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Assessment Criteria

All QP/NOS go through a rigorous review and approval process before they are finally ready to be utilized for skill
training and assessment. An important part of the creation of the QP is assigning assessment criteria to the various
performance outcomes expected from the candidate. It is important that assessment criteria is thoroughly examined
and approved by independent industry experts, academic SMEs and regulatory bodies during the QP/NOS creation
and approval.
Under the Assessment Criteria in a QP/NOS, PCs are allotted against the following four methodologies:
a. Theory: A theory assessment is a written/digital question paper that aims at assessing the knowledge of the
candidate
b. Practical: A practical assessment assesses the practical application/hands-on ability demonstrated by the
candidate
c. Viva: A viva voce is an oral questioning method where the assessor and candidate usually sit one-on-one. In
addition to evaluating theoretical knowledge, this method also allows for assessment of soft skills, body language,
etc
d. Project: A project could include a variety of methods to assess the candidate such as an evaluation of the candidate’s
portfolio, inclusion of formative assessment marks, evaluations from OJT logs, or any other initiatives undertaken

Assessment Blueprint

The ‘Assessment Blueprint’ is a detailed outline of the plan of action of assessment and as a document, aims to enable
stakeholders implementing skill assessments, like Sector Skill Councils, Assessment Agencies and other assessment
regulation bodies to define the complex relationship between performance outcomes/assessment criteria, theory
and practical items, difficulty levels, time and marks allocated to each question, assessment methodology and the
evaluation thereof. In the education and skilling ecosystems, blueprints are used to design the assessment to measure
the mastery of the standard(s), improve consistency across test forms, set goals and monitoring matrices for test forms,
and more. Figure 1 shown below describes four stages for developing an effective test blueprint.

21
Blueprint Stage Key Activities and Questions to Answer

Scan relevant materials (e.g. QP, Model Curriculum, Courseware etc.) for specific competencies
Identify major
Decide on a framework for organizing test content:
knowledge
and skill n Is it a traditional content outline?
domains n A list of procedural skills?
n A content-by-process matrix?

Delineate the n Assessment objectives can include behavioural objectives, competency or skill domains
assessment n Document the specific learning outcomes and behaviours (PCs) to be assessed
objectives n Determine the level of specificity desired in the blueprint (Difficulty Level, time, question
types, etc)

Decide on the Determine optimal assessment methodology for the knowledge and skills to be assessed
assessment Practical considerations include:
format n Location of the assessment objective within Bloom’s Taxonomy and Miller’s Pyramid;
n Reliability of scores produced by a method;
n Validity of the score interpretations (e.g. can an MCQ assess communication skills?)
n Practical constraints (e.g. testing time, budget, logistics)

Specify the n Determine how many assessment tasks (e.g. MCQs, Fill in the blanks, case studies etc)
category candidates can complete in the allotted time.
n Assign weights to each major category or domain in the test blueprint according to its
weights
overall importance.

Figure 1: Summary of Key Activities for developing a Blueprint

A blueprint contains pivotal points on how an assessment should be structured. The table that follows indicates the
key elements of a blueprint.
Item Type Description
QP/NOS Description Name of QP, QP code, Revised version No. and revision date of QP (the revision of blueprint
should be done immediately after the QP is revised)
NSQF Level As mentioned in the QP
Time Total duration of assessment, time allocated for each question, time allocated for each
candidate to perform the assessment
Number of Items Total items in a test, No. of items per NOS, No. of items mapped to a Performance Criteria.
Difficulty level Overall, the difficulty level of the assessment, NOS specific difficulty level, Item specific difficulty
level
Scaled/unscaled marks Marks per item, scaling factor /extrapolation
Type of items Recommended that SSCs indicate the preference of type of items to be used in an assessment
Language options Languages in which the question bank can be translated into including default
Passing Criteria Detailed as per SSC (moderation as defined by the SSC)
Technical/practical Specific details to be listed if available
environment/tools needed
Specific guidelines for the Addressed in the Cover Page; analytics for blueprint revision, in an accompanying deck
Assessment agencies approved by the SSC
Details on Types of question Mentioned in the Blueprint summary page; a section of the Blueprint covers this requirement
used in the Blueprint
Details on Difficulty level Mentioned in the Blueprint summary page; a section of the Blueprint covers this requirement
description
Table 3: Key Elements of a Blueprint

22
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Blueprint Summary Page

It is recommended that the blueprint begin with a summary page that sets the foundation of the assessment
requirements for the qualification. A sample summary with a detailed explanation is below.

3
4

9
2
10

Figure 2: Summary of Key Activities for developing a Blueprint

23
Label Description
1 Name of the QP and the QP number

2 Suggested Duration—This is sum of duration of each question included in the blueprint. It is recommended
that for the purpose of actual exam, the duration be rounded up to nearest 10 min
3 Suggested number of questions—sum of all the questions in the blueprint

4 Difficulty Level: D1:D2:D3 (ratio in which easy, moderate and difficulty questions are given in the assessment).
The SSC can decide whether they want to tag the difficulty level based on the number of questions or marks
distribution. Both methods are advisable
Overall—this is an average of difficulty levels across NOS (Combined of Domain and Generic)

Domain—Average difficulty level across domain NOS. It is recommended that this is standardized across
NSQF Levels. (D1:D2:D3)
Generic—Average difficulty level across generic NOS (D1:D2:D3)

5 Minimum number of questions per NOS

6 NOS Level Details—advantage of mentioning NOS level details is for ease of comparison and ensuring that
the standards are met
7 Acceptable level of Variance—this is to provide flexibility to the AA to develop a question-bank. The range
can be anywhere between 5-10% depending on the SSC’s requirement
8 Pass Criteria—detailed criteria in terms of aggregate to be mentioned here

9 Language Choice—the languages in which the question-bank can be translated, and the time taken by AA
to translate the question bank is mentioned here default language if required is also mentioned
10 Details of specific equipment/software tools/technical environment needed for assessment

Table 4: Description of Sample Cover of Assessment Blueprint

24
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Blueprint Detailing

After the summary comes the detailing of the blueprint. The detailed blueprint spreadsheet contains the complete
mapping of performance criteria over every NOS against items, difficulty levels, marks allocated, and more. A good
blueprint should ensure that each PC in a NOS is adequately covered by item(s). A sample of the detailed Blueprint
spreadsheet is below, substantiated with a comprehensive table explaining each aspect.

Figure 3: Detailed Blueprint Spreadsheet

25
Label Description
1 NOS – the NOS number is mentioned
2 Performance Criteria – PC description, taken directly from the QP and outcomes are qualified and quantified
3 Marks Allocation against each PC – divided into theory, practical or any other methodology in line with the QP
4 Suggested Question Allocation – these columns cover the number of Questions recommended against each
assessment criteria/clubbed criteria under different difficult levels. For example, in the illustration, the first 6PCs are
clubbed together to form 2 questions (one each for theory and practical)
5 Reference to Core Key Understanding and Generic Skills areas
6 Number of Questions for Theory, Practical, any other and total segregated into respective difficulty levels
7 Time – under this column, the suggested time for answering each question is mentioned. This helps in aggregating
the overall time
8 Type of Questions – Each PC can be best mapped with a certain kind of question; however, the question setter must
be given flexibility to suggest better questions. (Here, DC, ScB, COM are codes for question types). The selection of
type of questions should be the discretion of each SSC. It is recommended that the blueprint allow for flexibility in
this regard so that the type of questions best suited to their sector and the related occupation and target group is
selected
Table 5: Description of Blueprint Spreadsheet

Blueprint Conclusion

To close a blueprint, a short conclusion is recommended that offers a brief and high-level summary to the document.
A sample of such a Blueprint Conclusion is depicted below, substantiated by an explanation on each aspect.

1
2

4
5
6
7
8
9

10

Figure 4: Blueprint Conclusion

26
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Label Description

NOS Wise Question Distribution


1 Number of Theory Questions in each NOS

2 Number of Practical Questions in each NOS


3 Question count of each type in individual NOS

Difficulty Level Wise Distribution


4 Marks & per Item – Marks for each D1, D2 and D3 question

5 Total number of Questions for each difficulty level


6 Total Marks – Total no. of Questions * Marks for each question
7 Weightage – Percentage marks distribution for each difficulty level
8 Scaled Marks – Marks calculated after multiplying with the scaling factor
9 Minutes – Time per question for each difficulty level
10 Total Time in Mins – Total No. of Questions * Time for each question

Table 6: Description of Blueprint Conclusion

Correlating Assessment Criteria with Bloom’s Taxonomy

Bloom’s Taxonomy is a classification of the different learning objectives and skills that educators set for their students.
This taxonomy is hierarchical, which means that learning at the higher levels is dependent on having attained the
knowledge and skills at lower levels of the taxonomy. The taxonomy strong correlation to NSQF benchmarking of a QP/
NOS and difficulty matrices in an assessment.
Bloom’s Taxonomy is a critical tool to be used by those developing the assessment blueprint and designing assessment
items, and at present, details six levels of learning that can be used to structure the learning objectives, lessons and
assessments of the qualification. A diagram depicting
this is below, moving from bottom to top in order of
complexity of learning outcomes:

n Before understanding a concept, one must


remember it
n To apply a concept, one must first understand it
n To analyze a concept, one must know how to apply it
n To evaluate a process, one must have analyzed it
n To create an accurate conclusion, one must have
completed a thorough evaluation
It is also advisable to consider the instructional material of
the affixed training program for designing the assessment
and selecting appropriate assessment methodology.

Figure 5: Bloom’s Taxonomy

27
Mapping of Performance Criteria with Items

The National Occupational Standards (NOS) specify the standard of performance an individual must achieve when
carrying out a function in the workplace, together with the knowledge and understanding they need to meet a standard
consistently. Each NOS defines one key function in a job role, listed into competencies called Performance Criteria
(PC). In the context of assessments, PCs are the expression of what is to be measured and why. A good blueprint should
ensure that each PC in a NOS is adequately covered by item(s). Below are a few ways in which the mapping of NOS/PC
to items can be carried out:
a) Case 1: A PC is tested using a single item

1 Question 1 PC

b) Case 2: Appropriate PCs are clubbed together and assessed using a single item

Multiple PCs with


1 Question
common elements

The grouping of PCs with common elements in to any one question may be carried out because:
n There are many PCs in the NOS that are correlated and can be collectively demonstrated.
n The time duration of the assessment can be appropriately curtailed.
n Distribution of marks with respect to PCs in the NOS/ QP makes it unsuitable for a 1:1 mapping.

Important considerations while grouping PCs:


n Each grouping should have PCs conforming to a common underlying element.
n Clubbed PCs should flow into each other, e.g. in a group of PCs if two pertain to ‘use of agricultural instruments’ to
sow seeds, the third PC should be about ‘importance of seeds.’ Instead, the third PC could pertain to ‘types of seeds’
since the type of seed being used can determine the agricultural instrument that must be chosen.
n The PCs should, as much as possible, belong to the same NOS.
n There should be no cross-combination of marks i.e. practical marks should be combined with practical marks of
another PC and similarly for theory marks.
c) Case 3: More than one question is used to test onePC

An important PC is tested with


1 Question
more than one question

This scenario arises when a single PC might have a high marks weightage assigned to it, owing to the fact that it may
have a wide scope or be crucial in assessing competence for the job role. Such a PC could also be assessed using
multiple sub-question.
For cases wherein NOS have a large number of PCs and it is not possible to tag all the PCs to items, it is recommended
that:

n All important and key PCs should be covered. The count of PCs can be decided while preparing the Question
Matrix with respect to the Blueprint.
n It is recommended that no more than 3-4 PCs be tagged to an item in the stipulated structure. Sub-items can be
created where there is a need to assess a large number of PCs together.

28
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Marks Allocation

The marks allocation for each PC is present in the Qualification document.


Marks can be further allocated against questions based on PCs or based on difficulty levels. Some samples are below:

Difficulty No. of Questions Marks per Question Max. Marks Nature of questions

Easy X A X*A Multiple choice


Medium Y B Y*B Multiple choice
Difficult Z C Z*C Constructed Response
Total Marks X+Y+Z A+B+C Sum of marks as per QP
Table 7: Description of Blueprint Conclusion

NOS PC details Allotted Marks


1 PC1. Check for ground compactness and levelling a
PC2. Check for all required scaffolding material, hand tools and b
consumables.
2 PC3. Wear and use required safety gadgets following trade safety. c
PC4. Place and position sole boards as per marking. d
PC5. Erect and dismantle scaffold of 3.6 meter height within tolerance limit. e
PC6. Carryout proper housekeeping. f
Total Marks a+b+c+d+e+f
Table 8: Sample Marks Allocation based on PC Distribution
Time Allocation

The time duration of the assessment should take into account time requirements for various sections (theory/ practical/
viva) and can be calculated in following ways:
n Item wise: The time required for answering each item is Difficulty level Time Duration
assigned based on its difficulty level. Taking suggested D1 1 Minute
time durations for each difficulty, the time duration for
D2 ~2 Minutes
the entire test is extrapolated by summing the count of
questions of each difficulty level. D3 ~3-5 Minutes
n NSQF level: The time duration for a complete test can Table 9: Sample Matrix - Difficulty level and Time Duration
also be pre-determined based on the NSQ Flevel. The
recommended test duration for the Theory exam ranges between 60 minutes to 180 minutes across all NSQF
levels. The actual durations should be determined by the SSCs depending on sectoral needs.
n Simultaneous vs. Individual Allocation for Candidates: The time allocation of all assessments in which
candidates cannot simultaneously sit for the theory, practical, viva, etc., needs to be taken into consideration
while determining the total duration of the assessment. The duration of assessment is a summation of the time
wherein the candidates are assessed together (usually theory examination), and the time allocated for assessment
of individual candidates (usually practical and viva).
Based on the above, SSCs should identify ideal batch sizes for the job roles under their purview based on the time
required for assessment. This shall help indicate whether an additional assessor may be required to be sent, or if the
assessment to span multiple days. In general, the average batch size is approximately as follows:

Job role type Approximate batch size Remarks


Manufacturing 15 to 20 Additional time or additional assessor required for a batch bigger size than 20
Service 15 to 30 Additional time or additional assessor required for a batch bigger size than 30
Table 10: Approximate Batch Size for Assessment

29
Extrapolation of Scores

n Passing Criteria: The passing criteria for each job role should be considered as per industry standards. It can be
based on scheme guidelines or according to NSQF level. Candidates should obtain passing marks in each NOS
along with aggregated passing marks in the assessment.
n Scaling: Scaling is the extrapolation of scores by multiplying the assigning marks by a factor (scaling factor) to
meet the desired score. For example, if an item tagged to an PC has 15 marks, one may choose to:
a) Assign 15 marks directly to an item
b) Assign 3 marks to the item and use a scaling factor of 5 while calculating the total
In both the cases, if a candidate answers the item correctly, the candidate shall earn 15 marks on correctly answering
the item.
n Partial Marking: Partial marking is awarded when the answer of an item received from a candidate is incomplete
or partially correct. There should be strict guidelines for partial marking on an item. Is partial marking permitted? If
yes, then what are the rules for partial marking? In assessor-evaluated assessments, assessors need to be provided
rubrics that clearly delineate the rules for marking.

Types of Items

A typical assessment can be divided into three parts:


1. Theory (MCQs, constructed response type questions,etc.)
2. Practical (gamification, simulation, demonstration, coding,etc.)
3. Viva Voce (oral questioning)
4. Project (formative assessment, work logs, etc.)
The following table illustrates a few item types that can be utilized while creating the item bank and Assessment
Blueprint. Please note that this list is not comprehensive.

S.No. Question Type Code Description


Section Multiple Choice MCQ In MCQs, Candidates are to choose a correct option from a range of probable
A Questions/ / CR options with reference to a stem/question.
Constructed In CRs, Candidates are to construct responses to address the questions asked
Response
1 Fill in the blanks FiB A part of a sentence construct is left blank for the candidates to fit in the
most appropriate/correct response.
2 Scenario-based ScB A situation is provided, and a question based on probable outcome of the
Question scenario/description is provided to the candidate.
3 Media-based MI Either identification of a graphical situation or options in the forms of images
(Images/audio are provided, this makes it for candidates to grasp the question where
clips/ video clips) language may not suffice.
4 Comprehension COM A passage is presented to the candidates (much like a scenario but in greater
detail). The candidates are then subjected to a series of questions solely
based on the passage. This tests their grasping and comprehension ability.
5 Logical reasoning LR Logical deduction-based questions that are constructed on logic for
measuring mental ability.
6 Chronological CS A sequence of events/outcomes/steps to be stated or arranged in a correct
sequencing flow/sequence.

30
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

7 Matching the MCo Two columns containing multiple elements related to each other are given
columns in a mixed order. The candidates need to match the elements correctly.
8 Factual Inferential FI Factual questions require fact-based answers. For example, a candidate
may be asked to look at a passage, then answer a series of factual questions
based on what they just read.
9 Direct Concept DC Fundamental MCQ questions which assess whether the candidates are
aware of a singular concept or fact. It is mostly used to assess the theory
knowledge of the candidate. The questions directly picked from the
courseware.
Section Simulation-Based SiB Candidates are provided with real-life situations on a platform and assessed
B on their ability to solve the problem
1 Practical PP Direct evidence collection - Real work/ real time activities at the workplace.
Performance Suitable for predictable and routine works especially at operational levels.
This is also suitable for works that have a beginning and ending.
2 Role-Play RP Indirect evidence collection - Demonstration / Show-n-tell. Suitable for staff
dealing with a variety work situations
3 Case Study CS Indirect evidence collection – Questioning. Suitable for Suitable for assessing
analytical skills & higher order of cognitive
4 Typing Test TT Some jobs require the candidates to have knowledge of typing. To test
the ability of candidate to type with least mistakes and speed, this kind of
question is asked.
5 Verbal VB Voice based tests assess the listening/speaking abilities of the candidates.
Communication Their pronunciation, comprehension abilities, speech rate and other factors
Test related to voice are tested here. This test is useful for job roles where
candidate has to interact orally with the client.
Table 11: Question Types

Items need to be constructed well in order to ensure that the question being asked is interpreted correctly and allows
a fair chance to candidates to demonstrate their competency. Annexure 1 contains detailed instructions and tips for
the construction of Items.
Apart from existing digital solutions, various new technologies are being explored in the domain of assessments,
specifically with the development of augmented reality and virtual reality solutions. A detailed note on these
technologies is annexed (Annexure 9)

Difficulty Levels

The difficulty level of an item is the level of complexity/hardness of an item for the target group. Bloom’s Taxonomy
can serve as a structure for identifying the difficulty level of an item. Item difficulty is usually categorized on 3 levels,
namely: D1 – Easy, D2 – Moderate, D3 – Difficult, but this can be increased or decreased, as per the requirement. The
following table lists the parameters that can be taken into account when creating a difficulty matrix for an assessment.

D1 (Easy) D2 (Moderate) D3 (Difficult)


Expected Correct 70-100% 40%-70% Less than 40%
Responses
Level/ Type of Basic Knowledge/ Intermediate level theory/ Advanced Theory/ Application/
Concept Application Application Decision Making
Must Know/ Commonly Moderately Known/ Lesser known, Requires prior
Known/ Routine Task requires some experience + knowledge of other
surrounding knowledge concepts

31
Length of Stem Up to 2 Sentences Up to 3 Sentences (8—10 At least 4 Sentences (8—10 words
(8—10 words each) words each) each)
Up to 2 Sentences (12—15 At least 3 Sentences (12—15 words
words each) each)
Stem Complexity Basic Vocabulary Basic / Intermediate Intermediate Vocabulary
(Language) Vocabulary
(Commonly used words) (Few moderately used (Few moderately used words)
words)
Number of Only 1 Up to 2 for Basic 3 or more for basic
Concepts Only 1 for Intermediate 2 or more for intermediate
Only 1 for Advanced
Complexity of Basic (Vocabulary, Basic / Intermediate Intermediate (Vocabulary, Concept,
Distractors Concept, Length) (Vocabulary, Concept, Length)
Length)
Length/ type of (1—4 words) (4—6 words) (>6 words)
concept/ vocabulary
Question Type Basic concept/ fact Intermediate theory/ Advanced theory/application based
based application based E.g. Sequence/Match options
E.g. Fill in the blanks Eg: Match Options
Basic single correct Complex multiple-choice Complex multiple-choice items
answer items
Scenario (Narrative or Scenario (Narrative or Scenario (Narrative or simulation)
simulation) simulation)
Cognitive Recall of basic routine Recall of basic (less Skills based on Advanced concepts
Complexity (Bloom's and direct concepts / routine) and intermediate
Taxonomy) facts facts
Skills based on Skills based on Critical thinking and Reasoning
understanding of direct understanding of based on evidence or application
concepts / facts intermediate concepts ofintermediate and advanced
concepts
Calculation based Single Digit Operands Up to Double digit Calculations involving more than 2
operands Digits

Other factors to take into account while determining NSQF Level D1 (Easy)* D2 (Moderate) * D3 (Difficult) *
the difficulty level of as test item:
10 0 30 70
n Difficulty distribution of test questions in an 9 5 40 55
assessment can be decided based on NSQF
level of the job role. The ratio can be maintained 8 10 40 50
for all job roles at a particular NSQF level. As 7 20 45 35
NSQF level increases, the difficulty mix of an 6 30 45 25
assessment tends towards higher difficulty
5 40 45 15
levels.
4 50 40 10
n All difficulty levels are assigned to enable
discrimination between candidates based on 3 55 35 10
their performance and the type of test 2 55 40 5
A sample difficulty matrix is suggested. Each SSC 1 65 35 0
should create their own matrix dependent on Table 12: Sample Difficulty Matrix across NSQF levels
sector-specific requirements.

32
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

The Role of Assessors in Assessments

Assessors are stakeholders who are directly involved in evaluating candidates on various parameters defined in a
standard rubric for QP/ NOS assessment. Assessors are required to perform the act of evaluation for all assessments
that are not scored by system intelligence. They undergo a Training of Assessor (ToA) program that aligns them with
the assessment requirements in the short-term skill ecosystem.
For all assessor-based assessments (whether viva or practical skills assessment or responses on a theory paper), a
rubrics and scoring structure is to be provided to ensure standardization in scoring. The rubrics and checklists need
to be further coupled with training of assessors (ToA) and a comprehensive Assessor Guide that benchmarks how
evaluation and scoring need to be conducted, attempting to minimize subjectivity. The tables below are sample
rubrics/grading matrices that can be referenced for creating assessments.

Parameter Description
Excellent If a candidate can perform a particular step/rubric with perfection and is able to display the skills
required for completing that particular task, then he/she must be allotted Excellent i.e. 100% of
the assigned marks.
Good If a candidate can perform a particular step/rubric with perfection but he/she is unable to display
the skills required for completing that particular task, then he/she must be allotted Good i.e. 75%
of the assigned marks.
Satisfactory If a candidate performs a step/rubric but is unable to showcase the perfection and skills required
for completing that task, then he/she must be allotted Satisfactory i.e. 50% of the assigned marks.
Poor If a candidate is not able to perform a step/rubric and he/she has no knowledge of skills required
for completing that particular task, then he/she must be allotted Poor i.e. 0% of the assigned
marks.
Table 13: Sample Grading Matrix for Scenario-based Questions

Parameter Description
Correct If a candidate answers all the points as mentioned in the description and can showcase the
complete knowledge related to the topic being asked, then he/she must be allotted Correct
i.e. 100% of the assigned marks.
Partially Correct If a candidate answers few points as mentioned in the description and is not able to showcase
the complete knowledge related to the topic being asked, then he/she must be allotted
Partially Correct i.e. 50% of the assigned marks.
Incorrect If a candidate is unable to answer any points as mentioned in the description and has no
knowledge of the topic being asked, then he/she must be allotted Incorrect i.e. 0% of the
assigned marks.
Table 14: Sample Grading Matrix for Viva Voce Questions

Assessor Guide

This Assessor Guide is a manual used by an assessor for conducting and evaluating assessments. An Assessor Guide is
based on a QP/NOS and gives a clear standard operating procedure on how one must assess a specific qualification.
The purpose of the Assessor Guide is as follows:
n It clearly states how the qualification will be assessed and provides details on the assessment events, processes
and instruments that will be used. It also details the possible outcomes of the assessment
n It details how the relevant occupational national standards (NOS) and its respective assessment criteria shall be
assessed using mapped items in a test form
n It serves as a guide for the trainer in their course delivery as they prepare the candidates for the assessment. It
also assists candidates in undertaking preparation for the assessments

33
n It clarifies for assessors, trainers, candidates, auditors, training providers and employers what a competent person
can do and how these criteria have been met
A sample template for the Assessor Guide.

S.No Key Points in Assessor Guide


Overview 1. QP/NOS details and codes
2. Purpose of Assessment
3. Competency Assessment Level
4. Description of Candidates
5. Eligibility criteria of Assessor for the domain Job role
Assessment Specifications 1. Domain Infrastructure Equipment, Tools, Consumables
2. Assessment Criteria
3. Preferred mode of Assessment
4. Assessment Methods and Instruments
5. Assessment Venue
6. Duration of Assessment
7. Sequence of Activities in the Assessment
8. Turnaround times for the Assessment
9. Evidence Gathering Plan for practical competencies
10. Evidence Gathering Plan for Knowledge

Conduct of Assessment 1.Instructions for Assessors


a) Before conduct of assessment
b) During conduct of assessment
c) Post conduct of assessment
2. Instructions for candidates
Records 1. Performance Criteria Assessment Checklist
2. Knowledge Assessment Checklist
3. Assessment Results
Code of Conduct and 1. Grievance Redressal mechanism and contact details of AA and SSC
Grievance Mechanism 2. Code of Conduct
Appendices 1. Code of Practice for Assessor
2. List of Tools, Equipment and Supplies
3. Marks Allocation/ Marking Scheme
4. Candidate’s Appeal Form for re-evaluation/ re-assessment
Table 15: Sample Template for Assessor Guide

Target Group & Language

Language is an important factor to consider for assessments. Assessments should be translated into local/ regional
languages as per the requirements of the target group. Test items must be designed such that they are cross-cultural
with on-field terminologies used in the specified regions. Translators are required to be engaged for the purpose of
translation. All translated assessments should be cross-checked by local vernacular experts.

34
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Assessment Blueprint and Item Bank Approval & Review

All Blueprints & Item Banks developed should be approved by at least 2 independent SMEs who may be from the
industry or academia, empaneled on the criteria detailed in Table 1.

Generating the Test Form

The test form is a question paper that is generated for the assessment, drawing items from the item bank in a
combination that complies with the requirement of the Assessment Blueprint for the QP/NOS.
n The assessment platform should be able to generate a test form with a set of questions representing the PCs from
all the NOS based on the difficulty level of each item, such that it is in perfect alignment with the Assessment
Blueprint. For non-digital assessments, test forms should be created beforehand drawing items from the approved
item bank in a similar manner.
n The scoring matrix of the test form should be aligned to the Assessment Blueprint
n The questions should be systematically arranged within the test form
n The answers and distractors should also be systematically arranged within the items

35
Assessment Planning
Assessment planning involves a host of activities such as aligning and training personnel for management of the
assessment process, empaneling assessment agencies, defining appropriate modes of assessment, exploring
suitable proctoring solutions, training and certifying assessors, and ordering the entire assessment process. This
section elucidates the work that must be done in order to bring systematization and operational efficiency into the
assessment process.

Annual Assessment Planning

At the start of financial year, SSCs are expected to prepare an annual assessment plan, based on their visibility on
training and assessment targets under various schemes, whether government-funded or non-government funded.
SSCs should share an indicative plan with all its empaneled AAs to shift away from ad-hoc assessment assignments
through enabling greater preparation.

Management of Assessment Agencies



Empanelment of Assessment Agencies
Any AA duly empanelled with NCVET shall be permitted to operate with the concerned SSCs, after an operation &
financial Service Level Agreement (SLA) has been drawn between the SSC and AA. In the event that there are no
relevant AAs (based on sector, job-role, geography coverage) empanelled with NCVET, the SSC may directly work with
AAs using the SOP – Criteria for the Empanelment of Assessment Agency by SSCs. The SOP is annexed (Annexure 6).
Grading of Assessment Agencies
It is recommended that SSCs prepare a transparent performance measurement and feedback system for rewarding
and recognizing well-performing AAs and thereby driving quality assessment administration. The following could be
some of the parameters assessed in a grading matrix:

S. No Suggested Parameters for Performance Evaluation Matrix


1 Percentage of Assigned Assessments Accepted & Assessed by AA
2 Accuracy in calculation of results in adherence to assessment framework
3 Percentage of batches for which result has been uploaded within the defined TAT requirements
4 Percentages of batches where the Assessment Agency has captured all the evidences as required by the
respective awarding body/Sector Skill Council
5 Number of batches, if any, where Assessment Agency has been found involved in malpractice
6 Number of Assessors who have been blacklisted pertaining to Assessment Agency
7 Number of Assessors who have been suspended pertaining to the Assessment Agency
8 Percentage of batches which were delayed because of delay at the end of Assessment Agency
9 Percentage of batches for which reports and analytics were provided by the AA
10 Technological capability of the AA with regard to SSC-specific assessment
11 Reporting and analytics capability of the AA with regard to SSC-specific assessment
Table 16: Suggested Matrix for Performance Evaluation of AAs

While developing a performance evaluation grading matrix, SSCs should note that:
n Each parameter may be designated a weightage out of 100, based on its importance in sector assessments
n The matrix should be made available to all empaneled AAs, listing expectations, reward structure, and
consequences, if any. The volume of assessments assigned to an AA may be commensurate with the performance
of the AA.

36
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

n The review should happen on a quarterly basis during which outcomes are made available to the concerned
stakeholders with points of action, if any
n Based on the scores generated, AAs may be bucketed Grade Criterion
into grades. A sample grading matrix is provided in
Grade 1 AAs scoring > 90%
Table 17:
Grade 2 AAs scoring between 80 – 90%
n A minimum score may be benchmarked which shall
serve as the minimum requirement for AAs to remain Grade 3 AAs scoring between 60-80%
empaneled with an SSC Table 17: Suggested Parameters for Performance Evaluation of AAs

Assessment Fee
An assessment has the role of both the SSC and the AA, and the fee for the assessment is shared between the two
stakeholders, contingent on the functions performed by them. To advance standardization and ensure transparency
in the functioning of assessments, it is recommended that a uniform revenue sharing matrix based on responsibility
undertaken be introduced as follows:

Actor Responsibility Recommended Share


AA n Assessment design support to SSC
n Provision of assessment interface with desired functionalities 60%
n Assessment execution
n Assessor (if any)
n Proctor (if any)
n Online proctoring (if applicable)
n Evidence Collection and Documentation
n Uploading scores
SSC n Development of Blueprint
n Development of Item Bank 40%
n Monitoring and Audits
n Quality Assurance
n Grievance Redressal
n Item and Blueprint Review
n Performance Management of AAs and Assessors
n Reports & Analytics
n Certification
Table 18: Recommended Revenue Sharing between SSC and AA

Further, it is recommended that:

1 A defined fee should be levied on TPs for postponement or cancellation of an assessment less than 24 hours from
the designated time of the assessment.
2 A defined fee should be levied on AAs for cancellation of an assessment less than 24 hours from the designated
time of the assessment.
3 The revenue sharing for the AA may be increased for less accessible geographies, such as North Eastern regions,
Jammu & Kashmir, Left-Wing Extremist locations, Andaman & Nicobar Islands, etc.
4 The payment of the assessment fee should be made monthly to SSCs for all Government-aided projects and
schemes. AAs should also receive payments for all assessments conducted on a monthly basis.

37
Mode of Assessment Delivery

Depending on the differentiated needs across sectors and job roles, coupled with technological advancements in the
field of assessments, a range of assessment delivery modes have emerged. The delivery modes are derived taking into
consideration factors such as:
n Requirement of an Assessor (physically present, remotely present, or not present)
n Requirement of a Proctor (physically present, remotely present, or not present)
n Internet availability at assessment location
n Requirement of domain infrastructure, equipment & tools for practical skills assessment
n Digital literacy level of candidates to be assessed
n Availability of technological solution (infrastructure or software) for administration of assessment
Depending on a combination of the above factors, the prevalent modes of administering assessments fall broadly
within the following three categories:
I. Remote online assessments
II. Center-based digital assessments
III. Center-based non-digital assessments
The summary table below details these broad categorizations into various modes of assessments. SSCs must exam-
ine the above-mentioned factors influencing assessments and determine the most appropriate mode of assessment
delivery for every assessment.

Mode No. Description Assessor Proctor Internet


Remote 1 Applicable for Job roles for which No Yes Online
Assessment competency can be assessed online (remote)
(online) and auto-scored (preferred for cases
where there is no requirement of domain
infrastructure for assessment)
2 Applicable for Job roles for which Yes Yes Online
competency can be assessed online and (remote) (remote)
scored remotely by an assessor
Center 3 Applicable for Job roles for which domain Yes Optional Online or
Based Digital infrastructure may be required for (on-ground) Offline
Assessment assessment – to be assessed by an Assessor
(Online or on-ground supported by Proctor
offline)
4 Applicable for Job roles for which No Yes Online or
competency can be assessed online and (on-ground) Offline
auto-scored
5 Applicable for Job roles where competency Yes Yes Online or
can be assessed remotely, through video (remote) (on-ground) Offline
recordings captured by Proctor (on-
ground).
Center based 6 Cases wherein theoretical knowledge is Yes Optional Offline
non-digital tested using pen-and-paper and practical (on ground)
Assessment skills are assessed by the Assessor

Note:
n Mode 6 or Center Based Non-Digital Assessments are not considered a favorable mode of assessment. The
theory section of all assessments is expected to be administered digitally and should only be relaxed in extreme

38
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

circumstances wherein digital administration is a challenge.


n All digital assessments must have a minimum proctoring requirement of a secure-browser. Wherever there is
no proctor on-ground, AI-enabled auto-proctoring should be used.
n An internet connectivity of minimum 2MBPS is required to support digital online assessments.
n Internet usage during assessment enables real-time upload of information. In case of non-availability of
internet – information and data logs are recorded and synced with the server when internet is available.
n Wherever no internet is available for conduct of a digital assessment, a proctor must necessarily be present
on-ground.
n All modes of assessments where there is no requirement of an assessor (Mode 1 and Mode 4) refer to those
assessments that are auto-scored by system intelligence.
The categorizations and their use-cases are detailed below:
I. Remote Assessment (Online) are those assessments that can be undertaken from any location (home, TC
location, assessment center) and be assessed effectively on a technology device, using proctoring solutions
like AI-enabled tools and invigilation through a live video-stream by a remotely situated proctor. Depending on
the evaluation requirements, remote assessments (online) may or may not have the involvement of a remote
assessor. This delivery method is preferred when:
n The assessment does not require domain infrastructure, equipment or tools for assessing competency
n The assessment can be administered on a technology device that is available with the candidates to be
assessed
n The technology device specifications should be provided well in advance
n Uninterrupted internet is available at the assessment location at the speed and bandwidth requirement of
assessment platform
n AI-enabled auto-proctoring tools are available for assessment
n Candidates are comfortable with technology
n The scheme under which the assessment falls permits the conduct of remote online assessment
II. Center-based Digital Assessment (Online or offline) are those assessments that are undertaken at a Center (TC
or assessment center), with at least one person (Assessor and/or Proctor) on-ground to invigilate and administer
the assessment. There are various modes of assessment within this category, which are largely characterized
based on the involvement of the personnel on-ground. Proctors are recommended for this type of assessment.
Such assessments can be conducted online (using internet) or offline (using LAN or pre-loaded software),
depending on the availability of internet. In case of non-availability of internet hindering real-time upload of
information – the assessment data logs are recorded and synced with the server at a later period. AI-based auto-
proctoring is recommended for all cases wherein internet is available. This delivery method is preferred when:

n The assessment requires domain infrastructure, equipment or tools for assessing competency
n The IT infrastructure is to be provided to the candidate by the Center/Assessment Agency
n Internet is available at the Assessment location at the required bandwidth and speed of the technology
platform (online) and when internet is not available (offline)
n An assessor is required to evaluate the assessment
n There is a mandatory requirement to conduct center-based assessment
III. Center-based non-digital Assessment are administered completely without the usage of technology, using
pen-paper mode for theory assessment. The assessments are administered by the Assessor on-ground at the
assessment location and may be supported by a proctor. The assessor manually processes the assessment score
and shares it with the Assessment Agency.

39
This delivery method is not preferred and should only be implemented in extreme circumstances wherein digital
administration is not possible due to limitations at the candidate’s end. When utilized, it should be ensured that:
n The question paper complies with the requirements of the Assessment Blueprint
n Hard copies of the question paper are generated in advance by the AA and sealed in an envelope for the
Assessor to carry on the day of the assessment. The envelope should only be opened on the day of assessment in
front of candidates and proctor

Digital Assessment Interface

Based on infrastructure and equipment availability, as well as availability of internet at the assessment location, the
following interfaces can be leveraged for digital assessments:
I. Browser-based: Browser-based assessments can be administered on any device (Computers, Tablets or
Smartphones) leveraging available browsers such as Google Chrome, Safari, Firefox Mozilla, Internet Explorer,
Opera, etc. These assessments are conducted online using candidate-specific log-in credentials and do not require
an additional application to be downloaded or installed.
II. Secure Browser-based: Secure browsers are specific to the Assessment Agency and have the advantage of
incorporating features such as control over browser minimizations and switches, limiting background applications,
single log-ins and more. The following list of features should be present as part of a secure browser for assessments:
n Locked-down browser
n Disables screen recording
n Disables screen-sharing
n Disables screen projection
n Disables access to any other tool
n Disables access to google or any other site
n Disables Cut/Copy-Paste
n IP Whitelisting (recommended)
n Internet Scanning (recommended)
III. Application-based: Application-based assessments are administered through an Assessment Agency application
installed on a device such as a tablet or smartphone. These assessments have the advantage of including checks
such as disabling copy-paste functionality, sharing, flagging violations, etc.
Sample instructions for candidates appearing for digital assessments are annexed (Annexure 2).

Types of Proctoring Solutions

Assessments can be delivered to candidates in various formats of proctoring. Proctoring is used to troubleshoot
technical concerns and prevent any malpractice that may serve to manipulate scores during the assessment. Apart from
placing a proctor physically on-ground at the assessment location, technology can also be leveraged for proctoring of
assessments, whether remotely conducted or conducted on-ground. The three main types of proctoring modes are:
1. Manual Offline Proctoring
2. Live Remote Proctoring
3. AI-Enabled Auto-Proctoring
These are detailed in the section below:

Manual Offline Proctoring

40
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Manual Offline Proctoring is a traditional form of proctoring wherein proctors are physically present on-ground at the
assessment center as technical experts and invigilators. The proctors are the eye-on-the-ground and responsible for
monitoring the assessment, completing documentation, and raising complaints in case there are any anomalies.
Live Remote Proctoring
Live Proctoring refers to the proctoring of an assessment by a remotely situated qualified human proctor through real-
time image/video streaming and screen-sharing feeds. The salient features of Live Proctoring include:
n The tool allows a proctor to remotely invigilate candidates sitting for online remote assessments
n The proctoring software (auto-proctoring) provides a list of red flags whenever it detects suspicious behavior,
which can be reviewed in real-time or post the assessment to assess the integrity of the exam and take appropriate
action
n The proctor can pause the test when they notice anything unusual and chat with the test-taker or terminate
the exam, if needed
n The proctor can provide instructions or troubleshoot basic technical issues during the assessment, in case any
arise
AI-enabled Auto-proctoring
Auto-proctoring tools use system-driven logics and artificial intelligence to restrict undesirable behaviour during
the assessment process and raise flags wherever anomalies are detected by monitoring the web-camera feed. The AI-
based solutions are trained over a large amount of data to accurately raise flags. Some of the key flags that the system
should raise are:
n Window switch count: In the absence of a secure browser for conducting assessments, the window switch
count can be logged if the candidate moves from one assessment window to another. This tool can notify
invigilators and assessors immediately, and the system can terminate the assessment after x (e.g.: 5) number of
browser switches.
n Face detection: AI-enabled technology can be utilized to identify the number of faces in front of the screen
during the assessment. For example, it can detect multiple faces or identify if the candidate moves away from the
test window. Such detections can be recorded and highlighted for review by a proctor. The proctor can also send
warnings in case of a detected violation.
n Object detection: AI-enabled technology can detect the presence of suspicious objects like headset, mobile
phone, etc, in front of the screen during the assessment.
n Simultaneous login attempts: The system can also detect, and in some cases inhibit, multiple log-ins for
a single candidate. Candidate should be automatically logged out and assessment should be terminated after x
(e.g.: 3) attempts at a multiple log-in.
n Turning away from test window: The AI also analyses the candidate’s facial features, including their ears,
enabling the system to detect if the candidate turns away from the test window.
It is mandated that AAs use the data generated through flags for providing a Credibility Index to the Assessment. The
Candidate Credibility Index helps in flagging cases where anomalies are spotted during the assessment and enables
the proctor and AA to identify cases of malpractice and cheating. The Credibility Index tool should be customizable
with respect to the framework for flagging and weightages assigned to each flag.

Assessment Personnel within the SSC

An assessment team at the SSC/ AA works closely with affiliated SMEs to create and validate the test items. The
benefit of having an assessment team within the organization is the reduction in cost and time saved in assessment
development, as well as a tighter hand on assessment. The assessment team manages the on-going production and
maintenance of assessment, awarding, results release, and standard setting. The figure below is an example of what

41
Figure 6: Example of Assessment Team Structure at SSC/AA

an assessment team at the SSC/ AA could look like:


Assessment teams within each SSC play a crucial role in ensuring the quality control and adherence to relevant
assessment standards.These teams can further the SSC’s efforts to:
n Enhance the efficacy of assessment methodology, leading to adequate recognition of current skills, knowledge
and experience of trainees
n Increase quality assurance of the assessment design, development and delivery and instill standardization in
processes
n Enhance industry relevance of assessments to augment the value of the certification
n Ensure continuous improvement in quality of operations through performance assessment of AAs
n Instill a system of competition amongst training providers, trainers and assessment agencies by setting a
quality standard
n Assessment development is better integrated to training program and curriculum
SSCs can achieve the aforementioned objectives by having a team comprising members of varied backgrounds,
including statistical analysis, assessment design, pedagogy experts, researchers, and SMEs.

Orientation Required by Personnel for Conducting Assessments

The assessment process involves various actors that have different roles to play during the assessment. To ensure
that each stakeholder is aligned to the requirements of assessments in the skill ecosystem, each stakeholder has
to be familiarized with key resources to ensure that they perform their role effectively. The section below details a
non-exhaustive list of subjects that personnel must be oriented with before partaking in the assessment process.
Orientations may be conducted that are over and above the stated:

A. Assessment Designers must have a grasp on:


i. The QP and its corresponding Assessment Blueprint
ii. The different modes of assessment available (digital & non-digital) and their suitability for the QP
iii. The assessment platform and its functionalities for configuring the assessment
B. Assessors must be apprised on:
i. The appropriate means to score candidates on competencies demonstrated
ii. The conduct of assessments in alignment with the Assessor Guide
iii. The assessment platform/technology on which the assessment is administered
iv. The questions and environments created for the assessment

42
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

v. Specific requirements depending on the type of assessment (using technology such as AR/VR, conducting
assessments remotely, etc)
vi. Scheme-specific documentation and compliance requirements
vii. The standard operating procedure to raise complaints or malpractices
C. Proctors must be oriented with:
i. Technological know-how, understanding of software and hardware to troubleshoot technical issues that arise
during assessments, including regulating issues with the internet, LAN, and browser
ii. AA’s platform and application for conduct of the assessment
iii. Evidence capturing requirements
iv. Compliance requirements with respect to candidate attendance & verification, geo-tagging
v. Raising flags and invigilation responsibility
vi. Scheme-specific documentation and compliance requirements
vii. The standard operating procedure to raise complaints or malpractices

D. Assessment Agencies must be oriented with:


i. The different modes of assessment available (digital & non-digital) and their suitability for the QP
ii. The QPs and their corresponding Assessment Blueprints
iii. Requirements for processing results
iv. Compliance requirements with respect to candidate attendance & verification, geotagging, and Skill India
Portal
v. Application of technology to assess competencies
vi. Data analysis, storing and verification requirements
vii. Review and analysis process for items in question papers and performance in assessment batches, and escalate
cases for review
viii. Available proctoring solutions and interpretation of flags raised
ix. Evidence capturing requirements
x. Scheme-specific documentation and compliance requirements
xi. The standard operating procedure to raise complaints or malpractices

E. Sector Skill Council staff must be oriented with:


i. The different modes of assessment available (digital & non-digital) and their suitability for the QP
ii. Application of technology to assess competencies
iii. Result processing and approval process
iv. Data analysis for ensuring quality in assessments
v. Monitoring requirements for ensuring quality in assessments
vi. AA’s platform and application for conduct of the assessment, and types of proctoring flags raised during the
course of assessment

F. Subject Matter Experts must be oriented with:


i. NSQF level and a general understanding of the skill ecosystem

43
ii. QPs/ NOS, and their components, including the Assessment Criteria
iii. Guidelines on Assessment and requirements for a blueprint
iv. The different modes of assessment available (digital & non-digital) and their suitability for the QP
v. Application of technology to assess competencies

Certification of Assessors

SSCs schedule and conduct Training of Assessor (ToA) programs to orient and certify assessors on the requirement
of assessing on QPs/NOS aligned to the NSQF framework. Assessors undergo a program that has the following
components:
n Domain Orientation; during which the assessor is familiarized with the domain QP, its components, and an
understanding of the Indian Skill system
n Assessment Skills; during which the assessor is briefed on the principles of assessment and apprised on
processes and protocols that must be ensured while conducting assessments
The ToA program is closed with a rigorous assessment to evaluate both the Domain and the Assessment Skills of the
assessor. All those who successfully clear both assessments are deemed as qualified assessors and receive a certificate
on the Skill India Portal. The ToA program is conducted in accordance with the Guidelines for Training of Trainers and
Assessors, publicly accessible on the NSDC and MSDE websites. Assessors who may appear for this program must meet
a minimum eligibility criterion for the specific job role. Details on the Guidelines and Eligibility Criteria for Assessors are
annexed (Annexure 8).
AAs are encouraged to take independent performance evaluations of all their empanelled AAs through a defined
performance evaluation matrix. The key anticipated outcomesof this evaluation is three pronged:
n Comprehensive rating mechanism depicting strengths and highlighting areas of improvements
n Improving quality by recognizing and rewarding good behaviour
n Improving the professional and career aspirations of assessors
The approaches adopted could include self-rating of an assessor, rating by the AA, and evaluation by an independent
third party. The parameters that could be taken into account could be feedback collected from stakeholders during
assessments, quality outcomes during assessments and adherence to requirements, academic qualification and work
experience of the assessor, cognitive and leadership skills, TOA scores, adherence to TATs, or any other.

Recognition of Proctors

Proctors are individuals who support the assessment by ensuring readiness and assistance on technology and
infrastructure requirements, fulfilling documentation requirements, invigilating the assessment, and alerting
authorities in case of any anomalies.
The eligibility criteria for a proctor is:
n Minimum education qualification: Class X Pass
n Skill set: Technical knowledge for troubleshooting fixes in technology systems, internet, hardware, software,
etc
Detailed instructions for proctors are available in the Proctor’s Handbook (Annexure 5).

44
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Assessment Administration
This section details the process of administering an assessment in a comprehensive chronological manner starting
from pre-assessment, moving on to the assessment process on the day of assessment, and finally closing with post
assessment activities.

Assessment Activities

The section below details the various tasks that must be undertaken for administering an assessment and the
stakeholder primarily responsible for the task.
Pre-Assessment
S. No. Tasks Responsibility
1 Initiate Request for Assessment to SSC on Skill India Portal specifying the
following: Training Center/
Employer
i. Batch details
ii. Language preference
iii. Any special assistance requests (reader/ writer/ time consideration)
iv. Preferred mode of assessment (indicating internet & power availability)
2 Review request and allocate the following actors for assessment: Sector Skill Council
i. Assessment Center
ii. Assessment Agency
3 Accept the assessment request on Skill India Portal and ensure the following: Assessment Agency
i. Allocation and acceptance of batch by Assessor and Proctor
ii. Necessary travel arrangements for Assessors/Proctor, wherever applicable
iii. Readiness of assessment interface
iv. Readiness of devices, if any
v. Readiness of test form, in the language of choice
vi. Coordination with stakeholders to enable smooth administration of
assessment. Communication of Assessor/ Proctor with Assessment Center 24
hours before the start of the assessment
vii. Arrival of the Assessor/ Proctor at the stipulated time (if applicable)
4 Ensure the following are in place as per the requirements of the SSC and AA: Assessment Center
i. IT infrastructure including network connectivity
ii. Domain and general Infrastructure
iii. Domain tools and equipment in adequate quantity
iv. Consumables in adequate quantity
v. IP camera equipment to record the assessment
5 In case of remote online assessment, ensure the following is available for the end Training Center
user:
i. Availability of hardware compliant with the specifications required for the
online remote assessment
ii. Internet connection with minimum 2MBPS speed
iii. Internet and hardware that can support live streaming through a front-
facing camera
iv. Adequate power backup
v. Basic tools and consumables, if any
6 Ensure all candidates are apprised on all the conditions underpinning the Assessment Agency
assessment and all expectations from their end with Training
Center

45
During Assessment
S. No. Tasks Responsibility
1 Candidate registration and post registration support AA/ Assessment Center
2 Candidate level compliances: TC
i. Travel logistics (if applicable)
ii. Reporting of candidates at the reporting time
iii. Availability of genuine candidates
3 Ensure Assessor/ Proctor arrive 30 minutes prior to scheduled time AA
4 Presence of center personnel on the day of assessment Assessment Center
5 Verify identity of the assessor/ proctor by scrutinizing Government proof Assessment Center
and company ID
6 Ensure the following: Assessor/ Proctor
i. Candidate verification
ii. Candidate briefing for assessment
7 Ensure the assessment is video-recorded and stored, preferably through an AA/ Assessment Center
IP-based camera
8 Evaluation of candidate’s competencies (practical, viva, theory, any other) Assessor
9 Ensure all assessment data and evidence is collected and stored as per the AA/ Assessment Center
requirements
10 Managing observers on the day of assessment (representatives from Assessment Center
District Skill Committee, State Skill Development Mission, NSDC, SSC, etc.)
11 Ensure a conducive environment for the assessment including availability Assessment Center
of drinking water, clean toilets and adherence to health and safety
guidelines
12 Ensure conduct of assessment in a compliant manner and report any flags AA and Assessment Center
to SSC immediately
13 Documentation requirements and feedback collection AA/ Assessment Center
14 Surprise audits and monitoring activities SSC
15 Observation visits and surprise checks NSDC/SSDM/DSM
16 Report to the concerned stakeholder for any non-compliance All

Post Assessment
S. No. Tasks Responsibility
1 Review documentation from the day of assessment (Feedback for, Code of SSC
Conduct, etc.)
2 Uploading scores AA
3 Result Validation SSC
4 Results sharing with TC/TP and candidates and certificate generation SSC
5 Provide assessment reports and analysis to SSC after each assessment AA
6 Monitoring and analysis of assessment data, TAT compliance of AA SSC
7 Distribution of results and certificates to candidates Training Center
8 Report any non-compliance within the stipulated timeframe All
9 Address grievances and/ or queries All

A sample schedule for the day of assessment is annexed (Annexure 7).

46
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Assessment Timelines

All assessments should be scheduled using the centralized technology platform (Skill India Portal). The scheduling
process involve multiple stakeholders such as the SSC, Assessment Center, AA, Assessors and Proctors. The
recommended timelines for each stakeholder for taking requisite actions on the portal are covered in the table below:

S. No. Stage of Assessment Actor Recommended Timeline


1 Initiate Assessment request on SIP TP/ TC At least 30 days before the assessment date
2 Assignment of Assessment Center SSC At least 25 days before the assessment date
3 Acceptance / Rejection of Batch for Assessment At least 22 days before the assessment date
assessment Center
4 Assignment of Assessment Agency SSC At least 20 days before the assessment date
5 Accept/Reject batch for assessment AA At least 15 days before the assessment date
allocated by SSC
6 Assignment of batch to Assessor AA At least 10 days before the assessment date
7 Accept/Reject batch allocated by AA Assessor At least 7 days before the assessment date
on SIP
8 Arrival of assessor/ proctor to Assessor/ At least 30 minutes before the scheduled time of
assessment center Proctor assessment
9 Result upload by Assessment Agency Assessment Within 3 days of Conduct of Assessment
Agency
10 Result Approval and Certificate SSC Within 6 days of result upload by Assessment
Generation Agency

Dropouts and Absentees

Candidates who do not appear for assessments despite their names in the batch list could be marked as either absentees
or dropout cases. To bring in distinction in the two cases, the two cases are defined below:
i. Dropout: Candidates are considered to be dropout if they have not completed the training and hence not
appearing for the assessment. TCs are aware which candidates are dropouts and should indicate all such
candidates at least 7 days prior to the assessment date on Skill India Portal. A candidate who has been marked
as a dropout shall not be considered as part of the list of candidates to be assessed.
ii. Absent: Any candidate listed in the batch who does not appear for assessment on the assessment date due to
any reason is marked as ‘Absent’. Any candidate arriving late for assessment by more than one hour from the
scheduled time shall not be permitted to sit for the assessment and shall be marked as absent.

Code of Conduct

Assessment Agencies
It is the duty of the AA to ensure that:
n Potential forms of conflict of interest in the assessment process and/or outcomes are identified, and appropriate
referrals are made, wherever necessary
n Assessments are conducted within the boundaries of the assessment system policies and procedures
n Candidates are informed of all known potential consequences of assessment decisions, prior to the assessment
n All forms of harassment are avoided throughout the assessment process and in the review and reporting of

47
assessment outcomes
n Personal or interpersonal factors that are irrelevant to the assessment of competence do not influence the
assessment outcomes
n Evidence is verified against the rules of evidence
n Assessment decisions are based on available evidence that can be produced and verified
n Confidentiality is maintained regarding assessment decisions/outcomes and records of individual assessment
outcomes are only released on the explicit instructions of the SSC

Proctors
The following is the code of conduct for a proctor:
n Dress in a suitable attire, present and behave in a professional manner with all the stakeholders
n Smoking, arriving drunk at work, chewing tobacco, gutka, chewing gums, betel nuts on the premises of the
assessment center is strictly prohibited
n Any information that is confidential must not be disclosed to unauthorized personnel
n Any offers made to extend unauthorized favors in the benefit of TP/candidates must be rejected and reported
to authorities immediately
n Any proctors found breaching contracts of ethics and non-bribery shall entail strict action against the proctors
as per the defined penalty matrix, which may lead to suspension/blacklisting
n Proctor to not share passwords or assign their work to any other unauthorized personnel
n There must be no bias against anyone on the basis of color, caste, language, religion, social status, gender,
disability, political and other affiliations, etc
n Official materials including test resources, results, equipment, etc, should be safeguarded from loss by theft,
spoilage and misplacement
Assessors
The following is the code of conduct for an assessor:
n Be fair and unbiased to all candidates at all the times, regardless of treatment, regardless of race, religion, gender,
background, cultural beliefs or age
n Be respectful to all candidates. Do not belittle or degrade candidates. Be culturally sensitive and adapt to the
context of the assessment
n Display trust and integrity. Integrity means to carry out duties as an Assessor in a morally correct manner
n The assessment should be conducted only on pre-defined criteria; measuring only aspects defined in qualification
and as per the understanding with the SSC and AA
n It is recommended that any competency or dimension may be measured more than once if required to clearly
establish its presence or absence in the participants
n Seek consent from candidates before undertaking any recordings. The need for any such evidence must be
explained to the candidates
n The reliability and validity and fairness of the assessment tools have to be established prior to use in an assessment
n Potential forms of conflict of interest in the assessment process and/or outcomes should be identified, and
appropriate referrals should be made, if necessary
n All forms of harassment should be prohibited throughout the assessment process and in the review and assessment
outcome
n Candidates should be made aware of their rights including processes for grievance redressal and re-evaluation
n Assessments should be assessed objectively on defined rubrics and scoring matrices. Personal or interpersonal

48
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

factors that are irrelevant to the assessment of competence must not influence the assessment outcomes.
Assessment decisions should be based on available evidence that can be produced and verified by another
assessor if required
n Assessments should be conducted within the boundaries of the assessment system policies and procedures
n Assessment systems and tools should be consistent with equal opportunity rights, indicating the assessment
process should be uniform across environments
n Confidentiality should be maintained regarding assessment decisions/outcomes and records of individual
assessment outcomes
n The assessor shall not indulge in malpractice in any form. The assessor shall not accept gifts, bribes or hospitality for
any reason or purpose; nor show favour or disfavour to anyone.The assessor shall not use his/her official position
to secure unwarranted privileges for him/herself, family, business associates, or any other person wherein said
member benefits directly or indirectly
n In case any malpractice is observed, the assessor should report this to the concerned authorities
n Smoking, arriving drunk at work, chewing tobacco, gutka, chewing gums, betel nuts on the premises of the
assessment center is strictly prohibited

49

Evidence Gathering

Evidence gathering is a means to monitor an assessment whether for improving quality, or meeting standard guidelines
or for retrieval at a later date. Many schemes list out specific evidences needed to meet compliances requirements.
Evidence in an assessment can be captured in number of ways including means such as devices, tablets, external
cameras, etc. Evidence is generally collected by proctors, assessors, digital systems, auto-proctoring functionalities etc.
Evidence in an assessment should be gathered on the following fronts:

S. No. Evidence Type Details


1 Candidate n Photograph of valid government photo ID proof held by candidate. Both the candidate’s face
Validation and the ID proof to be present in frame. A proctor or AI-enabled tools may verify that it is the
same candidate.
2 Theory n 3 Photographs and a video preferably from various angles of the classroom with clear and
Assessment visible image/footage of all trainees giving the assessment, time-stamped and geo-tagged
n Photograph of assessor conducting the assessment
n Digital assessments to capture intermittent images of the candidate and highlight system flags
are raised
n Response transcript of the candidates
n Answer logs of the candidates
3 Viva voce n Video/ audio snippets of at least 5 minutes for each candidate
4 Practical n Photograph of outcome/final product of practical for each candidate
n Video recording (min. 5 minutes) of candidate performing the practical using IP based camera/
handheld camera/ front camera
n Practical evaluation checklist
5 Group n A group photograph with the assessor, assessment center personnel, proctor, and candidates
Photograph with faces clearly visible
6 Infrastructure n A photograph of a classroom to accommodate 30 candidates as per requirement
validation n Photographs of tools, equipment and consumables available as per the requirements
7 Attendance n Copy of the attendance sheet, with date and location, clearly marking absent and dropout
sheet candidates, duly signed by authorized signatory of TP or TC Head or Trainer assigned to the
batch
8 Assessor n Detailed form filled by assessor capturing feedback on candidates, experience of assessments,
Feedback form availability of infrastructure, tools, equipment, and consumables, etc. Standard feedback
templates shall be made publicly available
9 TP feedback n Detailed form filled by the Assessment Center capturing feedback on assessor and the
form assessment process. Standard feedback templates shall be made publicly available

10 Candidate n Detailed feedback filled by candidates on the experience of assessment and training (wherever
feedback form applicable). Standard feedback templates shall be made publicly available.

11 Code of Conduct n Code of conduct document signed by assessor, signed and stamped by Assessment Center

It is recommended that all evidence be stored digitally for at least 5 years, segmented clearly and easily retrievable.
Further, as we move towards digital assessments, certain checks, earlier performed manually must now be transitioned
digitally. It is recommended that the following be adopted for all assessments:
1. Assessment attendance: Candidate attendance should be captured through the bio-authentication device
already put in practice for PMKVY STT batches, or through a geo-tagged and time-stamped image of the candidate
along with an image of the ID card.
2. Assessment logs: For all digital assessments, assessment logs should be stored. A sample assessment log is
annexed (Annexure 3).

50
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

3. Image/video logs: It is suggested that image and video logs be captured wherever feasible.
4. Web-based or app-based monitoring of assessment: It is recommended that AAs should facilitate the
monitoring of the assessments through web-based or app-based access, to view the following key activities on
ongoing assessments:
n List of candidates taking assessment with their details and images
n Live assessment logs for ongoing assessments

Results

Uploading Scores
After the assessment is completed, the scores for each batch are uploaded on the portal by AA. With the use of digital
tools for assessments, uploading of scores should be system driven with minimal human intervention in the process.
Assessments with automated scoring should be marked by the system based on pre-fed inputs. Assessments where
assessors input marks should be systematized in such a manner that marks that have been fed are allocated against
ACs through the mapping defined in the blueprint. With respect to result generation protocols, it is recommended that
AAs should:
n Build capability to generate results centrally
n Ensure result generation team has no contact with the TP/Assessor/Proctor
n Keep the assessment data encrypted till the time of result generation
n Ensure any access to this data should be logged
n Ensure that any attempt to modify this data raises a flag or leaves a trail
Furthermore,it is the AA’s responsibility to ensure security of results through encryption or other techniques. AAs
should also have a robust mechanism of saving the result sheets in an efficient manner, for future reference. Ad-hoc
changes and moderation of results should be discouraged.
Scores for all batches across schemes are uploaded on the Skill India Portal in the formats desired as per the scheme
and portal requirement. The Skill India Portal acts as a central MIS for all training and assessment data across all SSCs
and projects. It is recommended that manual upload of marks should be avoided to the highest extent possible,
and preferably be picked up through APIs. The cycle of result generation and upload is time-sensitive and crucial
since it indicates certification of candidates and may even be linked to payouts under certain schemes. Hence, it is
recommended that all AAs and SSCs stick to the defined timelines for score upload and declaration. The following
timelines are recommended to be followed:
n AAs should be given minimum 3 working days to accept the batch assigned by SSCs
n AAs should upload the scores and send it to the SSC within 6 working days from the date of completion of
assessment
n SSCs should validate and approve the result within 4 working days
Scores shared should be accompanied by documents such as, which may be organized systematically and easily
retrievable:
n Time and date-stamped videos and photos
n Practical & theory marks evidences
n Documentation & compliance requirements
n Feedback forms
n Proctoring data logs, credibility index score for online assessments Signed and stamped Attendance sheet
Grace Marks
The passing criteria is indicated in the QP/NOS or is determined by the scheme under which the QP/NOS is being

51
assessed. The SSC and AA must follow the prescribed passing criteria for all assessments. The scoring should be defined
within the assessment blueprint for the QP/NOS and therefore grace marks should not be allocated. As much as
possible, the scoring should be free from manual intervention.
No grace marks to be allocated during QP/NOS-based assessments. To avoid arbitrary moderation of marks, clear
guidelines on the extent of rounding off should be detailed in the Assessment Blueprint and calculated by the system.
It is recommended that marks be rounded to the nearest absolute number (For example, rounding 69.5 to 70.0).
Publishing Formats
It is recommended that a standard marksheet design be followed for all schemes on Skill India Portal, specifying the
scheme name under which it is published. Scores should preferably be published in an easy-to-understand format and
detail out the NOS-wise, and wherever possible, PC-wise performance. This will enable candidates and TP to review their
test performance, identify areas of strength and development, and be better prepared for the next attempt. Reports
should be automated. The following information should be published on a scorecard, which should be downloadable
by the candidates:
n Candidate details – Name, Email ID, registration ID,etc
n Test time and duration details
n Marks/percentage/percentile scored – across various sections of the test
n Candidate response for all questions in the assessment
A sample scorecard is annexed (Annexure 4).

Re-evaluation

A re-evaluation request is an appeal made by a candidate for a review of the scores obtained on an assessment.
The assessment is re-evaluated by a subject matter expert on the evidence collected (including theory, practical, viva
or any other component) and the summary of scores obtained. The outcome of a re-evaluation may lead to no change
in scores reported, an increase in scores or a deduction in scores.
Timeline:

Deadline for Application 10 working days after the release of results How to Apply: A candidate
may apply directly for re-
Acknowledgement 3 working days after the receipt of completed application evaluation. A re-evaluation
Written Outcome 30 working days from the date of acknowledgement request can be lodged a
maximum of 1 time.
Fees: As prescribed by the SSC, suggested to be 50% of the assessment fee.

Re-assessment

A re-assessment, or repeated assessment, is available for all those candidates who may have been absent or
may have failed the initial assessment. The re-assessment occurs at a date after the assessment and is conducted by
an AA who did not conduct the initial assessment.
Timeline:
How to Apply:
Deadline for Application 15 working days after the release of results
A candidate may
Acknowledgement 3 working days after the receipt of completed application apply directly for re-
Re-assessment Within 90 working days from the date of acknowledgement assessment. Centers may
apply on behalf of the
candidate only with the consent of the candidate in question.
Fees: Complete Assessment fee

52
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Quality Assurance
An assessment undergoes an entire cycle and must be brought back to the starting point through an evaluation or
review. The complete assessment process is reviewed by running data analytics, checking the quality output, collecting
feedback. The cycle may move forward once the feedback loop is complete and corrective action is undertaken.
This series of processes aims at confirming whether the outcomes of an assessment of an individual’s learning meet
predetermined criteria (standards) and that a valid assessment procedure was followed. This means that the outcomes
have been quality assured and can be trusted.

Revision of Blueprint

With continuous advancement in technology and skills within and across sectors and occupations, there is a need to
regularly review and update Assessment Blueprint along with their respective QP/NOS and Assessment Criteria. The
Blueprint for a particular job role might undergo revision due to any of the following reasons:
n Modifications in the QP/NOS – updates in the NOS/ PCs
n Modifications in the Assessment Criteria – Change in marks distribution of the overall QP/NOS/PC or even
between the theory and skill sections of a NOS
n Learner performance analytics generating suggestions for alteration of the blueprint w.r.t. question type /
language / relevance /Difficulty level

Item Bank Review

SSC/Awarding body shall be responsible for the development and periodic review of the question bank developed for
a specific job role. SSC/Awarding body shall be required to publish a sample question paper on their website for all
stakeholders.
The quality of the item bank created by the assessment designer needs to be validated by a minimum of 2 reviewers
on the following parameters:
Parameter Description
Check the appropriateness of n Context should be clear
the question and its options n Construct of the question
n Facts, Data, information must be clear and correct
n Answer options should be clear and correct
n The item bank must be regularly reviewed for fairness
Language check n Check for grammatical errors
n Spelling check – both using ‘Spell Check’ and manual checking
n Capitalization, punctuation marks
n Language should be simple avoiding any irrelevant information.
n Sentence structure should not be complex
Ambiguity n The question should have the complete information required to arrive at the
requisite answer
n The information provided should be specific enough to remove any
ambiguity in answers/solutions to the question
Relevance – Assessing the topic n Is the test item related to the QP/Job role being assessed?
well w.r.t. the job role n Does the item correspond to the PC and the difficulty level?
Scripting/ Formatting Error n Consistency in font type and font size
n Subscript and superscript markings and brackets used if any

53
Difficulty Level n Check if the difficulty level of each question is as per the matrix
n The overall difficulty level mix is as defined in the blueprint
n Check the difficulty level assigned to items based on data analyses
Images/Media used n Check if the images used in the question are clear and relevant
n Check if the candidate will be aware of the item/message conveyed in the
image check for clarity of audio-video questions.
Declaration n All Variables, Symbols and Abbreviations used must be declared
n Variables should be declared/explained even if obvious unless this
knowledge is being assessed as a part of the item
n Abbreviations/Technical terminology should be elaborated clearly
Duplicity of Answer options n The correct answer option should be unique.
n Options should not be overlapping
Table 19: Parameters for Reviewing an Item Bank

One method of validating whether scores from the assessment are as expected is through desk reviews of the Question
Bank by industry experts. Industry experts review the assessment, verify and certify that the test correctly assesses
what it purports to assess. A desk review is performed on all items in the bank by SMEs. Each SME rates each item as
PASS (1)/ FAIL (0) based on the parameters mentioned in table 13. In addition to the above-mentioned parameters, the
SMEs also indicate the suitability of the item types selected for optimally measuring the linked PCs, thereby indicating
construct validity. All feedback is consolidated and corrective action on the items is undertaken.
The benchmarking of items and subsequent improvements/revision to question banks is driven by post- assessment
analytics.

Item Health Monitoring

It is recommended that statistical information on an item’s performance is maintained over time. The analysis of item
statistics will enable the assessing body to assure quality of the item bank, identifying poorly performing items. In
terms of item performance, attention should be given to both the item difficulty index, discrimination index and
exposure level. In order to conduct item health monitoring, information on item’s usage over a time period including
the current time period is required.
The statistics that constitute item analysis is computed from candidates’ responses to items on the assessment.
Particular attention should be paid to the:
n The number of candidates who have attempted each item
n The number of candidates who have answered the item correctly
n The number of candidates who have selected each distractor
n The number of candidates who have been included in the high scoring (upper) group and low scoring (lower)
group
n The discrimination index (also known as the point biserial correlation)
A sample item analysis sheet is annexed (Annexure 10).
Difficulty Index
The facility value is measured either on a scale of 0 to 1 or as a percentage. Thus, if all candidates answer an item
correctly its facility value will be 1.00 (or100%)while if none of the candidates made the correct response the facility
value will be 0.00 (or 0%). The nearer the facility value is to 1.00 the easier the item is for candidates.
The ideal value of an item should be between 0.2 to 0.8. It is recommended that items with a facility value below
0.40 (40%) or above 0.90 (90%) should be reviewed. A relaxation of ±0.05 relaxation can be given to the above
range.

54
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Difficulty Index Difficulty Level of the Item / Assessment


Less than 0.2 Very difficult
0.2 – 0.4 Difficult
0.4 – 0.6 Medium
0.6 – 0.8 Easy
More than 0.8 Very easy
Table 20: Difficulty Index and Correlation with Items

If an item performs outside the parameters, it may be due to a host of factors, for example:
n The item may have been poorly constructed/edited
n The item has become over exposed
n The cohort has changed
n The examiner’s evaluation is not aligned to the expectation
Such a scenario involves a detailed question level analysis to replace unsuitable items.
Discrimination Index
The discrimination index shows how well the item discriminates between candidates of different ability. It is a
measurement of the relationship between a candidate’s score on an item and their scores on the test as a whole. The
upper group (also known as the high-scoring group) is defined as the 27% of test takers with the highest test scores.
The low-scoring group (also known as the lower group) is defined as the 27% of test takers with the lowest test scores.
Ideally, an item should be answered correctly by candidates in the upper group and incorrectly by candidates in
the lower group. By drawing the correlation between the item performance versus the performance on the test, a
discrimination index for each item is calculated. The discrimination index varies between-1 and+1.
n A positive index (a value between 0 and+1)shows that candidates who made the correct response to the item
tended to score well on the test. A positive discrimination index implies that the question is performing well.
n A negative index (a value between - 1 and 0) is unusual and shows that candidates who made the correct
response to the item tended to score poorly on the test. Particular attention should be paid to items with
indices between -1.0 and 0.2 during the review.
Reliability Index
The reliability of a test is the consistency, or dependability, with which an assessment measures what it is supposed to
measure. It can be thought of as the extent to which the scores achieved in the test can be relied on as being equivalent
to the scores achieved in another version of the test. There are various levels of reliability that can be measured in an
assessment. The Kuder-Richardson (KR-20) formula is a measurement of reliability based on inter-item consistency. This
coefficient can vary between 0 and +1. The nearer it is to +1, the more likely it is that candidates would obtain the same
score if they took another version of the same test. Ideally, coefficients should be above 0.80. It is recommended that
tests with coefficients below 0.75 should not be re-used without replacing weak items.
Exposure Index
Monitoring how many times the item has been used in a test and the last time the itemwas used will help reduce the
risk of excessive item exposure. Item exposure refers to the number of times that test items are displayed to candidates.
Exposure index should be used in sync with all other indices like discrimination index, reliability index and difficulty
index. All items being used for assessments should be in the acceptable zone for all indices so that the exposure rate can
be counted as legitimate for the results. If all other indices are in acceptable limit for an item, then the recommendation
for the item exposure is:
n The exposure of item in a single working day should not be over 250
n The exposure of an item should not exceed 3000 in a single calendar month
n If these limits are reached, the items should be provided a cool-off period, of about 15 days, before being used

55
again for active assessments
n Provide a cool-off period of 1-2 months once the exposure exceeds 10,000
This limit can be defined in the selection algorithm and should also be taken into consideration while deciding the item
pool required for conducting assessments for a selected audience.
Question Retiring Policy
All the above-mentioned indices should be monitored for every QP/NOS assessment every quarter or after a certain
number of sets. Based on the reports and the item’s health, questions are revamped or discarded. It is advisable that
these questions are revised first and only after they reach a threshold of 300 exposures and still not performing as
desired, they should be retired.
Trend Monitoring for Control Groups
Since the assessment batches are a discrete population set (assessment from different TPs and at different places), the
following control groups must be monitored:
n Same TP (irrespective of the location): If seven consecutive pass rates of the same TP irrespective of the location
is above 85%, then the AA must change the set combination for the next assessment or revamp the existing
questions or add the question set(s), and ensure items are not overexposed. The AA may also change the
proctor and/or assessor for these locations and perform real-time monitoring.
n All TPs: If 10 consecutive pass rates for any job role is above 85%, then the AA must change the set combination
for the next assessment or revamp the existing questions or add the question set(s), ensure the items are not
overexposed and perform the real-time monitoring.
Randomization Norms
Randomization refers to the interchanging of questions and question sets for an assessment. The randomization of
Assessments is done in the following ways:
n Introduction of new sets
n Mixing of new sets with older sets
n Revamping of older sets
Criterion Validity
Criterion validity measures how well the assessment predicts the on-the-job performance of the candidates. The
process of measuring criterion-validity is as follows:
1. The process of validation starts with the identification of a sample pool i.e. a group of existing professionals of the
job role, that will serve as the reference group for validation.
2. This sample pool is assessed using a standard assessment set and their scores are recorded.
3. Performance data of the sample pool is gathered from their respective organizations, either on their overall
performance level or on the competency level (which are related to the NOS of the QP).
4. An analysis is performed on the performance and assessment scores. The correlation is mapped, and
recommendations are then suggested for the assessments.
5. Finally, the assessments are updated based on these recommendations.

56
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Malpractice & Grievance Redressal

Malpractice can be defined as an act of improper practice, including maladministration. It is an activity, practice
or omission, which is either willfully negligent or contravenes regulations and requirements. Some examples of
malpractice happening at the Assessment location by staff, candidates, assessors, etc, could be:
Training Center Staff
1. Influencing the assessment or certification process, including:
n The unauthorized obtaining, disseminating or the facilitating of access to secure examination/assessment
materials
n Amending learners’ answers for any examination
n Assisting or prompting learners in the productions of answers during examination
n Any action or deliberate inaction that allows to have learners an unfair advantage
n Falsification of learner’s marks, signature, assessment evidence, results documentation
n Offering bribe of any kind to invigilator, assessor, or any staff from assessment body
n Submission of misleading reports, which may lead to incorrect conclusion
2. Failure to meet the requirements for the conduct of examinations, including:
n Unauthorized changes to examination timetable
n Non-adherence to invigilation requirements
n Amendment of examination materials without permission
n Failure to supervise examinations effectively and continuously (especially where formative assessments are
concerned)
Candidates
1. Breach of examination or assessment rules, regulations, and requirements, including:
n Plagiarism of any nature
n False declaration of authenticity in relation to the contents of a portfolio or course work
n Deliberate destruction or tampering with a learner’s work or assessment records
n Obtaining or attempting to obtain secure exam materials from examination room
n Forging another learner or staff signatures
2. Inappropriate conduct during an examination or assessment session, including:
n Introduction of unauthorized material or devices into examination room
n Misuse or attempted misuse of examination/assessment material
n Disruptive, violent, and offensive behavior
n Any form (verbal, written, gesture, pointing,etc,) of communication with other learners
n Failure to abide by the instructions of invigilator or assessors
Assessor/ Proctor/Invigilator
Any assessor/proctor/invigilator is suspected to be involved in the malpractice in below instances and can be reported
by candidate or by the center:
1. Assessor not adhering to the examination guidelines and rules.
2. Assessor demanding a bribe from the learner or center to favour an outcome.

57
3. Assessor showing indifference or bias in his center activities.
Assessment Agency
The instances below are considered malpractice by AA:
1. Allocating untrained, inexperienced, or uncertified assessor/proctor/invigilator for the assessment work.
2. Allocating assessor/proctor/invigilator for practical work whose contract expired.
3. The scores shared by assessor/proctor/invigilator are modified by assessment agency.
4. Allegation on Assessment body of Involving in bribery or corruption.
Grievance Redressal
The AA, SSC and NSDC shall be responsible for setting up independent cells to address grievances and cases of
malpractice. Each AA is expected to host an AA Investigation and Compliance Team within the body, that is separate
from the teams conducting operations. All SSCs are expected to set up a Grievance Redressal Cell. Contact information
and all details relevant to the process of grievance redressal should be made publicly available on the AA’s and SSC’s
websites. A stakeholder may report grievances or any suspect case of misconduct or malpractice at these cells within 2
working days from the day of completion of assessment.

Assessment Agency Sector Skill Council NSDC


The following stakeholders The following stakeholders may report The following stakeholders
may report cases to the AA cases directly to the SSC grievance cell: may report cases directly to the
Investigation & Compliance n Assessment Center NSDC grievance redressal and
teams: n Training Center monitoring team:
n Proctor or any other AA n Candidate n Members of the public
staff n Any stakeholder not satisfied with n Any stakeholder not
n Assessor the resolution offered by the AA satisfied by the resolution
Investigation & Compliance team offered by the SSC
Grievance Cell

Note: Note: Note:


n The AA Investigation & n The SSC Grievance Cell must n NSDC Grievance Cell may
Compliance cell must document all reported cases choose to forward the
document all reported reported directly to SSC or to AA complaint received to SSC
cases through a monthly through a quarterly report to NSDC. or AA for redressal
report to the SSC. Cases of n The SSC must respond to the
severe magnitude should be grievant within 15 working days of
reported to the SSC within 3 receiving the complaint
days working days.
n The AA Investigation &
Compliance Cell may
escalate cases to the SSC if
required

58
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

The following table reflects the penalties that may be imposed when malpractice is established, and the applicability
with different stakeholders. A range of penalties is listed that may be imposed dependent on the extent of the
wrongdoing. The penalty must be imposed on the approval of the SSC.
Offender Penalty
Learner - Written warning
- Results cancelled
- Disqualification from the qualification for set period
- Barred from entering examinations for a set period

Assessment Center - Written warning


- Withdrawal of approval for specific qualification(s)
- Suspension
- Blacklisting (removal of A&A approval)

Proctor/ Assessor/ AA Staff - Termination of contract with AA


- Suspension
- Blacklisting

Assessment Agency - Written warning


- Financial penalty
- Termination of contract with SSC
- Suspension
- Blacklisting

Accessibility in Digital Assessments

Accessibility must be considered from the outset when designing assessments, otherwise differently-abled candidates
could be disadvantaged in assessments.
Accessibility in digital assessments refers to making Web content more accessible to differently-abled people. Disability
ranges across visual, auditory, physical, speech, cognitive, language, learning and neurological disabilities and for
each candidate digital needs could be different. It is recommended that assessments align with the Web Content
Accessibility Guidelines (WCAG), which are a set of requirements recognized internationally for accessibility. A WCAG
compliance ensures that your website is accessible by everyone, irrespective of disabilities and age. AAs and SSCs are
recommended to work towards reaching level A in WCAG compliance.
The digital assessment should allow learners to use their own assistive technologies to access questions and evidence
their answers. The basic principles of accessibility state that web content must be:
1. Perceivable: Information and user interface components must be presentable to users in ways they can perceive.
This means that users must be able to perceive the information being presented, i.e., it should not be invisible to all
of their senses.
2. Operable: User interface components and navigation must be operable. This means that users must be able to
operate the interface, and not necessitate actions that the user cannot perform.
3. Understandable: Information and the operation of user interface must be understandable and operable by the
user.
4. Robust: Content must be robust enough that it can be interpreted reliably by a wide variety of user agents,
including assistive technologies. This means that users must be able to access the content as technologies advance.

59
Data Security

AA may provide a declaration to clearly indicate the adherence to data governance policies and encryption guidelines
for maintaining the information security and data privacy as defined under global standards like ISO27001 (data
security, privacy, and audit requirements) and ISO9001 (quality data managements systems).
It is the AA’s responsibility to ensure that there is confidentiality with respect to the item bank, results, candidate’s
details and other sensitive information. In this regard, SSCs shouldcarefully examine and validate data security and
privacy practices at empaneled AAs in the following key areas:
i. Database Management: Security and robustness of the database used by an organization as a method of storing,
managing, and retrieving information, automatized and with minimal manual intervention.
ii. Data Access Controls: Details of access and authentication with the following classifications:
n Restricted – to be shared with pre-defined stakeholders only
n Confidential – can be shared with designated stakeholders but not for circulation
n Internal – to be shared with stakeholders within the skill ecosystem
n Public – can be made available in the public domain
iii. Database Credential Management: Control over flow of data including approval, monitoring and access.
iv. Data Encryption: A robust mechanism to ensure enhanced security of sensitive data through encryption
mechanisms.
v. Threat Detection: Procedure for raising flags, categorization of threat level and determining mitigative techniques
vi. Database Backup & Recovery: Protocols for ensuring data back-up and recovery in case of data loss.
vii. Data Portability: Mobility of data between different application programs, computing environments or cloud
services.
viii. AAs IT Assets Policy: IT management and security policies onIT equipment provided to employees, such as
misplaced devices, limits on access, etc.
ix. Audit Process: Protocols on quality and utility audit of assessment data for Quality Assurance.
x. Digitization of Data: Collection, storage and retention in soft formats.

Reports and Analytics

The assessment analytics should be used as the final link to complete the feedback loop. The following reports should
be made available:
1. Batch-wise Analysis: Each assessed batch should be provided a report outliningAC-wise and NOS-wise performance
of the complete batch and individual candidates. The report should highlight weak and strong performing PCs as
per the results of the candidates in the batch. This report may serve as feedback for trainers to identify points of
intervention and points of strength for upcoming batches and show failed candidates, which areas to focus on in a
possible re-assessment.
2. Overall TP Analysis Annual Report: Each TP should be provided a report for all QP that they are conducting
trainings for. This report should highlight PCs in which most candidates of the TP performed below par from the
passing criteria and those where candidates performed well. There should also be a TC-specific analysis. Those
candidates who have reappeared for a QP assessment during a year should be highlighted, and a comparison
could be drawn between subsequent attempts to indicate any improvements. The report should be published
each quarter to enable improvements in training with every batch.
3. Item Analysis Report: For a specific item bank of a QP, the AA must submit a report to the SSC on item wise
responses in the attempts on a quarterly basis. This report will highlight the overall performance of the item bank
and pinpoint items which need revision. A sample item analysis report is depicted.

60
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Figure 7: Sample Item Analysis Report

Label in figure Description


1 Proposed Difficulty level of question as mentioned in blueprint
2 Actual Question
3 The Answer Options given in the test
4 Correct Answer Option
5 Percent of responses for each answer option
6 Actual Difficulty level as per the responses received for the Question
7 Deviation – what percent the responses have deviated from intended difficulty level
8 Comments – actions to be taken basis the performance of the question
Table 21: Description of Sample Item Analysis Report

Reports should be automated and sent to the relevant stakeholders, such as TPs, SSCs, NSDC, etc, depending on the
type of report. The SSC may request for custom reports and analytics concerningfor quality assurance or to manage
escalations. Such reports should be made retrievable to the SSC.
General assessment analytics reports should typically contain the following details:
1. Details of every NOS/PC/module/topic in the assessment
2. All questions appearing in the assessment
3. Difficulty level against each question
4. Number of times question appeared in assessment
5. Number of times question attempted by candidate

61
6. Number of times question correctly attempted by candidate
7. Number of times each option been chosen
8. Percentage of correct attempts
9. Average time spent by candidates on a question – for correct and in correct attempts

Descriptive Analysis for the Batch


Batch Summary
Batch Id 0
Scheme of Non-PMKVY Pass Criteria 70% Overall
Assessment
Assessment Date Oct 16 2019
QP / Job role SSC/QXXXX XXXX Level 4
Total Gender Wise Male Female
Analysis
QP Result Enrolled 30 23 7
Appeared 30 23 7
Passed 29 23 6
NOS SUMMARY
% Students Mean Minimum Obtained Maximum Median SD Mode
Passed (On (%) (%) Obtained (%) (%)
Appeared) (%)
SSC/NXXXX 97% 79% 62% 90% 79% 5%
SSC/NXXXX 100% 91% 74% 100% 81% 7%
SSC/NXXXX 93% 82% 63% 100% 81% 8%
QP Code-
SSC/QXXXX 97% 82% 65% 88% 83% 5%

POOR PERFORMANCE IN PCs (<50% of CANDIDATES ANSWERED THE PCs CORRECTLY )


NOS 1 (Code) PC7 PC16 … …
NOS 2 (Code) PC17 PC2 … …
NOS 3 (Code) … … … …
MODERATE PERFORMANCE IN PCs (ONLY 50% TO <70% OF CANDIDATES ANSWERED THE PC CORRECTLY)
NOS 1 (Code) PC12 PC18 … …
NOS 2 (Code) PC8 … … …
NOS 3 (Code) PC5 … … …
STRONG PERFORMANCE IN PCs ( MORE THAN > 70% OF CANDIDATES ANSWERED THE PC CORRECTLY)
NOS 1 (Code) PC1 PC2 PC3 PC4 PC5 PC6 …
NOS 2 (Code) PC1 PC2 PC3 PC4 PC5 PC6 …
NOS 3 (Code) PC1 PC3 PC4 PC6 PC7 … …
Figure 8: Sample Analytics Report

62
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Figure 9: Sample NOS-wise Analytics

Figure 10: Sample Analytics Dashboard

63
Figure 11: Sample Analytics Dashboard 2

Figure 12: Sample Analytics Dashboard 3

64
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Conclusion
This Guide attempts to lay the base for setting processes and protocols in place for skill assessments in the short-term
skill system, regardless of the scheme under which they are administered, whether Government funded, fee-based,
CSR-funded, or any other. It is envisaged that the frameworks in this Guide shall bring uniformity and standardization
in how assessment design, planning, administration and quality assurance is approached, all the while leaving enough
flexibility for variations across domains, sectors, geographies, target groups, and other factors.
With this initial step towards strengthening skill assessments in India, the Guide concludes with the conviction that the
design and delivery of skill assessments shall continue to evolve and improve with the committed efforts of Regulators,
Sector Skill Councils, Awarding Bodies, Assessment Agencies, Assessors and Experts.

65
Annexures
Annexure 1: Tips for Constructing Effective Items

1. Stem: A question stem is a sentence, phrase, or word that asks for information and requests a response to test
someone’s knowledge. The following should be kept in mind while constructing effective stems:
n The stem should be meaningful by itself and should present a definite problem. A stem that presents a
definite problem allows a focus on the learning outcome.
n The stem should not contain irrelevant material, which can decrease the reliability and the validity of the test
scores.
n The stem should be negatively stated only when significant learning outcomes require it. Students often
have difficulty understanding items with negative phrasing. If a significant learning outcome requires negative
phrasing, such as identification of dangerous laboratory or clinical practices, the negative element should be
emphasized with italics or capitalization.
n The stem should be a question or a partial sentence. A question stem is preferable because it allows the
student to focus on answering the question rather than holding the partial sentence in working memory and
sequentially completing it with each alternative. The cognitive load is increased when the stem is constructed
with an initial or interior blank, so this construction should be avoided.
2. Options: In multiple choice questions, the stem needs to be answered by selecting an option, which could be the
correct answer or a distractor. A distractor is a plausible option but an incorrect answer.
n All options should be plausible. The function of the incorrect options is to serve as distractors, which should
be selected by students who did not achieve the learning outcome but ignored by students who did achieve
the learning outcome. Options that are implausible don’t serve as functional distractors and thus should not be
used. Common student errors provide the best source of distractors.
n Options should be stated clearly and concisely. Items that are excessively wordy assess students’ reading
ability rather than their attainment of the learning objective.
n Options should be mutually exclusive. Options with overlapping content may be considered “trick” items by
test-takers, excessive use of which can cause unnecessary confusion.
n Options should be homogenous in content. Options that are heterogeneous (varied and dissimilar) in content
can provide cues to a student about the correct answer.
n Options should be free from clues about which response is correct. Sophisticated test-takers are alert to
inadvertent clues to the correct answer, such differences in grammar, length, formatting, and language choice
in the options. It is therefore important that options
- have grammar consistent with the stem
- are parallel in form
- are similar in length
- use similar language
n The number of options can vary among items if all options are plausible. Plausible options serve as
functional distractors, which are those chosen by students that have not achieved the objective but ignored by
students that have achieved the objective. There is little difference in difficulty, discrimination, and test score
reliability among items containing two, three, and four distractors.
n Avoid including an alternative that is significantly longer or shorter than the rest. In line with keeping
options homogenous, it is also advisable that the response options are similar in length to ensure that they are
effective.
n Avoid complex multiple-choice items, in which some or all the alternatives consist of different combinations
of options. As with ‘all of the above’ answers, a sophisticated test-taker can use partial knowledge to achieve a
correct answer.

66
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

3. MCQs to Assess Higher Order Thinking: When drafting multiple choice items to test higher-order thinking, it is
essential to ask questions aligned to Bloom’s Taxonomy. A stem that presents a problem that requires application
of course principles, analysis of a problem, or evaluation of alternatives is better suited to assess higher order
thinking. It can also be helpful to design problems that require multi logical thinking, where multi logical thinking
is defined as “thinking that requires knowledge of more than one fact to logically and systematically apply concepts
to a …problem”. Finally, designing alternatives that require a high level of discrimination can also contribute to
multiple choice items that test higher-order thinking.
4. Viva voce: A viva voce item is an oral questioning method used to assess certain parameters, based on which an
Assessor evaluates a candidate. Some such parameters are indicated below
n Knowledge: Conceptual grasp on the knowledge of the subject
n Correctness: The accuracy of the responses
n Communication skills: The ability to convey thoughts and ideas effectively
n Presentation skills: The ability of a candidate to deliver the ideas effectively
A viva item will have a question stem and a correct answer(s). The complexity of the viva questions varies based on
the job role and the NSQF level.
5. Practicals: A practical is usually evaluated using two assessment methods: Demonstration and Role Play. The
candidate must clearly follow certain steps to earn marks in an assessment. The assessor can evaluate the candidate,
taking into account the following:
n Accuracy: The quality of being precise
n Ability to perform: The competency of a candidate to display their skills
n Knowledge: Conceptual grasp on the knowledge of the subject
n Decision-making: The candidate’s ability to take decisions
n Problem-solving: The candidate’s ability to find an effective solution to complex problems
n Presentation skills: The candidate’s ability to deliver the ideas effectively
n Time management skills: The candidate’s ability to manage the time efficiently and productively
n Process adherence: The candidate’s ability to follow the process and abide by the structure
n Safety measures: The precautions undertaken to ensure safety throughout the assessment

Annexure 2: Instructions for Candidates Taking Digital Assessments

Before the Test


1. The assessments can be conducted on a computer/tablet/mobile phone. The candidate should review the system
(hardware and software) compatibility requirements for the assessment before starting the test.
2. The candidate should ensure their device is fully charged and/ or connected to power supply for the duration of
the test.
3. The candidate must also close all other programs, chat windows, screen savers etc. on their computer/laptop
before starting the exam.
4. The candidate must keep handy the ‘Aadhar Card’ or any other ID proof issued by Government to enable personal
verification by the System/Proctor. The candidate may not be allowed to take the assessment with out personal
verification.
5. The candidate will not be allowed to use any other electronic device during the test.
6. If taking the test from a location other than a test center, the candidate should determine a comfortable location
from where to attempt the test. They should select a location that is airy and has ample light, and one where they
are likely to not be disturbed during the duration of the test.

67
7. Sample tests/ walk throughs will be provided to the candidate by the AA for helping them understand the test
platform and check the system compatibility with the test platform.
During the Test
8. Candidate must access the assessment link at the stipulated time (as mentioned in the invitation email).
9. Candidate must enter correct details like Name, ID etc. in correct fields. Any mismatch in the same might lead to
cancellation of their attempt.
10. Candidate must read the instructions carefully, take note of the time of exam, navigation through the questions,
important tabs to use while submitting, reviewing the questions etc.
11. The assessment is timed and will automatically shut down after the time is over. Candidate should keep an eye on
the clock and keep working through the questions.
12. The candidate will be provided the opportunity to review your submissions/responses before finishing the
assessment. It is advised that candidates avail this opportunity before completing their test.
13. In case of a power failure while attempting the test, candidate would be able to log back in and resume the test from
the point where they got disconnected. All the previous answers would automatically get saved. The candidate will
not be penalized for the time lost.
14. During the exam, a sophisticated proctoring software will monitor the candidate’s activities throughout the test.
The candidate must ensure their webcam/mobile camera is switched on throughout the duration of the test.
15. This software will record the feed for any red flags that indicate malpractice using advanced audio & video analytics.
Multiple red-flags or warnings during the test might lead to candidate being logged out of the test.
Key Don’ts that candidates must avoid
16. Non-adherence to these instructions, rules and regulations, might lead to the candidature being cancelled. As
such, candidates are strongly advised to not resort to any unfair practices during the exam.
17. In case of any issue during the exam, the candidate can connect with the Proctor as well as the phone, email
helpline details provided in test invitation.

Χ Search solution to the question(s) on Internet Χ Use of any device, apart from device on which
assessment is being administered
Χ Seek help from others available around the room
Χ Plug another monitor/ keyboard/mouse to the
Χ Copy the questions/ take screenshot/take pictures from
existing system
the mobile
Χ Look away from the test screen
Χ Browse other websites
Χ Not be present in front of the screen
Χ Use of documents/tool/tutorials/e-books available on
the computer Χ Change IP by logging in from another system
during the test
Χ Copy-paste text
Χ Usescreen-share, any desk type tools to provide remote
access

68
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Annexure 3: Sample Assessment Logs for Online Assessments

Sample log file: Please note that these logs are indicative and should not be considered comprehensive representations
of assessment logs.
Raw format

S. No. Person- Timestamp Event name Question Answer Time State State
identifier difference before after
(in seconds)
1 000001 09:20:06 ITEM_START – – – Starting Stem_
Q02
2 000001 09:20:47 ANSWER_ Q02 0 41 Stem_ Q03
SELECTION Q02
3 000001 09:20:51 ANSWER_ Q03 1 4 Q03 Q01
SELECTION
4 000001 09:20:56 ANSWER_ Q01 3 5 Q01 Q02
SELECTION
5 000001 09:21:02 ANSWER_ Q02 2 6 Q02 Q04
SELECTION

Readable format

S. No. Candidate Timestamp Event name Question Answer Time


ID difference
(in seconds)
1 000001 09:20:06 Assessment started – – –
2 000001 09:20:47 Question attempted Q02 0 41
3 000001 09:20:51 Question attempted Q03 1 4
4 000001 09:20:56 Question attempted Q01 3 5
5 000001 09:21:02 Question attempted Q02 2 6

69
Annexure 4: Sample Scorecard

70
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Annexure 5: Proctor’s Handbook

Code of Conduct
The following is the code of conduct for a proctor:
n Dress in a suitable attire, present and behave in a professional manner with all the stakeholders
n Smoking, arriving drunk at work, chewing tobacco, gutka, chewing gums, betel nuts on the premises of the
assessment center is strictly prohibited
n Any information that is confidential must not be disclosed to unauthorized personnel
n Any offers made to extend unauthorized favors in the benefit of TP /candidates must be rejected and reported
to authorities immediately
n Any proctors found breaching contracts of ethics and non-bribery shall entail strict action against the proctors
as per the defined penalty matrix, which may lead to suspension/ blacklisting
n Proctor may not share passwords or assign their work to any other unauthorized personnel
n There must be no bias against anyone on the basis of color, caste, language, religion, social status, gender,
disability, political and other affiliations, etc
n Official materials including test resources, results, equipment, etc. should be safeguarded from loss by theft,
spoilage and misplacement
Standard Operating Procedures for Proctors for On-Ground Assessments
1. Procedures & communication
n Be aware of all the requirements of the proctor
n Ensure that all assessment materials are as per the checklist provided by the AA
n Communicate assessment instructions and information of proctor’s arrival/ remote proctoring to the
assessment center
n Escalate any unanswered queries to the AA and seek clarifications to communicate them to the Assessment
Center, TC or the candidates
n Inform the AA immediately in case you are unable to proctor the assessment
n Do not handover proctoring assignments to any person without authorization of the AA
n Schedule the initiation and closure of the assessment as per the instructions of the AA and communicate the
same to the Assessment Center
n Ensure arrival/ readiness a minimum of 30 minutes before the scheduled time of assessment or as per the
specifications provided by the respective SSC / AA
2. Documentation and pre-requisite checks
n Set-up the application and/or device wherever required for conduct of assessment. Diagnose and troubleshoot
issues wherever necessary. Ensure compatibility of assessment software on devices
n Verify if the assessment location/candidate’s device has all the facilities as per the checklist provided by the AA
n Carry out the pre-assessment documentation activities as per the instructions of the AA
n Ensure that the candidates are ready at least 20 minutes before the commencement of the assessment as per
the instructions provided by the AA
n Instruct candidates to keep their belongings outside immediate vicinity except the equipment/things that
may be essential during the assessment
n Provide appropriate IT equipment such as tablets/smartphones/computers/laptops, etc, to candidates,
wherever applicable

71
n Provide instructions to candidatesto register/sign in into the assessment software
n Keep assessment materials and/or access password secure until candidate is present and ready to begin the
assessment
n Before the commencement of assessment, verify student identity using one of the following forms of official
photo identification:
- Original Aadhar card
- Any other photo identity card authorized by the scheme guidelines/SSC/AA
- The attendance of the candidates has to be marked on the assessor app/Pro forma and or electronic pro
forma provided by AA
n Review guidelines for the assessment provided by the AA and communicate these guidelines to the student
prior to the start of the assessment
n Examine if the TC/other location/home location is conducive for assessments with few distractions. Administer
the assessment in an area that is conducive to test taking (few distractions) and has appropriate testing
equipment if needed (i.e., computer and high speed Internet access)
n Examine if the CC/IP cameras are installed at the key locations and functioning as required such that the
candidates and the exam hall/TC/home location can be monitored. In case if the CC/IP-based cameras do not
exist in the exam hall/TC/home location communicate the in-ability to monitor the assessment hall through
CC/IP cameras. Obtain communication with the AA on the way forward
n In assessor-led assessments, assist assessor in verifying and calibrating the domain tools & equipment
n In assessor-led assessments, assist assessor in setting up the tools/instruments/equipment as per the
specifications provided by the assessor
n In assessor-led assessments, commence the assessment only after obtaining an approval from the assessor
3. Monitoring procedures
n In case of physically proctored assessments, collect any unauthorized objects from the candidate and store
them in a secure location for the duration of the assessment
n In case of physically proctored assessments, remain in the room with the students throughout the completion
of the assessment
n Ensure video and audio recording of the assessment/practical task/activity is carried out in alignment with the
specifications provided by the AA. Candidate should be made aware of the recordings that shall be undertaken
n Do not provide any assistance on the subject matter to the student in completing the assessment
4. Recording/collection of evidences
n Record the AV of the evidences as per the guidelines prescribed by the AA and ensure that the evidences are
stored uniquely for every candidate
n Ensure that the AV/Physical evidence is the candidate’s own work, else report any concerns to the AA
n Collect physical evidences, if any as and when they are completed and mark them with the candidate’s
identification. The marking done on the evidences must be permanent and not susceptible to tampering
n In assessor-led assessments, proctors may assist the assessor in recording and documenting observations
during the course of practical assessments
5. Handling situations of indiscretions
n Make note of any indiscretions of the student with respect to the assessments and halt the assessment for
such a candidate. Report the incident to the AA. Be sure to include as manydetail as possible, including, but
not limited to the time of the indiscretion, screenshots of web pages visited if any, description of behavior, etc.
Examples of concerns include:

72
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

a. Noting down the questions/attempting to copy the screen


b. Using unauthorized materials during the assessment
c. Seeking assistance from another person during the assessment without approval
d. Leaving the testing area without approval during the assessment
e. Providing false identification or substituting for another person to take the assessment
f. Referencing notes or books not authorized during the assessment
g. Using an unauthorized electronic device
h. Visiting unauthorized websites or using unauthorized computer programs
6. Documentation
n Ensure thatcandidates have completed the feedback form provided
n Collect and return any equipment belonging to Center, wherever required
n Complete the post assessment documentation with the Center SPOC as per the scheme guidelines or
instructions of AA, if any
n If the assessment or documentation task involves use of paper, collect the same and any allowed materials
(such as scratch paper or formula sheets, ready reckoners, forms etc.) immediately at the end of the testing
time frame
n Make copies of physical papers (preferably digitally) and return the originals to the AA when possible. Retain
copies of such documents/papers until receipt confirmation is received from the AA. After confirmation is
received, shred any physical copies
7. Submission of digital/online assessments
n Ensure that candidates log out of the assessment immediately after the completion of the assessment time
n Collect any electronic assessment device such as tabs/smartphones that belong to AA, if applicable
n Make sure that the work of the candidates is saved on the server of the AA
n In case of any discrepancy in the data transfer, report immediately to the authorities of the AA
8. Evidence handling
n Check the correctness of the AV evidences and ensure the evidences are named appropriately with the
candidate identification
n Upload the AV evidences on the server of the AA wherever applicable
n Handover any physical evidences to the AA as per the guidelines provided by them

73
Annexure 6: Revised AA Empanelment Matrix

1. SOP: Criteria for the Empanelment of Assessment Agencies by SSCs


Revised: August 2020
Objective: The objective of this SOP is to define a standard criterion to assess the Assessing Agency for accreditation,
having the capabilities and experience to assess the trainees trained in outcome- oriented training in the job roles in
line with QPs/ NOs and having potential to undertake assessment as per the structured procedures.
I. Essential Requirements
a) Affiliation procedure for AAs must be transparent, demonstrative (with evidence) and in line with
international best practices
b) As a pre-qualifier, all AAs must possess capabilities to conduct online and digital assessments
c) SSCs will select AAs from the pool of AAs empaneled with NCVET. SSCs should duly communicate/ publicize
when they undertake affiliation with AAs, clearly defining timelines for submission of applications and when
the process is expected to be completed. As much as possible, the application should be collected and
evaluated digitally.
d) This process will apply to the potential as well as all existing AAs
II. Conflict of Interest
a) Training Provider cannot ordinarily be appointed as an Assessment Agency
b) Not allow monopoly or cartelization in assessment
c) No sub-contracting or franchising would be permissible for AAs
III. Minimum Number of AAs and Cap on Target Allocation
a) Minimum number of affiliated AAs with each SSC must not fall under 05 at any given time.
b) Target Allocation under government-funded schemes to a single AA must not exceed more than 25% of total
assessments undertaken through that SSC in any FY. Target allocation of assessments to individual AAs should
be communicated to AAs quarterly.
IV. Periodic Audit
a) It is mandatory for all SSCs to carry out minimum one operations audit of the AA in every financial year. Cost of
conduct of audit will be borne by the respective SSCs.
b) In case shortcomings are found during the audit, AAwill be intimated to take corrective actions within 3
months, failure to which would lead to temporary or permanent de-affiliation of the AA.
c) In the event of complaints/reports about poor performance/unfair practices against an AA, the respective
SSC reserves the rights to conduct a special audit. Costof such an audit will be borne by the AA at actuals not
exceeding Rs 20,000 per audit.
2. Prerequisites for Selection of AAs
I. Legal Existence: AA should be a legal entity (company or society but not firms, proprietorship or individuals;
limited liability partnerships (LLPs) can be allowed)
II. Assessors Quality:
a) AA should have a roll/panel of assessors for the domain sector, approximately 2 assessors for every 1000
assessment numbers
b) Details of assessors should also be available on its website with state-wise details (assessors’ name,
qualifications, experience and photograph along with the details of assessor affiliations with multiple
SSCs)

74
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

III. Assessment Process:


a) AA should have an expertise to carry out online assessments with state-of-the-art technology
deployment
b) The AAs should have the ability to develop the assessment process and tools for different training
courses with ability for continuous improvement
c) AA should have the ability to maintain assessment process records and details pertaining to candidates
registered, tested, passed, centres, assessors, etc, and shall preserve all the records for at least 5 years
or till the validity of any scheme (whichever is later) at any point in time and make its online access to
SSCs
IV. Geographic Reach:
a) The agency applying for Pan India or for Specific State operations must empanel assessors relevant to
the concerned job roles who should be able to reach the assessment venue within 24 hours of travel
time
b) The agency applying for Pan India or Specific State operations must empanel assessors who have the
ability to conduct assessment in regional languages
V. Organization Structure:
a) AA should have a structured mechanism for Governance including a well-defined process for affiliation of
assessors either on its payroll or on long-term contracts
b) AA should have assessment coordination team on its payroll with required capacity and experience to
mentor, supervise, plan the assessment strategy and to guide the team of assessors
VI. Conflict of Interest: It should declare its linkages with other stakeholders in skill ecosystem to ensure
independence and to avoid any conflict of interest
VII. Assessment Design: AA should have the capability of designing assessments and creating items. AA should
have at least one assessment designer on the payroll
VIII. Data Security: AA should provide a declaration to clearly indicate the adherence to data governance policies
and encryption guidelines for maintaining the information security and data privacy as defined under
global standards like ISO27001 (data security, privacy, and audit requirements) and ISO9001 (quality data
managements systems). SSCs should validate the data retention, security and privacy practices by carefully
examining documentary proofs at empaneled AAs in the following key areas:
a) Database Management: Security and robustness of the database used by an organization as a method of
storing, managing, and retrieving information, automatized and with minimal manual intervention
b) Data Access Controls: Details of access and authentication with the following classifications:
c) Restricted: to be shared with pre-defined stakeholders only
d) Confidential: can be shared with designated stakeholders but not for circulation
e) Internal: to be shared with stakeholders within the skill ecosystem
f ) Public: can be made available in the public domain
g) Database Credential Management: Control over flow of data including approval, monitoring and access.
h) Data Encryption: A robust mechanism to ensure enhanced security of sensitive data through encryption
mechanisms
i) Threat Detection: Procedure for raising flags, categorization of threat level and determining mitigative
techniques
j) Database Backup & Recovery: Protocols for ensuring data back-up and recovery in case of data loss.
k) Data Portability: Mobility of data between different application programs, computing environments
or cloud services

75
l) AAs IT Assets Policy: IT management and security policies onIT equipment provided to employees, such
as misplaced devices, limits on access, etc
m) Audit Process: Protocols on quality and utility audit of assessment data for Quality Assurance
n) Digitization of data: Collection, storage and retention in soft formats
IX. Training of Proctors: AA must hold training programs for proctors (both on role and contractual) who are
going for on-field assessments or are proctoring remote assessments. Proctors must be made aware of the
key processes and compliance before conducting an assessment. They should also be familiarized with the
platform functionalities to oversee the assessment and be aware of possible areas of malpractice and steps to
be taken in case of observing any malpractice happening duringthe assessment.
3. Prerequisites for Selection of Assessors
I. Qualifications & Experience:
a) Assessor should possess relevant academic, occupational qualifications and work experience as defined in
the QP
b) Knowledge of assessment process and tools with ability to capture the assessment observations correctly
on the prescribed electronic or paper forms
c) Understanding of the Occupational Standards for the relevant QP is a must. Assessor must have the
ability to plan each task and allocate necessary resources
d) Understanding of competencies required in the job role for which assessment is being done with a
high level of integrity, reliability and fairness
e) Good observation skills with ability to communicate in writing and orally in the local language in
addition to English
f ) Ability to use technology viz, computers, tablets, spreadsheets, video communication tools.
II. Certified Assessor: Assessor must undergo necessary induction / orientation and certification under the TOA
programme through respective SSCs
III. Assessors in Multiple Sector: Assessors undertaking multiple sector assessments must meet the qualifications
for the relevant job roles and would need to furnish a self-declaration duly acknowledged by the AA
4. Affiliation Process for AAs – Stage 1
I. Application & Desktop Evaluation
a) Prospective AA will submit the application in prescribed formats along with the payable application fee.
b) SSC will carry out Desktop Evaluation of the prospective AA. It may invite them for deliberation and clarity,
if necessary.
c) SSC reserves the right to select/reject the AA on merit. However, in case of rejection same must be
communicated to the applicant in writing.
d) Decision of SSC in this regard would be final and grievances should be addressed to the Affiliation
Committee (team constituted by Board/Governing Council of SSC
5. Affiliation Process for AAs – Stage 2
I. Field Visit and Final Evaluation Stage
a) The SSC team will visit the prospective AA, meet their key staff and carry out verification of records and
processes. The SSC team must visit at least two of the existing top 3 customers of the prospective AA and
get their feedback in writing
b) The composition of the team will be two members from SSC and one Board/GC member or industry expert
c) Team will submit its Assessment Report for consideration to Affiliation Committee
d) The valuation of the AAs would be carried out under the Evaluation Framework suggested by NSDC
e) Minimum 60% score would be mandatory for affiliation. Those scoring between 40 to 59% could be given
an opportunity for further improvement of score within 3 months based on which affiliation or rejection
would happen.

76
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

II. Evaluation Matrix for AAs

S. No. Parameter Max Criteria Weightage Points (these may be modified based
Points on the specific requirement of the SSC)
1 Overall Experience 15 Number of Years in More than 10 5 to 10 Years Upto 5 Years
assessment Years
15 10 7
2 Experience in sector 10 Number of Years in sector More than 5 3 to 5 Years Upto 3 Years
Years
10 8 06
3 Experience in Job Roles 10 Number of job roles 0020 20 or more 10 or more job 1-9 job roles
assessed job roles roles
10 8 6
4 Count of Candidates 10 In last 3 years More than 10,000 to Upto 10,000
Assessed 25,000 25,000
10 7 5
5 Assessment 10 Will be determined by Subjective score based on rigour and innovation of
Methodology evaluation committee approach and methodology
6 Affiliation with Govt. 10 Affiliated with minimum 1 More than 5 3 to 5 1-2
Organization (GoI or organization Organisations Organisations Organisation (s)
State Skill Missions) 10 7 5
7 Geographic and 10 Minimum presence in 3 More than 10 4 to 10 States 3States
Vernacular Reach States/UT States
10 7 5
8 Affiliated with other 05 Minimum affiliation with 3 More than 5 4 or 5 SSCs 3 SSCs
SSC SSCs SSCs
5 3 2
9 Mode of assessment – 20 Assessment Modes Remote Center-based Non-Digital
Tablets/Pen and Paper Online Digital (Online/
Offine)
20 20 5
10 Monitoring Mechanism 10 Continuous Monitoring of the Real time Standalone Visits by
of assessments Assessment Online Video-Audio Proctors
Video-Audio Records
Monitoring &
Recording
10 5 3
11 Number of Subject 20 Based on evaluation of CVs More than 5 3 to 5 Experts Upto 2 Experts
Matter Experts Experts
(designers and 20 12 8
reviewers) on
company’s payroll
12 Number of Assessors 10 No. of Assessors More than 25 10 to 25 Upto 9
on payroll Assessors Assessors Assessors
10 7 5
13 Number of full time 10 No. of full time employee More than 25 11 to 25 Upto 10
employee Employees Employees Employees
10 7 3

77
14 Valid ISO Certification 10 Continuous Years in service More than 5 3 to 5 Years Upto 3 Years
with ISO Certification Years
10 7 5
15 Platform Capability – 10 Multiple language support Over 7 3-6 languages 1-2 languages
Language Support is required for delivering languages
assessments 10 7 3
16 Platform Capability – 10 Flexibility to conduct Matrix to be
compatibility across assessments across different created by
devices devices SSC
17 Platform Capability 20 Features such as IP tagging, Matrix to be
– features and navigation control, geo created by
functionalities offered tagging, image-based SSC
monitoring, test resume
facilities, LAN-based
assessments
18 Design capability – 10 Expertise to create Matrix to be
blueprint design (if appropriate blueprint created by
required) SSC
19 Design capability – 10 Range of items types Matrix to be
types of items including simulations, MCQs, created by
scenario-based SSC
20 Design capability – 15 Expertise to create and Matrix to be
creation of item bank maintain item bank created by
with review, pilot SSC
testing, discarding or
retiring items
21 Data analysis and 15 Reporting and analyses Real-time Complex post Basic post
Reporting – reporting prowess of the AA, including dashboard assessment assessment
parameters real time dash board report and report
monthly/
quarterly
analyses
15 10 5
Grand Total 250

Note: The above matrix would be evaluated by a committee nominated by the SSC GC and it would be authenticated
& counter signed by the SSC CEO for record.
III. Evaluation Scorecard for Assessing Agencies

Grade Score Action to be Taken for Prospective AA Action to be Taken for Existing AA
A 60% to 100% To affiliate To continue in business for 1 year
B 40% to 59% To reject but could be given an To be suspended with a 3 month notice
opportunity for further improvement of to improve further to Grade ‘A’, else face
score within 3 months to Grade ‘A’ based terminataion with eligibility to apply in the
on which affiliation could happen next selection process for AA affiliation
C Less than 40% To reject with eligibility to apply in the To terminate from business for 1 year with
next selection process for AA affiliation eligibility to apply in the next selection
after 1 year process for AA affiliation (post completion of
1-year termination period)

78
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

6. Affiliation Process for AAs Stage 3


I. Affiliation & Orientation

a) All eligible AAs will be formally intimated by SSC after receiving the prescribed Affiliation Fee
b) The Affiliation will be valid for one year, after which the AA will apply for renewal based on fresh evaluation
against Evaluation Matrix
7. Due diligence - Forms and Documentation
Some of the essential forms for documentation of AAs are given at annexure. SSCs are free to develop additional
forms if required.
8. Conclusion
The underlying responsibility of the assessment Agency would be to operationalize the process of onsite assessment
of the trainees taking overall care of the process of evaluation. The evaluation will be focused on evaluating if the
candidate knows how to perform the required tasks as part of his job role. This SOP does not restrict the SSC from
adopting additional methodology to strengthen the overall assessment process within the laid down parameters.
In addition, the SSCs would be free to evolve the required parameters for capturing the domain knowledge among
the assessors.
The Annexure forms to this document including self-declaration of the AA, details of candidates assessed, etc can
be accessed in public domain.

Annexure 7: Sample Assessment Schedule

Task Duration Time period Activity


Training Centre 9:00 AM to 9:30 AM 30 Min Verification of Tools & Infrastructure
Validation Verification of Training Attendance Sheet

Candidate Orientation 9:30 AM to 10:00 30 Min Briefing of Assessment Process


AM Candidate ID Validation
Mark Attendance (Biometric/Manual)
Knowledge Assessment 10:00 AM to 11:00 60 Mins Digital / Non- Digital (Knowledge/ Outcome-based
(Theory)* AM Questions)
Skill Assessment & Viva 11:15 AM to 12:45 90 Mins Demonstration of practical skills (outcome-based
(Hands on Skill) ** PM questions)
Group Picture 12:45 PM to 01:00 15 Mins Group Picture
PM
Lunch Break 1:00 PM to 1:30 PM 30 Mins
Skill Assessment & Viva 1:30 PM to 4:30 PM 180 Mins Role Play, Demonstration
(Hand on Skill)
Documentation 4:30 PM to 5:00 PM 30 Min Various evidences
VTP Feedback/Code of Conduct/ Additional
Documents SSC wise
Training Attendance Sheet
Centre Picture/Infra Picture/Tools
Candidate Feedback Form

*Minimum time for theory is 60 minutes per candidate


** Minimum time for skill assessment and viva is 20 minutes per candidate (can be conducted
in small sub-groups)

79
Annexure 8: Training of Assessor Program and Eligibility Criteria

1. TOT-TOA Guidelines for short-term skill development programmes – Version 2.0 (issued in August 2019): https://
nsdcindia.org/sites/default/files/files/Revised%20TOT-TOA%20Guidelines_V2_Aug%202019_with_cover.pdf
2. TOT-TOA Standard Operating Procedure for short-term skill development programmes – Version 1.0 (September
2019): https://nsdcindia.org/sites/default/files/files/SOP_V1_Sept%202019_with_cover.pdf
3. Job-role specific Trainer and Assessor Eligibility Criteria:https://nsdcindia.org/guidelines-0

Annexure 9: Augmented Reality & Virtual Reality in Assessments

Artificial Intelligence (AI): AI refers to system intelligence that enables machines to learn from experience, adjust to
the environment, and thereby perform human-like activities. Using deep learning algorithms, machines can be trained
to accomplish specific tasks by processing large amounts of data and recognizing patterns. Some benefits of using AI
in assessments are flagging of anomalies, reduction in cheating, reduced human intervention, precision in testing and
auditability.
Virtual Reality (VR): VR is the use of computer technology to create a simulated environment. Unlike traditional user
interfaces, VR places the user inside an experience. Instead of viewing a screen in front of them, users are immersed and
able to interact with three-dimensional worlds in a seemingly real or physical way using special electronic equipment,
such as a helmet with a screen inside or gloves fitted with sensors.
In context with vocational assessments, both VR as well as AR can be explored to conduct assessments of different types.
AR and VR can be explored for conducting vocational skill assessments that are assessor-less (with certain limitations),
without lab, tool or equipment set-up – framing questions around processes, tools or techniques for hands on skills.
Scenarios can be virtualized through simulations using VR technology. This can also be delivered to a candidate through
a URL accessible over web browsers, regardless of OS powering the device. This experience is enhanced with a VR based
headset. This technology has the best application for scenarios where it is very difficult to simulate or replicate the
physical setup of workplace to conduct practicals such as job role like furnace operator in case of Iron & Steel SSC or
similar job role(s) involving hazardous material at the workplace. It can also easily for service-oriented job roles. Lastly,
it can even be explored for assessments involving persons with special abilities or assessments targeted at assessing
soft skills. The limitation of this technology remains that it involves a high capex investment for developing AR/VR
experience for different modules/concepts in a job role. Some sample assessment questions are illustrated below:,

80
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Examples of Assessment Questions with Explanations

In the Retail space, we can assess employees based on their


understanding of the SOPs by giving them questions where
they need to choose the correct answers. Please find some
of the assessment techniques used in this case.

Learning by doing - We can create assessments based on


procedures that need to be performed by the learner. These
are step by step tasks that need to be completed by the user
to complete a process.

81
Customer Interaction Assessment - These types of
assessments judge an user’s understanding of how to
speak to a customer when they approach him/her. The
user can either directly interact through voice recognition
which should match the actual pre-defined responses or by
choosing from the correct responses in text format.

Process Assessment - These kinds of assessments involve


doing a task as per SOPs listed down. Here an employee
is asked to prepare an order by choosing the items in the
correct chronology. Points are awarded based on the user’s
correct flow of work.

82
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Personality and Cognitive ability Test - We can develop


questions that can assess the employees’ personality and
then they can be categorized into different personality
groups. Here there is no wrong answer but every answer
categorizes a person into a different personality.

Scoring based on Quiz and Questionnaire - We can


develop quizzes, questionnaires and trivia at every stage of
the learning journey which enables an organization to track
the understanding and knowledge retention of users. Based
on their answers, there are points and badges earned and
also we can create leaderboards to make the assessment
exciting and competitive.

Gamified Assessment - We can also create gamification


around case studies or any training modules and based on
the understanding of the same, the user needs to play a
game. Below is an example of an escape room game where
a user is put inside a room and has to find the correct items
which were a part of the training module to get out of the
escape room and earn a badge for completion.

83
Economics
The development of VR content for different job roles requires higher investments compared to conventional types
of digital/ non-digital assessments. Various cost components involved in development of a VR experience are as
mentioned below. VR content is developed and customized as per the QP/NOS.
Cost line items:
n One-time development fee of an app–unit rate is generally governed by the duration of the experience –cost
ranges between Rs 20,000 to Rs 50,000 per minute of VR experience
n Recurring cost of VR experience player–this is a nominal amount ranging between Rs 5 to Rs 25 per user per month
n VR headset–event headsets come in different forms and quality level, where in cost ranges between Rs 250 to Rs
15,000 and above

Key benefits
1. Immersive experience for a candidate during assessment
2. Longevity of the VR experience unlike traditional question bank
3. No involvement of assessor or proctor at all, completely self-administered
4. No physical setup or premise is required to conduct the assessment in case of VR content-based assessments

Gamification (Simulation) Type Questions - Detail


Gamification of assessments are the new era in simulated assessments. There have been some successful trials and use
cases of them in the industry. The use of gamification is an emerging phenomenon, which derives its popularity from
its intrinsic capacity to motivate action, solve problems and enhance learning in the most diverse areas of knowledge.
Gamification plays an important role in formative evaluations and can be evaluated by the controlled function, being
performed throughout the course by the student/learner in order to check whether students/learners are able to
achieve the objectives laid down pertaining to training outcomes. It is through this process of evaluation that one can
evaluate learner competency related to the job role concerned.
Gamification has the potential to improve learner performance and retention as it is very interactive and immersive.

Simulation Based Assessment-Detail


Simulations are technology driven manifestation of real-life scenarios or situations on a digital device. These tools are
used globally to train and evaluate people on their capabilities and competencies. Simulations based assessments
are used to assess how one would react to situations s/he can encounter while working and how s/he would solve
problems reflecting real time situations. In these tests/assessments, one can also be assessed on his/her critical thinking
and problem-solving ability, attention to detail and learning ability w.r.t. the domain function regarding the job role
concerned.
Key Benefits:
n Better evaluation of candidate’s capabilities
n Easier for candidate to learn/use – wider acceptance
n Objective & standardized evaluation
n Strong validation with real-life skills
n Cost-effective (at scale)
n Auto evaluated for better turn around time
Some of these tools are being currently used in the skills ecosystems for auto evaluation of candidates’ typing skills,
programming skills etc.

84
Assessment Processes and Protocols: Guide for Short-Term Skill Training Programs

Annexure 10: Sample Item Analysis Reports

RAW DATA CORRECT ANSWER KEY

The responses inputted by 5 students in an assessment The summary of responses received on 5 items from a
of 5 items batch of 48 students with correct answer key
Item No 1 2 3 4 5
Key C B D C A Item No 1 2 3 4 5
Student01 A C D C A A 1 3 2 2 41
Student02 C B D C A B 2 29 7 8 2
Student03 C B D C C C 44 13 5 35 3
Student04 C B B C A D 1 3 34 3 2
Student05 B B B C A Key C B D C A

CONVERTED TO NUMERICAL VALUE


SUMMARY OF RESPONSES SELECTED
The responses inputted by 5 students in an assessment
The summary of responses received on 5 items from a of 5 items, receiving 1 mark for a correct response and
batch of 48 students 0 marks for an incorrect

Item No 1 2 3 4 5 Item No 1 2 3 4 5
Key C B D C A Key C B D C A
A 1 3 2 2 41 Student01 0 0 1 1 1
B 2 29 7 8 2 Student02 1 1 1 1 1
C 44 13 5 35 3 Student03 1 1 1 1 0
D 1 3 34 3 2 Student04 1 1 0 1 1
Total 48 48 48 48 48 Student05 0 1 0 1 1

CALCULATED DISCRIMNATION INDEX

Discrimination Index calculated for 5 items on an assessment as per the responses inputted by candidates

Item No 1 2 3 4 5
Total 44 29 34 35 41
p total 0.92 0.60 0.71 0.73 0.85
Upper 15 13 13 12 15
p upper 1.07 0.93 0.93 0.86 1.07
Lower 13 5 7 7 10
p lower 0.93 0.36 0.50 0.50 0.71
Discrimination Index 0.14 0.57 0.43 0.36 0.36

85
301, 3rd Floor, West Wing, World Mark 1, Asset 11, Aerocity, New Delhi – 110037
Tel: +91-11-47451600-10 | Fax: +91-11-46560417
Website: www.nsdcindia.org

October 2020

About National Skill Development Corporation (NSDC): National Skill Development Corporation, working
under the aegis of Ministry of Skill Development & Entrepreneurship, is a unique public-private-partnership
which aims to catalyze creation of quality vocational training ecosystem in India. The organisation provides
funding to build scalable and profitable vocational training initiatives. Its mandate is also to enable support
system which focuses on quality assurance, information systems and train-the-trainer academies either directly
or through partnerships. Since establishment in 2009, NSDC has trained more than 2 crore people through
its partnership with 600+ training partners, wide a robust network of 11,000+ training centers spread over
600 districts across the country. NSDC has institutionalized 37 Sector Skill Councils and is also implementing
Government’s flagship skill development schemes such as Pradhan Mantri Kaushal Vikas Yojana (PMKVY),
Pradhan Mantri Kaushal Kendra (PMKK), National Apprenticeship Promotion Scheme (NAPS), among others.

CONTACT US: NSDC’s Skills Intelligence Platform at: [email protected]

You might also like