100% found this document useful (1 vote)
183 views45 pages

An Overview of Workplace-Based Assessment: Cees Van Der Vleuten Maastricht University, The Netherlands

This document provides an overview of workplace-based assessment and discusses various methods used to evaluate competencies, including direct observation methods like mini-CEX and multi-source feedback, as well as the importance of reflection and getting feedback from multiple assessors across different contexts. Effective workplace-based assessment relies on direct observation of clinical performance, sampling across different cases and assessors, and using evaluation forms to provide structured feedback in addition to reflective discussions.

Uploaded by

qwerty123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
183 views45 pages

An Overview of Workplace-Based Assessment: Cees Van Der Vleuten Maastricht University, The Netherlands

This document provides an overview of workplace-based assessment and discusses various methods used to evaluate competencies, including direct observation methods like mini-CEX and multi-source feedback, as well as the importance of reflection and getting feedback from multiple assessors across different contexts. Effective workplace-based assessment relies on direct observation of clinical performance, sampling across different cases and assessors, and using evaluation forms to provide structured feedback in addition to reflective discussions.

Uploaded by

qwerty123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 45

An Overview of

Workplace-based Assessment

Cees van der Vleuten


Maastricht University, The Netherlands
www.ceesvandervleuten.com

UCSF, 3 April 2019


Outcome systems

CanMeds ACGME GMC


 Medical expert  Medical knowledge  Good clinical care
 Communicator  Patient care  Relationships with
 Collaborator  Practice-based learning patients and families
 Manager & improvement  Working with
Interpersonal and colleagues
 Health advocate 

communication skills  Managing the


 Scholar
 Professionalism workplace
 Professional
 Systems-based practice
 Social responsibility
and accountability
 Professionalism
Typical for outcomes
 Emphasis on competences
 Emphasis on behaviours/performance
 Emphasis on non-discipline specific
competences
OSCE
 What are strengths?
 What are threats?
Reliability of a number of measures

Case- Practice
Testing Based Video In-
Time in Short Oral Long Mini Assess- cognito
Hours MCQ1 Essay2 PMP1 Exam3 Case4 OSCE5 CEX6 ment7 SPs8

1 0.62 0.68 0.36 0.50 0.60 0.47 0.73 0.62 0.61


2 0.76 0.73 0.53 0.69 0.75 0.64 0.84 0.76 0.76
4 0.93 0.84 0.69 0.82 0.86 0.78 0.92 0.93 0.92
8 0.93 0.82 0.82 0.90 0.90 0.88 0.96 0.93 0.93
1
Norcini et al., 1985 4
Wass et al., 2001 7
Ram et al., 1999
2
Stalenhoef-Halling et al., 1990 5
Petrusa, 2002 8
Gorter, 2002
3
Swanson, 1987 6
Norcini et al., 1999
Reliability as a function of sample
size (Moonen et al., 2013)

Mini-CEX
Reliability as a function of sample
size (Moonen et al., 2013)

Mini-CEX OSATS
Reliability as a function of sample
size (Moonen et al., 2013)

Mini-CEX OSATS
MSF
Effect of aggregation across methods
(Moonen et al., 2013)

Sample Sample
needed needed
when used when used
Method as stand-alone as a composite

Mini-CEX 8 5
OSATS 9 6
MSF 9 2
Reliability of an oral examination (Swanson, 1987)

Same New Two New


Testing Number Examiner Examiner Examiners
Time in of for for for
Hours Cases All Cases Each Case Each Case

1 2 0.31 0.50 0.61


2 4 0.47 0.69 0.76
4 8 0.47 0.82 0.86
8 12 0.48 0.90 0.93
Checklist/rating reliability

Examiners
Examiners using
Test length using Rating
In hours Checklists scales

1 0.44 0.45
2 0.61 0.62
3 0.71 0.71
4 0.76 0.76
5 0.80 0.80

Van Luijk & van der Vleuten, 1990


Miller’s competency pyramid

Outcomes
Does

Shows how OSCE

Knows how

Knows
Miller GE. The assessment of clinical skills/competence/performance. Academic
Medicine (Supplement) 1990; 65: S63-S7.
Assessing does
 We need measures that sample widely
 Across content
 Across examiners
 When this is done, subjectivity is less of a
threat
Classes of WBA methods
 Direct observation: Single encounter methods
 Mini-CEX
 DOPS, OSATS
 P-MEX
 Case-based discussion
 Global performance measures
 Multi-Source Feedback (MSF or 360)
 In-training Evaluation Reports (ITER)
 Aggregation and reflection measures
 Logbook
 Portfolio
Single encounter methods
 Repeated direct observations of clinical
performance in practice using (generic)
evaluation forms, completed by any
significant observer (clinician, nurse,
peer…..)
Mini Clinical Examination (Norcini, 1995)

 Short observation during clinical


patient contact (5-15 minutes)
 Oral evaluation
 Generic evaluation forms completed
 Repeated at least 4 times by
different examiners
 (cf. http://www.abim.org/minicex/)
Norcini JJ, Blank LL, Arnold GK, Kimbal HR. 1995. The mini-CEX (Clinical Evaluation
Exercise): A preliminary investigation. Annals of Internal Medicine 123:795-799.
Mini-CEX: Competencies Assessed and Descriptors

 Medical Interviewing Skills


Facilitates patient’s telling of story; effectively uses questions/directions to obtain accurate, adequate
information needed; responds appropriately to affect, non-verbal cues.
 Physical Examination Skills
Follows efficient, logical sequence; balances screening/diagnostic steps for problem; informs patient;
sensitive to patient’s comfort, modesty.
 Humanistic Qualities/Professionalism
Shows respect, compassion, empathy, establishes trust; attends to patient’s needs of comfort, modesty,
confidentiality, information.
 Clinical Judgment
Selectively orders/performs appropriate diagnostic studies, considers risks, benefits.
 Counseling Skills
Explains rationale for test/treatment, obtains patient’s consent, educates/counsels regarding
management.
 Organization/Efficiency
Prioritizes; is timely; succinct.
 Overall Clinical Competence
Demonstrates judgment, synthesis, caring, effectiveness, efficiency.
Take home messages
 People are more important than
measurement instruments
 Reflective dialogues are essential
 Words are more important than scores
 Rely on your clinical skills to provide feedback
 Use your patient communication skills for
communicating with a learner
 Sample across cases and assessors.
Mini-CEX Exercise

Start exercise
Mini-CEX
 What are strengths?
 What are threats?
Multi-source feedback
 Multiple raters (8-10)
 Different rater groups, including self-
rating
 Questionnaires
 Specifically on observable behaviour
 Impression over a longer period of time
Professionalism Mini-Evaluation Exercise
PROFESSIONALISM MINI-EVALUATION EXERCISE
Evaluator:_________________________________________________________
Student/Resident:___________________________________________________

Level: (please check) ? 3rd yr ? 4th yr ? res 1 ? res 2 ? res 3 ? res 4 ? res 5
Setting: ? Ward ? Clinic ? OR ? ER
? Classroom ? Other______________________________
N/A UN BEL MET EXC
Listened actively to patient
Showed interest in patient as a person
Recognized and met patient needs
Extended him/herself to meet patient needs
Ensured continuity of patient care
Advocated on behalf of a patient
Demonstrated awareness of own limitations
Admitted errors/omissions
Solicited feedback
Accepted feedback
Maintained appropriate boundaries
Maintained composure in a difficult situation
Maintained appropriate appearance
Was on time
Completed tasks in a reliable fashion
Addressed own gaps in knowledge and/or skills
Was available to colleagues
Demonstrated respect for colleagues
Avoided derogatory language
Maintained patient confidentiality
Used health resources appropriately

? Please rate this student’s/resident’s overall professional performance during


THIS encounter: ? UNacceptable ? MET expectations
? BELow expectations ? EXCeeded expectations
? Did you observe a critical event? ? no ? yes (comment required)
Comments:
Multi-source feedback

Patients Peers

Nursing staff Clinical supervisor(s)

Self
Illustration MSF feedback

SPRAT (Sheffield peer review assessment tool; Archer JC, Norcini J, Davies HA. 2005. Use
of SPRAT for peer review of paediatricians in training. Bmj 330:1251-1253.)
Multi-source feedback procedure
 Step 1: select raters
 Proposal by assessee in conjunction with
supervisor
 Complete questionnaires
 Raters remain anonymous
 Assign responsibility to someone (i.e. secretary)
 Require qualitative feedback
 Discuss information
 Mid-term review, end of rotation
 Plan of action, reflection
 Reporting
 i.e. in portfolio
Multi-source feedback
 What are strengths?
 What are threats?
Multi-source feedback
 Rich source of information on
professional performance
 On different competency domains
 Different groups of raters provide
unique and different perspectives
 Self-assessment versus assessment by
others stimulates self-awareness and
reflection
Self assessment

Eva KW, Regehr G. 2005. Self-assessment in the health professions: a


reformulation and research agenda. Acad Med 80:S46-54.
Self-direction
Multi-source feedback
 Assessment and learning: concrete,
descriptive, qualitative feedback is
extremely useful
 Learning: feedback is central; Plan of
action is part of feedback; follow-up!
 Assessment: proper documentation is
essential for defensible decisions
Multi-source feedback
 Dilemma’s:
 Dual role of supervisor (helper & judge)
 Anonymity of raters
 Discrepancies between rater groups
 Time pressured (absence of) rich feedback
Multisource-feedback
“The most important goal of
multirater feedback is to
inform and motivate
feedback recipients to
engage in self directed
action planning for
improvement. It is the
feedback process, not the
measurement process
that generates the real
payoffs.” (Fleenor and Prince,
1997)
Portfolio
 A collection of results and/or evidence that
demonstrates competence
 Usually paired with reflections, plans of
actions, discussed with peers, mentors,
coaches, supervisors
 Aggregation of information (very comparable
to patient chart)
 Active role of the person assessed
 Reversal of the burden of evidence
 But it’s a container term
Classifying portfolios by functions
Logbook
Overviews

Planning/monitoring

Ideal portfolio

Assessment Discussing/mentoring
Materials Reflections
Assessment portfolio Learning portfolio
What exactly
 Purpose:
 Coaching
 Assessment
 Monitoring
 Structure
 Professional outcomes
 Competences
 Tasks, professional activities
 Evidence
 Open (self-directed, unstructured)
 Structure (how much is prescribed)
 Interaction
 Coach, mentor, peers
 Assessment
 Holistic vs analytic
Portfolio
Maastricht Electronic portfolio (ePass)

Comparison
between the score
of the student and
the average score
of his/her peers.
Maastricht Electronic portfolio (ePass)

Every blue dot


corresponds to
an assessment
form included in
the portfolio.
What can go wrong?
 “Reflection sucks”
 Too much structure
 Too little structure
 Portfolio as a goal not as a means
 Ritualization
 Ignorance by portfolio stakeholders
 Paper tiger
Portfolio recommendations
 Portfolio is not but an assessment method,
rather it is an educational concept
 Outcome-based education
 Framework of defined competences
 Professional tasks need to be translated in
assessable moments or artefacts
 Self-direction is required (and made possible)
 Portfolio should have immediate learning
value for the student/resident
 Direct use for directing learning activities
 Be aware of too much reflection
 Portfolios need to be ‘lean and mean’
(Driessen, E., Van Tartwijk, J., Van der Vleuten, C. Wass, V. Portfolios in medical education: why do
they meet with mixed success? A systematic review. Medical Education, 2007, 41, 1224-1233.)
Portfolio recommendations
 Social interaction around portfolios are
imperative
 Build a system of progress and review meeting
around portfolios
 Peers may potentially be involved
 Purpose of the portfolio should be very clear
 Portfolio as an aggregation instrument is
useful (compare with patient chart)
 Use holistic criteria for assessment;
subjectivity can be dealt with
(Driessen EW, Van der Vleuten CPM, Schuwirth LWT, Van Tartwijk J, Vermunt JD. 2005. The use of
qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: a case
study. Medical Education 39:214-220.)
“It may not be a perfect wheel, but
it’s a state-of-the-art wheel.”

Evaluation: http://tiny.ucsf.edu/WorkBasedAssess

You might also like