Twelve Tips For Embedding Assessment For and As Learning Practices in A Programmatic Assessment System

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Medical Teacher

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/imte20

Twelve tips for embedding assessment for and as


learning practices in a programmatic assessment
system

Aubrie Swan Sein, Hanin Rashid, Jennifer Meka, Jonathan Amiel & William
Pluta

To cite this article: Aubrie Swan Sein, Hanin Rashid, Jennifer Meka, Jonathan Amiel
& William Pluta (2021) Twelve tips for embedding assessment for and as learning
practices in a programmatic assessment system, Medical Teacher, 43:3, 300-306, DOI:
10.1080/0142159X.2020.1789081

To link to this article: https://doi.org/10.1080/0142159X.2020.1789081

Published online: 13 Jul 2020.

Submit your article to this journal

Article views: 2032

View related articles

View Crossmark data

Citing articles: 7 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=imte20
MEDICAL TEACHER
2021, VOL. 43, NO. 3, 300–306
https://doi.org/10.1080/0142159X.2020.1789081

TWELVE TIPS

Twelve tips for embedding assessment for and as learning practices in a


programmatic assessment system
Aubrie Swan Seina, Hanin Rashidb, Jennifer Mekac, Jonathan Amiela and William Plutad
a
Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA; bRutgers Robert Wood Johnson
Medical School, Piscataway, NJ, USA; cJacobs School of Medicine and Biomedical Sciences, State University of New York (SUNY) at
Buffalo, Buffalo, NY, USA; dPerelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA

ABSTRACT KEYWORDS
Programmatic assessment supports the evolution from assessment of learning to fostering assess- Feedback; progress testing;
ment for learning and as learning practices. A well-designed programmatic assessment system self-assessment; writ-
aligns educational objectives, learning opportunities, and assessments with the goals of supporting ten; planning
student learning, making decisions about student competence and promotion decisions, and sup-
porting curriculum evaluation. We present evidence-based guidance for implementing assessment
for and as learning practices in the pre-clinical knowledge assessment system to help students
learn, synthesize, master and retain content for the long-term so that they can apply knowledge
to patient care. Practical tips are in the domains of culture and motivation of assessment, including
how an honour code and competency-based grading system can support an assessment system
to develop student self-regulated learning and professional identity, curricular assessment struc-
ture, such as how and when to utilize low-stakes and cumulative assessment to drive learning,
exam and question structure, including what authentic question and exam types can best facilitate
learning, and assessment follow-up and review considerations, such exam retake processes to sup-
port learning, and academic success structures. A culture change is likely necessary for administra-
tors, faculty members, and students to embrace assessment as most importantly a learning tool
for students and programs.

Introduction for Good Assessment, the purposes of Programmatic


Assessment are to:
Assessments can measure student performance, compe-
tence, learning deficits, readiness to progress, and entrust- 1. Optimize the impact of assessments on learning, decisions
regarding individual students, and curriculum quality. 2.
ment of professional activities. Assessments can also drive
Identify and provide feedback on individual student’s areas of
or shape learning behaviours to encourage students to dir- strength and weakness. 3. Provide students with a good basis
ect their own learning (Epstein 2007; Schuwirth and van for making self-assessments and judging learning needs. 4.
der Vleuten 2011; van der Vleuten 2016). Assessment sys- Motivate students to remediate areas of weakness. 5. Provide
information on instructional effectiveness to guide
tems can also help prepare students for the practice of
improvement (p. 1107). (Norcini et al. 2018)
medicine and to develop important professional values
such as trustworthiness. As pre-clerkship students take on Developing a learning-focused programmatic assess-
responsibility for patient care, pre-clerkship assessment sys- ment system that optimizes a school’s culture, student
tems that motivate students to manage their long-term population, resources, and mission is a substantial project,
and investment in faculty and staff development is needed
learning and to apply knowledge to practice can enhance
(Schuwirth and Ash 2013; Schuwirth and van der
learning and professional identity formation. Medical
Vleuten 2019).
schools can shift from emphasizing the importance of
We conducted a literature review of programmatic
assessment as a measure of learning to using assessment
assessment and assessment for learning strategies related
intentionally for facilitating learning and as learning to fos- to cognitive knowledge-focused preclinical exams and
ter student self-regulation (Heeneman et al. 2015; Eva reflected on our experiences working across eight medical
et al. 2016). schools to inform this work. In this article, we present evi-
A programmatic assessment system is one in which dence-based assessment tips that schools can use to
each individual assessment should provide learners with evolve from an assessment of learning structure toward an
feedback, and each assessment should be viewed as part assessment for and as learning system and to develop a
of a larger system in which the assessments are viewed programmatic assessment structure. While programmatic
holistically, to make high-stakes competence/promotion assessment structures also focus on non-medical know-
decisions (Schuwirth and van der Vleuten 2011; Heeneman ledge competencies, such as communication and system-
et al. 2015). According to the 2018 Consensus Framework based practice, we focus on presenting preclinical medical

CONTACT Aubrie Swan Sein [email protected] Center for Education Research and Evaluation, Columbia Vagelos College of Physicians
and Surgeons, New York, NY, USA
ß 2020 Informa UK Limited, trading as Taylor & Francis Group
MEDICAL TEACHER 301

knowledge assessment strategies in this paper. These tips quick studiers and punishing those learners who need
can also be used to enhance traditional systems to pro- more time to reach the same degree of competence,
mote learning through assessment. potentially improving the cohesion of diverse classes.
Creating an assessment system that promotes self-regu-
lation should intentionally avoid rewarding short-term
Tip 1 learning or rote factual memorization using inefficient
Use assessment for learning principles to build a learning strategies (e.g. cramming). Cramming, as opposed
curricular and assessment structure to to studying consistently across time and focusing on
enhance learning understanding, leads to poorer academic performance
(Bickerdike et al. 2016). Reducing incentives to memorize
Curriculum design processes begin with planning the cur- details to answer every test question correctly in order to
ricular structure that aligns with the intended educational earn an ‘honours’ grade can give students more time to
program learning outcomes. Designing an assessment sys- prioritize learning the most important concepts and to rec-
tem to enhance learning comes next. Assessment blue- ognize errors made. Preparing for long-term board exams
printing can be utilized to develop and communicate how and high-stakes assessments can also be framed as a long-
each assessment fits into a broader program of assessment term and clinical learning opportunity (Swan Sein et al.).
and how progress decisions will be made based on per-
formance across aggregate assessments (Wilkinson and
Tweed 2018). Tip 3
In making an institutional commitment to using assess-
Create a culture of assessment that develops student
ment for learning and improvement, there is need for clear
professional identity as physicians
communication of the purposes of assessment and the use
of assessment outcomes. In one study of programmatic We believe the pre-clerkship assessment system and overall
assessment implementation, students’ perceptions of this culture of assessment should be congruent with the profes-
structure depended on whether they truly saw exams as sional norms of medicine by supporting learners’ profes-
learning opportunities, but the programmatic assessment sional identity formation - their core values, moral
structure did help to spur and direct learning (Heeneman principles, and self-awareness. However, conducting low or
et al. 2015). Intermediate-stakes assessments should pro- intermediate-stakes examinations under high-security set-
vide learners with diagnostic, therapeutic, and prognostic tings can convey a sense of distrust of learners. This is
information (van der Vleuten et al. 2015). Ideally designed incongruent with the fact that students will soon (or
assessments for learning are those that already have) access to sensitive patient information and
have responsibility for patient care. An honour code can
1) help the learners define where they are in meeting the
objectives of a course; 2) identify what they need to do further, establish expectations and entrust medical students with
3) prepare them to transfer their knowledge and skills to novel core professional values related to honesty and integrity
situations; 4) enable them to gain a deeper understanding of (Irby and Hamstra 2016). By utilizing an honour code to set
the material, and 5) provide them an opportunity to clear and specific standards for students, and expectation-
personalize their learning. (p. 9) (Kulasegaram and
setting discussions with students, students learn to behave
Rangachari 2018).
professionally, and to hold peers accountable to these
Once the learning-focused curricular and assessment standards (Ross et al. 2019).
systems are designed, appropriate teaching and learning New exam administration technologies allow for a shift
opportunities can be developed. from high-security, proctored examinations to un-proc-
tored, off-site, and time-flexible examinations. The applica-
tion of professional standards to assessment also opens up
Tip 2
the possibility of adopting these exam administration strat-
Promote learning by adopting a competency-based egies. Un-proctored and time-flexible exams can help to
education and evaluation structure and a culture of foster professional identity formation and professionalism
valuing learning over performance (Ross et al. 2019). Time-flexible exams can induce students
to make decisions about when they are prepared to dem-
An ideal learning-focused curriculum and assessment sys- onstrate competence within a knowledge domain.
tem would promote a growth mindset orientation (Dweck
2006) among learners and teachers. A competency-based
education and evaluation structure defines criteria for pro- Tip 4
gression by a learners’ attainment of pre-defined thresholds
Utilise coaches to help students progress and develop
rather than their performance relative to peers. Within the
academic success systems to promote learning
medical education literature, competency-based grading
and growth
has been often framed in terms of moving to a pass/fail, or
pass/not yet, grading system. The adoption of pass/fail Providing coaches or mentors for students to review per-
grading can reduce competition and foster collaboration formance, reflections, learning opportunities, and next
among members of the class – core professional values steps is also important to facilitating student learning (van
(Reed et al. 2011; Spring et al. 2011), and can promote der Vleuten et al. 2015). Coaches can monitor student
intrinsic motivation (White and Fantone 2010). It can also assessment performance over time to identify students
help mitigate bias conferred by rewarding learners whose who may benefit from academic support or enrichment.
enriched academic backgrounds have taught them to be Viewing remediation as an academic support opportunity
302 A. SWAN SEIN ET AL.

and establishing proactive and preventative support systems Research on testing and assessment for learning has
for students can guide learning and improvement, prevent shown that repeated testing slows down the forgetting
future exam failures, and can orient a school’s culture process (Larsen et al. 2008; Rowland 2014). Frequent low-
toward learning (Guerrassio 2013; Chou et al. 2014; Kalet stakes assessment is more likely than infrequent high-
and Chou 2014). stakes assessment to promote learning (Vansteenkiste
Certain learning strategies can improve student learning, et al. 2004), reduce the achievement gaps among students
cognition, and affect, as well as performance on assess- from different socioeconomic classes (Pennebaker et al.
ments, however many students do not engage in these 2013), promote long-term retention by motivating students
strategies instinctually (Hattie et al. 1996). Focusing on to engage frequently with content, and discourage the
errors can be a powerful driver of learning (Metcalfe 2017). practice of ‘gorge and purge’ (Kulasegaram and
Learning specialists can help all students to use long-term Rangachari 2018).
learning strategies, including how to learn from assess- Weekly problem sets can be designed to promote learn-
ments and practice questions (Swan Sein et al. 2020a), pro- ing and provide feedback to students. Low-stakes implicit-
cess feedback and improve on weaknesses, and model confidence testing can be used with multiple choice ques-
study strategies that utilize self-assessment strategies to tions to help students to recognize misconceptions
foster deeper learning, teach students time management (Klymkowsky et al. 2006) and to learn from errors made
skills, and promote self-testing strategies (Swan Sein with high confidence (Metcalfe 2017). Problem sets can
also interleave content from across topics, to help students
et al. 2020b).
to study and test knowledge on a mixture of topics they
have learned, instead of focusing on just one topic for a
Tip 5 long period, promoting the ability to apply knowledge in
novel settings (Dunlosky et al. 2013). Problem sets of mul-
Promote assessment as learning to drive student self- tiple choice and open-ended questions can be assigned
regulated learning and worked on early in a week, and students could then
Assessment as learning intertwines learning and assess- complete the problem set from memory at the end of the
ment and places ownership over learning in the role of the week to benefit from the testing effect. Providing students
student (Dann 2014). The intention is for students to use with high quality feedback on their performance, even if
metacognitive strategies to plan, monitor, and self-regulate not frequently, is an important component of a program-
learning. Successful medical students are self-regulated matic assessment structure; narrative feedback can be pro-
learners who plan, evaluate, and monitor their learning vided on answers to open-ended questions, for example.
strategies (Artino et al. 2011). Assessment as learning tools,
such as portfolios or reflection exercises, can help students Tip 7
to become metacognitive and self-regulate their learning.
Using student portfolios for programmatic assessment can Use cumulative assessment to encourage student
help compile multiple assessments, collect assessment evi- retrieval and application of information over time
dence over time, and amass feedback from multiple eval- On the basis of the principle of spaced practice, an assess-
uators. In combination with faculty coaching and reflective ment system should directly or indirectly expose students
writing, portfolios can facilitate assessment as learning and to content cumulatively and repeatedly. Cumulative assess-
can promote critical thinking, problem solving, and self- ments, which include recent content and content learned
assessment to drive life-long learning behaviours (Gadbury- throughout a program, promote student learning and
Amyot and Overman 2018). Students can then engage in long-term retention, in part because students work to keep
deliberate practice to improve performance with immediate content available over time if they expect a cumulative
and useful feedback (Ericsson 2004). exam (Szpunar et al. 2007; Larsen et al. 2008; Wrigley et al.
2012). Notably, ‘it is often only when durability of knowledge
is tested in the longer run that the student may perceive a
Tip 6
problem with their study performance’ (Kalet and Chou
Promote assessment for learning via frequent 2014, p. 42). Non-cumulative exam feedback can give stu-
low-stakes assessment and feedback dents an ‘illusion of competence’ because it is not known
how well this understanding will be retained for the long-
Ensuring that an assessment structure provides students term (Koriat and Bjork 2005). Therefore, it is important for
with frequent low-stakes assessment for learning and feed- content to appear on multiple exams over time in a variety
back is important. Learning occurs throughout all phases of of forms.
the assessment cycle, which includes a pre-assessment, Progress testing, which provides students with longitu-
true assessment, and post-assessment phase. In the pre- dinal assessment opportunities with repeated comprehen-
assessment phase, learning about tests via exam blueprints sive exams over time, such as giving multiple practice
or practice tests directs students about what to study and board exams during the preclinical years, is associated with
learn. In the true assessment phase, engaging in retrieval improvements in knowledge retention (Johnson et al.
practice enhances the storage and ability to retrieve infor- 2014). Students study more continuously for progress
mation at a future time. In the post-assessment phase, stu- exams and build a better knowledge base leading to mini-
dents can learn from their mistakes when they receive mized use of test-driven learning strategies (Norman et al.
feedback about their performance and review incorrect 2010; Schuwirth and van der Vleuten 2012). Schools could
answers (Gielen et al. 2003; Schuwirth and Ash 2013). consider implementing regular cumulative examinations, or
MEDICAL TEACHER 303

integrate questions from previous topics into exams Multiple choice self-assessment questions and concept
throughout the curriculum, to enhance learning and pro- appraisal questions can ask for the mechanisms behind a
mote self-regulation of learning. clinical scenarios’ findings. Clinical reasoning exercises
where students write a paragraph describing a patient’s
problem and associated mechanisms can also help inte-
Tip 8 grate learning. A diagnostic justification exercise where stu-
Use a mix of assessment question types to challenge dents suggest a differential diagnosis and the rationale for
students to employ a variety of studying strategies a patient’s problems can also be used (Brauer and
Ferguson 2015). Educators can begin by creating low-stakes
It is well-established that retrieval practice causes a testing assessments with these types of questions to guide student
effect and can lead to high quality learning (Larsen et al. learning and comfort with this type of question/assess-
2008, 2013; Larsen and Butler 2013; Green et al. 2018). ment activity.
Engaging in answering questions can serve as a ‘desirable
difficulty’ as questions challenge students to engage in
retrieval practice and apply their learning (Marsh and Tip 10
Butler 2013). Multiple choice questions can also promote
Leverage authentic learning practices within the
learning and conceptual understanding. Students who
cognitive knowledge assessment system
carefully evaluate and generate reasons for why each
answer choice is correct or incorrect will engage in the Physicians need to be able and willing to work and learn
type of deep processing that promotes retention of both as a team, develop communication skills and peer-assess-
tested and untested information (Little and Bjork 2015). ment skills, and reference and apply new knowledge
Students learn well when an exam helps them to (Hodges 2013; ten Cate and Chen 2016). Incorporating
engage in retrieval practice via answering deep questions authentic social and cognitive practices in the assessment
and causes them not only to study and learn facts but also system promotes these skills, such as via the social practice
to engage in deeper reasoning and elaboration. Questions of collaborative examinations. Collaborative learning can
that promote elaboration, including generation and explan- lead to improved academic performance, interpersonal
ation questions, enhance understanding and long-term interactions, perceptions of social support, self-esteem, and
retention (Larsen and Butler 2013; van der Vleuten and retention in academic programs, compared to individual
Driessen 2014). Questions that challenge students to work, as well as compared to competitive learning para-
answer why, how, and what-if promote deeper comprehen- digms (Hmelo-Silver et al. 2013). Low-stakes group assess-
sion and levels of learning (Craig et al. 2006). Non-MCQ ments, such as Team-Based Learning ‘Group Readiness
questions are not trivial to grade, but can be developed Assurance Tests’ or anatomy group-based lab exams can
and incorporated over time into exams (Bierer et al. 2018). introduce collaborative practice and stimulate an active
New exam delivery software can facilitate rapid grading of learning process (Pluta et al. 2013).
open-ended questions on graded assessments. Students An authentic cognitive practice in assessment can be
can be provided with worked examples of question taking open-book examinations. Research shows benefits
answers for learning purposes. from both open-book and closed-book exams (Durning
et al. 2016). Completing closed-book quizzes on conceptual
questions can lead to less forgetting over time than open-
Tip 9 book quizzes because students practice recalling informa-
Use complex questions to integrate across domains tion and use the quiz as a retrieval practice exercise
of knowledge (Agarwal et al. 2008). Preparing for an open-book test may
encourage students to focus on understanding the import-
Students who view basic science as important for their cur- ant concepts and content, rather than memorizing details.
rent and future goals will demonstrate greater persistence, Creating a ‘crib-sheet’ to prepare for an exam may help
employ strategies for deeper learning, and may show students organize knowledge, identify which knowledge is
increased achievement (Artino et al. 2010). Assessment most important, identify which knowledge should be mem-
questions can aid in developing students’ perceived value orized, and which knowledge can/should be looked-up. In
of different domains, such as basic science, social and the future, students may be provided with additional
behavioural sciences, health systems sciences, and human- resources during board exams, such as biological or meta-
ities, and should be relevant and motivate students to bolic pathways or immunization schedules, to assess their
thoughtfully answer them. Growing education evidence ability to apply or synthesize learning, and not just memor-
highlights the utility of helping learners to integrate basic ize details.
and clinical science domains of knowledge in their minds
(Baghdady et al. 2013; Bandiera et al. 2018), by helping
them make conceptual connections across domains. Tip 11
Providing students with clinically oriented practice ques-
Create non-punitive exam retake processes that
tions in question banks can help anchor learning and to
promote learning
motivate students to learn and apply their knowledge
through clinical reasoning. In some programmatic assessment structures, exam retakes
A number of assessment question types can be used to are not utilised because multiple assessments over time are
assess integrated learning. These include reflection ques- used to make high stakes decisions about student progres-
tions that elicit connections between related information. sion, making retakes unnecessary from a student evaluation
304 A. SWAN SEIN ET AL.

standpoint. Schools may choose to use exam retakes so improvement and that details how good performance is
that students identify and learn from errors. By having achieved will promote self-regulated learning (Nicol and
retakes be a normal and non-punitive part of the assess- Macfarlane-Dick 2006).
ment system, institutions can begin to change the learning Leveraging emerging assessment technologies such as
culture so students are accountable for mastering the tagging questions from intermediate-stakes assessments
knowledge, skills, and attitudes necessary to achieve com- with topic keywords can allow schools to develop dash-
petence. A fair re-assessment should provide students with boards that track student progress towards specific bench-
appropriate remediation and the opportunity to demon- marks, and students can identify potential content gaps
strate successful learning in previously identified areas and use these gaps to inform their self-directed learning.
of deficiency. Student coaches or learning specialists can meet with stu-
Re-assessment can be accomplished via a number of dif- dents to review and make use of these data.
ferent formats, including an assessment customized to a As programmatic assessment systems rely on using mul-
student’s particular areas of difficulty (Hauer et al. 2009;
tiple data points from assessments collected over time for
Hawthorne et al. 2014). One practical retake approach is to
high stakes decisions, using assessment technologies to
have students retake the same exam that they have not
compile data for student progression committee reviews is
mastered, but to also require them to provide explanations
also important. Assessment data can and should also be
of the answers to each question, to demonstrate that they
used for curricular and program evaluation and improve-
have moved beyond memorizing the correct answer, and
to help them to learn important concepts. During re-grad- ment purposes (van der Vleuten et al. 2015).
ing, students can be given credit only if the question
explanation is also correct.
The best retake timing likely depends on the quantity Conclusion
and types of errors made. Retakes can happen close to a
A well-designed assessment program can have many bene-
failure, after engagement in a subsequent learning activity,
fits for learning (Roediger et al. 2011), but developing and
and/or when a student feels prepared. Advantages of quick
implementing evidence-based assessment systems is an
retakes include learning material needed for the next block
of the curriculum and not falling behind. Remediation dur- area in need of further consideration and study (Norcini
ing another course might be distracting to learning new et al. 2018; Schuwirth and van der Vleuten 2019). These
content. Students with major conceptual or fund of know- tips demonstrate that assessment systems can optimize
ledge deficits may benefit from dedicated study time and learning when: (1) learners enter into and promote an
additional support for learning. environment of trust, (2) schools set thresholds for compe-
tency and allow learners the opportunity to retake exams
until they have reached competency, (3) schools promote
Tip 12 self-regulated learning with ample feedback and resources,
Utilise exam feedback data for student learning and (4) learners and schools collaboratively focus on learning
ultimately for high stakes decisions and reducing the stigma of ‘failure’ – some content
requires more time or effort, depending on background
Giving students exam feedback can have more learning and learning strategies, (5) assessments focus on clinical
impact than many instructional strategies (Hattie and reasoning, not only recall of facts, and (6) schools offer
Timperley 2007). Learning and long-term retention are also
cumulative assessments so students do not forget what
enhanced by appropriately timed feedback (Black and
they learned before.
Wiliam 2006; Harks et al. 2014; van de Ridder et al. 2015).
We have provided tips in order to ‘disrupt thinking’ and
Rapid feedback on high-stakes assessments may have a
help schools to develop a program of assessment to gener-
greater impact on learning since students are often not
ate positive learning behaviours so students are better
able to take the test again (Hattie and Timperley 2007);
equipped to transfer and apply knowledge during patient
wrong answers may be learned and reinforced if immedi-
ate feedback is not given and utilized by students (Butler care activities. A culture change is likely necessary for all
and Roediger 2008). With the advent of electronic testing administrators, faculty members, and students to embrace
systems, exams can easily be reviewed by students, with a assessment as most importantly a learning tool for stu-
variety of security options available, such as during post- dents and programs. Educational settings are complex;
exam review sessions. Feedback can be facilitated by ultimately, schools need to balance what is best for long-
scheduling post-exam reviews sessions either immediately term student learning, what is best for student well-being,
or shortly following exams. what is feasible, and what works with a school’s learn-
Delayed feedback in low-stakes assessments can help ing culture.
students can spend time working through the problem
until they find the correct answer (Butler et al. 2007; Mullet
et al. 2014). There is also evidence that narrative and Acknowledgements
explanation feedback is superior to simply correct answer
The authors would like to thank the members of the Society of the
feedback (Black and Wiliam 2006). Students can also be
Directors of Research in Medical Education, the International
provided with electronic question explanations, or with Association of Medical Science Educators, the Northeast Group on
worked exemplar responses once they work to answer Education Affairs, the Medical Education Learning Specialists, the
questions independently. Feedback that reveals patterns in American Dental Education Association groups, and the article
students’ weaknesses, that focuses on specific areas of reviewers for guidance in the development of these tips and advice.
MEDICAL TEACHER 305

Disclosure statement Dann R. 2014. Assessment as learning: blurring the boundaries of


assessment and learning for theory, policy and practice. Assess
The authors report no conflicts of interest. The authors alone are Educ Princ Policy Pract. 21(2):149–166.
responsible for the content and writing of this article. Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. 2013.
Improving students’ learning with effective learning techniques:
promising directions from cognitive and educational psychology.
Notes on contributors Psychol Sci Public Interest. 14(1):4–58.
Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino AR Jr., Boulet JR,
Aubrie Swan Sein, PhD, EdM, is Director of the Center for Education Eva K. 2016. Comparing open-book and closed-book examinations:
Research and Evaluation and Assistant Professor of Education a systematic review. Acad Med. 91(4):583–599.
Assessment in Pediatrics and Dental Medicine at Columbia Vagelos Dweck C. 2006. Mindset: the new psychology of success. New York
College of Physicians and Surgeons, New York, NY. (NY): Random House.
Epstein RM. 2007. Assessment in medical education. N Engl J Med.
Hanin Rashid, PhD, EdM, is Associate Director for the Office for
356(4):387–396.
Advancing Learning, Teaching, and Assessment, a Learning Specialist
Ericsson KA. 2004. Deliberate practice and the acquisition and main-
in the Cognitive Skills Program, and Assistant Professor of Psychiatry
tenance of expert performance in medicine and related domains.
at Rutgers Robert Wood Johnson Medical School, Piscataway, NJ.
Acad Med. 79(10 Suppl):S70–S81.
Jennifer Meka, PhD, EdM, is Director of the Medical Education and Eva KW, Bordage G, Campbell C, Galbraith R, Ginsburg S, Holmboe E,
Educational Research Institute, Assistant Dean for Medical Education, Regehr G. 2016. Towards a program of assessment for health pro-
and Assistant Professor of Medicine at The Jacobs School of Medicine fessionals: from training into practice. Adv Health Sci Educ Theory
and Biomedical Sciences at The State University of New York (SUNY) Pract. 21(4):897–913.
at Buffalo, Buffalo, NY. Gadbury-Amyot CC, Overman PR. 2018. Implementation of portfolios
as a programmatic global assessment measure in dental education.
Jonathan Amiel, MD, is co-Interim Vice Dean for Education and Senior J Dent Educ. 82(6):557–564.
Associate Dean for Curricular Affairs, and Associate Professor of Gielen S, Dochy F, Dierick S. 2003. Evaluating the consequential valid-
Psychiatry, Columbia Vagelos College of Physicians and Surgeons, New ity of new modes of assessment: The influence of assessment on
York, NY. learning, including pre-, post-, and true assessment effects. In:
Segers M, Dochy F, Cascallar E, editors. Optimising new modes of
William Pluta, PhD, EdM, is Director of UME Evaluation at the assessment: in search of qualities and standards. New York (NY):
Perelman School of Medicine at the University of Pennsylvania, Kluwer Academic Publishers.
Philadelphia, PA. Green ML, Moeller JJ, Spak JM. 2018. Test-enhanced learning in health
professions education: a systematic review: BEME Guide No. 48.
Med Teach. 40(4):337–350.
References Guerrassio J. 2013. Remediation of the struggling medical learner.
Agarwal PK, Karpicke JD, Kang SHK, Roediger HL, McDermott KB. 2008. Irwin (PA): Association for Hospital Medical Education.
Examining the testing effect with open- and closed-book tests. Harks B, Rakoczy K, Hattie J, Besser M, Klieme E. 2014. The effects of
feedback on achievement, interest and self-evaluation: the role of
Appl Cognit Psychol. 22(7):861–876.
feedback’s perceived usefulness. Educ Psychol. 34(3):269–290.
Artino A, Hemmer P, Durning S. 2011. Using self-regulated learning
Hattie J, Biggs J, Purdie N. 1996. Effects of learning skills interventions
theory to understand the beliefs, emotions, and behaviors of strug-
on student learning: a meta-analysis. Rev Educ Res. 66(2):99–136.
gling medical students. Acad Med. 86(10 Suppl):S35–S38. Hattie J, Timperley H. 2007. The power of feedback. Rev Educ Res.
Artino A, La Rochelle J, Durning S. 2010. Second-year medical stu-
77(1):81–112.
dents’ motivational beliefs, emotions, and achievement. Med Educ. Hauer KE, Ciccone A, Henzel TR, Katsufrakis P, Miller SH, Norcross WA,
44(12):1203–1212. Papadakis MA, Irby DM. 2009. Remediation of the deficiencies of
Baghdady MT, Carnahan H, Lam EW, Woods NN. 2013. Integration of physicians across the continuum from medical school to practice: a
basic sciences and clinical sciences in oral radiology education for thematic review of the literature. Acad Med. 84(12):1822–1832.
dental students. J Dent Educ. 77(6):757–763. Hawthorne MR, Chretien KC, Torre D, Chheda SG. 2014. Re-demonstra-
Bandiera G, Kuper A, Mylopoulos M, Whitehead C, Ruetalo M, tion without remediation-a missed opportunity? A national survey
Kulasegaram K, Woods NN. 2018. Back from basics: integration of of internal medicine clerkship directors. Med Educ Online. 19(1):
science and practice in medical education. Med Educ. 52(1):78–85. 25991.
Bickerdike A, O’Deasmhunaigh C, O’Flynn S, O’Tuathaigh C. 2016. Heeneman S, Oudkerk Pool A, Schuwirth LW, van der Vleuten CP,
Learning strategies, study habits and social networking activity of Driessen EW. 2015. The impact of programmatic assessment on stu-
undergraduate medical students. Int J Med Educ. 7:230–236. dent learning: theory versus practice. Med Educ. 49(5):487–498.
Bierer SB, Colbert CY, Foshee CM, French JC, Pien LC. 2018. Tool for Hmelo-Silver CE, Chinn CA, O’Donnell AM, Chan C. 2013. The inter-
diagnosing gaps within a competency-based assessment system. national handbook of collaborative learning. New York (NY):
Acad Med. 93(3):512. Routledge.
Black P, Wiliam D. 2006. Assessment and classroom learning. assess- Hodges B. 2013. Assessment in the post-psychometric era: learning to
ment in education: principles. Policy & Practice. 5(1):7–74. love the subjective and collective. Med Teach. 35(7):564–568.
Brauer DG, Ferguson KJ. 2015. The integrated curriculum in medical Irby DM, Hamstra SJ. 2016. Parting the clouds: three professionalism
education: AMEE Guide No. 96. Med Teach. 37(4):312–322. frameworks in medical education. Acad Med. 91(12):1606–1611.
Butler A, Karpicke J, Roediger H. 2007. The effect of type and timing Johnson TR, Khalil MK, Peppler RD, Davey DD, Kibble JD. 2014. Use of
the NBME Comprehensive Basic Science Examination as a progress
of feedback on learning from multiple-choice tests. J Exp Psychol
test in the preclerkship curriculum of a new medical school. Adv
Appl. 13(4):273–281.
Physiol Educ. 38(4):315–320.
Butler A, Roediger H. 2008. Feedback enhances the positive effects
Kalet A, Chou C. 2014. Remediation in medical education: a mid-course
and reduces the negative effects of multiple-choice testing. Mem
correction. New York (NY): Springer Books.
Cognit. 36(3):604–616. Koriat A, Bjork RA. 2005. Illusions of competence in monitoring one’s
Chou CL, Kalet A, Hauer KE. 2014. A research agenda for remediation knowledge during study. J Exp Psychol Learn Mem Cogn. 31(2):
in medical education. In: Remediation in medical education. New 187–194.
York (NY): Springer Books; p. 339–348. Kulasegaram K, Rangachari PK. 2018. Beyond “formative”: assessments
Craig SD, Sullins J, Witherspoon A, Gholson B. 2006. The deep-level- to enrich student learning. Adv Physiol Educ. 42(1):5–14.
reasoning-question effect: the role of dialogue and deep-level-rea- Larsen D, Butler A. 2013. Chapter 38: test-enhanced learning. In: Walsh
soning questions during vicarious learning. Cogn Instr. 24(4): K, editor. Oxford textbook of medical education. Oxford (UK):
565–591. Oxford University Press; p. 443–452.
306 A. SWAN SEIN ET AL.

Larsen DP, Butler AC, Roediger HL. 2008. Test-enhanced learning in Schuwirth L, van der Vleuten C. 2011. Programmatic assessment: from
medical education. Med Educ. 42(10):959–966. assessment of learning to assessment for learning. Med Teach.
Larsen DP, Butler AC, Roediger HL. 2013. Comparative effects of test- 33(6):478–485.
enhanced learning and self-explanation on long-term retention. Schuwirth L, van der Vleuten C. 2012. The use of progress testing.
Med Educ. 47(7):674–682. Perspect Med Educ. 1(1):24–30.
Little J, Bjork EL. 2015. Optimizing multiple-choice tests as tools for Schuwirth L, van der Vleuten C. 2019. How ‘testing’ has become
learning. Mem Cognit. 43(1):14–26. ‘programmatic assessment for learning. Health Prof Educ. 5(3):
Marsh EJ, Butler AC. 2013. Memory in educational settings. In: Reisberg 177–184.
D, editor. The Oxford handbook of cognitive psychology. New York: Spring L, Robillard D, Gehlbach L, Simas TA. 2011. Impact of pass/fail
Oxford University Press; p. 299–317. grading on medical students’ well-being and academic outcomes.
Metcalfe J. 2017. Learning from errors. Annu Rev Psychol. 68:465–489.
Med Educ. 45(9):867–877.
Klymkowsky MW, Taylor LB, Spindler SR, Garvin-Doxas RK. 2006. Two-
Swan Sein A, Cuffney F, Clinchot D. 2020a. How to help students stra-
dimensional, implicit confidence tests as a tool for recognizing stu-
tegically prepare for the MCAT exam and learn foundational know-
dent misconceptions. J Coll Sci Teach. 36(3):44–48.
ledge needed for medical school. Acad Med. 95(3):484.
Mullet HG, Butler AC, Verdin B, von Borries R, Marsh EJ. 2014. Delaying
Swan Sein A, Daniel M, Fleming A, Morrison G, Christner J, Esposito K,
feedback promotes transfer of knowledge despite student preferen-
ces to receive feedback immediately. J Appl Res Mem Cogn. 3(3): Pock A, O’Connor Grochowski C, Dalrymple J, Santen S. 2020b.
222–229. Identifying and supporting struggling students to prevent failures
Nicol DJ, Macfarlane-Dick D. 2006. Formative assessment and self-regu- when the United States Medical Licensing Examination (USMLE)
lated learning: a model and seven principles of good feedback step 1 is moved to after clerkships. Acad Med. [accessed 2020
practice. Stud Higher Educ. 31(2):199–218. March 3]. https://doi.org/10.1097/ACM.0000000000003272
Norcini J, Anderson MB, Bollela V, Burch V, Costa MJ, Duvivier R, Hays Szpunar KK, McDermott KB, Roediger HL. 3rd. 2007. Expectation of a
R, Palacios Mackay MF, Roberts T, Swanson D. 2018. 2018 final cumulative test enhances long-term retention. Mem Cognit.
Consensus framework for good assessment. Med Teach. 40(11): 35(5):1007–1013.
1102–1109. ten Cate O, Chen H. 2016. The parts, the sum and the whole-evaluat-
Norman G, Neville A, Blake JM, Mueller B. 2010. Assessment steers ing students in teams. Med Teach. 38(7):639–641.
learning down the right road: impact of progress testing on licens- van de Ridder JM, McGaghie WC, Stokking KM, ten Cate OT. 2015.
ing examination performance. Med Teach. 32(6):496–499. Variables that affect the process and outcome of feedback, relevant
Pennebaker JW, Gosling SD, Ferrell JD. 2013. Daily online testing in for medical training: a meta-review. Med Educ. 49(7):658–673.
large classes: boosting college performance while reducing achieve- van der Vleuten C. 2016. Revisiting ‘Assessing professional compe-
ment gaps. PLoS One. 8(11):e79774. tence: from methods to programmes’. Med Educ. 50(9):885–888.
Pluta WJ, Richards BF, Mutnick A. 2013. PBL and beyond: trends in col- van der Vleuten CP, Driessen EW. 2014. What would happen to educa-
laborative learning. Teach Learn Med. 25 Suppl 1:S9–S16. tion if we take education evidence seriously? Perspect Med Educ.
Reed DA, Shanafelt TD, Satele DW, Power DV, Eacker A, Harper W, 3(3):222–232.
Moutier C, Durning S, Massie FS, Jr., Thomas MR. 2011. Relationship van der Vleuten C, Schuwirth L, Driessen E, Govaerts M, Heeneman S.
of pass/fail grading and curriculum structure with well-being
2015. Twelve Tips for programmatic assessment. Med Teach. 37(7):
among preclinical medical students: a multi-institutional study.
641–646.
Acad Med. 86(11):1367–1373.
Vansteenkiste M, Simons J, Lens W, Sheldon KM, Deci EL. 2004.
Roediger HL, Putnam AL, Smith MA. 2011. Ten benefits of testing and
Motivating learning, performance, and persistence: the synergistic
their applications to educational practice. Psychol Learn Motiv Cogn
Educ. 55:1–36. effects of intrinsic goal contents and autonomy-supportive contexts.
Ross PT, Keeley MG, Mangrulkar RS, Karani R, Gliatto P, Santen SA. J Pers Soc Psychol. 87(2):246–260.
2019. Developing professionalism and professional identity through White CB, Fantone JC. 2010. Pass-fail grading: laying the foundation
unproctored, flexible testing. Acad Med. 94(4):490–495. for self-regulated learning. Adv Health Sci Educ Theory Pract. 15(4):
Rowland CA. 2014. The effect of testing versus restudy on retention: a 469–477.
meta-analytic review of the testing effect. Psychol Bull. 140(6): Wilkinson TJ, Tweed MJ. 2018. Deconstructing programmatic assess-
1432–1463. ment. Adv Med Educ Pract. 9:191–197.
Schuwirth L, Ash J. 2013. Chapter 35, Principles of assessment. In: Wrigley W, van der Vleuten CP, Freeman A, Muijtjens A. 2012. A sys-
Walsh K, editor. Oxford textbook of medical education. Oxford, temic framework for the progress test: strengths, constraints and
United Kingdom: Oxford University Press; p. 409–420. issues: AMEE Guide No. 71. Med Teach. 34(9):683–697.

You might also like