Modernizing Learning - Building The Future Learning Ecosystem
Modernizing Learning - Building The Future Learning Ecosystem
The findings, interpretations, and conclusions expressed in this work do not necessarily re-
flect the views of the Department of Defense, U.S. Government, or other governmental enti-
ties. This work is available under the Creative Commons Attribution 4.0 license (CC BY 4.0
IGO) https://creativecommons.org/licenses/by/4.0. Under this license, you are free to copy,
distribute, transmit, and adapt this work, including for commercial purposes, under the fol-
lowing conditions:
Attribution—Please cite the work as follows: Walcutt, J.J. & Schatz, S. (Eds.) (2019).
Modernizing Learning: Building the Future Learning Ecosystem. Washington, DC:
Government Publishing Office. License: Creative Commons Attribution CC BY 4.0 IGO
Adaptations—If you create an adaptation of this work, please add the following disclaimer
along with the attribution: This is an adaptation of an original work by the Advanced Dis-
tributed Learning (ADL) Initiative, part of the Office of the Deputy Assistant Secretary of
Defense for Force Education and Training. Views and opinions expressed in the adaptation
are the sole responsibility of the author or authors of the adaptation and are not endorsed by
the U.S. Government.
ePub format
• eBook GPO Stock Number: 008-300-00197-2
• eBook ISBN: 978-0-16-095091-9
PDF format
• PDF GPO Stock Number: 008-300-00198-1
• PDF ISBN: 978-0-16-095092-6
Print format
• Print GPO Stock Number: 008-000-01329-2
• Print ISBN: 978-0-16-095088-9
Acknowledgments
The research for and publication of this book were sponsored by the Advanced Distributed
Learning (ADL) Initiative, a research and development program reporting to the Office of
the Deputy Assistant Secretary of Defense for Force Education and Training, part of the U.S.
Department of Defense.
Editorial Board
Xiangen Hu, Ph.D., Professor, The University of Memphis
Van Brewer, Ph.D., External R&D Principal (Contractor), ADL Initiative
Jody Cockroft, Research Specialist, The University of Memphis
Katie Flinn, Project Analyst (Contractor), ADL Initiative
Education
Ken Wagner, Ph.D., Education Commissioner, Rhode Island Department of Education
Daniel French, Secretary of Education, Vermont Agency of Education
Nathan Oakley, Ph.D., Chief Academic Officer, Mississippi Department of Education
Keith Osburn, Ed.D., Assoc. Superintendent, Georgia Virtual Learning, Georgia DoEd
Kimberly Eckert, Teacher, Brusly High, Louisiana State Teacher of the Year
Michelle Cottrell-Williams, Teacher, Wakefield High, Virginia State Teacher of the Year
Sandra Maldonado-Ross, President, Seminole Education Association, Florida
Sue Carson, Former President, Seminole Education Association, Florida
Government
Heidi Schweingruber, Ph.D., Director, Board on Science Ed., National Research Council
Suzanne Logan, Ed.D., Director, Center for Leadership Dev., Federal Executive Institute
Reese Madsen, SES, Senior Advisor to the U.S. Chief Human Capital Officers Council
Edward Metz, Ph.D., Research Scientist, U.S. Dept. of Education; Projects that Work
Erin Higgins, Ph.D., Research Analyst, U.S. Department of Education
Government ((Cont.
Cont.))
Pam Frugoli, Work Analyst, O*NET Competency Model, U.S. Department of Labor
Doug Tharp, Senior Learning Project Manger, Nuclear Regulatory Commission
Andrew Brooks, Ph.D., Chief Data Scientist, National Geospatial-Intelligence Agency
Military
Fred Drummond, SES, Deputy Asst. Secretary of Defense for Force Education & Training
VADM Alfred Harms, Jr., USN (Ret.), Lake Highland Prep School; UCF
Gladys Brignoni, Ph.D., SES, Deputy Commander, FORCECOM, U.S. Coast Guard
Lt. Gen. Thomas Baptiste, USAF (Ret.), President, National Center for Simulation
Maj. Gen. Thomas Deale, USAF (Ret.), Former Vice Director, Joint Force Development
RADM James Robb, USN (Ret.), President, National Training and Simulation Assoc.
Morgan Plummer, Director, MD5, U.S. Department of Defense
Ralucca Gera, Ph.D., Associate Provost and Professor, Naval Postgraduate School
LTC Michelle Isenhour, Ph.D., Assistant Professor, Naval Postgraduate School
Dennis Mills, Program Analyst, Naval Education and Training Command
Kendy Vierling, Ph.D., Director, Future Learning Group, USMC Training & Edu Command
Larry Smith, Technical Director, USMC College of Distance Education and Training
Non-Profit Organizations
Bror Saxberg, Ph.D., M.D., VP, Learning Scientist, Chan Zuckerberg Initiative
Russel Shilling, Ph.D., Chief Scientific Officer, American Psychological Association
Jason Tyszko, VP, Center for Education and Workforce, U.S. Chamber of Commerce
Elliott Masie, Founder, The MASIE Center
Amber Garrison Duncan, Ph.D., Strategy Director, Lumina Foundation
Emily Musil Church, Ph.D., Executive Director of the Global Learning XPRIZE
Betty Lou Leaver, Ph.D., Director, The Literary Center
Jeffrey Borden, Ed.D., Executive Director, Inter-Connected Education
Jeanne Kitchens, Credential Engine and Southern Illinois University
Industry
John Landwehr, Vice President and Public Sector Chief Technical Officer, Adobe
Phill Miller, Chief Learning and Innovation Officer, Blackboard
Shantanu Sinha, Director, Product Management, Google
Michelle Barrett, Ph.D., VP of Research Technology, Data Science, and Analytics, ACT
Anne Little, Ph.D., Vice President, Training Solutions Development
Stacey Poll, U.S. Public Sector Business Development Manager, Questionmark
Michael Freeman, Consultant, Training and Learning Technologies
Michael Smith, Senior Technical Specialist, ICF
CONTENTS
FOUNDATIONS
01 Modernizing Learning..................................................................... 3
J.J. Walcutt, Ph.D., Director of Innovation, ADL Initiative
Sae Schatz, Ph.D., Director, ADL Initiative, Office of the Deputy Assistant
Secretary of Defense for Force Education and Training
04 Lifelong Learning........................................................................... 61
J.J. Walcutt, Ph.D., Director of Innovation, ADL Initiative
Naomi Malone, Ph.D., Research Scientist (Contractor), ADL Initiative
TECHNOLOGY
06 Interoperability............................................................................ 107
Brent Smith, R&D Principal (Contractor), ADL Initiative
Prasad Ram, Ph.D., Founder and CEO, Gooru
10 Personalization.............................................................................181
Jeremiah Folsom-Kovarik, Ph.D. Lead Scientist, Soar Technology, Inc.
Dar-Wei Chen, Ph.D., Research Scientist, Soar Technology, Inc.
Behrooz Mostafavi, Ph.D., Research Scientist, Soar Technology, Inc.
Michael Freed, Ph.D., Consultant, Reperio
LEARNING SCIENCE
11 Assessment and Feedback......................................................... 203
Debra Abbott, Ph.D., Metis Solutions contractor, Joint Special Operations
University, U.S. Special Operations Command
ORGANIZATION
16 Instructional Designers and Learning Engineers..................... 301
Dina Kurzweil, Ph.D., Director, Education and Technology Innovation Support Office,
Uniformed Services University of the Health Sciences, U.S. Department of Defense
Karen Marcellas, Ph.D., Instructional Design Team Lead, Education and Technology
Innovation Support Office, Uniformed Services University
Endnotes................................................................................................. 388
ACRONYMS
ADDIE Analyze, Design, Develop, Implement, and Evaluate
ADKAR Awareness, Desire, Knowledge, Ability, Reinforcement
ADL Advanced Distributed Learning
AI Artificial Intelligence
API Application Programming Interface
AR Augmented Reality
ASVAB Armed Services Vocational Aptitude Battery
BYOD Bring Your Own Device
cMOOC Connectivist Massive Open Online Course
CORDRA Content Object Repository Registration/Resolution Architecture
DARPA Defense Advanced Research Projects Agency
DHS Department of Homeland Security
DIS Distributed Interactive Simulation
DoD Department of Defense
EEG Electroencephalogram
EMT Emergency Medical Technician
ESSA Every Student Succeeds Act
FAA Federal Aviation Administration
FATE Fairness, Accountability, Transparency, and Ethics
FERPA Family Educational Rights and Privacy Act
FM Field Manual
fMRI Functional Magnetic Resonance Imaging
FYI For Your Information
GIFT Generalized Intelligent Framework for Tutoring
HLA High-Level Architecture
HR Human Resources
HSI Human–Systems Integration
HTML Hypertext Markup Language
HTTP Hypertext Transfer Protocol
I/ITSEC Interservice/Industry Training, Simulation and Education Conference
ICAP Interactive, Collaborative, Active, and Passive
ICICLE Industry Connections Industry Consortium on Learning Engineering
IDS Intrusion Detection System
IEC International Electrotechnical Commission
IEEE Institute of Electrical and Electronics Engineers
IEEE-SA IEEE Standards Association
InKD Industrial Knowledge Design
IoT Internet of Things
IPS Intrusion Prevention System
ISD Instructional Systems Design
ISO International Standards Organization
IT Information Technology
K-12 Kindergarten through 12th Grade
KPI Key Performance Indicator
LMS Learning Management System
LOM Learning Object Metadata
LRMI Learning Resource Metadata Initiative
LRS Learning Record Store
LX Learning Experience
LXD Learning Experience Design
MERLOT Multimedia Education Resource for Learning and Online Teaching
MOOC Massive Open Online Course
MSSP Managed Security Services Provider
NASA National Aeronautics and Space Administration
NGO Non-Governmental Organization
NYCRR New York Codes, Rules and Regulations
OECD Organisation for Economic Co-operation and Development
OER Open Educational Resources
OPM Office of Personnel Management
PERLS PERvasive Learning System
PII Personally Identifiable Information
PLATO Programmed Logic for Automatic Teaching Operations
R&D Research and Development
RFID Radio Frequency Identification
ROI Return on Investment
SaaS Software as a Service
SAKI Self-Adaptive Keyboard Instructor
SAMR Substitution Augmentation Modification Redefinition
SAT Scholastic Assessment Test
SCORM Shareable Content Object Reference Model
SIEM Security Incident and Event Management
SOC Security Operations Centers
STEM Science, Technology, Engineering, and Mathematics
TAPAS Tailored Adaptive Personality Assessment System
TECOM Training and Education Command (part of the U.S. Marine Corps)
TED Technology, Entertainment, and Design
UI / UX User Interface/ User Experience
VR Virtual Reality
xAPI Experience Application Programming Interface
XML Extensible Markup Language
xMOOC Extended Massive Open Online Course
Foundations
Modernizing Learning | 3
CHAPTER 1
MODERNIZING LEARNING
J.J. Walcutt, Ph.D. and Sae Schatz, Ph.D.
Emerging technologies are not only changing the formal education and train-
ing landscape, they’re also changing our access to—and relationship with—
information and, by extension, affecting the soul of how we think, interact,
develop, and work. Our expectations for educational institutions, how and
where learning occurs, and what personal developmental looks like have
changed—and will continue to evolve into the future. The preK–12 system,
higher education, federal and state governments, employers, and military
must similarly adapt to accommodate.
The landscape of learning has broadened, now encompassing the full spec-
trum of formal, informal, and experiential training, education, and develop-
ment. The traditional concept of education is changing.
changing Employers are placing
less value on formal degrees. Instead, experience matters. Life skills, such as
grit and teamwork, matter. Performance-based credentials, including com-
petency badges and micro-certificates, are taking the place of transcripts to
document individuals’ traits, talents, skills, knowledge, preferences, and ex-
4 | Modernizing Learning
We use the phrase “future learning ecosystem” to describe this new tapes-
try of learning. At the highest level, the future learning ecosystem reflects a
transformation—away from disconnected, episodic experiences and towards
a curated continuum of lifelong learning, tailored to individuals, and delivered
across diverse locations, media, and periods of time. Improved measures and
analyses help optimize this system-of-systems and drive continuous adapta-
tion and optimization across it. Its technological foundation is an “internet for
learning” that not only allows ubiquitous access to learning, it also provides
pathways for optimizing individual and workforce development at an unprec-
edented pace.
This book focuses on the human and organizational aspects of the future
learning ecosystem. It provides key terms and models, and it helps identify
the diverse professional sectors involved in the realization of this vision.
The United States Government has recognized a need for coordination among
the communities of learning scientists, organizational psychologists, software
and hardware engineers, teachers, talent managers, administrators, and other
innovators contributing to this concept. Simply organizing the multiple, in-
terdependent layers of the future learning ecosystem represents an enormous
undertaking, more so because its many facets must evolve in concert. Improv-
ing school classrooms, for instance, means little unless we also transform how
those experiences translate to collegiate, trade, business, and public-sector
settings. Similarly, developing systems for earning and communicating cre-
dentials creates scant value, unless we also understand how to authentically
measure the skills and attributes they accredit. And finally, even if we suc-
cessfully reshape every aspect of our learning and development systems, we
Modernizing Learning | 5
must simultaneously consider the larger cultural and societal shifts affected
by this new approach. How will the reconceptualization of learning affect
jobs, self-worth, loyalty to businesses, power dynamics, access to education,
governmental processes, and our nation overall? When the paradigm of learn-
ing (something so fundamental to each of our lives) evolves it will have ex-
pansive and exciting, but difficult to fully forecast, effects.
WHAT IS LEARNING?
At its most foundational level, learning is any change in long-term memory
that affects downstream thoughts or behaviors. The process of learning starts
with awareness of stimuli,1 cognitive encoding of that information,2 and its
retention in memory. Later, the knowledge must be retrievable (that is, not for-
gotten) and transferable to novel situations.3 Throughout our lives, every per-
son learns constantly—all the time, every day. What we each learn, however,
its veracity, applicability, intelligibility, and whether it aids or limits perfor-
mance all vary significantly. Each day, we must reconcile among the complex,
competing information vying for our attention—all vying to “teach” us.
Contributions from diverse fields—including
IT, data science, psychology, and learning
science—form a repository of complementary
recommendations; together, these define the
framework of the future learning ecosystem
The concept of learning applies across performance domains, not only to cog-
nitive development. It necessarily includes physical and emotional aspects as
well as inter- and intrapersonal, social, and cultural components. Certainly,
learning occurs in formal settings, in grade school classrooms or professional
workshops, but it also happens in self-directed, just-in-time, social, experien-
tial, and other informal ways.4 These varied experiences accumulate in long-
term memory and, fused together, affect how we respond to the world.5 In
other words, formal learning in combination with other life experiences col-
lectively determines someone’s readiness for work, public service, and other
life challenges.
To date, our education and training systems have generally focused on the
delivery and documentation of formal learning. As a result, we’ve fostered a
society that values the accreditation of formal training and education (think
college degrees) and proxy measures of aptitude (time-based promotions)
rather than life experiences and direct measures of competence. Of course,
this is based largely on our inability to measure, analyze, and share data about
the latter. With advances in technology, however, we’re surfacing informal
learning.
The growing visibility of, and access to, informal learning is reshaping our
conceptualization of learning: Increasingly away from a separate, fenced-off
and time-based activity and towards an integrated, diverse lifelong learning
continuum where all experiences and development add to an interdependent
set of holistic competencies. This paradigm shift means education is no longer
viewed as a linear and finite pathway, starting in grade school and culminat-
ing with a high school or university degree. Books and teachers, and other
hierarchical authorities, are no longer the primary gatekeepers of knowledge.
Vocational schools and formal apprenticeships no longer serve as the primary
pathways to develop trade skills. Individuals can even cultivate their athletic
abilities through self-developed and informal learning channels.
Informal learning means more than just self-directed study. Consider, for in-
stance, when a young person travels overseas for the first time. Perhaps with-
out intention, she learns about other cultures, people, history, and food, as
well as other, more subtle lessons about social dynamics, cosmopolitanism,
and even self-awareness. Undoubtedly, such experiences are learning, that
is, they impact long-term memory and change us. But how might society,
teachers, or employers value such learning? How do we record or account for
such experiences? How can we define and measure such seemingly intangible
qualities, such as worldliness, emotional maturity, or empathy?
FUTURE LEARNING
ECOSYSTEM
The future learning ecosystem is a substantive reimagination of learning and
development. This concept recognizes the increasing need for cognitive agili-
ty, meaning learning is no longer viewed as a single event—nor even a series
of events—but rather as a lifelong experience of continual growth.
growth Second,
the pathways through which learners progress must be personalized to their
unique attributes, skills, interests, and needs in order to achieve necessary
effectiveness and efficiency in learning. Finally, instruction and information
presentation methods must more strongly emphasize deep learning and expe-
dite the transfer of learning from practice to real-world settings.
settings 7
Modernizing Learning | 11
Technological Infrastructure
Design
TECHNOLOGICAL INFRASTRUCTURE
Flexible, interoperable technologies for pervasive learning
DESIGN
Intentional methods applied to o
ptimize learning
COMMITMENT
Contributions to a shared vision a cross communities
GOVERNANCE
Negotiation of standards, conventions, and ethics
POLICY
Regulations and recommendations f or behavior
HUMAN INFRASTRUCTURE
Diversely skilled individuals and organizational structures
Modernizing Learning | 13
Commitment
Governance
need to negotiate the conventions for sharing and protecting individuals’ data,
for designing and updating shared application programing interfaces, and for
balancing the competing interests of educational, commercial, and govern-
mental organizations. Accreditation bodies will need to evolve to accommo-
date new types of assessments and credentials. These governance bodies will
also have a responsibility to consider the social and societal impacts of this
new learning system. They will need to navigate a spate of new social and
ethical considerations, envision new legal and regulatory rules, and attempt to
envision the emergent risks and opportunities as the system matures. While
government will undoubtedly play a role, we—the stakeholders across highly
Modernizing Learning | 15
Policy
Governance bodies, along with the actual government and key performers
within the ecosystem, will inform policies for the future learning ecosystem.
Policy is the blueprint of recommendations and regulations that define guide-
lines for behavior within the system. Recommendations might include best
practices for collecting and personalizing learning in response to data. Regu-
lations, or rules put in place to protect the public, might include guidance on
the privacy, ownership, and commercialization of learners’ data. Nearly all
innovation carries a double-edged sword: Creative foresight, social account-
ability, and ethical principles will need to guide employment of the future
learning ecosystem for our public sector as well as personal and business-re-
lated interests.
Human Infrastructure
This book examines the future learning ecosystem concept, our collective
progress towards its realization, and the pivot our systems and society need to
make away from formal, detached education and training towards experien-
tial, personalized, interconnected learning journeys. The U.S. Government’s
ADL Initiative has taken the lead in designing this book and is helping to
coordinate across the broad stakeholder community, both conceptually and
practically. The following chapters in this publication provide a snapshot of
the achievements the ADL Initiative and other contributors have made to date,
what we need to build for tomorrow, and what this near-future system will
enable our children, workforce, society, and military personnel to achieve.
CHAPTER 2
HISTORY OF
DISTRIBUTED LEARNING
Art Graesser, Ph.D., Xiangen Hu, Ph.D.,
and Steve Ritter, Ph.D.
Certainly, others have written more robust historical accounts, for those in-
terested in more detail. For instance, in a now classic article, Soren Niper
outlines the three historic generations of distance education, starting with cor-
respondence teaching, followed by multimedia offerings (e.g., cassettes and
television broadcasts), and finally the third-generation, involving information
and communication technologies.1 Building upon Niper’s framework, Mary
Simpson and Bill Anderson wrote a brief and accessible overview of the “His-
tory and Heritage in Distance Education.” 2
1980s
In all historical accounts of distributed learning, authors seem compelled
to highlight its analog foundations—hand-painted slides illuminated by oil
lamps in the 17th century, correspondence learning by mail in the 18th centu-
ry, or silent films in the early 20th.6 However, for our purposes, the history of
distributed learning meaningfully begins in the 1980s. This decade witnessed
the rise of personal computers, with widespread adoption in most schools
beginning around 1983.7 Their proliferation ushered in Niper’s so-called
History of Distributed Learning | 19
learners through problem steps, give hints, and provide teacher-like feedback.
The more advanced intelligent tutoring systems showed even higher learn-
ing gains, an effect size of .76 standard deviations, according to more recent
meta-analyses conducted by James Kulik, Phil Dodds, and Dexter Fletcher.11
Many of the early instructional technologies weren’t yet distributed, but that
was changing. Throughout the 1980s, U.S. federal agencies, including the De-
partment of Defense, National Science Foundation, and Department of Edu-
cation sponsored significant research on computer-based instruction, includ-
ing distributed learning.12 In 1989, the U.S. Office of Technology Assessment
delivered a Congressional report, called Linking for Learning, summarizing
the progress such investments had made over the decade:
The report also called for increased research on distributed learning, partic-
ularly regarding its effectiveness, methodology, and design. “The quality and
effectiveness of distance learning are determined,” it explained, “by instruc-
tional design and technique, the selection of appropriate technologies, and the
quality of interaction afforded to learners.” This was a job for instructional
designers.
The origins of Instructional Systems Design (ISD) trace back to the 1960s,
but the 1980s saw a proliferation of ISD models appear in the literature.
Roughly around this time, the ADDIE concept also materialized, apparent-
ly spontaneously,14 as a generic framework underpinning the various mod-
History of Distributed Learning | 21
els. Traditional ISD approaches grew out of the behaviorist paradigm, and
similarly, most early computer-based learn-
ing used drill-and-practice tactics grounded
in behaviorism.15 As Kulik observed at the
ADDIE
Analyze, Design, Develop,
time, “Most programs of computer tutoring Implement, and Evaluate
derive their basic form from Skinner’s work …an evergreen model, general enough
in programmed instruction. Skinner’s model to suit pretty much any process
Some educators in this decade also advanced an industrialized model for dis-
tributed learning, as best expressed by Otto Peters. He positively compared
distance education to industrial production, citing the division of labor, mass
production, realization of economies of scale, and reduced unit costs. His
model wasn’t intended as an instructional theory, but rather as an organiza-
tional concept that, in his own words, described the industrial “objectification
of the teaching process.” 17
Benjamin Bloom was also exploring the impacts of cognitive science on ed-
ucation. His influential research on the “two-sigma problem
two-sigma problem” attracted the
attention of many learning researchers. Bloom found that students who re-
ceive instruction via one-on-one (human) tutoring using mastery learning
techniques outperform those who receive group-based instruction in class-
rooms.19 This foundational study has become a rallying point for proponents
of computer-based adaptive learning.
significant impacts on this field. For instance, they experimented with com-
puter-supported intentional learning environments that enabled collaborative
meaning-making by helping students share ideas, pictures, and notes via net-
work computers.24 Projects like this influenced the wider field of educational
technology, encouraging a fundamental shift towards social learning.
The digital collaborations spawned in the 1980s led to contextually rich envi-
ronments in the ensuing decades. While Hiltz and her colleagues developed
virtual classrooms, other built entire worlds. Virtual worlds,
worlds or “synchro-
nous, persistent network[s] of people, represented as avatars, facilitated by
networked computers” 26 and synthetic environments,
environments or realistic simulated
environments, similarly emerged during this era. One example of this is Mi-
chael Naimark’s concept of “surrogate travel,” virtual recreations of real en-
vironments navigable via a LaserDisc.27 Another instance is the NASA Ames
Laboratory’s virtual reality system, which used stereoscopic head-mounted
displays and a fiber-optic data glove. Finally, Habitat, developed by Lucasfilm
Games in association with Quantum Computer Services, Inc., is often-cited
as one of the first attempts to develop a large-scale, multiplayer, commercial
virtual world.28 Such systems would require several intervening decades to
reach fruition, but the contributions of these forerunners can’t be understated.
viable learning modality until the 1990s and the rise of the global internet.
1990s
Computer-based learning continued to expand throughout the 1990s, in con-
junction with the increasing prevalence of personal computers, improvements
in their multimedia capabilities, and advances in computer networking. Most
notably, the 1990s were profoundly marked by the growth of the world wide
web (invented in 1989), and with it, broad access to networked communications.
History of Distributed Learning | 25
The power of the web to change society via education could not be ignored.
Marking its impact, the U.S. Congress established the Bipartisan Web-based
Education Commission in 1998, part of the reauthorization of the Higher Ed-
ucation Act. In the Commission’s subsequent—and evidence-rich—capstone
report, titled The Power of the Internet for Learning, it urged Congress to
make e-learning a center-piece of the nation’s education policy, saying “The
Internet is perhaps the most transformative technology in history, reshaping
business, media, entertainment, and society in astonishing ways. But for all
its power, it is just now being tapped to transform education. …It is now time
to move from promise to practice.” 44
The six promising trends cited by the Commission’s report included greater
broadband access; pervasive computing,
computing “in which computing, connectivi-
ty and communications technologies connect small, multipurpose devices,
linking them by wireless technologies;” 45 digital convergence,
convergence or the merging
of telecommunications, radio, television and other interactive devices into a
ubiquitous infrastructure; education technology standards;
standards emerging adaptive
technologies that combine speech and gesture recognition, text-to-speech,
28 | Modernizing Learning
Distributed simulation also saw marked progress during this decade. The
developments of SIMNET, the decade prior, had given birth to the era of
networked real-time simulations. Now, those same proponents that drove the
creation of SIMNET sought to develop synthetic environments capable of
seamlessly integrating live, virtual, and constructive simulations within a
common environment.47 Towards that end, engineers were developing new
interoperability standards to support synchronous instructional scenarios, in-
cluding the Distributed Interactive Simulation (DIS) and the High-Level Ar-
History of Distributed Learning | 29
The U.S. Government was also looking at better ways to leverage web-based
learning, particularly for military and workforce development. These require-
ments led to the creation of the Advanced Distributed Learning (ADL) Initia-
tive. The ADL Initiative traces its antecedents to the early 1990s, when Con-
tive
gress authorized the National Guard to build prototype electronic classrooms
and learning networks for their personnel. By the mid-1990s, DoD realized
the need for a more coordinated approach, and the 1996 Quadrennial Defense
Review formalized this by directing development of a Department-wide strat-
egy for modernizing technology-based education and training. This strategy
became the original ADL Initiative. In 1998, the Deputy Secretary of Defense
directed the Undersecretary of Defense for Personnel and Readiness, in col-
laboration with the Services, Joint Staff, Undersecretary for Acquisition and
Technology and the Comptroller, to lead the burgeoning program. He also
directed the creation of a department-wide policy for distributed learning,
development of a corresponding “master plan” to carry out the policy, and
resources for the associated implementation. Shortly thereafter, aspects of the
ADL Initiative grew into a federal-wide program, with a mandate to help uni-
fy e-learning systems through coordination, shared technology standards, and
the application of modern learning theory.
Part of the ADL Initiative’s mission involves technology standards for dis-
tributed learning. In the 1990s, standards such as Hypertext Transfer Protocol
(HTTP) and Hypertext Markup Language (HTML) were just appearing. Sim-
30 | Modernizing Learning
Whole books could (and most certainly have been) written about the techno-
logical advancements seen in the last decade of the 20th century. For our pur-
poses, a few other notable ones included the growing prominence of AI and
data mining,
mining availability of natural language interfaces, commercialization of
personal digital assistants and associated cellular communications, and cre-
ation of DVDs. Unprecedented demand for computational models also devel-
oped, encouraging researchers to craft extensive model sets for all manners of
industries including airport facilities, call centers, businesses, health centers,
and even fast-food restaurants.51 Cognitive modeling approaches,
approaches initially ex-
plored in earlier decades, started to be realized in applied systems. DARPA’s
Pilot’s Associate, for instance, incorporated artificial intelligence and cogni-
tive modeling to infer an aircraft pilot’s intentions and support her decision
making. These sorts of cognitive and neuroscience advances also marked this
era, and later lead president George H. W. Bush to designate it “the Decade
of the Brain.”
2000s
The 2000s continued to see acceleration in learning technologies, aided by ex-
panding broadband access, consumer smartphones, streaming video services,
e-book readers, and the rise of social media. As mobile phones permeated
across the globe, practitioners embraced mobile learning (or m-learning). In
developing nations, m-learning became a lifeline, delivering education to mil-
lions of otherwise disconnected or underserved people.52 Even in industrial-
ized countries, m-learning opened new doors, offering an innovative platform
for context-aware, pervasive learning.53
History of Distributed Learning | 31
The growing demand for e-learning software reinforced the need for asso-
ciated technology standards, such as the Learning Object Metadata (LOM)
and Dublin Core for defining content metadata, and the Sharable Content Ob-
ject Reference Model (more commonly known as SCORM
SCORM) specifications for
making e-learning content interoperable across systems.56 Dovetailing with
these specifications, researchers promoted the concept of “instructional ob-
jects,” or encapsulated learning materials that could be remixed and reused.
As Fletcher predicted in 2005:
With such goals in mind, proponents began creating learning registries and
content repositories—federated systems intended to support seamless discov-
ery and access to content, such as the Content Object Repository Discovery
and Registration/Resolution. Architecture (CORDRA)58 and the Multimedia
Education Resource for Learning and Online Teaching (MERLOT) project.
32 | Modernizing Learning
Although the idea of object registries has floundered somewhat in the inter-
vening years, 59 the promise of ready access to learning continues to gain
ground.
The campaign for open education also drove development of massively open
online courses or MOOCs. Although MOOCs wouldn’t become widely pop-
ular until 2012, they first appeared in 2008. Platforms, such as Udemy and
Peer 2 Peer University, were founded soon after, offering free online cours-
es to thousands of students. MOOCs also introduced a new learning para-
digm. The first MOOCs grew out of connectivist learning theory,
theory developed
by George Siemens and Stephen Downes. Dubbed “a learning theory for the
digital age,” 62 connectivism suggests that knowledge is distributed across
networks of connections—particularly in our complex modern world. Con-
sequently, it emphasizes continuously learning, the ability to see connections
among information sources and across different fields, and the importance
of current, diverse knowledge. The original, connectivist MOOCs are some-
times called cMOOCs, to accentuate their emphasis on social learning, coop-
eration, and the use of collaborative learning tools.
distributed practice, this principle highlights that learning occurs best (that
is, is best encoded in and made retrievable from long-term memory) when its
presentation happens over time rather than massed into shorter, less frequent
intervals. Paul Kelley, headteacher at a British high school, helped popularize
spaced learning in his 2008 book Making Minds, which drew notably from
neuroscience principles. In it, he wrote, “As of this moment, scientific analysis
of learning has hardly made any impact on education. In contrast, knowledge
in areas of technology and science generally is growing rapidly. As we will
see, this knowledge is often quite at odds with the conventional wisdom of ed-
ucation. The scientific understanding of the human brain, and how it works, is
beginning to show that learning is not an abstract transmission of knowledge
to an infinitely plastic intelligence but a biochemical process with physical
limitations.” 73
Desire for increased, evidence-based rigor was also seen among assessments
of learning.77 Although not a new concept, learning scientists strongly pro-
moted the use of tests for learning,78 and urged teachers to move away from
multiple-choice items in favor of more active techniques, such as writing es-
says, which most teachers didn’t know could also be automatically graded
with high reliability.79 Relatedly, by the end of this decade, increasing com-
puting power and the expanding amounts of learning data encouraged the de-
velopment of learning analytics,
analytics led by George Siemens and his colleagues,80
and educational data mining,
mining led by Ryan Baker and his colleagues.81 These
History of Distributed Learning | 35
2010-PRESENT
From a learning science and technology lens, the 2010s blend into the prior
decade, but there are technological advances that have changed the landscape
dramatically. This decade ushered into our world accurate spoken language
understanding, smartphones at all spectrums of societies, ubiquitous gam-
ing and social media, tracking of performance in log files at fine grain sizes,
sensing algorithms that detect emotions and identity of people, MOOCs on
thousands of topics, hyper-realistic animated agents, collaborative problem
solving, and disruptive AI that will replace many jobs. It is impossible to fore-
cast the most impactful inventions of our current era. However, a few trends
already stand out for our current decade, but whether they will stand the test
of time remains to be seen.
MOOCs have continued to develop, although not without their critics and con-
cerns. More commonly, today, MOOCs follow the so-called Extended MOOC
model. These xMOOCs share some features with cMOOC, including open
We just finished up a manuscript for the Journal of
Cognition and Development describing where we’ve
come from in the learning sciences and where we’re going.
We traced the funding investments from the 1970s until now
and noted that the funding is coming from different places,
including multiple federal agencies and private foundations.
For example, the Office of Naval Research has a long track
record of funding in this space, as does the Department of
Education in many capacities—not just through the Institute
of Education Sciences but also through predecessors, like
the National Institute of Education.
access and large scale. However, where cMOOCs stress connectivist learn-
ing, xMOOCs generally use more traditional, instructivist methods, focusing
instead on scalability. Spanning both industry and academia, the most popu-
lar xMOOCs launched in 2012 including Coursera, edX, and Udacity. These
platforms, which attempt to provide learning at scale,
scale have been significantly
aided by the development of cloud computing in the 2000s and by the con-
sumer release of Amazon Web Services and Microsoft Azure. Cloud systems
made the “service” model of computing viable, freeing software applications
to become device and location independent, allowing for more frequent appli-
cation updates, and creating a near-infinite capacity to scale on-demand.
Cloud computing also helped realize the Internet of Things (IoT), the network
of smart devices that can connect to networks and share data. Cisco’s Chief
Futurist, Dave Evans, estimates the IoT was “born” around 2008 or 2009, but
researchers have only begun exploring its applications for learning.83 In the
context of education and training, IoT helps bridge real and virtual contexts,
allowing learners to interact with networked physical objects that also have
digital footprints.84 These objects might include embedded RFID sensors,
spatial beacons, or wearable technologies,
technologies such as FitBits or Google Glass.85
T3 INNOVATION NETWORK
In early 2018, the U.S. Chamber of Commerce and Lumina Foundations launched
the T3 Innovation Network to bring businesses, postsecondary institutions, techni-
cal standards organizations, human resource professionals, and technology vendors
together to explore Web 3.0 technologies for an increasingly open and decentralized
public–private data ecosystem. Since its kickoff, the Network has grown into a thriv-
ing network of over 128 organizations who are addressing three key challenges: (1)
The need for harmonization among technical data standards groups to ensure data is
interoperable and shareable across systems and stakeholders; (2) The need to apply AI
solutions to improve how learning objectives, competencies, and skills are authored,
translated, and distributed; and (3) The need to empower learners and the American
worker with data to improve their agency and ability to manage and connect to oppor-
tunities in the talent marketplace.
still lacks a systematic, widely accepted approach to estimating costs and de-
velopment time for building and testing these complex learning environments.
With the increasing automation in education and training, there’s been a cor-
responding push to create semantically rich data, that is, to give the mean-
ing to the underlying data elements—in ways computers (and other humans)
can understand. The developers of xAPI, for instance, are attempting to build
semantically rich usage profiles as well as published, shared vocabularies.
Proponents of competency-based learning are attempting a similar feat, but
in their case, to define the data elements that make up a human competency.
Volunteers supporting the IEEE established a working group in 2018 to revise
the decade-old Reusable Competency Definition (1484.20.1), expanding its
utility and harmonizing it with other standards for competencies and compe-
tency frameworks.90
The working group’s efforts are timely, as more formal education programs
are embracing competency-based degrees, i.e., postsecondary programs
where students earn diplomas by demonstrating mastery through real-world
projects—rather than through time-based credit hours. In competency-based
40 | Modernizing Learning
programs, students are typically assigned learning coaches, rather than di-
dactic instructors, and they have access to an array of open-source resources,
including videos, textbooks, and online communities.91 As of 2014, there were
already an estimated 200+ competency-based learning postsecondary degree
programs in the U.S., but policy regulations are lagging.92 It’s not clear how
this trend will resolve, but we fully expect the core concept expand in the
coming years.
by Bob Sottilare, Avron Barr, Robby Robson, Shelly Blake-Plock, and others.
In 2018, Chris Dede, John Richards, and Bror Saxberg released their guide to
Learning Engineering for Online Education.94 Saxberg, who also serves as
a Consortium advisor and as vice president of learning science at the Chan
Zuckerberg Initiative, described the emerging discipline:
There will come a time when we look back at how we “used to do learn-
ing,” and, just as we now look at medicine in the 19th century, wonder
how we ever made progress without using the science and evidence that
we can now generate. We’re not there yet—but we may be on our way.96
Saxberg’s words ring true, not just for learning engineers but for the wider
learning and development sector. Much has changed as technology advanced
and learning science evolved. The concept of “distributed learning” has pro-
gressed, from its simple roots as a pragmatic tool to bridge the transactional
distance, to today’s cacophony of ubiquitous, adaptive, on-demand instruc-
tion. A central goal of the ADL Initiative and its larger community has always
been to bring clarity and coordination to this discipline. Today, more than
ever, the distributed learning community needs organizational, theoretical,
technological, and policy structures to bring unity. We are, perhaps, in the
middling ugly-duckling years of the field’s maturation. The promise of re-
sponsive and evidence-driven ubiquitous learning is there, crafted by con-
tributors for over 40 years. It’s now our challenge to resolve the complexity,
to bridge across its numerous facets as our connectivist peers have taught us,
infuse deliberate learning theory into our work as learning science scholars
advise, and, as the learning engineers promote, to embrace a comprehensive
approach to enhancing the full continuum of learning.
The first hurdle is to move past the “recorded slides and
talking head” form of online learning. The instructors need
to be trained on advances in digital learning technology and
methodology. The second hurdle is ensuring that the organization
has a modern experience driven learning environment that supports
these more interactive and personalized experiences. The third is to
communicate expectations between the instructor and learners that
this isn’t a lecture, rather, it’s a facilitated dialogue, not limited to a
particular place and time—but available for continuous reference and
enhancement.
John Landwehr
Vice President and Public Sector Chief Technical Officer, Adobe
Distributed Learning Instructional Theories | 43
CHAPTER 3
DISTRIBUTED LEARNING
INSTRUCTIONAL THEORIES
Scotty D. Craig, Ph.D. and Ian Douglas, Ph.D.
Learning has moved beyond the classroom. It’s happening everywhere, all the
time, formally and informally, incidentally and intentionally—and increas-
ingly supported by digital technologies. For more than a decade, online educa-
tion has consistently expanded.1 The U.S. Department of Education estimates
that 5.8 million students enrolled in distance
education courses in 2015, the most recent Distributed learning
year for which statistics exist, accounting should employ evidence-
for 28% of the total student population.2 The based practice, built on
Association for Talent Development report- the science of learning
ed that 88% of corporations offered e-learn-
ing as part of their workforce development in 2017, and 27% of high-perfor-
mance organizations used e-learning for a majority of their training.3 MOOC
clearinghouse Class Central reported that MOOCs also grew, serving over 80
million students in 2017.4
But the world isn’t so grim. The scholarly pursuit of learning science is in-
creasing. The National Academies recently released a sequel to their excellent
compendium, How People Learn. This new volume, How People Learn II,
published near the end of 2018,6 included new research on educational tech-
nologies, including findings on neurological processes, lifelong learning, and
the impact of social and cultural factors. There’s also growing awareness from
policymakers and administrators of the importance of learning science and
greater numbers of research programs at institutions such as the aforemen-
tioned Department of Education and World Bank.
In this chapter, we mix optimism with some healthy caution. In the next sec-
tions, we overview research that provides some guidance on designing for
technology-supported learning and practical best practices for establishing
associated design teams. We’ve omitted many quality theories, for the sake
of brevity, but will summarize a few of the most relevant to the design of
distributed learning. Our main goal is for readers to take away the ideas that
distributed learning theories exist, authors have taken steps to make them
accessible to practitioners, and new distributed learning systems—whether
Distributed Learning Instructional Theories | 45
INSTRUCTIONAL THEORIES
As outlined in the preceding chapter by Art Graesser and colleagues
(Chapter
Chapter 2),
2 learning science theories have generally evolved with the zeit-
geist of cognitive science. Early educational theories followed the behaviorist
model, emphasizing drill-and-practice tactics, reward and punishment, feed-
back, and repetition. Cognitivist theories came next. In contrast to the be-
haviorists, cognitivists sought to understand the mind and apply principles of
cognitive processing to the design of learning content. A third prominent par-
adigm, constructivism, followed. Constructivists argued that humans create
rather than acquire information; it’s therefore impossible for some “correct”
understanding of the world to be transferred from one person’s memories to
another. Individuals must learn through engagement.7
1 st
PRINCIPLES OF INSTRUCTION (DAVE MERRILL)
Both Merrill and Ambrose et al.’s work recommends that practitioners create
active learning environments. However, in practice, this suggestion is often
watered down, distilled to superficial criteria like measures of classroom at-
tendance or homework completion, or it’s otherwise simplified to proxy in-
dicators, such as attitude or interest. None of these truly meet the mark. As
Michelene Chi and her collaborators have observed:
Chi and colleagues developed the Interactive, Collaborative, Active, and Pas-
sive (ICAP) framework to provide guidelines for fostering active learning
environments. The ICAP categories describe hierarchical levels of cognitive
engagement, with “passive” learning typically producing the weakest learn-
Seven principles for smart teaching from Ambrose et al.
ing outcomes and “interactive” learning often promoting the strongest. Inter-
active learning encourages learners to actively integrate new and prior knowl-
edge, draw inferences to fill knowledge gaps and confusions, and otherwise
enact strategies that build rather than merely rehearse knowledge, ultimately
supporting deeper learning and increased transfer to new domains. Notably,
this research highlights that it’s the way learners engage in different activities
that makes them more or less passive; learners’ engagement levels aren’t nec-
essarily “cooked in” to the instructional interventions, themselves.
what’s happening in learners’ minds, but some theories give guidance on how
to encourage better learner processes.
Louise Yarnall and her colleagues describe self-regulated learning in more de-
Chapter 15).
tail later in this book (Chapter 15 In short, one way to envision it is as a cycle,
involving different phases that someone undertakes to strategically and inten-
tionally improve performance.11 These phases start with task definition, where
someone works to understand the problem at hand along with any available
resources. This is followed by a goal setting and planning phase, where learn-
ers establish objectives and select tools and strategies to meet them. Next, an
enactment or engagement phase occurs, where learners implement their cho-
sen strategies and attempt to perform the task. Finally, there’s an evaluation
or adaptation phase, where learners assess their actions and outcomes, and
revise their goals, plans, and strategies, accordingly. Although these actions
are, by definition, learner driven, individuals without strong metacognitive
skills can be taught. For instance, teachers and trainers can provide scaffolds
to help guide learners through these self-directed learning processes.
allocate study time.12 Art Graesser built on prior work to define 25 principles
of learning (clearly an overachiever in learning frameworks!).13 These roughly
group into recommendations for reducing processing load, facilitating learning
by implementing strategies within (e.g., feedback and deep questions) and
around the learning content (e.g., testing effects and spaced learning), and
suggestions for helping learners understand the process of learning (e.g. self-
regulated learning and desirable difficulties). Finally, for a truly comprehensive
historic treatment, Peter Jarvis authored a three-volume set, beginning with
the book, Towards a Comprehensive Theory of Human Learning.14
INSTRUCTIONAL
TECHNOLOGY
THEORIES
Classic instructional theories emphasize learner-content, learner-teacher, or
learner-learner interactions. Starting in the around the 1960s, researchers be-
gan to also examine learner-interface dynamics, leading to unique pedagogies
for educational technology. Early work on instructional media involved com-
parison studies, often looking at technology-mediated versus traditional set-
tings. These found “no significant differences,” but this was the behaviorist era
and (as described below) instructors tended to employ the instructional media
in the same way they might deliver traditional teaching. In the 1980s, with
growing interest in the cognitive perspective, researchers began to look more
closely at media attributes and their interactions with individual differences.15
The SAMR model defines levels of technology use in teaching and learning.
The most basic, and most often implemented level is substitution, where the
Distributed Learning Instructional Theories | 53
technology is used to perform the same task as was done before. For example,
an instructor uses PowerPoint to replace acetate slides or students use laptops
to replace paper notebooks. Alternatively, the highest level is redefinition,
where technology supports new learning tasks that were previously incon-
ceivable. This level represents the future of learning and is a foundational
reason for reimagining instructional design.
REDEFINITION
Technology enables new tasks, previously inconceivable
MODIFICATION
Technology enables significant task redesign
AUGMENTATION
Technology acts as direct substitute, with functional improvement
SUBSTITUTION
Technology acts as direct substitute, with no functional improvement
54 | Modernizing Learning
Certainly, Walcutt and Schatz aren’t the first to suggest a wider aperture.
Badrul Khan,18 for instance, proposed an eight-dimension framework for
e-learning, comprised of institutional, management, technological, peda-
gogical, ethical, interface design, resource support, and evaluation factors.
Shahid Farid and colleagues built on Khan’s work.19 They use empirical data
from stakeholders about roadblocks to e-learning in postsecondary environ-
ments. Farid et al.’s model includes software, technical, institutional, person-
al, and cultural dimensions. Beatrice Aguti and her colleagues also devel-
oped a broader model for higher-education contexts, but this time for blended
learning. Their framework has four dimensions, including e-learning course
delivery strategies, e-learning readiness, quality e-learning systems, and ef-
fective blended e-learning.20 For our purposes, we’re less concerned with the
potential similarities and differences of these various frameworks. Our point
is simply that learning—and particularly technology-enabled learning—hap-
pens within a broader context.
Distributed Learning Instructional Theories | 55
This new team structure will also require strong leadership.23 Leaders respon-
sible for learning will need awareness of the expertise available to them and
know how to integrate different kinds of expertise into learning development
processes. They’ll need to understand evaluation, at multiple levels (such as
within the content, to assess learners, but also at an institutional level to evalu-
ate the learning experience, itself), and they’ll need to consider broader impli-
cations, such as privacy, ethics, and social factors. During learning design and
development phases, leaders will need to look for efficiencies. For instance,
they’ll need embrace the reuse of learning materials, looking for ways to re-
duce the cost of development efforts by reusing already developed content
elements, technologies, or tools.
56 | Modernizing Learning
Thus, learning leaders should continually ask themselves questions, such as:
• Do we have all the specific expertise on our team to meet our goals?
• Is the team working effectively as a community with shared purpose?
• Are we making good use of existing reusable resources and tech?
• Are our evaluation processes (at all levels) the best we can achieve?
• Are we aware of the evidence in support of each instructional resource,
method or technology we use?
• Do we have someone capable of interacting with the output of the
learning science community to identify relevant knowledge that can be
adapted into our process?
Projects That Work is an ongoing research study with the goal to provide teachers
data-driven information to make decisions to use service learning flexibly, efficient-
ly, and effectively. The premise is that if schools and teachers have continuously
updated lists of projects that were highly rated by 20 or 25 previous classes
around the country, these projects would (a) be known to teacher and (b) could
be replicated, providing all students the opportunity to realize the potential of
what service learning has to offer.…Preliminary findings revealed that about 90%
of students were highly engaged by service learning and produced positive results
from many types of service learning projects. Many of the findings to date echo pri-
or research demonstrating the role of well-designed programs that include specific
activities to prepare students with a clear and compelling rationale for the project
and with specific roles and responsibilities. The key to replication in schools with
less expertise in service learning may focus on teachers having information on key
components of projects. It’s important to ensure that the projects are feasible for
teachers and students to do, and that they lead to students’ belief that they’re
making a difference and perceive that they’re learning.
We used AutoTutor for the Office of Naval Research and put it into ALEKS,
a commercial adaptive learning system. It went ok, but then we tried to do
a scale-up in a school district. We were able to get a big teacher preparation
session. They were reasonably optimistic. The strategy was to let them use
ALEKS on their own before getting AutoTutor. We found that initially a lot
of people liked it, but then they had school vacation and then after that they
had a huge snow storm and were out for about 8 days of schools. Then they
had a very short time for standardized testing for the state (about 5 weeks)
that resulted in universal attrition. In talking with the teachers, they had to
teach to the test, but ALEKS is based on mastery learning. It won’t allow you
to do topics you’re not ready for.…From a learning perspective, it makes
sense, long-term, but teachers have many logistical needs that aren’t directly
represented in adaptive systems. They have to have the kids know information /
knowledge at a certain time whether or not the student is technically
ready for it—even if they aren’t going to remember it. Their knowledge
repository might collapse later because they didn’t get the foundational
information when they needed it but it’s what they needed for the test.
Benjamin Nye, Ph.D.
Director of Learning, Institute for Creative Technologies,
University of Southern California
Newer repositories are now integrating evidence in support of the assets pro-
vided to the community. The What Works Clearinghouse from the Institute
Distributed Learning Instructional Theories | 59
CONCLUSION
In summary, extensive research has been conducted to inform instructional
theory, but there continues to be a gap between scholarly findings and their
practical application. However, there are many excellent resources for teach-
ers, trainers, instructional designers, policymakers, and administrators. Un-
fortunately, many of these resources still assume that learning will occur un-
der traditional (Industrial Age) conditions; so, consider them with caution.
Some theories have been developed specifically with instructional technol-
ogies in mind. Seek these out, but also remember it takes years to properly
validate a theory; so, watch out for hype, particularly when commercial profits
or someone’s reputation is on the line. Also, when designing for technolo-
gy-supported learning use a measure of creativity, to avoid succumbing to the
“just substitution” mindset. Similarly, also be willing to rethink the design,
delivery, and coordination of learning processes. Emerging technologies are
radically changing the ways we train, educate, learn, and develop, and they’re
similarly changing the ways learning professionals operate—embrace teams,
seek out shared materials, and embrace a culture of reuse.
Education in the future will be more of an iterative process.
Currently, people pursue their education at the beginning of
their lives and then they go to work. Education beyond that initial
period typically only happens because of some disruption in their
lives—they lose their job or other changes in circumstances. It’s
difficult to access at that stage in life, but in the future, while you’ll still
have early-life education, it might look a bit different—with more of
an emphasis on work-ready skills and learning to continuously learn.
There will also be many more opportunities for dipping in and getting
back out of the workforce throughout someone’s life. Education will be
more just-in-time and based on the needs of the moment. Technology
will support that, but it requires significant change in the way
education institutions operate and the way employers do things.
CHAPTER 4
LIFELONG LEARNING
J.J. Walcutt, Ph.D. and Naomi Malone, Ph.D.
The world has progressed in so many ways over the past 100 years, yet our ed-
ucational structures have stayed relatively unchanged. Incremental progress
has certainly occurred, to include improvements in classroom organization
and information delivery, but the developmental models, progression of for-
mal educational offerings, and recognition of learning via grades and degrees
have proven resistant to change. As a society, we still focus on controlled
settings for learning and group-based information delivery. The sequence is
linear, the instruction is split into finite end-points, and the whole process is
assessment-oriented.
However, learning isn’t confined to the classroom. The world outside the
schoolhouse is filled with limitless sources of potential learning. We’re in-
creasingly exposed to torrents of data, questionable “facts,” and diverse un-
connected information. It’s incumbent upon the individual—the learner—to
determine the value of that information and how it connects to other data or
experiences. The speed and diversion of information in our modern world im-
pacts our abilities to synthesize useful knowledge, effectively retrieve it, and
translate or apply it in practice.
That preparation doesn’t end at 18 or 25 (or even 100!) years of age. With in-
creasing average lifespans 4 and worldwide pace of change, continuous lifelong
64 | Modernizing Learning
learning has become a necessity. New inventions create or destroy whole in-
dustries each year, and AI is altering the nature of work in fundamental ways;
add to that increasing lifespans and the evolving view of employee-company
permanency. All this means that many people will change careers—not just
jobs—multiple times within their lives.5 Thus, we need to expand the time-
frame of learning beyond K–12 and even beyond traditional higher education
and vocational schools. While these forms of formal, developmental educa-
tion are likely to persist for some time, we can expect more learning to occur
later in life—in the 30 to 65 age range.
FOCUS
EDUCATOR
EXPERIENCE
TIMING
ACCESS
Distributed systems-of-systems,
Dedicated systems in silos, an interconnected ecosystem
often focused on formal learning
TECHNOLOGY
66 | Modernizing Learning
• New solutions for a rapidly changing world with diverse global challenges
• New transformative competencies for innovation, responsibility, and awareness
• Learner agency—the responsibility for one’s own education throughout life
• A new, broad set of desired knowledge, skills, attitudes, and values
• Individual and collective educational goals for wellbeing
• Design principles for eco-systemic change
1. Learning is lifelong
life and is shaped by individuals’ behaviors.9 What and how much individuals
learn depend on a variety of micro- and macro-level factors. Micro-level fac-
tors include individual choices, motivations, and the ability to self-regulate,
particularly outside of formal education settings. Macro-level factors include
learners’ neighborhoods, societies, and cultures.
Some of these factors make adults particularly well-suited for learning. Clar-
ity of interests and goals, and greater self-awareness make this time-frame
conducive to personal growth and often encourage a greater motivation to
learn. Adults also have a greater wealth of experiences to draw upon, which
can help them synthesize new information more deeply and efficiently.10 How-
ever, placing the control of learning into adults’ own hands may encourage
them to focus too narrowly on limited, task-specific forms of learning. We’ll
need structures that protect and support a comprehensive view of learning.
Otherwise, we risk having deep experts embedded within stovepiped knowl-
edge communities who lack a general understanding of how the pieces fit
together to work within a holistic, efficient system.
It’s just a subset of the larger territory that we’re looking at; it’s an under-
appreciated subset but important for our economy and civic health. We
need to recognize that the world is changing and that we don’t leave people
out to dry because their first career fizzled out and dried, and we didn’t have
a mechanism to help them out. Under the spotlight, we have K–12, higher
education, and retirement, but when you have a career change and the world isn’t
helping you, it’s murky. We held a conference recently focused on the concept
of education ages 15–75. We asked, “How do we make that a different span of
life during which people feel supported? Do we need unemployment insurance?”
We’re interested in figuring this out. For example, what if I’m really struggling and
I don’t know if I want to be a researcher or a designer? The real question now is
what do you want to be first? We didn’t have those dialogs in the past; it’s totally
different now.
Christopher Dede, Ed.D.
Wirth Professor in Learning Technologies
Harvard Graduate School of Education
70 | Modernizing Learning
COGNITIVE DEVELOPMENT
Although mature theories of cognition and learning already exist, these will
need to be expanded and potentially reevaluated within the future lifelong
learning model. Discussions of cognitive development usually point back to
the foundations built by Jean Piaget (1936) and Lev Vygotsky (1978).12 Piag-
et’s theory of cognitive development defined four critical periods in which a
young child develops sensorimotor intelligence, preoperational thought, con-
crete operations, and, finally, formal operations. Interestingly, the final stage
spans ages 11 to adulthood. People who reach this final stage (and not all do,
according to Piaget) are able to think abstractly. Since we now know that
learning occurs throughout an entire lifetime, what happens after reaching
this stage? Vygotsky’s sociocultural theory of cognitive development offers
some answers; it focuses on a person’s journey to individualized thinking
through a co-constructed process of social and cultural interaction. Therefore,
the individual learns either by using self-regulatory tools (e.g., self-speech) or
by observing and/or taking direction from others. Though both Piaget’s and
Vygotsky’s theories recognize the interplay between self-development and di-
rected learning, they take some opposing views; neither accounts for develop-
ment across the lifetime, and neither consider how a person can achieve a set of
meta-skills across disciplines, experiences, and formal and nonformal learning.
SOCIAL DEVELOPMENT
EMOTIONAL DEVELOPMENT
Research suggests that early emotional regulation skills have a significant im-
pact on development and outcomes in later life.19 For example, emotional reg-
ulation is part of the spectrum of skills needed to be successful in the class-
room. The emotional regulation and interpersonal strategies children develop
in early years allow them to navigate the school system, and more than that,
these skills become key tools for success in life—arguably more than the ac-
ademic knowledge itself. But can these skills be taught? Substantial evidence
exists 20 that suggests: yes. Explicit teaching of social and emotional skills
PHYSICAL DEVELOPMENT
Formal investigation into motor and physical development traces its founda-
tions to the 1920s, when doctors began weighing infants to determine if they
met appropriate growth benchmarks.22 More significant research began in
earnest the 1970s and 1980s, spurring significant advancements in the under-
standing of average motor development, constraints both within and external
to a person, and the benefits of aiding, enhancing, and improving motor skills.
However, like other developmental domains, much of the research in physical
development has been limited to early childhood and disorders, with some
unique focus areas for special populations such as sports and military per-
sonnel. Yet, beyond the scope of these specific groups, general physical mat-
uration and the impacts of motor skills and practice have been less studied,
although that is changing.
3. Learning is ubiquitous
Lifelong learning comprises all phases of learning and stages of life, and it
occurs across diverse contexts, from school to the workplace, at home and
within the community.27 Lifelong learning activities can happen in formal
settings (e.g., courses offered by a university), nonformal contexts outside of
fully structured institutions (e.g., meet-up workshops), and in informal and
spontaneous ways (e.g., while chatting with a co-worker or reading a post on
social media).28
Learning already occurs in all of these ways, all the time, and everywhere. To
date, however, we’ve largely documented (and, subsequently, largely valued)
only formal learning experiences. Informal and experiential learning can have
as much, or even more, impact on individuals’ abilities to acquire, assimilate,
and apply knowledge. With the development of data science, machine learn-
ing, and interoperable data standards that allow us to measure and classify
experiences, we’re unlocking the ability to better capture and communicate
a person’s true skill level as well as his or her ability to perform in a variety
of settings and across communities. It’s irrelevant where a person “learned”
something—the transfer of that learning into practice is what matters.
The idea that learning happens everywhere and all the time isn’t new. Rather,
it’s our ability to measure it and communicate about it (e.g., through compe-
tency badging and credentialing) that’s novel. This also ties to the whole-per-
son principle described in the preceding subsection. That is, various skills
contribute to someone’s success in the world. In military contexts, for exam-
ple, there’s much talk of grit and resilience, and in higher education, we often
reference executive functioning and well-roundedness; however, such capaci-
78 | Modernizing Learning
ties are rarely measured or reported in transcripts and personnel records. As-
sessing their applications in real-world contexts and giving “credit” for other
lived experiences will also enable our ability to create personalized learning
trajectories, improve talent management into the future, and create equitable
opportunities for more people.
Finally, an asset model can better support a focus on continual, lifelong learn-
ing. The structure of this type of model naturally defines success at every
level, with every addition, and yet has an infinite number of notes, skills, and
competencies that one can attain. The reframing of both the learner and the
educational system can aid in the reimagination and refocus on how we can
improve the system and work toward optimization of each individual, rather
than focusing on creating able-workers ready for an industrialized nation.
IMPLEMENTATION
The previous section outlined a vision for lifelong learning in the future. This
section outlines specific steps we can take towards that vision.
these changes, it would be wiser to help cultivate the ecosystem more holis-
tically. We need to collect evidence and recommend best practices about the
elements within it and their collective impact as well as incentivize those ele-
ments that bring out its best features—for individuals and society, writ large.
Learning science, both its extant research and its inquiry principles, can aid
this endeavor, but we must commit to using it for this larger vision.
82 | Modernizing Learning
CHAPTER 5
LEARNING
EXPERIENCE DESIGN
Sae Schatz, Ph.D.
more likely to monitor the most superficial data and defer to familiar con-
cepts while ignoring conflicting evidence. Attention-deficit disorder specialist
Thomas E. Brown has even found that most people, i.e. those without the
syndrome, report symptoms similar to it multiple times a day, including the
inability to concentrate and to pay attention to what needs to be done.4 In
decision-making contexts, overload depletes mental resources, driving indi-
viduals to expedient (rather than optimal) choices, encouraging them to avoid
decisions or defer to negative or default options, and allowing unrelated emo-
tions to play an undue role.
Michelle Cottrell-Williams
Teacher, Wakefield High School
2018 Virginia State Teacher of the Year
gration of these practices, along with the strategic design of learning systems
and careful attention to their practical interaction details must be considered.
Hence, this chapter focuses on the design of learning experiences as a neces-
sary complement to the other critical elements informing the future learning
ecosystem.
With roots in UX, it’s unsurprising that educational technologists were among
the first to embrace LXD, nor that much of the discussion around it has con-
centrated on design thinking, usability, and interaction design methods for
technology-aided learning. LXD practitioners also frequently emphasize the
application of user-centered design, sometimes drawing a distinction with con-
ventional instructional design by contrasting LXD’s learner-centered meth-
ods.7 Increasingly, though, LXD proponents are widening its scope beyond
(learning) product design, focusing more on broad learning outcomes with
an extensive toolkit to apply towards this end. For instance, Margaret Weigel
and her colleagues with Six Red Marbles have begun emphasizing LXD’s ho-
listic approach to design and its synthesis of instructional design, educational
pedagogy, neuroscience, social sciences and UI/UX principles.8 There’s also
Learning Experience Design | 87
Like LXD, InKD considers interaction design and usability principles, and in
many practical ways the two concepts overlap. InKD, however, grew out of
different foundations and, as such, contributes some unique perspectives. It
adds to LXD by identifying a set of (1) foundational scholarly fields to draw
upon for theories and concepts as well as (2) practical applied fields from
which to derive actionable tools and processes. Specifically, InKD draws
from information science fields concerned with the analysis, collection, clas-
sification, manipulation, storage, retrieval, movement, dissemination, and
protection of information. These include, for instance, instructional design,
knowledge management, informatics, semiotics, and media design. It synthe-
sizes these with neurocognitive fields concerned with how individuals interact
with data, process information, and form knowledge; these include, for exam-
ple, learning science, cognitive science, human factors psychology, cognitive
ergonomics, and marketing.
Emily Musil Church, Ph.D. For example, marketing and related dis-
Executive Director of Global Learning, ciplines such as consumer behavior, pub-
Prize Development and Execution, XPRIZE
lic relations, and advertising offer ample
guidance applicable for learning. While
that may sound surprising, in practice,
marketing and learning professionals
share many similar goals: Both try to
understand their audiences, generate mo-
tivation, capture attention, make their
messages memorable, and affect their
audiences’ downstream behaviors. Of
course, marketers generally want to sell
products or services, while learning pro-
fessionals may seek to foster an accurate and robust understanding. Still, the
techniques are often the same.
Behavioral economists Cass Sunstein and Richard Thaler (who also received
a Nobel Prize for his work) have expanded the field, widening it to explore
nudge” decisions at large scales. Their canonical book, Nudge,20 out-
ways to “nudge
lines principles for subtly coaxing people towards better choices. Proponents
have used these to great effect. For instance, Collin Payne and colleagues used
small cues at a grocery store to increase shoppers’ likelihood to buy fresh
fruits and vegetables (e.g., designated sections for produce in shopping carts
and big green arrows on the floor). These yielded a 102% increase in pur-
chasing for fruits and veggies, with 9 out of 10 shoppers following the green
arrows to the produce section when first arriving at the store.21
Learning Experience Design | 91
96%
Under each tenet, HSI practitioners have developed systematic processes, de-
sign tools, and documentation methods. While many of these are designed
for projects involving highly complex sociotechnical systems (e.g., building
a new aircraft carrier), they can provide LXD designers, at any level, with
inspiration and an extensive toolkit to draw from, and HSI’s core tenets serve
as valuable touchstones for LXD, as well.
Summary
Commercial fields also offer useful methods. For instance, experience de-
sign has concepts, methods, and use-cases for constructing memorable and
motivating holistic experiences, often at scale through mass customization
techniques. Similarly, behavioral economics helps us understand more about
individuals’ real-world (“predictably irrational”) decisions, and it teaches us
ways to “nudge” behaviors, whether to persuade individuals or shift whole
communities.
Finally, LXD designers can leverage the four HSI principles as well as its
robust collection of established processes and developer tools. Notably, HSI
uniquely contributes methods for integrating human-centered design princi-
ples with systems engineering, balancing local outcomes against global con-
siderations, and facilitating these designs at scales within production teams
and formal organizations.
94 | Modernizing Learning
RECOMMENDATIONS
Each of the fields of study discussed so far offers a wealth of insights for
learning design. Below is a list of recommendations drawn from across them,
although it surely only scratches the surface.
Logically, then, the program manager may select the most economical ap-
proaches for creating that exposure. Meanwhile, the instructional designer is
likely given a stack of materials and told to “train” employees on them—al-
beit with limited resources. Now, his apparent goal becomes communicating
as much information as possible under challenging constraints. Subsequently,
supervisors’ goals become checking off each employee from a completion
list, and employees’ goals become completing the training as quickly as pos-
sible.…and so on until, ultimately,
everyone’s best intentions yield lim-
ited actual utility. We did a fairly broad study with 47
large, well-known companies from
UX and user-centered design have around the world, and we synthesized
proven processes for uncovering the attributes of their learning
organizations. In all cases, what we
strategic goals and designing solu-
found was that they are mission-
tions for them; so, LXD already
focused. They created an architecture
excels in this area. Jesse James Gar- clarifying how data-driven decisions
rett’s Elements of User Experience 23 about training connect to the mission.
is an oft-cited resource for learning Their organizational structures focused
on growing internal people and were
designers, even though his work
really helpful to outcomes and buy-in.
focuses on digital product design,
more generally. His five-layer model Michael Smith
starts with Strategy (defining goals Senior Technical Specialist, ICF
and user needs), and then progress-
es through Scope (requirements
and specifications), Structure (in-
teraction models and architectural
design), Skeleton (interface, naviga-
tion, and information designs), and
Surface (sensory elements and aes-
thetics) elements.
Results published by the National Academies Press show that only 34% of
technology development projects in the U.S. are successful,
successful and projects most
frequently fail because “(1) an inadequate understanding of the intended users
and the context of use, and (2) vague usability requirements, such as ‘the sys-
tem must be intuitive to use.’” 24 As education and training increasingly rely
upon technology, it’s important to incorporate UX, interaction design, human
factors, ergonomics, and other closely related human-centered disciplines into
learning design processes.
Cognitive science and behavioral economics teach that humans are predict-
ably irrational. We’re prone to making expedient (rather than optimal) deci-
sions, substantially more motivated to avoid loss than seek gain, and vulnera-
ble to a slew of other biases. Recognize that learners have these “flaws.” That
doesn’t imply you should deceive or condescend—none of us is a rational
actor! Rather, acknowledge and design for the messiness of humanity. This
may mean, for instance, designing for emotional effect or carefully avoiding
information overload during a learning experience.
Similarly, don’t forget about the power of aesthetics when designing for hu-
mans. Psychological research actually shows that “pretty things work bet-
ter”—that is, individuals’ perception of aesthetics directly impacts their per-
formance outcomes.27 Such aesthetic principles have been well codified for
most media by applied creative types; however, practitioners of more “se-
rious” disciplines are often more hesitant to invest in them. In fact, some
subcultures, such as certain academic disciplines or military sectors, wholly
reject the application of aesthetics (under the assumption, presumably, that too
much polish will detract from the “seriousness” of the message—even though
scholarly research supports the positive impact of quality aesthetic design).
Learning Experience Design | 99
tioners will likely need to modify this model. What’s more important than
its specifics, however, is the broad, system-wide perspective it encourages.
When designing a learning experience, it’s useful to not only consider its
delivery but also, for instance, how many learning professionals are needed
to implement it (manpower), what skills those professionals need (personnel),
how they’ll be preparation for their roles (training), and the context in which
they’ll deliver the intervention (habitability).
Explicit in the “learning ecosystem” concept are the notions of diversity and
interconnectivity—across an entire lifetime (or, at least, career). This con-
nectivity creates new opportunities for us to consider learning experiences
in concert rather than as isolated events. Other chapters in this book discuss
Chapters 4 and 12
instructional strategies for connecting learning events (Chapters 12, in
particular). This chapter, however, adds practical considerations that LXD is
uniquely positioned to address.
LXD, like the future learning ecosystem concept writ large, represents a syn-
thesis of varied and emerging disciplines. Learning design teams in the fu-
ture will likely involve instructional designers, learning scientists, learning
engineers, technologists, data scientists, and other professionals. LXD fills a
unique void, helping to integrate the diverse perspectives across these team
members, giving voice to learners’ (and other stakeholders’) needs, and en-
couraging the use of disciplined human-centered design practices.
It’s impossible for any one person to thoroughly know all of the disciplines
that inform LXD, but it’s important for LXD designers to avoid “reinventing
the wheel” with their work. As this chapter has shown, many existing domains
offer useful theories, processes, use-cases, and tools. Seek out these prior
solutions; curate and remix them for your own purposes. Look in creative
places, such as the advertising literature or systems engineering manuals, and
look to conventional principles of instructional design, learning science, and
cognitive psychology, too. This discussion on LXD isn’t meant to supplant
those important fields but rather to supplement them by integrating design
principles that consider human-system interactions, applied cognition, orga-
nizational dynamics, and user experiences. Together, in synthesis, these vari-
ous methods can help learning designers to not only create quality instruction
but to better achieve learning outcomes for real people, in real-world contexts.
Everyone comes through the same
education system, and we get locked
into believing that’s the way we
learn—when we really don’t.
Doug Tharp
Senior Learning Project Manger
Nuclear Regulatory Commission
Technology
Interoperability allows data to easily flow, even among applications
developed for different purposes, using a standardized vocabulary,
structure, and cadence. Interoperability implies common standards
that promote system-to-system communications, potentially across
organizational boundaries and institutional firewalls, using specified
data formats and communication protocols. These standards form
the fundamental building blocks for technology-enabled lifelong
learning by establishing consistent protocols that can be universally
understood and adopted by related systems to enable data exchange
about learners, activities, and experiences.
Interoperability | 107
CHAPTER 6
INTEROPERABILITY
Brent Smith and Prasad Ram, Ph.D.
system can’t always integrate with data from another, which means learning
records aren’t easily transferable between institutional systems and across or-
ganizations. Training and education institutions don’t even record the same
learner activities or capture learner achievement information in the same for-
mats, which further complicates our ability to aggregate data.
FORMAL LEARNING
PROGRESSION
Beginning with K–12 education, most state educational systems use products
from multiple vendors, and each district deploys their systems independent-
ly. Historically, these applications have used limited (or no) underlying data
standards. Instead, most employ their own internal data models, and inte-
gration across systems requires a patchwork of connections at the state and/
or local levels.1 Consequently, there are gaps in the integration among dis-
parate applications, and many systems are simply not interoperable. Ideally,
data from multiple products, such as learning management systems, student
information systems, and learning object repositories, would be aligned to the
same common data standards, enabling seamless coordination across these
applications.2
The existing higher education system is also its own stovepipe. With its fo-
cus on credit hours, semester-long courses, and formal credentialing, these
institutions often fail to account for new practices available in a digital, and
globally connected, world—such as emerging global online learning envi-
ronments that increasingly blur formal and informal practices. Students are
now much more interested in interactive and self-guided approaches, and with
so much information online (and often available for free), universities are no
longer the only places to find higher-level learning. Consequently, the value
Interoperability | 109
Within military education and training there are many different schools and
training programs designed to foster technical, professional, and leadership
skills in service members. Many of these programs, their instructional tech-
nologies and personnel information systems, exist in stovepipes. Further, his-
torically there’s been a separation between the education and training com-
munities across the U.S. Defense Department. Education traditionally occurs
incrementally and involves grappling with ambiguity while thinking and re-
flecting about the concepts being learned.3 Training is linked to readiness and
offers opportunities to apply knowledge, skills, and abilities in a manner that
provides immediate feedback and progress measurement.4 Within the current
context, training and education have different reporting structures, motiva-
tions, and logistical requirements such as fuel, personnel, and the access to the
appropriate environments or equipment. Combined, this leads to data being
acquired from many different sources but with little-to-none of it standard-
ized or connected.
110 | Modernizing Learning
Types of Interoperability
Rapid technological change has become the norm in the modern landscape of
training and education. Within learning contexts, the pressure of such chang-
es is felt acutely by educators, trainers, administrators, and learners alike.
Table 6-1 shows a different view of the various learning technologies, en-
vironments, organizations, and outcomes a given learner might encounter
throughout his or her career. This matrix highlights the numerous types of
interoperability required to facilitate a future learning economy. This is large-
ly due to the organizational design of the current learning landscape as well
as the different reporting structures and responsibilities for when and where
training and education occur.
VISION
Common standards and shared technical specifications create the underpin-
nings needed for the future learning ecosystem, from a technology interopera-
bility perspective. These standards consist of published documents that estab-
lish key interface specifications, communication protocols, and data structures
designed to facilitate interoperability among connected components. In this
context, interoperability specifications form the fundamental building blocks
Interoperability | 113
Competencies
Interoperable frameworks that form the
“common currency” of the future learning ecosystem
and these various elements within a competency framework can have many
nonexclusive relationships with one another.
Activity Tracking
Data about learners’ performance and behaviors
a subject (the person doing the activity), a verb (what the person is doing), and
a direct object (what the activity is being done to or with); optionally, other
elements that describe the performance context can also be incorporated. The
resulting dataset tells the story of a person performing an activity. Examples
include “Mike posted a photo to his album” or “Emily shared a video.” In
most cases, these components will be explicit, but they may also be implied.
Within the future learning ecosystem, activity streams need to capture what
individuals do, which learning activities they perform, and how they perform.
Each entry in the stream should be timestamped, meaning that a learner can
have progress measured as a function of time, not simply a function of state.
The goal of activity streams is to provide data (and metadata) about activities
in rich, human-friendly formats that are also machine-processable and exten-
sible. This interaction data will need to be published by any activity a learner
engages with. In some instances, data might be generated by a learner’s per-
formance, and, in other cases, a system might generate data based on system
events or key milestones achieved by a learner. Alternatively, data may be
Interoperability | 117
generated to establish the context of the learner, the application, or other com-
ponents within the learning ecosystem.
The subject of an activity is nearly always the learner but could, foreseeably,
be an instructor, cohort, or other human or machine agent. The direct object of
an activity depends on its context, as do the verbs (although to a lesser extent).
Universal terms, particularly verbs, will need to use a common vocabulary
across systems, otherwise the data will lack semantic interoperability and lose
much of its utility. By formalizing a common vocabulary, activities can ref-
erence an established set of attributes along with rules for how the dataset is
stored and retrieved by components in the learning ecosystem.
The current way learner records are managed is insufficient for the evolving
needs of instructors, learners, and organizations. Today, a transcript is typically
used to record learners’ permanent academic records. Transcripts usually list
the courses taken, grades received, honors achieved, and degrees conferred
from a formal academic institution. Only this most basic of information
follows individuals across their different learning episodes. Teachers and
trainers have little visibility into individuals’ past performance, such as what
other instructors have noted about them, the informal or nonformal learning
they’ve experienced, or their strengths, weaknesses, and individual needs.
Activity Registry
Arrays of diverse learning activities
To effectively enable activity registries, the resources they point to will need
to be described in some manner. Such descriptions are encoded as metadata.
In training and education, many different metadata formats have already been
explored, including Learning Object Metadata (LOM; IEEE 1484.1.1), which
is commonly used with SCORM managed content, the Dublin Core Metadata
Initiative, and the Learning Resource Metadata Initiative (LRMI).5
one or more educational events and/or other types of CreativeWork that aims
to build knowledge, competence or ability of learners.
Talent Management
Bridging education, training, and workforce silos
The systems around talent management need to work seamlessly. Within the
future learning ecosystem, an employees’ digital records will include data
from various stages of their careers related to recruitment, training and de-
velopment, and performance management. Many new standards are actively
being developed through the International Standards Organization (ISO) rele-
vant to business-crucial areas such as compliance and ethics, workforce costs,
diversity, leadership, occupational health and safety, organizational culture,
productivity, recruitment, mobility and turnover, skills and capabilities, suc-
cession planning, and workforce availability. All these areas contain specific
metrics and reporting recommendations. Creating systems that combine these
workforce data with other training and education information will enable the
Interoperability | 123
IMPLEMENTATION STRATEGIES
The future learning ecosystem promotes an increasingly complex world of
interconnected information systems and devices. The promise of these new
applications stems from their ability to create, collect, transmit, process, and
archive information on a massive scale. However, the vast increase in the
quantity of personal information being collected and retained, combined with
Interoperability | 125
our increased ability to analyze it and combine it with other information, cre-
ate valid concerns about managing these volumes of data responsibly. There
is an urgent need to strengthen the underlying systems, component products,
and services that make learning data meaningful. The following subsections
outline a foundation for an enterprise-wide learning ecosystem that can adapt
and grow with the needs of the organization.
The current landscape of disparate learning and personnel systems will con-
tinue to evolve for the foreseeable future. A cohesive data strategy needs to be
implemented to help identify all of the relevant data types required to support
the human capital supply chain, to define the relevance of different data types
over time, to identify an approach for capturing the decay of importance be-
126 | Modernizing Learning
tween different data types, and to identify authoritative sources for generating
each data type. An effective data labeling strategy will enable automation,
increased analytics, and an associated lifecycle for how long the different
data elements remain relevant. Data labeling attaches meaning to the differ-
ent types of data, correlated to the different systems that generate the data
across the lifelong learning continuum. This allows all systems in the learning
ecosystem to use the data as needed, such as to adaptively tailor learning to
individuals. Patterns of data should also be explored to derive additional in-
sights at institutional levels. Consider both structured and unstructured data
that may be generated in different areas, and develop clustering strategies for
how to organize the different data types so that all components have access to
the data they need.
These four steps provide a strategic framework from which a learning eco-
system can be built. These aren’t trivial tasks and will be implemented dif-
ferently in each organization, depending on its size, complexity, and goals.
Collectively, these steps allow organizations to embrace the future learning
ecosystem concept and to benefit from the rich data it will produce, allowing
businesses to maximize their workforces and learning-delivery organizations
to optimize and manage the quality of training and education experiences
they offer.
In thinking about risks associated with the learner data needed
to power personalized adaptive learning, privacy and security
are clearly at the top of my mind. But we need to expand the set of
values beyond those two in determining if the use of student data is
responsible /ethical. There’s value in advancing knowledge, ensuring
students are successful, and promoting the development of practices
that have the potential to affect a lot of people. This multi-faceted
approach isn’t new: A lot of these values are considered in the context
of human-subjects research reviews. It’s important for the academic
community to have a similar process for considering such a range of
values in evaluating our practice, in addition to our research.
CHAPTER 7
DATA SECURITY
J.M. Pelletier, Ph.D.
Data breaches, like traffic accidents, are inevitable. Yet, it’s also a requirement
that we progress as a nation to a digitized learner ecosystem. Accordingly, this
chapter describes the ways we can be proactive in simultaneously managing
the likelihood of occurrence, damages of impact, and potential for contagion
of breaches across learner data systems. An effective learning architecture re-
quires security to preserve privacy, to prevent cheating by the individual, and
to prevent intrusion by external threat actors. Accordingly, balanced effort
is required across the three pillars of security: confidentiality, integrity, and
accessibility. While most security investigations focus on confidentiality and
integrity, the access to that data enables timely and well-informed decisions.
Further, users are highly likely to invalidate security controls if accessibility
is inadequate. All of these concerns can be addressed by hardening devices
and networks in a way that places users at the center of each improvement.
To do this efficiently, data security design should enable individuals and or-
ganizations to limit the spread of breaches within current and future learning
architectures. Thus, this chapter describes principles and strategies that will
allow distributed learning environments to keep pace with developments in
cybersecurity.
AMATEUR THREATS
FOREIGN THREATS
SOCIAL ENGINEERING
INVESTMENT MODELS
SUMMARY OF CURRENT
BEST PRACTICES
Subsequently, cyberspace has become an asymmetric battlefield, upon which
attackers operate at a disproportionate cost advantage and seek to win through
attrition. While these problems may seem intractable, there are specific best
practices that can preserve the confidentiality, integrity, and accessibility of
distributed learning architectures without exorbitant expense. The single most
critical practice for security requires regular examination of standards, re-
quirements, protocols, and implementations. Effective cybersecurity requires
extensive review of technology specifics, which is far beyond the scope of
this document. Instead of an exhaustive review, we consider here a few extant
vulnerabilities within the current distributed learning protocols. The goal is
two-fold: first, to support immediate improvement and, second, to support
ongoing sustainment in security that will result in cost-efficient reliability
across distributed learning architectures. The implementation plan at the con-
clusion of this chapter recommends a process for further review and hands-on
validation, which will allow a rank-ordered task list after a structured risk
management process.
Security within and across each of these layers must allow for consistently re-
liable application without exposing organizations to unnecessary information
risk. This is especially important as data become increasingly standardized
across the wide range of learning interactions tracked by xAPI. Any security
evaluation starts with an assessment of each of the controls that are currently
in place. A preliminary analysis reveals that there are several vulnerabilities
that require immediate consideration.
KAFKA
Larry Smith
Technical Director
U.S. Marine Corps College of Distance Education and Training
Data Security | 135
Other examples of message-based middleware that can work for learning pro-
cessing are based on the Advanced Message Queuing Protocol 1.0, which is
an international standard (ISO/IEC 19464) with several implementation op-
tions that are optimized for smaller systems. Some of these options include
ActiveMQ, Apache Qpid, and RabbitMQ.6
Thus, we must consider the potential issues that can result from failures with-
in interdependencies among complex systems involving identification, access
control, authorization, auditing, network segmentation and boundary enforce-
136 | Modernizing Learning
Hardening Networks
The highly technical nature of a firm’s information storage and retrieval sys-
tem makes the Intrusion Detection System (IDS) and Intrusion Prevention
System (IPS) useful components for breach identification. While most in-
trusion detection and intrusion prevention systems monitor network traffic,
host-based anomaly detection can reveal and report unauthorized attempts
to access examination answers or to manipulate grades. There are also sev-
eral commercially available Security Incident and Event Management tools
(SIEM), which explicitly monitor network logs and data flows for indicators
of compromise. The inclusion of these tools is likely to significantly increase
awareness of security compromises, reduce detection timelines, and inform
organizational needs for response. For distributed learning, data streams
should be designed as one-way valves. Data lakes should be tightly patrolled
with a SIEM and organizational Security Operations Centers (SOC), which
monitor the SIEM data and conduct live response around the clock. Several
Managed Security Services Providers (MSSPs) provide SOC capabilities for
organizations that are too small to maintain their own defenses.
A more extensive review of the xAPI and Kafka standards, in light of the
Kerberos protocol, is likely to yield an elegant alternative to the current secu-
rity schema. Furthermore, the integration of a robust security layer within the
API can provide abstraction that simplifies the instantiation of authentication
mechanisms across content providers and distributed learning hosts.
HARDENING DEVICES
Implementation Recommendations
Broadly speaking, the plan for implementing security within the future learn-
ing ecosystem should include four phases. Some form of this plan is likely the
most rapid and cost-effective way to improve cybersecurity capabilities in an
extensible and forward-leaning manner.
CHAPTER 8
LEARNER PRIVACY
Bart P. Knijnenburg, Ph.D. and Elaine M. Raybourn, Ph.D.1
While this may nominally lengthen the development cycle, it prevents a sit-
uation where the system has numerous complex privacy settings and a com-
plicated privacy policy that learners are unable to navigate—or worse yet: no
privacy protections at all.
Data Collection
Many types of data might be available through a digital learning system, in-
cluding learner runtime activity, competencies, and context. Such data can
be collected anonymously or identifiably connected to a learner’s profile. The
data collection practices of a digital learning application can have unique pri-
vacy implications depending on the type of data collected, its source, and its
potential identifiability. This section discusses how to consider those aspects
when defining and developing the data collection practices of a digital learn-
ing application.
Learner Privacy | 145
PRIVACY DECISION-MAKING
Research has also demonstrated that user trust has a significant influence on
disclosure behavior in digital systems.6 Therefore, building trust is an im-
portant strategy for increasing acceptance of the data collection and tracking
practices employed by modern digital learning systems. Trust can be built
by ensuring learning applications originate from trustworthy sources, and by
employing sensible, transparent data collection practices from the outset.
outset
COMMUNICATION STYLE
Internet users also choose their social network based on their preferred com-
munication style. Research 9 suggests services that broadcast implicit social
signals (e.g., location-sharing social networks) are predominantly used by
“FYI (For Your Information) Communicators,” who prefer to keep in touch
with others through posting and reading status updates. They tend to benefit
from the implicit social interaction mechanisms provided by broadcast-based
social network systems. People who are not FYI Communicators, on the other
hand, would rather call others, or otherwise interact with them in a more di-
rect manner. They tend to benefit more from systems that promote more direct
interaction. In order to tailor to both types of communicators, digital learning
Learner Privacy | 147
systems should employ both automatic social-network style sharing (for FYI
Communicators) and direct, chat-style interaction (for non-FYI Communica-
tors). Further, since the communication styles of FYI and non-FYI Communi-
cators are at odds, developers should also pay attention to effects of integrat-
ing different communication styles within a single application.
LEVELS OF IDENTIFIABILITY
The use and sharing of learners’ personally identifiable information (PII) de-
serves special attention, because it presents the risk of revealing the identity
of learners to other parties. PII can be defined as any information that could
be used on its own or with a combination of other details to identify, contact
or locate a person, or to identify a person in context. The privacy concerns
associated with PII can be mitigated by allowing users of a digital learning
system to remain fully anonymous.
anonymous
Friend List
Management
Withholding
Contact Info Withholding
Basic Info
Selective
Sharing Timeline and
Wall Moderation
SELECTIVE PRIVACY
SHARER BALANCER
Leverages Moderate levels
more advanced of privacy
privacy settings management
Continuous tracking may create a digital panopticon that restricts user free-
dom. Therefore, users should be given easy-to-use notice and control mech-
anisms to manage the boundary between leisure and learning. Additionally,
users’ runtime activity should be carefully protected through a combination
of strict access control, de-identification, obfuscation, encryption, and/or cli-
ent-side personalization (see later sections).
INFERENCES
Instead, one could build a learning platform where all users, departments, and
organizations share the same centralized components. However, a single en-
tity that collects the data of all users creates an attractive target for hackers.12
A good trade-off is therefore to put these components at a level that is “low”
enough for learners to trust but high enough to allow efficient mobility and
user-modeling synergies. In other words, data/insight mobility problems can
be reduced through portability requirements and standardized APIs.
APIs
Another question is how each learning application on the platform can access
learners’ data. Since users are likely to trust different applications to differ-
ent extents, an access control mechanism is needed to allow applications to
optimally utilize the learners’ data while at the same time respecting each
learner’s privacy preferences. A recent development in adaptive systems is to
perform the calculations required to compute adaptations “client-side” rather
than on a centralized server. Research shows that such client-side methods al-
The end-user license agreement of most modern online services claims full
ownership over the personal information they collect about their users. The
legality of this claim is questionable though: The legal concept of “owning
information” is still new, and laws are still being written about this topic.
Moreover, preliminary investigations among users show that there are merits
in granting end-users ownership of their personal information, and it may ex-
pedite the movement of data among different digital learning systems. How-
ever, data ownership is not exclusive, and it may be desirable to give other
entities (e.g., applications, employers, researchers) partial co-ownership over
an individual’s data. These co-owners should request minimal amounts of
data, avoid duplicate storage, and de-identify data where feasible.
In the 401(k) model, learners formally own the data, but they can partially
delegate the responsibility of making decisions regarding their data to a fidu-
ciary, such as a teacher or administrator. As a “data steward,” this fiduciary
would then be allowed to make decisions on the learner’s behalf; although,
there should be a strict policy that outlines the
limits of these powers. This policy can outline
Structure data
several practices that are always allowed, nev-
ownership like
er allowed, or require the explicit consent of the a 401(K)
user. In the latter case, such consent should not
just be a notice with an option to “opt out.” Rath-
er, it should ask the user to formally opt-in to the proposed practice—this
practice makes it more likely that learners will make an informed consent
decision.
Finally, when more than one party has a say over the disclosure and use of cer-
tain data, Private Equality Testing can be used to create a Two-Person Con-
cept solution (a concept proposed by U.S. Air Force Instruction 91-104 [16])
that prevents any single person from intentionally or unintentionally leaking
data or becoming victimized by extortion or social engineering attacks.
Data Sharing
Data collected in digital learning systems can be used for purposes outside the
system. One such purpose is to make the data available to the learner them-
selves, which allows quantified self–like innovations. Beyond this, learning
systems can allow learning materials, activities, and outcomes to be shared
with fellow learners (enabling social learning experiences), researchers (cat-
alyzing learning innovation), and employers (informing organizational de-
cision-making). This section covers the privacy-related consequences of the
social, academic, and organizational use of data collected and generated by
digital learning systems.
154 | Modernizing Learning
QUANTIFIED SELF
By sharing learner data with the learners, themselves, digital learning sys-
tems can create a “quantified self” experience that allows them to gain in-
sights into their own data. For example, carefully constructed personalized
infographics can allow individuals to explore the common and unique sides
of their identities.14 Such insights are an important reason for many people to
accept the potential privacy intrusions that come with wearable technologies
and constant tracking. As such, the quantified self can be a motivating factor
behind the data collection efforts of a digital learning system. Also, the quan-
tified self can be a catalyst for learning. Translating self-tracked parameters
into a game-like structure can create new motivational and heutagogical sup-
port structures that encourage and enable users to push themselves further.
Learning data can also be used for research and organizational decision-mak-
ing. Privacy experts argue that secondary use of information should be ex-
plicitly communicated to users, otherwise they may be surprised to find out
about it and feel that their privacy is violated.16 Moreover, there are laws and
regulations surrounding research and employment-related practices that need
to be adhered to. For example, whereas employment discrimination is ille-
Learner Privacy | 155
PRIVACY NOTICES
CONTROL MECHANISMS
Simple privacy controls can help users take control over their privacy settings.
For example, in social sharing settings, recipients can be grouped to simplify
the decision landscape and graphical representations of the control matrix
can help users understand and manage their sharing patterns. Selective infor-
156 | Modernizing Learning
mation sharing is just one of many strategies users may employ to alleviate
privacy tensions. Likewise, privacy control can be provided in more diverse
and intuitive ways than a traditional “sharing matrix” in which users specify
who gets to see what. Research has found that it’s important to give users the
privacy features they want, lest they experience reduced connectedness and
miss out on social capital.20
PRIVACY NUDGING
Nudges are subtle yet persuasive cues that make people more likely to decide
in one direction or the other. An example of a privacy nudge is a justification
that makes it easier to rationalize a privacy decision. Justifications include
providing reason for requesting the information, highlighting the benefits of
disclosure, appealing to the social norm, or providing a symbolic character
to represent the trustworthiness of a recipient (e.g. a “privacy seal”). Another
approach to nudging users’ privacy decisions is to provide sensible default
settings, which tend to nudge users in the direction of that default.
The privacy nudges evaluated to date usually only work for some users, how-
ever, and they leave others unaffected or even dissatisfied. Some researchers
argue that this is because nudges take a “one-size-fits-all” approach to pri-
vacy.22 Since such nudges are rarely good for everyone, they may actually
threaten consumer autonomy. It’s therefore best to only use nudges if there’s
ORGANIZATIONAL
CONSTRAINTS
AND PRACTICES
DEFAULTS
USER
CHARACTERISTICS
DATA USER
RECOMMENDATIONS
USER RECIPIENT OTHER
BEHAVIORS FACTORS
JUSTIFICATIONS
A digital learning system normally tracks users’ location (Data) in order to give context-
1 relevant training exercises (Organizational practice). However, user-tailored privacy
knows that like many young mothers (User characteristic), Mary (User) does not want her
location (Data) tracked outside work hours (Other factor). It therefore turns the location
tracker off by default when Mary is not on the clock (Default).
David needs to decide how to share his recent milestones—two certificates he’s just
2 earned (Data)—within his organization (Recipient). Due to the rules of his employer
(Organizational constraint), user-tailored privacy requires him to share these milestones
with his direct supervisor (Recipient). Moreover, from his previous interactions (User
behaviors), the user-tailored privacy knows that David keeps close ties to several other
divisions. User-tailored privacy therefore suggests (Recommendation) that he should share
his new certifications with the heads of these divisions (Recipient) as well, explaining that
they’re likely to be interested in exploiting his newly gained skills (Justification).
158 | Modernizing Learning
USER-TAILORED PRIVACY
The next step is to model privacy. This can be done in a way that matches the
learners’ current privacy practices; however, in some cases, it may be better
to suggest privacy practices that are complementary to their current practices,
and in still other cases, it may be best to completely move beyond learners’
current practices. The model can also take the practices and constraints of
users’ organizations into account. Finally, using this user model, user-tailored
privacy can personalize the privacy settings of a digital learning application
as well as the justifications it gives for requesting certain information, its pri-
vacy-setting interface, and its learning recommendation practices.
Implementation Recommendations
We recommend several steps in the development process that will both build
intuitive privacy controls into the design of the learning ecosystem as well as
create privacy-sensitive recommender agents to guide learners.
1. DECISION-MAKING
Build trust:
trust Ensure that the learning applications originate from trustworthy
sources. Employ sensible data collection practices and a privacy-by-design
philosophy from the outset. Finally, provide contextualized privacy control
mechanisms and easy-to-understand privacy information
2. COMMUNICATION STYLE
“What we found when we studied the FAA is that the lines between training and
operations are blurring. …Aircraft have sensors with analytics; so, they can make
profiles and tell if pilots do something unsafe. It allows the FAA to look into a pro-
gram to provide information back to pilots. But the pilots, being union-driven and
structured, said “No, you can’t watch us!” So, they made a union the go-between
guardian for that data. This way, if there is an issue, there are a series of approvals
and guardians of the data, so that the pilot can’t be punitively damaged but can be
informed.” – Michael Smith, Senior Technical Specialist, ICF
160 | Modernizing Learning
3. LEVELS OF IDENTIFIABILITY
6. MANAGE ADAPTATIONS
Let learners know about secondary data use:use Communicate secondary data
use practices to learners and indicate exactly the data used and its purpose.
162 | Modernizing Learning
CHAPTER 9
ANALYTICS AND
VISUALIZATION
Shelly Blake-Plock
Both fields differ slightly, for instance, based on their origins, primary appli-
cation areas, and preferred AI algorithms.1 Learning analytics grew out of
efforts in the semantic web, and it’s practitioners tend to emphasizes big-pic-
ture analyses and decision-support for teachers and learners. Educational data
mining developed out of the adaptive instructional technologies tradition, and
it tends to focus on automated adaption and reductionist modeling.2 For our
purposes, in this chapter, we’re less concerned with the finer details distin-
guishing the two disciplines. Instead, we’re focused on their shared purpose:
Understanding and applying data-intensive approaches to education and train-
ing, particularly for large-scale learning data—so called big learning data.3
As the phrase “big data” implies, training and education analytics often (but
not exclusively) employ machine learning techniques. Machine learning is a
subset of AI that uses algorithms to automatically uncover patterns in data to,
for instance, assign classifications, estimate the influence of different variables
on downstream outcomes, or make predictions based upon historical data. In
the training and education domain, these applications have notably matured
over the last 20 years, coalescing into the two communities mentioned above.
But what can you do with these tools? People have applied analytics to a
variety of learning systems. For instance, some applications use analytics to
predict engagement and then recommend personalized resources to encour-
age students’ participation.4 Others can analyze students’ interactions and
proactively alert instructors as to which may need help.5 One well-known
example, Purdue University’s Course Signals, used current data from an LMS
combined with historical data (such as course attendance and prior grades)
to forecast which students would fall behind in a course and then alert both
learners and their teachers about their risk levels.6 Other tools apply similar
retention management approaches across an entire student body, identifying
those at highest risk of dropping out—in time for the administration to in-
tervene.7 Basically, any of the analytics applications we’ve come to expect of
Analytics and Visualization | 165
For instance, in the domain of sales and marketing, event-based data has in-
creased our capacity to understand the market and prospective customers. It
provides a window (for example, via analysis of social media streams) into
the story of the prospect’s journey, both as it relates directly and indirectly to
a product or service offering. In the entertainment industry, streaming data
informs recommendations of content, such as movies and television shows on
Netflix. In politics, streaming data helps analysts identify and capitalize on
public sentiment and social trends.
We often see a desire to digitize the analog world. We wear digital watches
that resemble their windable cousins. We create “offices” in our computers,
mirroring the components of the physical workplace. In education, we digitize
chalkboards, loose leaf, and books. But the inclination to recreate the analog
world within the digital domain eventually confronts both the limits of ana-
log practice as well as the more esoteric surprises of what, when it works in
our favor, we call innovation. When we move from tangible “things,” such
as chalkboards and books, to conceptual practices and processes, such as as-
sessment, the situation gets particularly dicey. Esoteric and nuanced concepts
become oversimplified to the point of caricature. This leads to notions being
thrown around such as, AI will replace educators! or Automation could never
substitute for teachers!—arguments that tend to betray a misunderstanding of
both AI and teachers. However, in the world where access to learning is dis-
tributed across the internet, expansive in breadth and always available, there
are practical limits to the analog approach of teaching. While there’s little
danger that AI will “replace” human teachers, their role—and the way we im-
plement training and education, writ large—needs to evolve in collaboration
with evolving technologies.
Analytics and Visualization | 167
Data at Scale
Contrast the analog “data set” with the contemporary “data assets” created
by social media newsfeeds. These data assets support the creation of time
series–based behavioral profiles that hold the activity records, built up over
time, from users’ behaviors on social media platforms, including likes, com-
ments, shares, photo posts, video watches—all user actions. These become
part of the user’s behavioral profile, and, in turn, become nodes on a vast
social graph. Each node owns a narrative. That data asset is key to the social
media industry’s business model. It’s the aggregate of these profiles that cre-
ates the opportunity for more targeted advertising, and, at scale, it’s a most
impressive record of formative experiences—of individuals, yes, but more so
of vast aggregate populations.
For social medial data assets, value isn’t encapsulated in a single pinpoint
score. It’s not even found in the ability to estimate a single user’s likelihood of
accepting a given advertisement (although this certainly brings some benefit).
Rather, or (at least) more importantly, value derives from the cumulative amal-
gamation of all these behavioral profiles. The power is in the aggregate. Only
the scale of the aggregate provides the rich raw data necessary to uncover the
array of patterns, categories of human interest, and shared narratives of hu-
man experience. It’s a matter of scale.
scale Similarly, the challenge streaming data
poses to the traditional view of assessment comes down to a matter of scale.
A gradebook at scale will never offer the insights into learning experiences
that an activity feed at scale can provide. This isn’t to denigrate gradebooks;
rather it’s a reminder to recognize their functions and where their value lies.
Consider a typical gradebook full of letter grades and percentages. In
one sense, this table of letters and numbers offers a substantial bit of
information about how one student may have progressed over time or
how she compares to her peer group’s scores. But in another sense—in
the sense informed by a world of streaming data, where data convey a
narrative about students’ digital experiences—the gradebook tells us little
about what actually happened, how it was done, and what it suggests
about the learner. The gradebook, and the modes of assessments
that inform it, are analog technologies. They’re no worse than digital
technologies merely because they’re not computerized, but they are
technologies reflecting an earlier paradigm—a paradigm ill-equipped to
support learning at scale in a digitized, interconnect world.
Analytics and Visualization | 169
Learning practitioners have long sought to increase their insights into forma-
tive development. For instance, teachers may subconsciously wonder, How
far along is each student in his or her learning journey? Unfortunately, dif-
ficulty in gathering the data points needed to make confident and continuous
formative appraisals makes the alternative—a big summative assessment—
seem like the only option. This can be understood as a scale problem. Yet, by
leveraging activity and event-based data in a manner similar to what social
media employs, we can create formative profiles of learners. These, in turn,
can empower (human) educators and trainers to make better decisions about
instruction and help them tailor guidance in ways that would otherwise be im-
possible. We can similarly empower learners, administrators, systems teams,
content and experience providers, and a whole host of constituents across the
learning ecosystem with information relevant to improving, and making more
meaningful, their own pieces of the puzzle.
The result of this merging of activity and event-based streaming data, along
with the subsequent human applications of the knowledge derived from it,
could offer a path towards something of a Golden Age for formative assess-
ment—but this Golden Age doesn’t stand a chance if either the technologies
or instructional strategies employed fail to attend to the matter of scale.
The time is ripe to investigate new models of assessment that take advan-
tage of advancements in cloud services, streaming data architectures, APIs,
and a new generation of web-based applications. By applying these tools to
One key topic of focus for future
learning is data analytics. We
currently use very fanaticized or learning, we can surface meaningful pat-
ritualized measures, like time on terns previously too obscure, if not overly
task or changes in knowledge complex, to act upon.
in a single area. How do we get
that mind reset to the galactic This prompts us to consider a wholly
view of learning? new human-machine model of assess-
ment for the digital age, not simply a
Elliot Masie
Founder, The MASIE Center digitized version of analog assessment
at scale. For example, it’s routinely noted
that automation can maximize the effi-
ciency and timeliness of tactical learning
interventions (e.g., micro- and macro-ad-
aptations). However, automation can also
help identify those interventions best
addressed by a human—who, in a web-
scale context, needn’t be a single preas-
signed instructor. Rather, learners could
be served by a distributed network of po-
tential teachers and mentors, and based upon various automated analyses, the
system could recommend the optimum (human) learning facilitators for dif-
ferent situations (including, potentially, the individual learners, themselves).
In this way, we enable widespread distribution, not just of individual instruc-
tion, but of the entire ecosystem—including its human capital.
This suggests a new paradigm for learning and assessment, one where
machines and humans complement one another—a symbiotic system.
It’s almost cliché to say, “learning is a journey.” But when most people use
this platitude, it’s possible they really mean, “Sure, you’re going to find
out new things in the future, but this class ends in three weeks and you’d
better finish this learning by that time.” An assumption of the learning
ecosystem concept, and the closely related philosophy of personalized
lifelong learning, is a shift away from output-focused, time-based
learning—characterized by high-stakes summative tests—and instead
towards more a process-focused outlook on learning—supported by a
steady stream of formative assessments. This represents a fundamental
shift for learning and assessment—away from discrete mathematics and
towards continuous equations.
IMPLEMENTATION
RECOMMENDATIONS
Because the field of streaming data and the capabilities it supports are still
emerging, we expect future innovations to eclipse the suggestions made in
this chapter. But in terms of a starting point, the section below outlines practi-
cal implementation steps to consider when looking to bring this new wave of
digital transformation to bear.
data, including the shape of the data model and where, when, and how it was
data
delivered and stored. Also document the status of the current data architec-
ture and system design,
design and information about its previous incarnations (if
any), including its historical levels of use and expectations for the scale to be
served by the new system. Finally, as appropriate for any project, catalog the
known risks and protocols (such as privacy, data governance, and security);
the objectives and goals of digital transformation,
transformation so as to provide guidance
on what new data sources will need to be integrated into the system to provide
desired metrics and insights; and the timeline, scope, and budget,
budget in order to
best enable (what will most often be) a phased approach to implementation of
the complete system.
Practitioners often make mistakes during the data design phase that only sur-
face later in the process. To limit exposure to errors, poor design, and the ac-
cumulation of technical debt, it’s useful to work backwards. Begin by laying
out key questions;
questions simultaneously, it’s helpful to draw prospective visualiza-
tions for these questions,
questions particularly in collaboration with their respective
end-users. Next, identify performance indicators that provide insights to those
questions, and determine what data sources may best inform these perfor-
mance indicators (whether or not those data sources currently exist). Then
design the “ideal” data model, incorporating the hypothetical data sources
previously identified; take care to deliberately consider how different data
sources may react to one another and how data from multiple sources may be
needed to inform recommended actions—possibly including actions taken by
other providers within the larger ecosystem. Once this optimum data model is
developed, look for available data sources to fill, or at least partially address,
its proposed components; also, consider potential limitations or access issues
with these data. Finally, revisit and tailor the visualization mock-ups to the
final data model.
174 | Modernizing Learning
There are a variety of ways to visualize data. Key factors to consider include
the velocity of data streaming through the system, the shape of the data, se-
mantic features including both human- and machine-readable attributes, po-
tential correlations or potential false flags among the data, and the metrics
necessary to demonstrate progress towards key performance indicators. Ad-
ditionally, strive to design visualizations to be as transparent as possible, to
help end-users build appropriate levels of trust in the algorithms and make
informed decisions based upon the analyses they depict.
3. Architecture Development
Once the conceptual data model is designed, the next step is to develop it.
When applying the xAPI specification to capture and store data, an xAPI
Profile should be used,
used either an off-the-shelf Profile or, if none suffice, then
a new one created for this system. xAPI Profiles define the accepted terms (or
Analytics and Visualization | 175
Next, choices will have to be made regarding the integration of other data
sources. Some learning data sources already may be delivered natively in
sources
xAPI formats. These data will usually be validated and made available by a
learning record store, a particular kind of datastore defined by the xAPI spec-
ification. Standardized data and APIs, such as those offered by xAPI, make
data aggregation relatively easy. However, there may be other learning data or
non-learning activity (such as on-the-job workflows across web services) that
aren’t natively structured as xAPI statements. One option is to instrument the
external source to deliver xAPI data, but this can be difficult when working
with proprietary third-party software. An alternative is to coerce the data into
an xAPI format using API methods. However, it won’t make sense to force
all data into an xAPI-based data model. There’s no reason to transform data
into xAPI formats if it’s not a good fit. Instead, this heterogeneous data either
may be modeled to another specification or just passed directly through the
Kafka Streams processor (described below), where it can be subscribed-to by
different applications and joined with disparate data in downstream analyses.
Once the native data format and external data streams have been defined,
they’ll need to be implemented within a streaming data architecture.
architecture These
can follow several models, but we would usually recommend the Kappa
Architecture 11 as the software architecture pattern for a real-time learning
ecosystem. This paradigm treats everything as though it were streaming
data and processes these data into a stream that may be leveraged by various
microservices. This approach generally makes it easier and more efficient to
deal with various forms of data, as opposed to creating polyglot solutions and
176 | Modernizing Learning
maintaining a separate code base for batched and non-streaming data or—in
the case of xAPI—each non-conformant data source or data type that may pass
through the system (e.g., from student information systems, HR technologies,
and legacy databases). In this architectural paradigm, regardless of the nature
of the source, the data comes into the stream as logged events. This is a huge
benefit to real-time analytics because from an operational perspective, the
subscriber to the data stream never has to request that the data producer batch
the data. Instead, the subscriber always has access to the log and can replay
the events in the log as necessary to perform operations.
When considering the integration of data from different sources, it’s import-
ant to carefully consider how users’ identities will be handled.
handled Identity man-
agement should be organized so that everything is kept orthogonal. When
designing a streaming data architecture, it’s also best to keep identity man-
agement and administrative provisioning matters close to the point of ingress;
so that no data elements slip through unaccounted for.
James Robb
Rear Admiral, U.S. Navy (Ret.)
President, the National Training and Simulation Association
4. Deployment
5. Production Implementation
it is supposed to. Because other services may be depending on data from that
vendor in order to process jobs, breaks such as this can cause bottlenecks that
affect the larger system. For that reason, it’s crucial that stream-processing
systems be attended to by services teams,
either locally or via managed services. Some practitioners use
Luckily, making fixes is usually a relative- the acronym FATE when
ly painless process so long as you’ve done discussing Fairness,
your due diligence into the quality of the Accountability, Transparency,
and Ethics in AI
data sources feeding into your system. Fur-
ther, because most breaks will be caused by
things like changes to endpoints or reconfigurations of APIs, they’re usually
well-documented and part of the product plan shared with the team—mean-
ing most breaking changes will be telegraphed well in advance and can be
planned for.
Just as important to the success of the analytics and data visualizations ser-
vices within the future learning ecosystem will be scalability and extensibil-
ity. Advances in learning tools, web technologies, and AI are likely to alter
future learning analytics and data visualizations. Likewise social changes in
behavior, expectations, methods of instruction, access to learning, and pref-
erences among both formal and informal learners will influence the nature
of the events captured in activity data streams. The technologies deployed to
serve learning analytics and data visualization objectives, therefore, should
be as flexible, extensible, and open, as possible. The systems must be built to
withstand whatever is thrown at them. Dedication to open source standards
and specifications will aid in meeting this need.
Conclusion
In the end, the quality of insights gleaned from analytics and visualizations
will be tied to the quality of their data models, the velocity and variety of the
In a learning management system, you can get a gradebook, much
like analog systems today but available online. But with the advances
in assessment analytics, you can delve much deeper to gain insight
into how reliably your questions and tests are measuring what
they’re supposed to measure. You can determine if your question
bank is fair, valid, and reliable. You can see in multiple views in a
dashboard, and you can even see it within, and eventually across,
education, defense, commercial, and healthcare.
Stacy Poll
U.S. Public Sector Business Development Manager
Senior Account Manager, Questionmark
data they employ, and the accuracy of the data’s representations. As the truism
goes, there are lies, damned lies, and statistics.13 Statistics, and even more so
infographics and visualizations, when misapplied can obfuscate the “truth” of
data. It’s far too easy to make bogus claims, given any data set—particularly
one as complex, personal, and socially and culturally situated as learning.
Consequently, the design of the data, application of algorithms, and layout of
visualizations are of great consequence. Small decisions during these design
and development phases can lead to significant downstream effects—hope-
fully positive ones—for learners and other learning stakeholders.
Personalization | 181
CHAPTER 10
PERSONALIZATION
Jeremiah Folsom-Kovarik, Ph.D., Dar-Wei Chen, Ph.D.,
Behrooz Mostafavi, Ph.D., and Michael Freed, Ph.D.
Customized experiences, like those a skillful tutor might craft, are the gold
standard for learning, but these don’t scale well, given the costs and limit-
ed availability of expert teachers and trainers. Computer-assisted instruction
can mitigate scalability issues, and personalized learning technologies can (at
least partially) unlock the benefits of one-on-one learning, similar to working
with a personal mentor.3
Overall, this field is fast growing, and new technologies are improving the
sensitivity, impact, efficiency, and cost-effectiveness of personalized systems
every day. The following sections outline a general approach to designing and
deploying personalized learning, with a particular focus on how new adaptive
learning capabilities will inform the future learning ecosystem.
DESIGNING PERSONALIZED
LEARNING
When preparing to implement a personalized learning approach, it’s useful
to consider which aspects of a learning experience are most impacted by per-
sonal differences as well as how instructional elements might be varied in
response to those differences. The availability of historical, real-time, and
external data sources will also influence the adaptive system. The next three
subsections step through high-level considerations for data collection, data
analysis, and what and how to personalize learning.
Data Sources
Learner performance data can include both static data, such as from historical
test results and portfolio scores, as well as more timely data from quizzes,
exercises, simulations, and other activities within the given instructional ex-
perience. Learner performance can be used to inform complex inferences,
through methods such as item-response theory or Bayesian knowledge-trac-
ing; simpler approaches, such as comparisons to threshold metrics and pop-
ulation norms, also provide some utility. However, even basic learner-perfor-
mance data isn’t always easy to collect; sometimes, for instance, individuals
or organizations may feel threatened by the measurement and recording of
their scores. Despite this, learner performance data makes a big difference to
personalization; it’s worth the effort to devise quality measures, collect the
data, and analyze them carefully.
Related to both learner performance and sensor data, learner experience data
refers to event-based data that describe what learners see and do. Compared to
learner performance data, learner experience captures not just the outcomes
but all the steps that explain each outcome—the fine-grained, step-by-step
activities a learner (or other relevant human or machine agents in the setting)
perform. These could include pausing a video, selecting (and then changing)
a quiz answer before submitting, or requesting help from an automated tutor.
siderations; these could include the digital devices available to that learner
(e.g., smartphone versus laptop), the number of seats available in a particular
course, or cost and time constraints. Organizational factors may also inform
personalization in various ways. As one example, consider how the design
and delivery of learning might change depending on whether someone is
completing a training course for workforce compliance reasons, because of
professional development goals, or out of personal curiosity.
Another form of external data comes from human observations and inputs,
inputs
including from learners themselves, their peers, instructors, and supervisors.
For instance, an instructor might input a critique about a student’s persuasive
writing, or an observer/trainer might score exercise trainees against a perfor-
mance rubric. A student may even self-report data, or it might come from peer
evaluations or 360° surveys. (The point is, it’s not necessary for all aspects of
the future learning ecosystem to be digitized and automated! In fact, this is
an important area for ongoing research, i.e., how to best integrate technology
with learning facilitators in a symbiotic—rather than substitutional—way.)
Finally, it’s important to note that learner data is often more useful when it’s
more robust, more personal, and more contextualized—but these same char-
acteristics also increase privacy concerns. A balance must be carefully struck.
(Refer to Chapter 8 for a more detailed discussion.)
Data Analyses
Collected data need to be analyzed in some meaningful way, and then the sys-
tem should use those analyses to make diagnoses, predictions, and adaptation
decisions. What kinds of decisions can personalized-learning technologies
make? The most obvious answer is they can estimate learners’ content mas-
tery and then take actions to fill capability gaps and remedy misconceptions.
People learn at different rates, and some of the most impactful interventions
a system can make are simply to ensure each learner progresses at his or her
188 | Modernizing Learning
optimal pace so that all learners reach mastery, without skipping over import-
ant subcomponents or suffering through already-known materials.
In addition to mastery, many individual states and traits impact learning and,
thus, can be useful targets of analysis. Learner states are malleable features
that change from moment to moment, while learner traits are more fixed and
change only over longer periods of time, if at all. Affective states, such as
frustration or boredom, can reduce individuals’ motivation to learn; physio-
logical states, such as hunger or lack of sleep can also affect learning, both
by impacting emotions and by moderating cognitive functions. As mentioned
earlier, personality traits (e.g., goal orientation and general self-efficacy) can
also provide some insights; additionally, personal characteristics, such as so-
cial identity traits or learning goals, may be useful.
Finally, aggregations of data from many learners over time can inform trend
analyses or, at sufficient scale, be used to train machine-learning algorithms
that uncover hidden patterns. At a minimum, collective data can provide some
Personalization | 189
Adaptations
The next important consideration concerns the kinds of adaptation the sys-
tem will make. This could involve modifications to many factors, including
display elements, what and when content is presented, the task sequence, the
contents of instructional materials, embedded content features (e.g., selection
of relevant examples), extrinsic content features (e.g., feedback and hints), in-
structional strategies and tactics, delivery methods, delivery devices, perfor-
mance standards, learner goals, and various other interactions. These forms
of adaptation can be expressed, to a greater or lesser extent, at the micro-,
macro-, and meta-levels.
190 | Modernizing Learning
team. As this example highlights, different learning systems use distinct, and
often complementary, approaches.12 Intuitively, each experience might work
better (or worse) for each learner. Consider, for instance, how professional
development goals, workshop scheduling logistics, available technologies, ur-
gency of earning the licensure, and risk tolerance of the organization might
affect the way the hypothetical medic is trained.
Technological Considerations
extensive and highly secured digital storage for massive amounts of learn-
er data, flexible servers capable of processing online AI algorithms at scale,
or federated systems that share data across APIs. Similarly, depending upon
the selected data sources, unique hardware devices may be required, such as
wearable sensors, environmental beacons, or instructor input tablets.
BANDWIDTH
DATA
Data models can be informed by extant data, whether collected through large-
scale validation and norming studies, from other applications in a learning
ecosystem, or from centralized data repositories. A word of caution, however:
More isn’t always better. It’s important to judge the extent to which previously
collected data accurately reflects the current population. In precision settings,
for example, bias has been detected from differences as subtle as the order of
questions within a test.14 As this highlights, data quality is a key concern—
whether data come from external sources or from system-collected inputs.
Resilience to error, completeness, objectivity, fairness, timeliness, and consis-
tency (to name a few) are all critical factors for personalization.15
194 | Modernizing Learning
MACHINE LEARNING
CONTROL
Transparent and explainable systems let users see why and how an application
works, but what if those stakeholders want to control some of its functions?
Systems can allow learners, instructors, and other human stakeholders to in-
fluence their estimations and/or actions. This sort of human-machine teaming
is an ongoing area of research.20 Ideally, learning stakeholders should be able
to retain the kinds of control they want while they offload tasks to comple-
mentary technology that augments them with faster processing of large or
detailed data.21
USABILITY
BUILDING EFFECTIVE
PERSONALIZED LEARNING
Ultimately, the purpose of personalization is to help individuals achieve learn-
ing objectives more effectively and efficiently. But how do we determine how
well a particular system—its data, analyses, and adaptive interventions—per-
functional, i.e., does
forms? The first question to ask is whether a system is functional
it give different learners experiences that fit their needs? Can we verify that
it performs as designed and expected? It’s useful to break these evaluation
factors down into several categories. For instance, how does the system—as
a software application—perform? Consider elements such as: the amount of
work done by a user without help from the system, time-related information
about the work processes, information related to the accuracy of underlying
models, and the behavior of users in interacting with the system. It’s also
useful to evaluate the content within the application, for instance the extent to
which a system produces recommendations for every possible target learning
outcome, quality of the instructional “catalog” the system draws from, and
quality of instructional interventions made.
CONCLUSION
Personalization is among the most important ways to achieve effective learn-
ing outcomes, and computer-assisted personalization can bring this benefit to
more learners. The field of learning science has advanced our understanding
of what and how to adapt learning (through decades of research in educational
theory and cognitive science), and innovations in technology are improving
our ability to implement these methods, efficiently and effectively at scale.
Kimberly Eckert
Teacher, Brusly High School; Louisiana State Teacher of the Year 2018
Assessment and Feedback | 203
CHAPTER 11
ASSESSMENT AND
FEEDBACK
Debra Abbott, Ph.D.1
The future learning ecosystem will change the management and processing
of learners’ data across systems, communities, and time. As new analytics
capabilities evolve, they will catalyze change in several ways: by increasing
the level of insight into how learners develop over longer periods of time, by
enhancing the ability of instructors to make teaching more responsive and
adaptive, and by recommending experiences and learning pathways designed
to meet the needs of individuals. However, new technologies won’t enhance
learning if they’re applied without purpose. The current system too often elic-
its an abundance of learner performance data without making effective use
of it. And, too often, other factors essential to learning—such as motivation
and long-term goals—are ignored, or learners receive feedback that’s neither
useful or actionable and, hence, quickly forgotten. This chapter lays out an up-
dated framework for assessment and simultaneously emphasizes the impor-
tance of analyzing the intent behind assessment activities, reforms available
through improvement of formative feedback, and affordances required in a
technology-enabled system of assessment.
Valerie Shute and Matthew Ventura sum up the consequences of this state of
affairs:
PRECONDITIONS FOR
ASSESSMENT: THE ESSENTIALS
In Visible Learning, John Hattie names two elements as “essential to learning”:
(1) a challenge for the learner and (2) feedback.3 Similarly, both factors serve
as a foundation, or as the minimum requirements for, assessment. If challenge
is insufficient, neural connections are neither strengthened nor altered in a
learner’s brain, and if useful feedback isn’t present, the learner is acting blind-
ly, unable to relate her performance to either current or future learning goals.
New-age learning analytics have moved the needle considerably as they allow
for continuous, real-time monitoring of performance and can present up-
to-date dashboards to stakeholders. This is a far cry from assessment in the
age of our grandparents. For most of the 20th century, a “factory model” of
training and education prevailed and, with it, an assumption that teaching is a
transmission process, with learners on the receiving end. The goal was to fill
everyone’s head with knowledge and deliver a uniform product, the graduated
student, to society. Instructors were told that a period of teaching needed to be
followed by an assessment, followed by another period of teaching and another
assessment, ad infinitum until a program of instruction ended. Assessment
206 | Modernizing Learning
So, in this new age must we always be assessing? What’s best for learners?
For now as well as in the foreseeable future, some forms of student work
and performance will be prioritized above others as the significance of any
given assessment is socially constructed. For example, in adult education, as-
sessments that mirror authentic types of workplace tasks may be more great-
ly valued and better serve to articulate learning objectives. It’s important to
recognize that not all actions or learning artifacts individuals produce will
have equal value relative to learning goals, program objectives, or learning
outcomes. Part of the challenge, therefore, lies not only in designing and de-
livering effective assessments but also in prioritizing their applications and in
considering their broader roles within the learning ecosystem.
Building upon the progress made to date, assessment in the future must con-
tinue to empower education and training stakeholders. Understanding assess-
ment is no small feat, but to start, it’s useful to clarify the true purpose for
systems of assessment, including for singular high-stakes assessments, and to
encourage a mindset shift away from 20th century preconceptions that cou-
ple valid measurement almost exclusively with summative measures such as
tests, papers, quizzes, and the like. It’s also useful to become versed in devel-
opments arising from research in formative assessment, as well as its close
cousin, feedback—which has a symbiotic relationship with learning. Finally,
as we embrace a more technology-centric approach to learning, it’s useful
208 | Modernizing Learning
Purpose of Assessment
The best way to determine the reason for doing the assessment is by
examining the focus of the plan. Is the focus simply on collecting data?
Or is the focus on using data to improve student learning? Assessment
plans designed to appease others generally involve a lot of data collec-
tion but are rarely put to meaningful use. Plans that focus on student
learning connect collected data to potential courses of action.4
Kimberly Eckert
Teacher, Brusly High School
Louisiana State Teacher of the Year 2018
210 | Modernizing Learning
As evaluation enters the picture, it widens the aperture about the purpose and
utility of assessments. Evaluations and other macro-level assessments should
emphasize measures of effectiveness,
effectiveness that is, meaningful outcomes in terms
of the impact of learning, such as college admittance rates or improvement in
job performance. Measures of effectiveness are contrasted against measures
of performance,
performance or process-focused measures such as a student’s grade-point
average or how many people completed a training workshop.
This distinction gets to the heart of training and education. Whether individ-
uals are enrolled in a high school composition course, corporate training pro-
gram, or professional military education seminar, the aim of most formal and
informal learning is to engender practical competence—competence that’s
necessarily instantiated in a particular context or environment. For exam-
ple, if you tell students to achieve a set of general communication outcomes,
they’re likely to shrug and disengage. However, if you focus those students on
writing their college entrance essays, corporate work plans, or five-paragraph
field orders, they’re not only likely to show greater motivation but assessments
of their abilities are apt to be more authentic, meaningful, and reliable.
One of the most persistent problems in (adult) training and education stems
from inadequate understanding of how applied performance—real people
Assessment and Feedback | 211
1. Serviceable Feedback
used to iteratively close the gap between the actual level and desired level of a
particular capacity. Assessment results that don’t meaningfully inform some
aspect of teaching and learning, or that fail to help this progression, are con-
sider “dangling data.”
The term “feedback” is not only vague but itself a misnomer. Assessment
expert Dylan Wiliam is fond of saying that it more aptly refers to the view
from the front windshield rather than the rearview. It can refer to performance
observations or advice, reflective prompts and questions, or other information
relevant to an individual or group; and it may refer to past, present, or future
performance.
So, as long as teachers and trainers deliver accurate and relevant feedback,
what’s the difficulty? It was Sadler 6 who again uncovered the key: There are
several reasons a learner may have trouble implementing feedback—even if
it’s of exemplary quality and delivered early enough in a period of instruction
to be useful. First, the line may be blurry for the learner between the work
as realized and what was intended; individuals may see potential where in-
structors may see flawed work. Second, terminology or criteria related to the
instructional task may not be understood. Third, students may fail to grasp
tacit knowledge. For example, statements such as “this doesn’t follow logical-
ly from what goes before” makes no sense to students who don’t recognize
the hallmarks of subpar writing structure: It looks fine to them. Last, learners
often cannot consolidate or apply advice fast enough for learning to stick. To
be effective providers of feedback, then, teachers and trainers need to better
understand learners’ own visions of their work, their challenges, and any gaps
in their learning. Also, learning facilitators would be wise to implement learn-
er self-assessments and peer assessment, since both can go a long way toward
meeting these needs.
Another model for the creation of more comprehensive and appropriate feed-
back comes from the work of John Hattie and Helen Timperley.7 They believe
that learners need three questions to be answered concerning their perfor-
214 | Modernizing Learning
mance. First, they need information about the performance goal, which an-
swers the question, “Where am I going?” This includes specific and compre-
hensible success criteria and is referred to as the “feed up” stage. It’s followed
by the “feedback” stage, which answers the question, “How am I going?”
Lastly, the question is, “Where to next?” This final stage is called “feed for-
ward,” and it’s probably the most critical juncture for applied learning and
development. Hattie and Timperley also identify four targets for feedback:
feedback about the task, about the processing of the task, about self-regu-
lation, and about the self as a person. Their three questions apply to each of
these categories, and together these twelve targets become a useful, heuristic
catalog for learner feedback.
2. Evidence-Based Systems
Shute and her colleagues advise against hiding assessments or evaluating in-
dividuals without their awareness; rather the term “stealth” refers to the fric-
tionless integration of the measurement, where it’s inherently situated within
a task rather than an exogenous activity to it. Two other characteristics of
stealth assessment are that it’s continuous (in contrast to single-point summa-
tive testing) and probabilistic (in contrast to the predefined criteria frequently
used by standardized exams with well-defined correct and incorrect answers).
“The big power of this technology is that we can construct these interactions,
collect this data on students’ interactions, and use it to drive very powerful
feedback loops in the learning system.” – Candace Thille 10
With the growing use of automation, we run the risk of disempowering learn-
ers, teachers, and trainers. Despite their enormous potential, automated sys-
tems are only as strong as their weakest link—which is very often the user
interface and user experience. Even today, in arguably simpler times, comput-
er-assisted instruction is fraught with UI/UX design challenges, delivery tool
mismatches, and assessments that learners perceive as irrelevant. While new
technologies can enable more frequent and better attuned assessments, these
may be relatively meaningless if they fail to offer learners and instructors
sufficient interaction affordances, such as for understanding and making use
of the assessments, feedback, and subsequent intervention recommendations.
3. Learner Autonomy
Dron extended the transactional distance theory to highlight the impact that
control, or the extent to which choices are made by teachers and learners, is
the fundamental dynamic of it. The central idea is that flexibility, negotiation
of control (or “dialogue”), and autonomy all matter a great deal in learning
contexts.11 The solution isn’t as simple as giving learners (or instructors) full
autonomy; rather, a thoughtful approach, considerate of control, is needed. As
Dron explains:
Most learning transactions tend towards control by either the learner or,
more often, the teacher. From a learner perspective, being given control
without the power to utilize it effectively is bad: learners are by defini-
tion not sufficiently knowledgeable to be able to make effective deci-
sions about at least some aspects of their learning trajectory. On the oth-
er hand, too much teacher control will lead to poorly tailored learning
experiences and the learner may experience boredom, demotivation, or
confusion. Dialogue is usually the best solution to the problem, enabling
a constant negotiation of control so that a learner’s needs are satisfied...
The ideal would be to allow the learner to choose whether and when to
delegate control at any point in a learning transaction.12
RECOMMENDATIONS
Given the principles of assessment and feedback, as well as the opportunities
(and challenges) afforded by new technologies, there are several precepts to
consider regarding assessment and feedback for the future.
stored in persistent learner profiles, and examined in aggregate. This will start
to shed more light on competencies in situ as well as the interplay among di-
verse knowledge, skills, attitudes, and other characteristics.
sign ensures that learners receive useful information that’s timely, actionable,
and customized to their needs.
Conclusion
It’s strange that we don’t hear more frequent comparisons made between
the practice of teaching and the practice of medicine. Both require intense
amounts of skill, professional development, and consistent practice. As as-
sessment expert Dylan Wiliam says: Teachers need professional development
because the job of teaching is so difficult, so complex, that one lifetime is not
enough to master it.20 Mastering assessment in teaching is a bit like mastering
triage skills in the emergency room, in that successful intervention depends
on successful evaluation of the unique situation of each individual. And, yes,
because so much of our survival and future success depends on acquiring
effective training and education, one’s learning needs often are (at least in a
theoretical sense) as urgent as many health needs. Perhaps because nearly all
of us have been coaches, trainers to workplace apprentices, or teachers to our
own children, the instructional process may have lost its mystique somewhere
along the way. Hopefully, a clearer vision may help us appreciate the mystery,
regain some enthusiasm, and redefine as well as reimagine assessments to
work more effectively and purposefully to uplift and motivate our students.
Instructional Strategies for the Future | 223
CHAPTER 12
INSTRUCTIONAL STRATEGIES
FOR THE FUTURE
Brenda Bannan, Ph.D., Nada Dabbagh,
Ph.D., and J.J. Walcutt, Ph.D.
Background
Fostering more cohesive, coherent learning will likely involve designing some
manner of “macro-level instructional arcs” that span a mosaic of individual
and collaborative learning experiences—meaningfully intersecting different
events across a lifetime. It will also require us to make better use of multimod-
al communication tools to help individuals curate information and generate
knowledge across experiences. This position reflects the connectivist view of
learning, which perceives knowledge as a network, influenced and aided by
socialization and technology.1 From this standpoint, knowledge isn’t only con-
tained within an individual or information artifact; it’s also distributed exter-
nally through networks of internet technologies and communities, accessible
via social-communication tools. Learning takes place in these autonomous, di-
verse, open, interactive, collaborative, and global knowledge systems. Hence,
recognizing relevant information patterns, constructing new connections,
and nurturing and maintaining connections become critical skills for achieve-
ment. Individual learning opportunities can be (and have been) designed
with this paradigm in mind; 2 the full solution, however, requires even more.
Instructional Strategies for the Future | 225
global level, and “instructional tactics” at a more granular one.5 Exactly where
the lines are drawn between these levels is a bit fuzzy—and largely irrelevant
to our discussion. What’s more applicable is the general idea that there are
instructional design distinctions at different conceptual and granular levels.
One final distinction for the future learning ecosystem is belied by its name.
Why is it an ecosystem; why not just a regular, old system? An ecosystem, by
definition, is comprised of interconnected parts, with the behaviors of many
individual agents affecting one another as well as the environment’s over-
all holistic pattern. It’s a dynamic system, in the engineering sense, involv-
228 | Modernizing Learning
THE EXPANDING
CONTEXT OF
FUTURE LEARNING
To advance instructional theory, it’s necessary to expand its design towards a
modern, longitudinal view of learning, one that facilitates connectivist prin-
ciples and seeks to amplify outcomes throughout an array of teaching and
learning situations, across multiple contexts, diverse learning objectives, and
disparate learning modalities. This section outlines eight principles likely to
shape the purpose and application of instructional strategies in this complex
future context.
Explicit in the “ecosystem” concept are the notions of diversity and intercon-
nectivity. Most relevant, here, are the diversity of learning experiences and
their complex interconnectivity with one other. As humans, all of our experi-
ences naturally affect one another. The question is not simply “how to ensure
learning episodes are somehow additive,” but rather how to intentionally build
meaningful and effective connections among learning episodes that advance
overall learning goals. Even within a relatively constrained setting, like a sin-
gle course, instructors and instructional designers need to broadly consider
multiple and varied learning modes and,
importantly, how to help connect learners’
experiences across them. As a simple ex- The transitions for
ample, consider a semester-long class that learners from K–12 to
incorporates face-to-face seminars, online postsecondary education
courseware, an additional smartphone app are significant, and if we
used to remediate some students, and in- really want to learn about
formal resources, such as videos or blogs, accumulated learning, we
that students find online. Courses that have to have data systems
blended these sorts of resources are al- that talk to each other. In
ready common. Part of the challenge, how- the science standards,
ever, is gracefully navigating the available we’re thinking about the
set of learning-resource options and inten- progression of learning
tionally integrating them so that they not over time. Learners need
only coexist but also correlate. time to digest what they’re
learning in a deep way.
This mosaic of learning components, of
Heidi Schweingruber, Ph.D.
course, is often more complex than this
Director, Board on Science Education,
example describes. In reality, learning National Research Council, U.S.
National Academies of Sciences,
experiences span multiple formal and in- Engineering, and Medicine
When a child learns to read, we first start by teaching sounds and letters; once
these are learned, we teach words, sentences, punctuation, grammar rules,
comprehension, and eventually one day maybe professional investigative
journalism or creative screenwriting. The point is that different capabilities
emerge from the integration of competencies at a given level of analysis. The
Instructional Strategies for the Future | 231
3. Connect across
different levels
of abstraction
professionals to consider how new concepts fit into their jobs? Can we build
something more than the sum of the learning parts?
As discussed in Chapter 4,
4 cognitive overload poses a serious problem for
individuals, who can readily become overwhelmed by the sheer amount and
velocity of information. Learners need new supports that help them filter out
“noise” and meaningfully integrate the relevant “signals.” If not addressed,
we run the risk of increasing information acquisition to the detriment of deep
comprehension and robust knowledge construction. The multilayer, intercon-
nected model we’ve discussed in this section emphasizes this complexity. The
challenge for learning professionals is to help learners navigate through infor-
mation overload and to develop the internal cognitive, social, and emotion-
al capabilities needed to self-regulate against it. Some strategies to support
this have been discussed in prior chapters, including social and emotional
MESO
competencies (Chapter
Chapter 4),
4 self-regulated learning skills (Chapter
Chapter 15),
15 and so-
cial learning supports (Chapter
Chapter 14).
14 Mentoring learners in these areas can
help, as can specifically teaching techniques for managing overload including
connectivist skills, curation, and metacognition.
systems help learners and their teachers manage learning resources. Looking
ahead, learning professionals will need additional tools and mentorship strate-
gies to continue to support such curation activities across increasingly “noisy”
and diverse settings.
This section has outlined guidance for instructional strategies as well as pos-
sible interventions to help develop and activate learners’ own internal learn-
ing strategies. This final item highlights that both internal expert-directed
learning controls as well as learner-directed self-regulatory interventions are
critical. Over time, individuals should develop the desire and ability to exert
more independent control. However, many learners need help cultivating their
self-directed learning abilities, hence a negotiated mix of instructor-controlled
and learning-controlled approaches is needed. The role of the instructor in
these new multidimensional contexts, therefore, needs to expand and grow
in flexibility, shifting to encompass the roles of activator, facilitator, coach,
mentor, and advisor.8
STRATEGIES FOR
MEANINGFUL
FUTURE LEARNING
The prior section outlined eight principles for the application of instruction-
al strategies in the future learning ecosystem context; however, it didn’t de-
scribe the strategies, themselves. Hundreds of instructional strategies and,
likely, thousands of corresponding tactics have been tried and tested. Rather
than provide a litany of these, we’ve identified five generalizable principles
of meaningful learning well-suited for instructional strategies in this context.
236 | Modernizing Learning
INTENTIONAL CONSTRUCTIVE
Goal-directed Articulative
and Regulatory and Reflective
AUTHENTIC COOPERATIVE
Complex and Contextualized Collaborative and Conversational
ample, the instructors might engage the EMT trainee in intentional, goal-di-
rected, and regulatory behaviors to prompt a connection between what she
learned in the EMT training course and how she can extend the physical and
cognitive dimensions of EMT training into future paramedic training.
In the EMT example, this means engaging the EMT trainee, who (let’s say)
is now a paramedic, in authentic (complex, contextualized) and cooperative
(collaborative, conversational) activities to help her think about how to extend
STRATEGIES FOR MEANINGFUL LEARNING
Instructional strategies such as scaffolding, modeling and explaining, and coaching and
mentoring can support meaningful learning within and across different levels: 12
Macro-level instructional strategies can inform larger and larger units of in-
structional and professional development, and adding meta-level structures
also helps support a lifetime of growth across multiple careers, experienc-
es, and interests. This supports continual expansion of knowledge, multiple
learning itineraries based on learners’ competencies and interests, and multi-
ple tools for manipulating resources. This includes not only formal learning
experiences but also informal and life experiences, all intimately connected.
SUMMARY
Instructional strategies can incorporate interventions, such as scaffolding,
modeling and explaining, and coaching and mentoring, to provide the glue
that meaningfully supports connected and cohesive experiences across a
learner’s lifetime. Thinking about the continuum of future learning, we need
to consider these strategies at multiple levels—not only within a particular
instructional event or course of study, but across learners’ longitudinal trajec-
tories. Accordingly, a significant challenge for the future is the differentiated
application of instructional interventions across conceptual areas, learners’
developmental phases, content modalities, and levels of abstraction—while
also considering the impact of composite learning experiences.
John Landwehr
Vice President and Public Sector Chief Technical Officer, Adobe
Competency-Based Learning | 243
CHAPTER 13
COMPETENCY-BASED
LEARNING
Matthew Stafford, Ph.D.
Competency-based learning isn’t new. It evolved from the following four in-
novations: The parsing of learning into specific chunks of skills and knowl-
edge; the creation of learning outcomes to clearly establish levels of mastery;
assessments that allow learners to demonstrate their mastery; and most re-
cently, a focus on the learner and the learning (outputs) versus a focus on the
teacher, the curriculum, and the time invested (inputs).
The first of these advances traces back centuries to the age of guilds and
apprenticeships. Master craftsmen parsed their specialties into a variety of
discrete tasks and then trained their apprentices to perform those activities to
appropriate levels of mastery.
mastery Another remnant of the age of guilds is the con-
cept of varying levels of mastery. Aspiring craftsmen started as apprentices
and advanced through the assorted levels. Only after demonstrating mastery
of every aspect of the craft, would the tradesman graduate the apprenticeship
at the full craftsman status.
It may sound shocking to some; however, this is how most informal learning
occurs. Someone buys a lawnmower and turns to YouTube to figure out its
assembly and how to get it running. Someone else goes online to figure out
how to change the oil filter in an antique automobile. Gamers have special
websites to share tips on how to win in their favorite video games. Even those
who sit and practice the lost art of “reading the manual” are benefiting from
informal, self-directed learning. In each case, there are no formal classes.
246 | Modernizing Learning
In competency-based
learning, performance
is key; performance
standards are held constant
while time may vary.
Suppose a coach goes into the team assembly room and explains:
Next Friday, I’m going to put this 48” stick into the ground
vertically, like this. I’ll expect each of you to jump over it without
touching it. Those who do so will accompany me to the track-
and-field competition the next day.
‘how’ of performing job tasks, or what the person needs to do the job
successfully,” per the U.S. Office of Personnel Management.5
In their 1999 work, The Art and Science of Competency Models, Anntoinette
Lucia and Richard Lepsinger offered a slightly different conceptualization.9
Readers can see in the above figure how Lucia and Lepsinger’s approach cor-
relates with Spencer and Spencer’s; however, the pyramid provides better in-
sight into the ways in which some characteristics support others and how,
when combined, they all manifest in behaviors—i.e., in performance.
250 | Modernizing Learning
MOTIVES drive,
direct, and select
behavior towards
Personal
Aptitude certain actions / goals
Characteristics
The Competency Pyramid per Lucia & Lepsinger, with definitions from Spencer & Spencer
Lucia and Lepsinger argued that aptitude and personal characteristics are
foundational, and while such characteristics may be innate, they can be influ-
enced. Skills and knowledge, of course, are more easily affected; they can be
imparted through training and education—through development. At the top of
the pyramid, all of the characteristics manifest in behaviors—in performance.
the scholarly literature, however, there are many examples of purely cognitive
competencies, such as analytical thinking, critical thinking, conceptual think-
ing, diagnostic skill, and commitment to learning, to name a few. Like their
vocational counterparts, these cognitive competencies are transferable—ap-
plicable to a wide variety of educational pursuits.
CONCERNS
Minimizing Learning
Achieve reached out to states inviting them to be lead state partners in developing
the standards to an overwhelmingly positive response, resulting in 26 state partners.
This was the start of the tag line, “For States, By States.” The collaborative approach
continues to today with the launch of Achieve’s Science Peer Review Panel to
enhance the implementation and spread of high-quality lessons aligned with the Next
Generation Science Standards. Creating a sense of ownership and the providing
tools to implement. To date, 19 states and the District of Columbia have adopted
the Standards, and 21 additional states have developed their own standards based
on the Framework.
Quality
As we begin the 21st century, evidence abounds that executive and lead-
ership development has failed to meet expectations. Unless we change
our assumptions and think differently about executives and the develop-
ment process, we will continue to find too few executives to carry out
corporate strategies, and the competence of those executives available
will be too often open to question. The “competency model” of the ex-
ecutive, proposing as it does a single set of competencies that account
for success, must be supplemented with a development model based on
leadership challenges rather than executive traits and competencies. Ex-
ecutive performance must focus on “what gets done” rather than on one
way of doing it or on what competencies executives have.12
VISION
A national competency-based system will enable a great deal of flexibility.
So much of our education system is based on where you live and how
much money you have. We’re lacking national equity. But if you learned
it, it should count. I don’t care where you learned it. Lots of people
aren’t being served by the current system, but they should be. By 2025,
60% of Americans will need a postsecondary credential. We currently
don’t have a system that can produce those results unless we leverage
every postsecondary learning opportunity and everyone together.
Translation
Note: The listing of these courses does not constitute endorsement of their content by
the U.S. Department of Energy or any agency of the U.S. Federal Government.
Competency-Based Learning | 261
for every currency interaction (pesos to dollars, dollars to rubles, and fenings
to pesos, for instance), particularly as their values fluctuate. There would be
so many different currencies to track! Why not leverage the same approach
used for different currencies? Evaluate the relative value of the currency in
relation to commodities. If one knows how many of a given currency it takes
to purchase a commodity, such as a loaf of bread or barrel of oil, currency
conversion is easy.
IMPLEMENTATION
RECOMMENDATIONS
1. Decide if competency-based learning
is right for the organization
organizations similar to your own with similar missions and challenges. Look
at what they did to embrace competency-based learning and how they’ve em-
ployed it. …and as much as possible, learn from others’ mistakes!
Next, construct and validate a competency model. There are several approach-
es one can take. Many institutions simply select from existing models where
the competencies, performance levels, and other accoutrements seem to fit
their needs, modifying their model as needed during validation. A second
method is job analysis.
analysis With this approach, researchers dissect the various
jobs performed within an organization figuring what core and/or occupation-
al competencies are required and at what proficiency levels. Typically, the
researchers will interact with workers to ensure the analysis is thorough and
that all competencies have been properly identified. Another method involves
leveraging panels of experts,
experts surveys, and interviews to create a competency
model. This is a fairly common approach and benefits from the fact that most
organizations fail to capture the full breadth of tasks and knowledge within
the human capital management documentation. A final method, and one rated
most effective by experts, is a criterion sampling method.
method With this approach,
researchers work with organizational members to establish criteria to iden-
tify the most outstanding performers. Applying this criteria, the researchers
then interview these performers to determine “what makes them tick” and
what competencies make them so successful in their jobs. The resulting mod-
el helps drive workforce development by focusing on the competencies most
closely aligned with success—outstanding performance—thus benefiting
both the employer and employees.
has a high degree of predictability and validity. If, however, those employees
who reach all of the desired levels of mastery are still found wanting, then the
model probably needs more work. With predictability—the “gold standard”
for competency models—it’s
models easy to see why criterion sampling is a preferred
method for creation and validation. Perhaps not surprisingly, starting with top
personnel offers a shortcut to creating a model capable of predicting outstand-
ing performers!
Once a model has been successfully created and validated, the next step is to
develop authentic assessments through which learners can demonstrate levels
of mastery. For industrial and vocational organizations, assessments can be
based on actual job performance. For most technical skills, workers need only
demonstrate their ability to perform their work-specific tasks correctly to earn
certification for a given level of mastery. For “soft skills” and cognitive com-
petencies, the assessments are usually more difficult. As noted previously, ed-
ucational programs usually rely on samples of behavior and faculty judgment
to assess competency mastery. A student required to demonstrate mastery in
multiplication, for instance, with levels of mastery determined by the number
of digits in the numbers being multiplied, would never be asked to multiply
every possible combination of appropriate-length numbers. That would be
ridiculous. Similarly, a student required to construct and deliver persuasive
arguments would only have to perform this task a limited number of times be-
fore a faculty member felt confident in certifying a level of mastery in the task.
There are standardized tests for soft skills, for example the California
Critical Thinking Skills test and a number of leadership and communication
assessments. The key to building or selecting assessments is to ensure they’re
valid (i.e., assess what they are supposed to assess), reliable (i.e., consistently
produce similar results), and authentic (i.e., match similar challenges learners
will encounter outside of the classroom—in the workplace, for instance).
Competency-Based Learning | 265
www.onetonline.org
This is where creativity and ingenuity can pay big dividends. If program
leaders pursued a criterion-sampling approach to creating and validating a
competency model, they may be able to ask those outstanding performers,
“How did you learn that?” The same is true of others able to demonstrate mas-
tery of competencies without taking any institutional classes or courses. The
answers can be fascinating. It may turn out, for instance, that an employee
266 | Modernizing Learning
This isn’t a simple task. Those responsible for this will have to consider the
broad array of users who need access to the information. Certainly, learners
need to know how they’re progressing—where they’re strong, where they’re
weak, and what they need to do to achieve their learning goals. For education-
al institutions, faculty and staff will need access to the information. There’s
also a need for transcripting learning progress for sharing with learners and
other institutions. Industrial entities will have a variety of data-users as well.
Like students, workers will want to know where they stand. Supervisors will
See: www.uschamberfoundation.org/workforce-development/JDX
268 | Modernizing Learning
want to know how their individual workers are progressing in their develop-
ment, and also where their teams are strong or weak in terms of needed com-
petencies. Similarly, progressive levels of supervision will want insight into
this aspect of workforce development.
Within the military, the term force readiness describes how ready a military
force is to execute its warfighting mission. Competency-based learning pro-
vides a granular look into force readiness, providing senior leaders insight
into where they need to invest their developmental resources. Prior to World
War II, the U.S. Marine Corps correctly anticipated the nation would face a
war in the Pacific. The Corps purchased equipment to effect beach landings;
however, there was also a corresponding need to teach Marines to fight in this
extraordinarily challenging, sea-to-shore environment. In essence, the Corps
determined a new competency was required, assessed the developmental need
this new competency created (gap analysis), then began training Marines to
execute the new mission.
Summary
As noted at the beginning of this chapter, competency-based learning isn’t
new. It is, however, an exciting way to approach learning. The power it gives
to learners—the control they have over their own learning journeys—creates
an excitement both for the learners and those guiding them to their eventu-
al goals. Competency-based learning also fosters creativity as both learners
and leaders seek new ways to attain and demonstrate mastery. Lastly, com-
petency-based learning offers that “common currency” that permits learners,
workers, and their institutions to both understand developmental needs and to
share achievements across institutional barriers.
Social Learning | 269
CHAPTER 14
SOCIAL LEARNING
Julian Stodd and Emilie Reitz
Technology is the most visible manifestation of change we see around us: the
rise of social collaborative technologies, leading to the proliferation of con-
nectivity, and the democratization of organization at scale. Put simply, we’re
now connected in many different ways, almost all of which are outside of the
oversight or control of any formal organization or entity.2 In network terms,
there’s high resilience and great redundancy in our connections—which
connections is
significant. Historically, mechanisms of connection were local and tribal, or
large-scale and formal. We connected within formal hierarchies and formal
organizations, and within those spaces, we were expected to conform, to wear
the “uniform,” use the appropriate “language,” and accept the imposition of
270 | Modernizing Learning
In turn, this leads to a shift in power across individual and collective and for-
mal and informal dynamics. There’s a broad rebalancing taking place around
the world, slowly draining power away from formal systems (hierarchy) and
into social ones (community). An important part of shifting power dynamics
is the fracturing of the social contract between individuals and organizations.
organizations
The notion of “career” is evolving; it no longer emphasizes lifelong loyalty
between an employee and a company. Instead, our public reputations, our
personal networks, and the broader communities that surround us become our
“job security.” 3 This has broad implications for learning and development.
The type of learning these new entities offer is different. No longer hindered
by decades of organizational stagnation and “known knowledge,” it’s typi-
cally more dynamic, co-created, contextual, adaptive, and free. This speaks
to the challenge of how organizations need to adapt to the new ecosystem:
Clinging to old models of organizational design (nested power structures),
formal learning (learning as a form of control), formal hierarchies of power
(systems of consequence), and known knowledge (unchallenged, static orga-
nizational dogma), is a sure fire way to be disrupted, from the level of organi-
zations up to the scale of nations, themselves.6 And hence, the old structures
of formal power are ceding some of their relevance—unless they can adapt.7
We’re used to seeing training and education as discrete parts of a stable sys-
tem, but today, in the context of the Social Age, learning and development
are dynamic parts of a dynamic system—and we must adapt them to fit the
changing times, not just the new modes of delivery available.
available In other words,
our adaptations must fundamentally readdress the design, facilitation, assess-
ment, and support of learning. We must develop new methodologies for learn-
ing, and invest heavily in the communities and social leaders who will deliver
these new capabilities so that we don’t simply survive—but thrive, and avoid
disruption and failure, in the Social Age.
Delving into semantics may kill us, but let’s briefly consider the nature of
knowledge, not at the deepest philosophical level but at the rather mundane
272 | Modernizing Learning
and practical one: Our ways of knowing are changing. We’ve moved from
“concentration” to “distribution.” Where once we memorized and codified
knowledge, and held it in libraries, books, vaults, and experts (in concentrat-
ed “centers of learning”), today it’s dispersed, distributed, and free—yet, not
without its problems (validity, bias).
If we worry about validity to the point where we take no action, then we can’t
benefit from social learning. Conversely, if we liberate social learning with no
account of the risks, we’ll be overtaken by it. We must learn to balance both,
in a persistent dynamic tension.
HIERARCHICAL HYBRID SOCIAL
1
Learning is changing
Against the backdrop of the Social Age, the type of knowledge we engage with everyday has
changed, often co-created, geolocated, adaptive, and hidden within our social communities.
The formal system is everything an organization can see, own, and control.
Formal systems are where we create formal learning, and they’re extreme-
ly good at certain things: collectivism, consistency, and achieving effects
at scale. Flowing around and through the formal system are social systems.
These aren’t held in contractual relationships but in trust-based ones. The
social system is multilayered, contextual, often internally conflicted, and ever
changing. Social systems are also good at certain things that formal ones ar-
en’t: They’re good at creative dissent, gentle subversion of outdated process-
es, questioning of systems, radical creativity, social amplification, movement,
momentum, curiosity, and innovation.
FACILITATING A SOCIAL
LEARNING CULTURE
1. Create the conditions for effective social learning
First, consider that in social learning, individuals will engage with formal
assets (stories written by the organization, codified and accepted knowledge),
social assets (tribal, tacit knowledge, held within the community), and individ-
ual knowledge (worldview, preconceptions, biases, and existing knowledge).
276 | Modernizing Learning
From a design perspective, one can, for example, vary the amounts of for-
mal knowledge provided, create conditions for sharing tribal knowledge, and
schedule reflective opportunities for individuals to explore their own experi-
ences. The “scaffold” in Scaffolded Social Learning represents these struc-
tures. In other words, this scaffolding supports specific activities designed
to facilitate and integrate formal, social, and individual learning, and to help
people “make sense” of it all, both individually and collectively as a group.
may bring an example that seems terrible to others, and another person might
offer one that seems off-track. Hence, another step is to encourage the co-cre-
interpretation. This is where someone writes a narrative,
ative behavior of interpretation
shares a story of precisely why he sees the case study as relevant or how it
relates to her personal journey. In other words, this involves interpreting the
thing they curated and exchanging stories across the community.
Will we agree? Well, that doesn’t matter: Social learning isn’t about conformi-
ty and agreement; it’s about broadened understanding, context, and perspec-
tive. We don’t get to deny the validity of others’ examples, but we’re absolutely
allowed to challenge and engage in debate about them. Indeed, challenge can
be another co-creative behavior: I tell a story, you respond, I try to paraphrase
your story, you respond, we both collaborate and respond to a third story, and
we come together to co-create an overall narrative.
While you can technically measure anything, the pertinent question may be,
How will that information be used? The collaborative technologies often used
Social Learning | 279
At the heart of social learning are the learning spaces—the places people
come together to carry out collective sensemaking activities. To be very clear,
space means something very different than community. Consider the analogy
of building a new town: You can build houses, landscape gardens, construct
a mall, and pave a town square. You can even move people into those houses.
But none of this creates the community. It’s only begins to emerge when two
of those people come together, on a street corner, let’s say, and have a conver-
sation about what a terrible job you’ve done on the brickwork. The buildings
form the space; the conversation forms the foundations of the community.
Spaces for social learning might be a classroom, a chatroom, or some kind of
learning management system—however, none of those are the community.
SENSEMAKING ENTITIES
Coherent communities are sensemaking entities; they help figure out infor-
mation, identify misinformation, determine value, and recommend responses.
Our social communities help us to filter the signal from the noise, and then to
Social Learning | 281
understand those signals. In the context of social learning, where much of the
sensemaking is done in the community, this helps provide a diversified view,
and the more diverse in worldview, experience, cultural profile, and capability
the community is, the more effective its sensemaking can become.11
MECHANISMS OF ENGAGEMENT
Within formal systems, we’re assigned roles by the organization, but in social
systems, our roles are more nebulous and change more often. Sometimes we
bring specific expertise, resources, or capability; sometimes we bring chal-
lenge, sometimes support, and other times we’re cross-connectors, linking
different communities. Sometimes we simply come to learn. When consider-
ing social learning communities, it’s worth remembering that we don’t need
everyone to engage in a certain way; we just need broad engagement. It’s fine
for people to take diversified roles.
There’s a role for ritual; in our own research, people described the “rituals of
welcome and engagement” as the single most important factor for their future
success within a community. Such rituals are something within our control;
when designing the scaffolding for social learning, we can actively design rit-
uals or consciously adopt existing ones. We can work with community mem-
bers on their rituals of engagement for new members, for example, and can
work with their formal managers on the rituals they’ll use to share stories of
their learning back to the rest of their teams.12 It’s all part of the choreography
of learning. This means we pay equal attention to every part of the learning
experience, from the email that invites someone to join to the registration
instructions they receive, the way we thank them for sharing stories, and the
ways we graduate them at the end. It’s important to script and craft each part
as an element in the overall running order. Pay attention to them all. Together,
rituals and choreography form a powerful tool of community-building and,
ultimately, of learner engagement.
282 | Modernizing Learning
HIDDEN COMMUNITIES
SANCTIONED SUBVERSION
this organizational detritus? Typically, it’s subverted; people work around re-
dundant systems and suboptimal process. And they do so not only individu-
ally, but collectively too; indeed, when people join a new organization, much
of what they learn, at the local or tribal level in the early days of a new job,
comes exactly from this type of crowdsourced subversion, usually under the
generic banner of “this is how we get things done around here.”
Conclusion
It’s a champagne bottle to uncork with care. The balance between formal sys-
tems of control and socially moderated ones creates an important dynam-
ic tension. When managed effectively, a socially dynamic organization can
emerge, one that integrates the very best of the formal (system, process, hi-
erarchy, and control) with the very best of the social (creativity, subversion,
innovation, amplification). That’s our challenge: to craft more collaborative
models of learning, and to learn how to build an organizational culture in
which learning can thrive both for today and through our emerging future
learning ecosystem.
Americans should have self-sovereign management of their
lives. Right now, medical records are yours but not so much
your educational records; you don’t really control any of that info right
now. We’re working on envisioning what the future looks like following
these guiding principles: to give each person their own destiny,
balance on the supply and demand side…and put it into the hands of
the ones who want to earn the competencies and credentials. It gives
them the power to drive the marketplace. Currently, the providers
squarely have the advantage, but we need to make it a new space
where learners are empowered.
Jeanne Kitchens
Chair of the Technical Advisory Committee for Credential Engine; Associate
Director of the Center for Workforce Development, Southern Illinois University
Self-Regulated Learning | 285
CHAPTER 15
SELF-REGULATED
LEARNING
Louise Yarnall, Ph.D., Michael Freed, Ph.D.,
and Naomi Malone, Ph.D.
PERFORMANCE
SELF-CONTROL
• Imagery
• Self-instruction
FORETHOUGHT • Attention focusing
SELF- • Task strategies
TASK ANALYSIS
Goal setting •
REGULATED SELF-OBSERVATION
Strategic planning • LEARNING • Self-recording
• Self-experimentation
SELF MOTIVATION BELIEFS
Self-efficacy •
Outcome expectations •
SELF-REFLECTION
Intrinsic interest/value •
Goal orientation • SELF-JUDGMENT
• Self-evaluation
• Causal attribution
Learners need to become skillful at regulating their learning over time and
across different settings, especially to acquire thinking, writing, and analysis
skills.3 However, individuals often struggle to manage their learning without
effective and perceptive external support, such as what a teacher, mentor, or
well-structured piece of courseware might provide.4 Consequently, developing
effective self-regulated learning skills requires educators and trainers to help
learners notice knowledge gaps, try new strategies, and adopt more proactive
mindsets. Incorporating support for this approach into new technologies can
also help learners acquire the meta-level skills needed to manage their own
learning across their lifetimes.
By the 1990s, researchers agreed that learners self-regulate during three iter-
ative phases: the forethought phase, where a learner plans and initiates action;
the performance phase, during which learning actions occur; and the self-re-
flection phase, in which a learner reflects on and evaluates performance, ad-
justing as necessary. Barry Zimmerman, one of the preeminent scholars in
the self-regulated learning field, developed a model of these three phases,
grounded in social cognitive theory (see Figure 15-1).6
RECOMMENDATIONS
Helping learners develop better self-regulated learning skills will require new
supports, added into the many contexts where people engage in learning. To
cultivate awareness of Zimmerman’s three phases of self-regulated learning
and to develop effective habits at the cognitive, metacognitive, emotional, and
behavioral levels, we propose three conceptual levels of self-regulated learn-
ing support: micro-, macro-, and meta-interventions. The micro-level focuses
Self-Regulated Learning | 289
on individuals and the tools they use to better navigate a personalized trajec-
tory. The macro-level focuses on how to navigate the selection and progres-
sion across learning experiences.
experiences At the meta-level
meta-level,, there’s a recognition that
building appropriate learning habits requires focused practice in the cogni-
tive, social, emotional, and physical capabilities that contribute to resilience,
effective decision-making, and lifelong personal growth. We describe appli-
cations of these three levels in the suggested interventions below.
SELF-REPORT INSTRUMENTS
learning. International studies indicate wide variation in how well both early
childhood education and family upbringing sets the stage for lifelong learn-
ing;17 however, it generally begins with establishing confidence and indepen-
dence as learners. Over the past 35 years, K–12 education researchers have
found evidence that open-ended instructional practices, such as guided inqui-
ry activities, foster confidence and independence in learning more than other
practices, such as traditional close-ended question-and-answer routines.18 In-
troducing open-ended practices in childhood can help set the conditions for
lifelong learning, but continued support for self-regulation is needed even in
adulthood. For example, some research indicates that those countries with the
highest levels of lifelong learning among adults have robust adult education
systems.19
Past education and experience represent both a potentially rich learning re-
source and a possible threat, since old habits and misunderstandings can block
the grasp of new ideas and procedures. For this reason, educators, trainers,
and instructional designers should incorporate activities and tools to elicit
learners’ prior knowledge and help them reflect on which elements of it are
potential building blocks and which are possible barriers.
More research is needed in this area, however, to uncover new methods for
estimating learners’ prior content knowledge and self-assessed self-regula-
tion skill levels. Since traditional testing can negatively impact learners’ mo-
tivation, finding new assessment methods is a critical step to enhancing per-
sonalization models beyond their current level. Currently, traditional testing
approaches and curriculum sequences favor comprehensiveness and certifi-
cation. Work is needed to understand how adjusting the frequency and forms
of assessment can inspire rather than hinder self-regulated learning. Methods
worth exploring include integration of self-reflective assessments of content
knowledge and self-regulated learning skills with validated measures of tra-
ditional content knowledge and skills.
As learning platforms and media proliferate, the community will need a wider
range of ways to gather trace data on how and under what conditions learners
use self-regulated learning supports. This line of research is likely to inno-
vate around new approaches to using xAPI to collect student data, usefully
aggregate datasets across experiences, and apply learning analytic models to
analyze them. Such work need not focus only on individual learners’ patterns,
but should also consider patterns within content pathways from multiple us-
ers. Such data traces can support more personalized and optimal recommen-
dations of what content to review next and can strengthen systems to covert-
Self-Regulated Learning | 295
Discover
EXPLORE
Dabble Bridge
Familiarize
Assess
Next
Refresh Use
SHARPEN
Extend
To return once again to the three levels of abstraction, ways to foster post-learn-
ing reflection might include: Providing lessons to individual learners and
learning facilitators about the kinds of useful questions to pose (micro-level);
scheduling and building-on reflection activities across an extended lesson or
project (macro-level), and rewarding learners for engaging in reflection ac-
tivities, such as offering them the chance to unlock a range of new learning
opportunities based on their reflective participation (meta-level).
Summary
Successful self-learners do more than just study and memorize. They stay alert
and are curious to discover new, valuable learning. They skim a lot of content
to find the important points. They search informally to nurture motivation for
intensive study and periodically review afterwards to fight forgetfulness. And
they find the time to do it all.
CHAPTER 16
INSTRUCTIONAL DESIGNERS
AND LEARNING ENGINEERS
Dina Kurzweil, Ph.D. and Karen Marcellas, Ph.D.
For over 60 years, instructional designers have supported teaching and learn-
ing, primarily by identifying effective ways to present material in formal edu-
cational and training environments. Given advances in technology, increasing
access to data, and the explosion of formats and venues for learning, designers
in the future will have to gain more knowledge and expertise than ever be-
fore as they develop their professional craft. Consequently, a new concept is
entering this complex field: the learning engineer.
engineer Who are these individuals?
What are their areas of expertise? How do their knowledge and skills relate
to, expand upon, or differ from those of instructional designers? This chapter
describes the history of instructional design and explores how the field of
learning engineering will need to develop and expand upon instructional de-
sign methodologies to support teaching and learning in the future.
educational knowledge, but usually this isn’t the norm. In contrast, learning
scientists are educational researchers who are deeply knowledgeable about
how humans develop and learn, particularly from a cognitive perspective.
Both of these roles can act in support of instructional designers, who apply a
systematic methodology based on theory, research, and/or data to plan ways
to teach content effectively. Instructional designers work in both education-
al and training environments. They’re problem solvers who use different in-
structional models to promote learning. In other words, they’re responsible
for “the theory and practice of design, development, utilization, management,
and evaluation of processes and resources for learning.” 1
Through the 1960s and 1970s, the growth of digital computers influenced
learning theories, and many new instructional models adopted an “informa-
tion-processing” approach. The 1970s also heralded the systems approach to
instructional design, including one of its best-known models, the Systems Ap-
proach Model, published by Walter Dick and Lou Carey.3 The Dick and Carey
approach offered a practical methodology for instructional designers, and it
emphasized how each component of the model works together. Dick and Car-
ey also highlighted how technology, media, and research were all impacting
the field at that time and, consequently, how “modern” instructional designers
differed greatly from their counterparts in the 1960s in terms of academic
background, training, research, and tools.4
Instructional Designers and Learning Engineers | 303
Throughout the 1970s and 1980s, the instructional design field continued to
evolve; a later survey of instructional design models found they had differen-
tiated into having a classroom orientation (focused on development of instruc-
tional materials for a single lesson or set of lessons by teachers), a product
orientation (focused on development of specific products by teams), or a sys-
tem orientation (focused on development of curricula by teams).5 Present-day
instructional design continues to have different application specialties, and it
continues to be influenced by technology. However, rather than model instruc-
tional design theories on technology, as in the 1960s and 1970s, contemporary
instructional designers explore ways to incorporate technology into their work.
ous uses for learning—but it’s still just a tool. While technology can provide
many benefits, its effective use in training and education requires carefully
defining its role and ensuring it remains subordinate to the learning goals. In
recent history, we’ve seen a push for instructional designers to focus more on
technology, shifting emphasis away from instructional theory. However, the
systematic design, development, implementation, and assessment of teaching
and learning requires that instructional designers keep instructional methods
central to their work and examine all technology with an eye towards promot-
ing more effective learning.
and the Kemp instructional design process. Their approach usually involves
identifying desired outcomes and determining the skills, knowledge, and atti-
tude gaps of a targeted audience. They apply theory and best practices to plan,
create, assess, evaluate, select, and suggest learning experiences to close those
gaps.10 Instructional designers may be involved with the entire instructional
process or with portions of it. For example, early in a project, they’re often in-
volved with the systematic review and critical appraisal of existing materials.
Using research and theory, instructional designers may also conduct analyses
before the actual instructional design and development occur. Later in the
process, instructional designers may emphasize the importance of assessment
and evaluation, to ensure learning experiences have met their intended goals.
A common theoretical and practical understanding of innovation also con-
tributes to instructional designers’ work, and the best instructional designers
ensure their clients, fellow educators and trainers, and leadership recognize
how the different tools, processes, materials, and innovations that make up
learning systems can enhance their learning offerings. Hence, instruction-
al designers need to additionally have a creative spirit of design,11 including
an imaginative, creation-oriented, and interdisciplinary character as well as
the creative spirit to remain flexible and perceptive in their practice. That is,
despite the proliferation of formal processes, such as instructional systems
design, instructional design remains an art—albeit one firmly grounded in
science and theory.
CHANGING CHARACTER
OF LEARNING
The growth of technology and access to learner data has led to advances in
learning science and made the learning environment more complex. This, in
turn, affects the roles of instructional designers, who must now interact with
a variety of formal and informal modes of learning, social and experiential
learning theories, as well as new tools, processes, and people. This complex
infrastructure has been called the “learning ecosystem.” It encompasses the
physical and mechanical elements of educational and training environments;
the theories, processes, and procedures that drive their use; and learners’
(complex) relations to and interactions within that environment. This includes
all elements that make up learning, from the formal classroom and those tra-
ditional instructional activities, to the technologies used to support informal
learning. The complexity of the future learning ecosystem is turning instruc-
tional design into an even more dynamic activity, where designers must be
aware of how all these elements come together, how each works, and how to
best orchestrate learning across time, space, and media.
Learning Engineers
AI: In many ways it’s solving similar problems as before but doing it more
effectively with data. For example, we can search and find content with a
much deeper understanding of its meaning. We can get better at questions
such as: “What’s the student really trying to learn? Can we find the part of
a video that would be most helpful? How else can we make this experience
easier for students?”
Shantanu Sinha
Director, Product Management, Google
Former Founding President and Chief Operations Officer, Khan Academy
entific and analytical methods. Learning engineers use data and knowledge
of enterprise structures to help promote good decision-making in the use of
learning ecosystem components. With its focus on data, and in using validat-
ed methods that put learning data to work in the service of improved learn-
ing outcomes and institutional effectiveness, this emerging field takes a step
beyond traditional instructional design. Learning engineers do this, in part,
by combining big data with design-based research to improve the design of
learning experiences.14 Additionally, learning engineers use theoretical and
practical understanding to scale innovations across the learning ecosystem.
Learning engineers can help with the complexities of integrating various tech-
nologies, workflows, interactions, and data-driven processes to enable learn-
ing. They may engage with widely varying technologies, including learning
management and learning content management systems, mobile learning
Instructional Designers and Learning Engineers | 309
The growing and dynamic learning ecosystem means learning engineers are
likely to play much larger roles in the planning, design, development, and
analysis of diverse and complex instruction. Learning engineers, like instruc-
tional designers, will be expected to anticipate changes or new developments
in applicable technologies or in the instructional fields affecting their spe-
cialty areas and programs. They’ll also need to continually improve their
instructional strategies to reliably identify best practices and opportunities
for change. Accordingly, learning engineers need to possess a wide scope of
competencies, including a foundation in learning science as well as the use of
data to improve learning practice. They need to know good learning design
principles, be conversant in learning analytics and enterprise learning tech-
nologies, and have some unique areas of relevant expertise, such as cognitive
science, computer science, or human-computer interaction.
310 | Modernizing Learning
I don’t think that from a military perspective that we’ve completely taken
advantage of large data management. Here’s a great analogy: We have
hundreds, if not thousands, of hours of full motion video, but how much do
we actually analyze based on the current tools…? Eighty-plus percent isn’t
reviewed in detail. Until recently, we were working on automating that and
that’s one element I look at data management for—turning those mountains
of data into decision-quality information.
Thomas Deale
Major General, U.S. Air Force (Ret.)
Former Vice Director for Joint Force Development on the Joint Staff
service by helping to link research and teaching, both promoting current re-
search into effective teaching and encouraging faculty members to conduct
such research. Learning engineers can also work in many different industries,
perform many different tasks at various organizational levels, and, indeed,
work side-by-side with instructional designers and other learning profession-
als—but with a different focus.
IMPLEMENTATION
Define the Roles
While they can work together and have some overlapping skill sets, there are
important distinctions between learning engineers and instructional design-
ers. Notably, while learning engineers’ skills are grounded in applied learn-
ing sciences, they additionally emphasize data science, analytics, user expe-
rience, and applied research. Learning engineers also have a greater depth
and range of experience, including some expertise in the implementation and
improvement of learning ecosystems—that is, in working with diverse, tech-
nology-enabled, data-driven learning systems.
Before becoming a learning engineer, someone must acquire the highest lev-
els of knowledge in learning theories, models of learning, data about learning,
research into learning, and the management of learning. They’re also likely
to need higher levels of technical experience than instructional designers. As
312 | Modernizing Learning
such, unlike an instructional designer who can start at the entry level and
develop skills over time, learning engineers must have more extensive edu-
cational backgrounds and prior experience. The mix of knowledge and ex-
perience, or, more specifically, the ability to filter expert knowledge through
the lens of practical experience, helps characterize the learning engineering
approach to instructional solutions.
and experience. Entrants into a program could have various areas of relevant
expertise, and the purpose of the program would be to engage them in devel-
oping a common vocabulary, breadth of awareness, and solid ability to exam-
ine data to identify learning evidence.
In the end, the graduate from such a program should be able to design and im-
plement innovative and effective learning solutions in complex systems, po-
tentially at scale, and aided by advanced technologies when appropriate. They
should be able to use data and a solid, theory-based evaluation framework to
improve learning and assessment in practice. Whether applied to industry,
government, military, or academic settings, these graduates should bring val-
ue above-and-beyond that provided by traditional instructional designers.
The path to the job of the instructional designer or learning engineer may begin
with teaching in K-12 or higher education; working in technology in corporate,
government, or military environments; holding an academic research position;
or filling some other responsibility related to educating or training people.
Because the U.S. Federal Government has a strict classification system for
employment, and because it employs so many education professionals, it
314 | Modernizing Learning
serves as a useful lens through which to view the learning engineer role. The
Office of Personnel Management classifies jobs in the Federal Government,
and its General Schedule outlines the occupational groups, series codes, and
classifications of positions including their duties and responsibilities, descrip-
tions, and standards.17 Each occupational group (such as the 1700 “Education
Group”) is indicated by the first two numbers of a four-digit sequence, and the
subspecialties in that group fall within the specified range, for instance be-
tween 0000 to 0099. The 1700–1799 occupational series covers education and
training–related professions, such as “training instruction” (1712) and “public
health educator” (1725). The requirements and description for learning engi-
neering should be included within this general series.
Currently, instructional design falls within the 1750 sub-series (i.e., the “in-
structional systems series”). It seems like a clear solution to expand this sub-se-
ries to incorporate the competencies necessary for learning engineers and
related future learning professionals. For instance, the title could change from
“instructional systems series” to “teaching/learning support and instructional
systems series.” This would follow a trend in the industry acknowledging the
importance of supporting teaching and learning, broadly. Also, more detailed
language about the work performed by learning engineers, their education
qualifications, and experience requirements could be added to the description.
Correspondingly, the upper end of this job series should be reviewed to ensure
that pay and benefits are appropriately aligned with the necessary experience
and education. If we don’t reframe this series (or take similar actions), it’s
more likely key learning engineering components will become lost within an
organization or devalued in career planning or performance appraisals; we
also risk learning engineers being conflated with instructional designers.
The success of the instructional designer or learning engineer of the future will
ultimately rest on how institutions and their leaders connect, communicate,
support, and value those specialties. Learning engineers shouldn’t be seen
as “one-time stops” or clearinghouse consultants for educational products.
60YC: THE 60 YEAR CURRICULUM
The dean of DCE [Harvard’s Division of Continuing Education], Hunt Lambert, is leading this
effort to transform lifelong learning, which is now a necessity in our dynamic, chaotic world.
The 60YC initiative is focused on developing new educational models that enable each person
to re-skill as their occupational and personal context shifts. The average lifespan of the next
generation is projected to be 80-90 years, and most people will need to work past age 65 to
have enough savings for retirement. Teenagers need to prepare for a future of multiple careers
spanning six decades, plus retirement. Educators are faced with the challenge of preparing
young people for unceasing reinvention to take on many roles in the workplace, as well as for
careers that do not yet exist.
On-the-job learning is familiar to most adults; many of us take on tasks that fall outside of our
academic training.…but our children and students face a future of multiple careers, not just
evolving jobs. I tell my students to prepare for their first two careers, thinking about which is a
better foundation as an initial job—but also building skills for adopting future roles neither they
nor I can imagine now…Given this rate of change, education’s role must be long-term capacity
building—enhancing students’ interpersonal and intrapersonal skills for a lifetime of flexible
adaptation and creative innovation—as well as short-term preparation so that they are college-
or career-ready. Education must also advance two other goals beyond preparation for work:
to prepare students to think deeply in an informed way and to prepare them to be thoughtful
citizens and decent human beings…
The 60YC initiative centers on the least understood aspect of this challenge: What are the
organizational and societal mechanisms by which people can re-skill later in their lives, when
they do not have the time or resources for a full-time academic experience that results in a
degree or certificate? Thus far, attempts to address this issue have centered on what individual
institutions might do. For example, in 2015 Stanford developed an aspirational vision called
Open Loop University. Georgia Tech followed in 2018 with its model for Lifetime Education.
The hallmarks of these and similar models center on providing a lifelong commitment to alumni
that includes periodic opportunities to re-skill through services offered by the institution;
microcredentials, minimester classes, and credit for accomplishments in life; personalized
advising and coaching as new challenges and opportunities emerge; and blended learning
experiences with distributed worldwide availability. I believe a possible third approach is to
reinvent unemployment insurance as “employability insurance,” funding and delivering this
through mechanisms parallel to health insurance…
Much remains to be understood about how 60YC might become the future of higher education.
In my opinion, the biggest barrier we face in this process of reinventing our models for higher
education is unlearning. We have to let go of deeply held, emotionally valued identities in
service of transformational change to a different, more effective set of behaviors. I hope higher
education will increase its focus on the aspirational vision of 60YC as an important step towards
providing a pathway to a secure and satisfying future for our students.
Instead, they should be leading the way to optimize experiences and systems
of learning (which may or may not involve technology), and helping organi-
zations meet their missions through the growth and evolution of their training
and education programs. This will require learning engineers to work both
together with other experts and on their own to navigate client expectations,
integrate emerging capabilities, choreograph complex interactions, and help
learners achieve more efficient and effective results.
As the learning ecosystem becomes more complex, those who teach others,
whether they are facilitators, faculty members, or other professionals may well
find it difficult to keep up with the changes. Instructional designers and learn-
ing engineers are specialists in education and training; they can help teachers,
trainers, and organizations transform teaching and learning environments for
the modern age, and they can also help fellow learning professionals expand
their own knowledge and skills in the use of best practices for education and
training.
CHAPTER 17
GOVERNANCE FOR
LEARNING ECOSYSTEMS
Thomas Giattino and Matthew Stafford, Ph.D.
Heraclitus of Ephesus noted, “Life is flux; the only thing that is constant is
change.” Learning professionals will certainly agree; their field has changed—
and continues to change so rapidly that it’s difficult to keep abreast of develop-
ments. The proliferation of content, the myriad delivery modalities, and even
the collective understanding of how the human mind actually learns have
318 | Modernizing Learning
Reese Madsen
Senior Advisor for Talent Development, U.S. Office of Personnel Management;
Chief Learning Officer, Office of the Secretary of Defense (Intelligence and Security)
Daniel French
Secretary of Education, Vermont Agency of Education
For the most part, too, education and training vendors have been less con-
cerned about governance and more concerned about sales. Governance is a
customer concern. So then, the question for customers—for those organiza-
tions who design and deliver learning—is: How do we create a governance
structure that both centralizes general oversight of the ecosystem while simul-
taneously maintaining necessary flexibility that allows for content ownership
by communities, data ownership by users, and tool creation by developers?
Governance for Learning Ecosystems | 321
E PLURIBUS UNUM
(OUT OF MANY, ONE)
The move toward independence from England, which precipitated the arrival
of what was then the world’s most capable military force, drove a loose alli-
ance among colonies. At first, the colonies attempted to keep their indepen-
dent identities, with primarily decentralized control; however, this first gov-
ernance structure, the 1777 Articles of Confederation, proved a failure. The
Articles failed to create a sufficiently strong, centralized government capable
of guiding the fledgling nation. This resulted in infighting and made the cen-
tral government unable to overcome challenges or capitalize on opportunities
collectively.
guing that only a strong, centralized government would be able to quell the
bickering that had made the Articles-based government so ineffective.
Since humankind first saw the need to join together to satisfy common needs,
there has been some form of governance. Learning ecosystem governance is
no different. An effective governance structure is born out of a small group of
professionals who decide to combine their individual needs, capabilities, and
resources to provide better support for, and service to, their organizational
constituencies. These professionals come together to discover the breadth of
the organization’s stakeholders and the key issues to be addressed. They then
work across the organization to select representatives—the framers—who
Governance for Learning Ecosystems | 323
discuss the issues, create an ecosystem charter, and manage its governance
over time. It’s a labor-intensive and emotional process, but when successful,
it’s an extraordinarily fulfilling undertaking.
IMPLEMENTATION
The process through which ecosystem administrators can design and imple-
ment a governance structure, necessarily includes the following steps:
The framers should also include members who can think locally, addressing
individual requirements and concerns, while also thinking globally to under-
stand an enterprise perspective. It’s not always possible to find people who
can do both; so, organizers should try to find a balance among the selected
members to ensure all the constituencies are heard. The result shouldn’t be
a patchwork of individual interests, rather the collective perspectives should
inform an overarching strategy for addressing the broadest array of require-
ments and desires.
Once the constituencies are determined and framers selected, organizers will
need to consider the breadth of topics to discuss. The selected framers will
undoubtedly expand the discussion when they meet, but it’s necessary to have
“an entering argument”—a list of key questions to answer. These will vary
with each organization’s unique situation; however, the following brief list
might prove helpful in building a governance conference, as they are some-
what common to most organizations:
MEMBERS
Members
Who determines who joins the ecosystem?
How will constituencies be represented?
How will governance structure be organized?
Policy
Who is responsible for establishing centralized policy?
Who will enforce policy?
How can enterprise-level functions be supported?
Resources
Who will provide the resources and how?
Who will provide support and how?
Processes
How can the ecosystem address change?
How can the ecosystem remain relevant and responsive?
How does the ecosystem interact with partner/other organizations?
How will users experiment and adapt?
but the governance structure must ensure no single constituency takes control
of the ecosystem to the detriment of others. In addition to rules for expected
behavior, mechanisms are needed to censure misbehaving representatives or
shed inactive ones.
3. How will the governance structure be organized? There are multiple ap-
proaches; however, an approach needs to be selected, coordinated, approved,
and promulgated so all constituents understand where their representation
lies, where decision-making authority lies, and where they can go to request
reconsideration of their proposals should they be denied. The model adopted
by the Framers of the U.S. Constitution (the federalized approach) is worthy
of consideration: The centralized (“national”) government oversees enter-
prise-level concerns while subordinate organizations (“states”) have the capa-
bility to make certain changes to keep their operations moving.
326 | Modernizing Learning
POLICY
2. How will the ecosystem address change? Change is difficult. Framers will
need to consider a variety of potential scenarios to devise a system responsive
to change. The following scenarios present examples framers might consider:
RESOURCES
1. Who will provide support and how? Support is a complex topic and one
often overlooked in the rush to bring aboard new capabilities. Systems tend
to come with “a maintenance tail” to keep them functioning effectively and
current to industry and security standards. More importantly, users—be they
teachers, learners, data analysts, or records keepers—need support too. The
governance framers, in their desire to balance enterprise-level and individu-
al constituency-level concerns, may opt for the federalized approach, where
some level of support is provided locally and other support nodes are cen-
tralized for the entire ecosystem. Support is often a major hurdle for fram-
ers as new ecosystems come on line: Users will want to retain their existing
support capabilities while ecosystem administrators tend to favor centralized
approaches. This is a critical resource consideration.
328 | Modernizing Learning
2. Who will provide resources and how? This question should drive framers
to discuss the sources, types, and quantities of resources required, and who
can provide them. It’s a broad category, encompassing money, manpower,
machines, infrastructure (facilities, electricity, internet capability, etc.), and
much more. Some resource considerations follow:
ture will have to find efficiencies and ensure that systems work together.
Recognizing potential duplications and overlaps, and dealing with these
fairly, is an important part of technological governance. Those constitu-
encies forced to adapt must receive sufficient assistance to ensure their
operations are not adversely affected.
PROCESSES
1. How will users experiment and adapt? The educational technology market
is changing constantly. Users will want to explore new capabilities to meet
their organizational needs. Restraining creativity will frustrate users and
drive them out of a centralized-governance approach. The best way to counter
this is to provide space for experimentation—an “innovation sandbox.” This
approach supports the insatiable appetite of some users for tinkering; howev-
er, it also drives these users to follow system protocols that govern the entire
ecosystem. This approach benefits all: The ecosystem is not corrupted by ex-
perimentation, and those experiments that prove worthy of pursuing have al-
ready demonstrated an ability to function within the ecosystem successfully.
An additional benefit is the way this approach aids in “policing.” The innova-
tors who leverage the sandbox are much less likely to try to sneak capabilities
onto the ecosystem (with potentially dire consequences for the enterprise) if
they have an approved place and method for experimentation as well as a way
to advance their successful innovations to the central governance structure for
adoption into the ecosystem.
leaders want the same for employees who train in their programs.
Ecosystem administrators will have to craft these reciprocal
agreements and develop the data-transfer capabilities to make these
arrangements successful.
• A local community college would like to partner with the
organizational training unit to offer associate degrees. The college
wants access to the employees’ training records as well as the ability
to report courseware completions back to the ecosystem. Senior
leadership agrees; they want this too.
4. How can the ecosystem remain relevant and responsive? There must be
mechanisms in place to ensure stakeholders have awareness of what’s trans-
piring within the ecosystem and have a voice in its evolution. There also has
to be some level of senior-leader oversight to adjudicate disagreements that
arise between stakeholders and ecosystem administrators. Lastly, like the
U.S. Constitution, there must be a mechanism for updating the governance
structures and policies. How will the organization drive change in the ecosys-
tem to ensure it continues to meet the needs of the future?
Once all issues have been debated and preliminary decisions made, the fram-
ers should produce a charter. This, in effect, is the ecosystem’s constitution,
describing the manner in which it will function and prescribing the processes
by which it will remain responsive to organizational and user needs. A pub-
lished charter ensures common understanding of authorities, decision-making,
and resource-allocation processes, and it outlines steps stakeholders can take
to resolve disagreements or seek change. The charter should be coordinated
through stakeholders’ organizations and concerns adjudicated by the framers
before final, senior-level approval and implementation. Once approved, eco-
system administrators must adhere to the charter precisely. Doing so ensures
332 | Modernizing Learning
Returning to the Heraclitus quote that began this chapter, “Life is flux; the
only thing that is constant is change.” Ecosystem administrators will face
change. Charters are created for specific needs at specific moments in time.
Those needs can change. The U.S. Constitution, for instance, was ratified
in 1788. In the course of its existence, 33 amendments have been proposed
by Congress and sent to the states for ratification. Of these, only 27 have
been ratified and have become part of the Constitution. Arguably, each of
these proposed amendments represented a disagreement between contempo-
rary Americans and the Framers; disagreements that must be addressed and
resolved. Through the ratification process, the nation keeps its governance
aligned with the nation’s evolving needs. Ecosystem charters need to be simi-
larly responsive. Change should be possible, but the change process should be
sufficiently difficult, so the charter isn’t in constant flux. Should that happen,
the charter will lose its power and meaning. All stakeholders should have a
voice in changes to the charter, so they can weigh the advantages and disad-
vantage and respond appropriately.
Much in the same way that organizers cast a wide net in establishing their
conventions, ecosystem administrators should ensure all stakeholder com-
munities remain aware of and involved in the evolution of the system. This
requires managers to identify and continually refine the requirements of the
supported population. “Responsiveness” is the watchword, requiring manag-
ers receive and respond to needs quickly and accurately.
Although the governance structure has already been addressed, with the need
for each stakeholder to have a voice in system administration, ecosystem ad-
ministrators must ensure there’s transparency in this process. “Frequently
School districts are a traditional, sole-service delivery model, and districts have
exclusive rights over the learning needs of the students assigned based on residence.
They have to be all things, to all kids, all of the time. This is impossible, and it’s arbitrary
because it’s based on just where you live. If a kid wants something but the school
doesn’t have it, we assume the kid is wrong but the system is right. For example, if a
kid in a rural community loves art but the district doesn’t offer much art, we ask the kid
to put the passion on hold and instead get excited about history or some other offering
the district is good at. We say that the district is right and the student is wrong—in a
deep profound way. But the kid and family are right and the system needs to adjust
and adapt to provide those pathway options. Of course, districts can’t do it alone; they
have to form partnerships.
asked questions,” chat rooms for stakeholder feedback, meeting minutes for
governance meetings, and regular, open communications between adminis-
trators and stakeholders are critical to building trust across the organization.
Administrators should also alert users to current and upcoming problems,
maintenance schedules, and actions taken to resolve problems. Too much in-
formation is better than too little in building trust. Resources are limited, and
administrators invariably have to deny stakeholders’ requests. Trust and trans-
parency aid considerably in how a negative response is received and perceived.
Dennis Mills
Program Analyst, Naval Education and Training Command, U.S. Navy
SUMMARY
This chapter described the evolution large organizations must make as they
move from functionally isolated information-technology schemes toward
enterprise solutions. The example of the American colonies’ transition from
relatively independent polities, to loosely affiliated states, and later to inter-
dependent states governed by a constitutionally ordained centralized gov-
ernment, provided a foundational metaphor to help readers orient to this
evolutionary process. In our case, today’s functional, formerly independent
learning institutions will need to come together. Although, they’ll still require
some level of autonomy to address local events and requirements, events with
Governance for Learning Ecosystems | 337
a broader impact must be handled at the enterprise level, across the learning
ecosystem, to capitalize on opportunities, reduce costs, and avoid unintended
consequences that can occur within this complex system of systems.
The most advanced learning ecosystem efforts will ensure all system com-
ponents work together, that learning is captured and reported across organi-
zational and temporal boundaries, and that the entire construct is learner-fo-
cused, giving users control over their learning and, to the extent possible, their
learning environment. Yet, for such an overarching system to be successful,
it must start locally, with well-developed processes and mature governance
methods within individual enterprises. Over time, then, we can extend those
approaches out, building the complex, lifelong learning ecosystem across our
societies—albeit, one step at a time.
Everything people want is obtainable, but there’s a great deal
of myopic thinking, especially within the government. We hear
too often, “We’ve never done it that way, so why should we change
now?” The focus is too often less about the mission and more about
the change. Having a good change agent is key. We need the executive
branch pushing from the back and Congress pulling from the front. It
needs to be a comprehensive system to be effective.
Reese Madsen
Senior Advisor for Talent Development, U.S. Office of Personnel Management;
Chief Learning Officer, Office of the Secretary of Defense (Intelligence and Security)
Culture Change | 339
CHAPTER 18
CULTURE CHANGE
Scott Erb and Rizwan Shah
FEAR OF CHANGE
Undertaking significant organizational changes can create feelings of uncer-
tainty, anxiety, and being threatened.2 When the area of change is something
as fundamental as learning, such fears can multiply.3 If not adequately ad-
dressed, these feelings can manifest as either passive or active opposition,
resulting in immediate failures and resistance to future attempts.4
There are various human factors that make change difficult,5 some of which
are particularly relevant for the future learning ecosystem. For instance, con-
sider the fear of automation. The potential for AI to replace workers in the
economy, including teachers, doctors, and lawyers, has been widely publi-
cized in the popular press over the last few years.6 This has had the effect of
amplifying the natural fear of one’s skills becoming obsolete in a changing
economy.
Another, related example, involves the fear of losing of control. The impo-
sition of change may make individuals feel their self-determination is being
undermined, particularly when that change involves increased automation,
complexity, and difficult-to-understand data analytics. Individuals might feel
uncertain about their roles, the direction of the organization, or their abilities
to contribute and maintain relevance.7 Team members who were instrumental
in creating the current way of doing business may worry about the perception
Culture Change | 341
that the need for change means their way had failed. Similarly, those who help
administer the current system (such as the present-day teachers, trainers, and
instructional designers) may wonder whether they’ll be able to translate their
current skills into the new environment—will they still be competent and be
viewed as competent by others?
Add to these underlying insecurities the fears of increased scrutiny. Data ana-
lytics, increasingly important in all aspects of learning and absolutely critical
to measuring the effectiveness of changes in a learning environment, may
cause concerns that instructors or program managers will be held adversely
accountable if the data don’t demonstrate high levels of perfection. Learners
may feel exposed and uncomfortable, as well, as the data we can collect and
analyze gets richer and more actively informs a growing range of actions—
not just within a given learning episode but potentially affecting jobs, careers,
and lives.
Another reason some resist change is because it looks like additional work.
In general, production must often continue in the existing system while new
systems are established;8 this is certainly the case for the future learning eco-
system. Add to that the new processes and requirements of the future system,
the looming prospect of ongoing lifelong learning, and the complexity of it all.
It seems like a daunting task.
tory within an organization, and who may have seen several generations of
leadership, may become passive resistors—intent on waiting out the latest fad
while continuing to execute the status-quo processes they’re responsible for.
CHANGE MODELS
There are several change management models that are useful across a variety
of settings, and which can inform options for creating acceptance to advance
learning (see adjacent figure).9 The models vary in complexity. Some have
only a few steps, but these fail to target all the necessary areas; others have
more detail but risk draining resources and time. As such, no pre-packaged
model is sufficient. Rather, a composite of these models, combined with les-
sons learned from working within government, military, and current educa-
tion structures, must be utilized.
Staff
4 Soft
Elements Transition
344 | Modernizing Learning
Similarly, it’s important to incorporate a vision for the future, one that stimu-
lates unity of effort and inspires individuals to take initiative to move forward.
A compelling vision of what the organization looks like in the future helps
generate the buy-in and initiative needed to implement change. Sinek stresses
the importance of communicating why change is needed and reinforcing the
message frequently. Communicating the vision once and expecting it to take
hold throughout the organization is a recipe for failure.
Finally, helping the entire organization (not just the leadership!) contribute to
this vision creates ownership, builds a common compelling story, and inspires
initiative. It’s also likely to generate ideas that leadership didn’t consider and
to reveal easy early wins to help build momentum. Open-ended questions
can help drive creativity here, for instance: What does the new normal look
like, feel like, and sound like? How do our students or employees say that
they imagine the future? What feedback might instructors give to leadership,
if the new system is working? What feedback would indicate that an experi-
ment isn’t working? What new problems does success create? Are we ready
to recognize and take on the new challenges? What are the characteristics of
Culture Change | 345
It’s also useful to identify key influencers at this phase in the change manage-
ment process; they can help carry messaging throughout an organization and
across different stakeholder groups. The influencers may not necessarily be
the most senior people (those with the formal authority); rather, they should
be those with the social leadership to influence the rest of the organization.
Once adequate levels of initial awareness and buy-in have been achieved, the
organization can begin experimenting with process or technology changes.
Innovative organizations rarely fail from lack of vision. Often, ideas are plen-
tiful, while implementation is lackluster. Innovating, especially within large,
established, and successful bureaucratic organizations depends not only on
having a sound vision but also on the ability to manage the organizational
disruption that change entails. However, it’s not the leader’s responsibility to
design and manage the implementation plan; rather, it’s critical that the entire
organization participate. The leader’s job then becomes to do only a few dif-
ficult things: (1) inspiring the team to pursue the “why” by doing things that
generally move the organization in the right direction, at generally the right
speed, (2) ensuring the team has the resources to make progress, frequently
by removing resistance, and (3) creating safety for the team by putting the
innovation and culture change risk on her own shoulders.
The leader must resist, at all costs, the temptation to answer specific ques-
tions in any form of “just tell us what you want us to do.” Providing detailed
instructions for HOW to achieve the WHY is all but guaranteed to derail the
innovation and accompanying culture change efforts. The leader must give
ownership to each team member (at the appropriate levels) to decide what
to build and HOW to build it. There are a variety of ways for the leader to
communicate this transfer of ownership, perhaps the most simple is to ask the
346 | Modernizing Learning
questioner for his intent, followed by asking whether that intent enhances the
organization’s WHY.
The team should then craft a process for how they will implement the innova-
tion and the change-ideas of the team. While the team should craft the process
themselves to ensure that the right domain expertise is incorporated and to
provide ownership of the outcomes, some general principles should be fol-
lowed to address common sources of resistance and their underlying causes.
Notably, the design of the system itself also matters. Too often, early proto-
types are designed for minimum functionality but lack corresponding reli-
ability, usability, and user experience considerations—which distracts from
the experiment and can turn stakeholders against the entire change process.
For instance, the user interface is important. If the new tool takes more than
cursory training to begin using, the experiment is not yet ready for the audi-
ence. A new technology tool should be easy to understand, easy to use, and
Culture Change | 347
make the end-users feel like they’re more effective with it than without it, all
within a few minutes. A good rule might be “as easy to use as an iPad for
a 10-year-old.” Failure to fully appreciate this will strengthen fears of com-
petence, skill obsolescence, or more work. Human-centered design and user
interface programming are complex and time-consuming, but users are so
accustomed to technology that’s well-designed that failing to do so early on
may have severe consequences.
The author of the book, The Checklist Manifesto, Atul Gawande, notes
that he’s never seen the “Big Bang” approach to change succeed.11 That
is, dictating a change from the top of the leadership structure to happen
at a specific place and time hasn’t been seen to work. Clearly an approach
that respects the intent of leadership while also preserving ownership of
outcomes and processes, and inspiring innovation at the point of contact
between provider and customer (in the old model, between student
and teacher), is needed. This approach should be common enough to be
replicable, but flexible enough to be rapidly tailored to specific cases, and
to grow as an organization’s experience grows. Furthermore, it should
be maintained deliberately to ensure that lessons learned in the change
process are collected, understood, and disseminated. If critics see mistakes
repeated, they’ll become more effective critics! We suggest creating a guide
for introducing new projects within the organization. This guide should be
owned by the innovation leader (who may also be the organizational leader
or a senior member who reports to the leader), and used and updated by the
project managers.
Another unique area of concern involves the use of learning data. Clarify
upfront what data will be collected and how it will be used. Understanding
learning outcomes and modernizing learning will require handling big data
and advanced analytics. In learning environments, it’s tempting to focus most
of our attention on students. Teachers, staff, and program managers will also
want to understand that they and their data are safe.
348 | Modernizing Learning
Astro Teller, director of Alphabet’s moonshot factory, Google [x], has a meth-
od for doing so that may serve as a best practice for innovators in the learning
domain. Teller explains that giving lip service to the idea of “failing fast” isn’t
enough. Employees need to be free of the fear of punishment—and in fact
truly believe they’ll be rewarded—for failing fast, that is, for learning and for
rapidly sifting through possible avenues for innovation and change. As Teller
recently explained in a podcast: 12
When one of our projects that actually has, like, a nontrivial number of
people, at least a few people full time on it, ends their project…we bring
them up on stage, and we say, “This team is ending their project today;
they have done more in ending their project in this quarter than any of
you did to further innovation at [x] in the quarter.” …then I say, “And
we’re giving them bonuses…You know what guys? Take a vacation, and
when you come back the world’s your oyster. You’ll find some new proj-
ect to start or you can pick which project to jump into, depending on
which one’s going best.…The word failure, and trying to get people to
fail is a bit of a misnomer.…Failure when it’s actually just “you got a
negative result for no reason and it’s meaningless” is a bad thing. I’m not
pro-failure; I’m pro-learning.
Culture Change | 349
IMPLEMENTATION PLAN
Connecting the general theories, methods, and models of change manage-
ment, we recommend a hybrid approach that capitalizes on the key points of
each. Six areas of focus are recommended when starting the culture change
process for modernizing learning systems.
Educate
The first step towards preparing an organization to embrace the future learn-
ing ecosystem concept involves communication and foundational (re)educa-
tion. Resetting the WHY of the organization is critical to not having to re-
peat the culture change process with ever-increasing frequency. The future
learning ecosystem idea isn’t a defined end-state but, rather, a commitment
to ever-evolving, learner-focused support via interoperable technology and
other emerging capabilities. Our goal, therefore, is to foster an organizational
culture that embraces change as a way of life rather than an organization that
has successfully navigated from one static state to another.
There’s nearly always a fear of change. The goal is to reduce this fear by
increasing education about the change. Extra time needs to be spent helping
people understand what they need to achieve and, of course, why. It’s not just
about garnering their buy-in, it’s about reducing their fear. Step one, then, is
to ensure everyone is educated on the goals for the future learning ecosystem.
For example, explaining the value of interoperability at the technological level
and envisioning the new methods learners and teachers will use to operate in
a human-computer shared space will be important. However, the next step is
to listen: To carefully consider stakeholders’ fears and give them an oppor-
tunity to work through their concerns, contribute to the larger vision, and
become ambassadors to the idea in their own ways.
350 | Modernizing Learning
Support
Everyone needs to know where and how to get support, not just philosophi-
cally, but also from a management perspective. Within the U.S. Government,
the Office of Personnel Management’s USALearning program provides an
immediate go-to for the development of this system, and the ADL Initiative
offers support for research associated with new aspects of it. Within higher
education and K-12, other support systems are being developed; for example,
the Lumina Foundation and U.S. Chamber of Commerce are working together
to support employers and employees making this transition. Further, other
standards organizations and professional societies, such as the IEEE, can also
offer guidance and recommendations to government, academic, and industry
constituents.
and more than that, it requires a demonstration of “skin in the game” via the
allocation of resources.
Buy-In
What’s the return on investment? That’s the question at the highest level,
but at personal levels, individuals need motivation and will ask themselves,
what’s in it for me? (or WIIFM, usually pronounced as “wiff-um,” a com-
mon acronym in military). So, both quantitative, logical messages and more
personal, evocative messages need to be crafted. That is, we need to consider
both the ROI and WIIFM for teachers, instructors, managers, leaders, senior
leaders, and learners, as well as for businesses, schools, universities, and gov-
ernment agencies. They need to understand why these changes need to occur
and the pathway through the transition. They need to understand why it will
help them personally and how it will be implemented and/or integrated with
existing systems.
Making transition feel easy is one of the most important challenges to tackle
and among the most important one to get right. This book is meant to serve as
an initial step in that process. It’s intended to help paint a picture of the “art
of the possible” and take the first steps towards clarifying why these changes
will improve the system; however, the specific buy-in rationale will be unique
to each organization and stakeholder group.
Multi-Messaging
It’s one thing to make changes within a small system or even within a depart-
ment where like-minded or similarly oriented individuals reside. However,
once change is nationwide and includes systems of systems as well as multiple
communities, it requires multiple but complementary messages to be culti-
vated and disseminated. In this case, it’s necessary to achieve two primary
New observations
and exploration
Integrate and
implement new
Ask questions learning insights
where relevant at the
individual, group, and
Assess, evaluate, test, or, organization levels ORGANIZATIONAL
experiment to gather data
INNOVATION
Analyze data and USE-CASE
form conclusions Distribute new knowledge,
From Kendy Vierling, Ph.D.,
insights, and practical
applications to stakeholders Director, Future Learning
Communicate results Group, USMC TECOM
and recommendations
Inform organizational The U.S. Marine Corps Training and Education Command
learning methods, (TECOM) Future Learning Group demonstrates how an or-
policies, procedures,
ganization can implement evidence-informed organizational
systems, and processes
learning processes to support innovation. Established in
2017, the TECOM Future Learning Group is a special staff
unit that advises the Commanding General of TECOM. Its mission is to seek and assess
innovative methods and technologies to improve Marine Corps training and education. The
figure above shows their process.
Beginning with “new observations and exploration,” the group contributes to organization-
al learning by identifying current and future Marine Corps learning needs, competencies,
gaps, and goals—and how they relate to the individual, group, training and education units,
and the overall Marine Corps. Next, the group scans the horizon for emerging science and
technology, such as augmented and virtual reality–based training simulations, adaptive mobile
learning applications, and new methodologies for enhancing instructor development. They
ask questions to explore the prototypes, test new methods and technologies, gather
data and analyze it to form conclusions, and ultimately provide recommendations to
TECOM leadership. These results and recommendations go on to inform organizational
learning methods, policies, procedures, systems, and processes.
The TECOM Future Learning Group also shares the knowledge and practical applications
they uncover with stakeholders both within and (as appropriate) beyond their Command.
Findings are also integrated into current and future Marine Corps programs at the individual,
group, and organization levels, and the results feed-back into their organizational learning
process, driving the ongoing improvement cycle to enhance Marine Corps learning. The
TECOM Future Learning Group’s work helps overcome the research–practice gap and
more rapidly integrate new capabilities into Marine Corps programs. It also facilitates orga-
nizational culture change, encouraging more innovation in Marine Corps training and educa-
tion—helping the Service move from an Industrial Age model of learning to an Information
Age paradigm.
Culture Change | 353
goals: (1) ensure that the messages to the individual communities (e.g., K-12,
higher education, employers, military, and government) are in line with their
singular goals and (2) that there’s a meaningful message that transcends and
unites these communities. In particular, we need to be clear that the benefit to
both human development as well as to our national development lies in the co-
ordination across these communities, that is, in collectively optimizing learn-
ing and development. The future learning ecosystem requires that we have a
shared, single goal but with an unlimited set of pathways for attaining it.
Individuals in compliance and policy roles need motivation for accepting the
future learning ecosystem concept. The stated goal for compliance and pol-
icy is often to ensure no problems occur—that is, to mitigate risk. This is
especially true in the context of information technology and associated cyber-
security and data handling. But to evolve and optimize, risk must be taken.
Consequently, we need to work with compliance and policy stakeholders to
find the acceptable amount of risk. Who decides that? Who’s responsible if
a breach occurs? These individuals have experience and knowledge, but are
often engaged later in a change process, which creates obstacles to obtaining
their buy-in or integrating their ideas into the fledgling system. We need them
to give their direct input, be part of the conversations for planning, and help
us move smartly towards this new vision of learning.
Implement
Average projects (those not tied to cultural change) usually involves linear
planning and straightforward management, with efficiency among their
performance goals. However, in an innovation context, where culture change
is a necessary criterion, different metrics need to apply. There’s a temptation
to revert to traditional managerial methods, to emphasize speed, to reward
354 | Modernizing Learning
only successful trials, and to backslide into comfortable processes. That will
spell disaster for the future learning ecosystem—it cannot function without
the genuine buy-in of stakeholders or the radical change of participating
organizations.
Each organization will need its own experiments, incentives, and implemen-
tation plans, and these must be devised through collective participation. Sim-
ilarly, the larger community—possibly at a nation-wide level—needs to co-
ordinate. This may require extensive cross-cutting communities of practice
and will certainly mean negotiation of experiments and incentives across do-
mains. Just how this implementation plan is designed and what it will contain
isn’t yet clear; however, it’s apparent that it must serve multiple levels—for the
individual stakeholders, their local organizations, and the collective multi-or-
ganizational community. And it’s also clear that each organization will need
to devise its own messages, measures of commitment, and ways of contribut-
ing to the larger vision. We are just beginning down this pathway. We have the
opportunity to do so “the right way,” in concert and with thoughtful coordi-
nation; it’s important that we resist the urge to speed ahead with shortsighted
implementation plans that sacrifice longevity for temporary achievements. “If If
you want to go fast, go alone; but if you want to go far, go together.”
together 13
Summary
It’s easy to avoid change, to play the cynic, wait out new ideas until the orga-
nization returns to the status quo, or find excuses to avoid uncomfortable ac-
tions (e.g., remaining in the “analysis paralysis” process). Individuals and bu-
reaucratic organizations, in particular, are often remarkably clever at finding
ways to avoid change. It’s also tempting to view the future learning ecosystem
as simply another technology—as a thing that can be installed and activated,
and then fueled with educational materials that instructional designers mer-
rily create using more-or-less conventional methods. But this won’t suffice. If
effective, the future learning ecosystem concept will extensively affect how
we each live, work, and learn. It will affect organizational dynamics, societal
systems, and maybe even the overall zeitgeist of our time. Such impacts can’t
be achieved through technology alone. They require coordination, a shared
vision, and commitment to it. They require a culture change.
…you’ve got to be opportunistic in fixing problems so that
you’re not just fixing one but rather fixing multiples. At the
same time, you have to try to build and control the narrative; use it as
a barometer and take some of the danger out of change. You realize
you’ve reached where you need to when people start to give the story
back to you. It helps to know that you’ve got a narrative from the
beginning that does something that will keep people engaged but will
also allow you to implement it later.
CHAPTER 19
STRATEGIC PLANNING
William Peratino, Ph.D., Mitchell Bonnett, Ph.D.,
Dale Carpenter, Yasir Saleem, and Van Brewer, Ph.D.
In this chapter, we explore some of the most immediate steps required to re-
alize the future learning ecosystem across educational, academic, business,
government, and military sectors. We discuss the larger system, including
people, processes, and technologies, and recommend considerations related to
its design, development, and implementation.
Currently, in the U.S., most children begin formal learning in the conven-
tional education system. Primary and secondary programs follow a fairly lin-
ear, time-based model that creates a conservative, general trajectory where
children progress through academic milestones more or less as an age-based
cohort. Students are largely taught as groups in classrooms and provided with
similar lessons and homework. Usually, these curricula focus on key areas
of knowledge acquisition to include mathematics, reading and writing, sci-
ence, and history, often with a few additional areas included such as art, mu-
sic, physical education, and health. Frequently, development of self-regulated
learning capacities as well as social, emotional, and physical competencies
aren’t formally included, although some students may encounter outstanding
teachers or participate in extracurricular activities that foster these abilities.
and in some districts, school choice programs offer more diverse options such
as magnet, charter, virtual, home, and private schools. Increasingly, students
can even opt for fully online high schools, including relatively low-cost na-
tional and international programs.1 Enterprising students, as well as their
teachers and mentors, also have access to an increasing wealth of education-
al resources, which they’re exposed to at younger and younger ages, from
sources including the National Academies, Khan Academy, TED, and various
MOOCs, as well as associated resource repositories such as MERLOT, OER
Commons, and Connexions. There’s also an unprecedented amount of infor-
mal (and sometimes questionable) online resources from YouTube, Wikipedia,
and Reddit to countless other blogs, web sites, and apps.
Once students graduate from high school, they can enter the public or pri-
vate-sector workforce, seek additional vocational training, or matriculate to
higher-education institutions. Postsecondary education traditionally involves
two- and four-year degree options as well as trade and certificate programs.
Colleges and universities also frequently offer advanced degrees in the form
of graduate certificates, master’s degrees, and doctorates. While many schools
still follow traditional methods, the higher-education sector is rapidly evolv-
ing with various new choices including competency-based degrees, fully on-
line options, and hybrid programs.
It’ll change the resumé, too, putting less emphasis on the jobs someone has
held or the degrees earned, and more on his or her demonstrated capabilities.
After individuals enter the workforce, their learning journeys continue. They
can seek vocational training and additional credentials, attend workshops and
seminars, or pursue any number of informal and self-directed learning oppor-
tunities. Some companies also offer continuing education or professional de-
velopment programs for their employees. In the U.S. alone, businesses spend
roughly $90 billion annually on corporate training (as of 2018).2 These offer-
ings range in formality. On the more formal side, there are programs such as
McDonald’s Hamburger University, the “Harvard of the fast food industry,” 3
which trains more than 7,500 students a year,4 and Starbucks helps its em-
ployees earn first-time bachelor’s degrees online through their partnership
with Arizona State University.5 Less formal programs come in many shapes
and sizes, including corporate coaching and mentorship, developmental sem-
inars, official and informal feedback, corporate e-learning and webinars, and
numerous informal learning approaches. There are abundant resources avail-
able, and individuals and organizations have a whole slew of learning and
development opportunities to choose from.
Like the private sector, the public sector and military workforce face similar
opportunities and challenges. In general, the same informal learning opportu-
nities exist for these special populations. Agencies throughout the U.S. Gov-
ernment offer wide-ranging learning and development programs, covering
the full gamut of formality. For example, the Office of Personnel Management
hosts the Federal Executive Institute that provides training in strategic devel-
opment for senior executives. The National Park Service provides access to a
wide range of personal learning opportunities through its internal Common
Learning Portal, and the Department of State uses its Virtual Student Feder-
al Service program to provide on-the-job experiential learning opportunities
to students around the country. But the U.S. Department of Defense is most
notable among these agencies. It’s been considered the “greatest training or-
ganization of all time” 8 and invests more funds in innovating education and
training for its workforce than any organization in history, with the bulk of
these efforts focused on programs for its military personnel.
The DoD conducts formal individual, collective, and staff programs, and it
actively encourages mentorship, peer-to-peer learning, and self-development.
It employs the spectrum of learning modalities including in-resident and com-
puter-aided instruction, simulation-based and embedded training, m-learning,
augmented and virtual reality, and hands-on experiential learning. The DoD
also has strict education and training requirements tied to assignment and
promotion, and particularly for key accession points, it employs several stan-
dardized test, such as the Armed Services Vocational Aptitude Battery and the
Tailored Adaptive Personality Assessment System.
Unlike the private sector, service members generally have fairly constrained
entry- and exit-points into the military workforce, and almost always, individ-
uals separate from active duty military service before they fully retire from all
work. Once service members separate from the military, they may return to
Strategic Planning | 361
BUILDING TOMORROW’S
LEARNING JOURNEY
Never before have so many high-quality opportunities for learning existed.
Yet, tomorrow’s learning environment will be even more advanced as infor-
mation and communication technologies, automation, and innovation contin-
ue to change how we interact, behave, and learn. We have great momentum,
but how do we optimize this future system? Towards that end, we’ve inte-
grated a set of 10 near-term strategic recommendations for the wider future
learning ecosystem—drawn from across this book.
Public and private school enrollments in the U.S. have steadily risen over
the preceding decades.10 The education, training, and talent development in-
dustries are similarly expanding along with corresponding increases in both
not-for-profit and open-access resources. However, many of these expansions
are happening in isolation. For example, learner records are typically housed
in stovepiped data silos. Someone might spend 13 years in school as a child
and then graduate with a high school diploma and a transcript with letter
grades. Any additional specialties, sub-competencies, extracurricular activi-
ties, or other insights are usually absent from this documentation. The same is
true of the university or vocational school outputs, and, typically, for previous
work experiences, which may be documented (say, on a resumé) but are rarely
assimilated as meaningful data. A similar story happens throughout service
362 | Modernizing Learning
The future learning ecosystem will enable an environment where the different
tools, technologies, and systems a person encounters can communicate data
about his or her performance and the contributions of different activities to it.
Key to this vision, the various systems will need to interoperate, collect and
share meaningful data, and use that compiled information to promote tailored
instruction. In other words, we’ll need greater interoperability across learn-
ing systems and, correspondingly, greater portability of learning-related data.
Part of the change will also likely involve creating systems of learner-owned
and managed data that use metadata to ensure authenticity, respect learners’
privacy needs, and broker across different systems. This will require a unique
set of capabilities to accommodate
security, privacy, architecture, and
A universal learning profile will
content, and it will place demands on
act as an external repository where
individuals can hold their data and development, deployment, employ-
share it as desired to drive educational ment, and assessment of learning sys-
choices, personalization, employment
tems. This technological architecture
eligibility, and personal growth.
for learning forms the essential back-
bone of the future learning ecosys-
tem—connectivity across time and space make the entire vision possible—
hence why interoperability, data specifications, and learner-centric universal
profiles top our recommendations list.
list
4 Improve assessment
• Limit high-stakes summative assessments, particularly in K–12
• Limit
• Integrate
• Integrate more formative, portfolio-based, and experiential assessments
• Make
• Make assessment data and feedback visible to learners
It’s necessary to begin with the foundational years (kindergarten through 8th
grade), to widen the curriculum focus to include social, emotional, metacog-
nitive, and physical development as part of formal education. Creating ob-
jectives for teachers in these areas provides the policy justification they need
to spend time in the classroom explicitly focused on developing the whole
student. Inclusion of these competencies, however, necessitates a shift to an
asset model of growth, which places more emphasis on what students can do
currently and what they need to learn next—as opposed to focusing on areas
where improvement is needed to meet norm-referenced or “typical develop-
ment” milestones. This shift from achievement-orientation to growth-orien-
tation can also improve motivation to learn and promote lifelong interest in
self-driven learning.
When you take away the performance aspect, people act differently.
In “practice” settings they’re freer to make mistakes without
someone reviewing and judging them. When they’re being assessed,
though, people bring to bear a different mindset and focus. If we
replace more traditional assessment with “stealth” assessment,
will we introduce a paradigm that’s counter to a growth mindsets
and to how learning happens best? If they have to be always “on”
that could be a really challenging dynamic for our learners.
out new information, determine its accuracy and relevance, and assimilate it
in a manner accessible and translatable to the real world. We will need to edu-
cate and empower people to distinguish accurate from falsified data, manage
data saturation and information overload, and cultivate persistent energy for
lifelong learning. However, individuals’ abilities to engage in effective infor-
mal learning vary. Hence, it’s important to foster individuals’ self-regulation
capabilities and facilitate their active engagement in self-directed learning,
for instance, by providing access to resources, making learning content more
easily “findable” (such as via metadata), or encouraging learning through per-
sonalized prompts.
4. Improve assessment
Assessments, along with the learning data they generate and evaluations they
enable, play foundational roles in training and education. In the future, with
the increased emphasis on personalization and data-driven systems, assess-
ments will only grow in importance. However, the nature of the assessments
will change.
At the K–12 level, the number and use of standardized assessments currently
poses several challenges to learning. Today, in the U.S. system, students are
required to take numerous standardized tests; the results from these are used
to identify struggling students or, in aggregate, to uncover underperforming
school systems. In both cases, the assessments serve as accountability devic-
es. Once a child or school is shown wanting, more assessments are used to
focus and monitor their remediation. While this sounds logical, in practice,
time spent on such detailed work can be emotionally and cognitively taxing
as well as a drain on overall learning time. Emphasis on such high-stakes
The problem is, with all the
assessments we have to
bombard students with, we don’t summative testing has been shown
have time to do the project-based to shift the focus away from true
work. That, in and of itself, is learning and instead to encourage
defeating. Teachers have the superficial “teaching to the test”—
test
lessons, but they don’t have tests that typically emphasize cogni-
the time to develop them with tive abilities to the exclusion of the
the students because of all the “full-spectrum” competencies de-
testing. scribed above.12
As learning contexts evolve, so too do the roles and requirements for learning
professionals within them, notably teachers, trainers, educational technolo-
gists, and instructional designers. The speed of progress in this sector means
they’ll need to learn continuously—keeping abreast of the latest research,
technologies, and regulations. Ongoing professional development to re-skill
and up-skill learning professionals, using formal and informal methods across
diverse media, will be critical.
The future learning ecosystem will likely be a highly technical and collab-
orative environment supporting both micro- and macro-level instructional
strategies, maybe even leveraging the “in-between” learning experiences and
events—between classes, courses, and life events—to adapt to learners’ inter-
ests, needs, prior knowledge, and resources. As we begin to look at learning
across the lifetime, leveraging big learning data and new learning strategies,
learning professionals will need new knowledge and skills. This has driven
efforts to define the concept of a learning engineer (see Chapter 16),
16 to close
the gaps between technology and instructional design, and between isolated
instructional events and larger-scale learning systems. We’ll need new con-
ceptual models that define learning engineering, their professional practices,
certification and skills, professional development processes, and integration
into teams and organizations.
The future learning ecosystem vision views learning as an integral and on-
going aspect of life, woven throughout work and personal contexts. This has
unique implications for employers, who will no doubt leverage it as a learning
and performance ecosystem that “enhances individual and organizational ef-
fectiveness by connecting people and supporting them with a broad range of
I just did a survey on issues for teachers asking them what their biggest
issues in the classrooms are. Four major issues in the survey were found.
First, academic freedom doesn’t exist anymore: “Learn as you live is gone.”
Some of the other big issues were about the evaluations. They pressure
teachers and administrators. In any other job, you’re evaluated on what’s
seen, but people don’t go into surgeries and second guess everything
the surgeon does. That doesn’t happen in a regular job; they don’t get an
evaluation that nitpicks every possible thing they do to ensure it fits into the
rules of what they’re told is important. The administrators, many of them
don’t like how strenuous and stressful the process is.
The third was stress in the classroom and stress in the working conditions.
If there isn’t strong contract language then when there’s an issue it’s tough
to fix.
Sue Carson
President, Seminole Education Association (Florida)
372 | Modernizing Learning
In the future, we expect to see greater churn across roles, companies, and ca-
reers. As workers increasingly value flexibility, fluid work/life structures, and
personal experiences, we may also see more “gig economy” careers, where in-
dividuals or teams are available for project-based work or consulting services
but don’t work directly for a single company. Correspondingly, there may need
to be greater permeability across the workforce, encouraging people to move
into and out of formal learning, full-time jobs, and personal developmental ex-
Strategic Planning | 373
The success of the future learning ecosystem concept is, in large part, predi-
cated on culture change. A significant mindset shift will need to accompany
any advancement from the Industrial Age of learning towards the future learn-
374 | Modernizing Learning
Closely related, we’ll need to embrace mastery learning and nonlinear, tailored
learning. While such concepts have been touted for decades, most systems—
whether formal schools or workplace development programs—still tend to
emphasize time factors and minimum achievement standards. To move ahead,
we’ll need to let go of the idea of “minimally acceptable” as an advancement
criterion. Similarly, we’ll need to allow more flexibility in systems, moving
away from amassed education and training approaches, with predefined lin-
ear curricula, and instead towards more nonlinear, personalized trajectories.
Once realized, the technological architecture doesn’t just allow for improved
access to status-quo learning opportunities—it creates an entirely new capa-
bility. Metaphorically, consider the components of a car (the steering wheel,
tires, pistons, and so on); separately, they’re functional objects, but when con-
nected together, they can produce an entirely new capability—transportation.
Similarly, the future learning ecosystem, by the aggregate nature of the systems
that comprise it, will create unimaginable new capacities, more than merely
the sum of its parts or the incremental expansion of today’s learning paradigm.
We created the Common Learning Portal. It’s a web portal—a marketplace for
training. It opens April 2019. The government’s cybersecurity processes (FedRAMP)
kept us from opening the doors sooner; it’s been in pilot-project mode, but
operational, for two years now. It provides a comprehensive learning performance
ecosystem, a holistic view of learning. The system enables us to put information,
people, and other learning resources in places where people can find them, even on
a mobile device. We hope our personnel and volunteers who have been out in the
field for work can go back to their offices and do their training, which they have to
do at the beginning of every cycle. Already, we have over 500,000 page views and
4,000 registered users—without even formally launching. It’s caught on by word of
mouth. Some trainers got excited. We had support from leadership and people. It
was a grassroots effort.
So that’s where we are going tomorrow. We’re not throwing away formal learning,
but we’re trying to pull in performance support, microlearning, and things that allow
us to better do our jobs.
contexts can’t use it, it won’t achieve its goals. At this most obvious level,
this means system usability—across its various user interfaces and user ex-
periences—plays a major role in its success. It’s necessary to put a focus on
UI/UX, making all aspects of the system as intuitive, modern, and effective
as possible, to increase adoption and facilitate its customization to unique
requirements across the broad stakeholder community.
Similarly, issues of network connectivity and technical access are equally im-
portant and extend beyond technology, touching on social and societal consid-
erations. Access issues already limit educational opportunities for children in
many rural or underserved areas. As more of our learning becomes digitized
and networked, we must carefully ensure equity in access to it—not only for
ethical reasons but also to maximize the diverse capabilities of society and
enable all to realize their unique potentials. If not, we risk widening the edu-
cation gap, creating greater disparity in access to quality education and train-
ing, and potentially creating a bifurcation between the “haves” and the “have-
nots.” In other words, we could inadvertently build a divide between those
with access to open, unstructured junk information versus those with access
to higher-fidelity, semi-automated methods of transmitting quality knowledge
within and across communities.
Holistic solutions will demand holistic governance, as well as new law and
policies. These may span broad areas of consideration from technical frame-
works and interoperability standards to content and data exchange processes,
and equity, ethics, and fairness of use. Below are a few considerations, but this
discussion will require a much more extensive treatment, as well as interorga-
nizational coordination, to fully outline.
Starting with K–12 education, new policies and processes are needed in sev-
eral areas. For example, moving to competency-based learning methods will
be key for fostering a learner-centered, development-oriented system. In the
U.S., the Common Core Standards, in theory, allow for a similar kind of co-
ordination. In practice, however, these standards have become inflexible re-
quirements added to already overloaded schedules. Transitioning to a compe-
tency-based model would better allow teachers to personalize learning and let
students earn credit for knowledge acquired outside the classroom. Teachers
will need policy to support them in exercising the academic freedom required
by this model, to be able to adjust content and methods to meet each student’s
unique developmental needs. Additionally, as indicated above, competency
goals will need to be augmented to incorporate social, emotional, metacog-
nitive, and physical elements for “whole person” development. Additionally,
the Every Student Succeeds Act focuses on providing funds to schools that
use evidence-based practices, yet teachers and administrators rarely receive
formal training on research design and statistics. Expanding existing govern-
mental programs (e.g., the Education Innovation Programs, U.S. Department
of Education) that aid in closing this research-practice gap will help optimize
learning and can also support the up-skill and re-skilling of learning profes-
sionals (as described in Recommendation 5, above).
The big thing here, that everyone’s dealing with, is the challenge of: How
do you transition into a performance organization? How do you support
an organization trying to become a performance-based one, and what
are the other things that wrap around that structure? For example, talent
management systems are important, but the way HR people manage now
is mainly through “box-checking,” like the things that you’re required to
do for mandatory training. We’ll have to rework so many things—HR, the
compliance stuff, and assignment decisions…Conceptually, though, it’s
always been the same thing: How do you choose the right people?
Michael Freeman
Consultant, Training and Learning Technologies
usage rights. The diversity of learning venues obviates any unitary solution,
but certain characteristics will be common, such as privacy, continual assess-
ment, and security.
to be reconsidered for the future context and designed in a way that balances
privacy with functionality.
ments (as mentioned in Recommendation 4), who will validate them, update
them, and accredit their use across learning and workforce systems? Further,
how will schools that provide degrees based on mixed methods for compe-
tency attainment be formally accredited or ranked? To continue our example,
laws involving medical insurance, malpractice determinations, and formal li-
censure might be impacted.
CONCLUSION
In this chapter, we’ve offered several recommendations for the advancement
of learning. Throughout this process, we’ve assumed that technologies, nota-
bly automation and data analytics, will continue to advance. In other words,
we felt it safe to assume that such capabilities are (or would be) technological-
ly feasible. The challenge lies not in developing the technologies but in their
validation, effective integration into learning systems, and consideration for
the corresponding social, organizational, and societal changes they’ll produce.
It’s not feasible, nor frankly advisable, though to plan out every piece of this
future learning ecosystem; the rapid pace of change and its complexity neces-
sarily require its design to be dynamic, flexible, and collaborative. However,
we’ve attempted to apply systems-thinking approaches to the planning pro-
Strategic Planning | 385
The immediate and enduring relevance of this discussion is clear; we’re con-
ducting basic research now that will provide knowledge to reframe our future
paradigms, bounding the unknowable to both enable and constrain future
choices. We recognize whatever choices we make will have consequences,
but learning, itself, is essential for making those future choices. The con-
fluence of learning and technology—the evolution from traditional schools,
to distributed learning, and now to “ubiquitous learning”—is driving us to-
wards the need for learning across time, space, and function using tools and
techniques from across diverse venues to enable seamless lifelong learning,
whether training, education, or experience, as part of a holistic approach to
empowering human potential. Interdisciplinary stewardship will be essential
to extend and connect learning science, policy, and technology to address to-
day’s challenges and be prepared for the unknowable future.
29 For example, see: Schatz, S., Nicholson, D., & Lester, J.C., Towns, S.G., & Fitzgerald, P.J. (1998).
Dolletski, R. (2012). A system’s approach to sim- Achieving affective impact: Visual emotive com-
ulations for training: Instruction, technology, and munication in lifelike pedagogical agents. Interna-
process engineering. In P. J. Mosterman (Series tional Journal of Artificial Intelligence in Educa-
Ed.) Real-time Simulation Technologies: Princi- tion, 10, 278–291.
ples, Methodologies, and Applications (pp. 371– 37 D’Mello, S.K., Picard, R., & Graesser, A.C. (2007).
388). Boca Raton, FL: CRC Press. Toward an affect-sensitive AutoTutor. IEEE Intel-
30 Web-based Education Commission (2000). The ligent Systems, 22, 53–61.
power of the internet for learning: Moving from Kort, B., Reilly, R., & Picard, R.W. (2001). An af-
promise to practice. Washington, DC: Web-Based fective model of interplay between emotions and
Education Commission. www2.ed.gov/offices/AC/ learning: Reengineering educational pedagogy—
WBEC/FinalReport/index.html See pages 75–77. Building a learning companion. In T. Okamoto, R.
31 For example, see: Hartley, Knshuk, & J.P. Klus (Eds.), Proceedings
Beaumont, I., & Brusilovsky, P. (1995). Educa- of the IEEE International Conference on Advanced
tional applications of adaptive hypermedia. In Hu- Learning Technologies (pp. 43–46).
man-Computer Interaction (pp. 410–414). Spring- 38 For example, see: Calvo, R.A., & D’Mello, S. K.
er, Boston, MA. (2010). Affect detection: An interdisciplinary review
Brusilovsky, P., Pesin, L., & Zyryanov, M. (1993). of models, methods, and their applications. IEEE
Towards an adaptive hypermedia com-ponent for Transactions on Affective Computing, 1, 18–37.
an intelligent learning environment. In L.J. Bass, 39 Mayer, R.E. (1997). Multimedia learning: Are we
J. Gornostaev, & C. Unger (Eds.), International asking the right questions. Educational Psycholo-
Conference on Human-Computer Interaction (pp. gist, 32, 1–19.
348-358). Springer, Berlin, Heidelberg. 40 For example, see: Garrison, D.R., Anderson, T., &
32 For example, see: Archer, W. (2003). A theory of critical inquiry in
Koedinger, K.R., Anderson, J.R., Hadley, W.H., online distance education. In M. Moore and G. An-
& Mark, M.A. (1997). Intelligent tutoring goes to derson (Eds.), Handbook of Distance Education.
school in the big city. International Journal of Ar- (pp.113–127). New York: Erlbaum.
tificial Intelligence in Education, 8, 30–43. 41 Garrison, R. (2000). Theoretical challenges for
Ritter, S., Anderson, J.R., Koedinger, K.R., & Cor- distance education in the 21st century: A shift from
bett, A. (2007). Cognitive Tutor: Applied research structural to transactional issues. The Internation-
in mathematics education. Psychonomic Bulletin & al Review of Research in Open and Distributed
Review, 14(2), 249–255. Learning, 1(1). doi.org/10.19173/irrodl.v1i1.2
48 Henninger, A.E., Cutts, D., Loper, M., Lutz, R., 59 For example, see: Cavanagh, S. (2018, November
Richbourg, R., Saunders, R., & Swenson, S. 21). Ed. Dept. pulls plug on ‘Learning Registry,’ an
(2008). Live virtual constructive architecture road- Obama-Era tech initiative. EdWeek Market Brief.
map (LVCAR) final report. Alexandria VA: Insti- marketbrief.edweek.org
tute for Defense Analyses. 60 Johnstone, S. M. (2005). Open educational resourc-
49 El Kaliouby, R. & Robinson, P. (2005). General- es serve the world. Educause Quarterly, 28(3), 15.
ization of a vision-based computational model of 61 Howe, J. (2006, June 2). Crowdsourcing: A defini-
mind-reading. In International Conference on Af- tion. Wired. ww.wired.com/2006/06/crowds
fective Computing and Intelligent Interaction (pp.
582-589). Springer, Berlin, Heidelberg. 62 Siemens, G. (2005). Connectivism: A learning
theory for the digital age. International Journal of
50 Director for Readiness and Training (1999, April Instructional Technology and Distance Learning,
30). Department of Defense Strategic Plan for Ad- 2(1), 3–10.
vanced Distributed Learning (Report to the 106th
Congress). Washington, DC: U.S. Office of the 63 National Research Council (2000). How people
Deputy Under Secretary of Defense for Readiness. learn: Brain, mind, experience, and school: Ex-
apps.dtic.mil/dtic/tr/fulltext/u2/a470552.pdf.
apps.dtic.mil/dtic/tr/fulltext/u2/a470552.pdf panded edition. Washington, DC: The National
Academies Press.
51 Ibid. El Kaliouby & Robinson (2005). Endnote 2-49.
2-49
NOTE: A digital version of this book is openly and
52 Motlik, S. (2008). Mobile learning in developing publicly available at doi.org/10.17226/9853
doi.org/10.17226/9853.
nations. The International Review of Research in
Open and Distributed Learning, 9(2). www.irrodl. 64 Anderson, L., & Krathwohl, D.E. (2001). A Tax-
org/index.php/irrodl/article/view/564/1039 onomy for learning teaching and assessing: A re-
vision of Bloom’s taxonomy of educational objec-
53 Crompton, H. (2013). A historical overview of tives. New York: Addison.
m-learning: Toward learner-centered education. In
Z.L. Berge & L. Muilenburg (Eds.), Handbook of 65 Merrill, M.D. (2002). First principles of instruc-
Mobile Education. Hoboken: Taylor and Francis. tion. Educational Technology Research and Devel-
opment, 50(3), 43–59.
54 Watson, J., Murin, A., Vashaw, L., Gemin, B., &
Rapp, C. (2010). Keeping Pace with K-12 Online 66 Fiore, S.M., & Salas, E.E. (2007). Toward a sci-
Learning: An Annual Review of Policy and Prac- ence of distributed learning. Washington, DC:
tice, 2010. Evergreen Education Group. American Psychological Association.
55 For example, see: 67 Pashler, H., Bain, P., Bottge, B., Graesser, A.,
Koedinger, K., McDaniel, M., & Metcalf, J. (2007).
Muoio, A. (2000, October). Cisco’s Quick Study. Organizing instruction and study to improve stu-
Fast Company. www.fastcompany.com/41492/cis- dent learning (NCER 2007-2004). Washington,
cos-quick-study DC: National Center for Education Research, In-
Seufert, S. (2001). E-learning business models, stitute of Education Sciences, U.S. Department of
framework and best practice examples. In M.S. Education. http://ncer.ed.gov
Raisinghani (Ed.), Cases on Worldwide E-Com- 68 Magoulas, G.D., & Chen, S.Y. (Eds.). (2006).
merce: Theory in Action (70–94). New York: Idea Advances in web-based education: personalized
Group. learning environments (ERIC No. ED508909).
56 Fletcher, J. D. (2009). Education and training tech- Hershey, PA: Information Science Publishing.
nology in the military. Science, 323(5910), 72–75. 69 Woolf, B.P. (2009). Building intelligent tutoring
Wisher, R. A. & Khan, B. H. (Eds.), Learning on systems. Burlington, MA: Morgan Kaufman.
demand: ADL and the Future of e-Learning. Wash- 70 King, A. (1993). From sage on the stage to guide
ington DC: Department of Defense. on the side. College teaching, 41(1), 30–35.
57 Fletcher, J. D. (2005). The Advanced Distributed 71 O’Flaherty, J., & Phillips, C. (2015). The use of
Learning (ADL) vision and getting from here to flipped classrooms in higher education: A scoping re-
there (No. IDA/HQ-D-3212). Alexandria VA: In- view. The Internet and Higher Education, 25, 85–95.
stitute for Defense Analyses. apps.dtic.mil/dtic/tr/
fulltext/u2/a452053.pdf. See page 7.
fulltext/u2/a452053.pdf 72 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67
58 Rehak, D., Dodds, P., & Lannom, L. (2005, May). 73 Kelley, P. (2008). Making minds. New York: Rout-
A model and infrastructure for federated learning ledge. See page 4
content repositories. Paper presented at the 14th 74 Ibid. Graesser et al. (1999). Endnote 2-36.
2-36
World Wide Web Conference, Chiba, Japan.
392 | Modernizing Learning
Graesser, A. C. (2016). Conversations with AutoTu- 87 For example, see: Hampson, R.E., Song, D., Rob-
tor help students learn. International Journal of Ar- inson, B.S., Fetterhoff, D., Dakos, A. S., Roeder, B.
tificial Intelligence in Education, 26(1), 124–132. M., et al. (2018). Developing a hippocampal neural
Nye, B.D., Graesser, A.C., & Hu, X. (2014). Auto- prosthetic to facilitate human memory en-coding
Tutor and family: A review of 17 years of natural and recall. Journal of Neural Engineering, 15(3),
language tutoring. International Journal of Artifi- 036014.
cial Intelligence in Education, 24(4), 427–469. 88 See www.gifttutoring.org
75 Rowe, J.P., Shores, L.R., Mott, B.W., & Lester, J. Also refer to, for example:
C. (2010). Integrating learning and engagement in Sottilare, R.A., Goldberg, B. S., Brawner, K.W.,
narrative-centered learning environments. In: V. & Holden, H.K. (2012). A modular framework to
Aleven, J. Kay, & J. Mostow (Eds.), Internation- support the authoring and assessment of adaptive
al Conference on Intelligent Tutoring Systems. ITS computer-based tutoring systems (CBTS). In Pro-
2010. Lecture Notes in Computer Science, vol 6095 ceedings of the I/ITSEC. Arlington, VA: National
(pp. 166–177). Berlin, Heidelberg: Springer. Training and Simulation Association.
76 Johnson, W.L., & Valente, A. (2009). Tactical lan- 89 Sinatra, A., Graesser, A., Hu, X., & Brawner, K.,
guage and culture training systems: Using AI to (2019). Design Recommendations for Intelligent
teach foreign languages and cultures. AI Magazine, Tutoring Systems: Artificial Intelligence (Volume
30(2), 72. 6). U.S. Army Research Laboratory.
77 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67 90 IEEE Competency Data Standards Work Group
78 Roediger, H.L., and Karpicke, J.D. (2006). The (CDSWG20 P1484.20.1). (2018). sites.ieee.org/
power of testing memory: Basic research and im- sagroups-1484-20-1
plications for educational practice. Perspectives on 91 Yang, I. (2014, October 30). Grading adults on life
Psychological Science, 1(3), 181–210. experience. The Atlantic. www.theatlantic.com
79 For example, see: Landauer, T.K., Laham, D., & 92 For example, see: Anderson, L. (2018). Compe-
Foltz, P.W. (2003). Automatic essay assessment. tency-based education: Recent policy trends. The
Assessment in Education: Principles, Policy & Journal of Competency-Based Education, 3(1).
Practice, 10(3), 295–308. doi: 10.1002/cbe2.1057
80 Siemens, G. (2006). Knowing knowledge. www. 93 Kazin, C. (2017, August 15). Microcredentials, mi-
knowingknowledge.com cromasters, and nanodegrees: What’s the big idea?
81 Baker, R.S.J.D. & Yacef, K. (2009). The state of edu- The EvoLLLution. www.evolllution.com
cational data mining in 2009: A review and future vi- 94 Dede, C., Richards, J. & Saxberg, B. (2018). Learn-
sions. Journal of Educational Data Mining, 1, 3–16. ing Engineering for Online Education. Routledge.
82 For example, see: Baker, R.S.J.D. & Inventado, P.S. 95 As quoted by Blake-Plock, S. (2018, January).
(2014). Educational data mining and learning ana- Learning engineering: Merging science and data
lytics. In J.A. Larusson & B. White (Eds.), Learn- to design powerful learning experiences. Getting
ing Analytics (pp. 61-75). New York: Springer. Smart. www.gettingsmart.com
83 Evans, D. (2011, April). The Internet of Things: 96 Saxberg, B. (2016, July). “Learning engineering”
how the next evolution of the internet is changing making its way in the world. Getting Smart. www.
everything. CISCO white paper. www.cisco.com gettingsmart.com
84 Gómez, J., Huete, J.F., Hoyos, O., Perez, L., &
Grigori, D. (2013). Interaction system based on in-
ternet of things as support for education. Procedia Chapter 3 Endnotes
Computer Science, 21, 132–139.
1 Allen, I.E., & Seaman, J. (2016). Online Report
85 For example, see: Bower, M., & Sturman, D.
Card: Tracking Online Education in the Unit-
(2015). What are the educational affordances of
ed States (ERIC No. ED572777). Babson Park,
wearable technologies? Computers & Education,
MA: Babson Survey Research Group. eric.ed.gov
88, 343–353.
/?id=ED572777
86 D’Mello, S.K., Kappas, A., & Gratch, J. (2018).
2 U.S. Department of Education, National Center for
The affective computing approach to affect mea-
Education Statistics. (2018). Digest of Education
surement. Emotion Review, 10(2), 174–183.
Statistics 2016 (NCES 2017-094). nces.ed.gov/
pubs2017/2017094.pdf. See Table 311.15.
pubs2017/2017094.pdf
Endnotes | 393
3 Association for Talent Development Research. 17 For example, see: Puentedura, R. (2014). Learning,
(2017). Next Generation E-Learning: Skills and technology, and the SAMR model: Goals, process-
Strategies (Product Code 191706). Alexandria,VA: es, and practice. Ruben R. Puentedura’s blog. hip-
ATD Research. pasus.com/blog/archives/127
4 Shah, D. (2018, March 10). A product at every 18 Khan, B.H. (2003). The global e-learning frame-
price: A review of MOOC stats and trends in 2017. work. STRIDE Handbook, 42–51.
MOOC Report by Class Central. www.class-cen- 19 Farid, S., Ahmad, R., & Alam, M. (2015). A hier-
tral.com/report/moocs-stats-and-trends-2017 archical model for e-learning implementation chal-
5 World Bank (2018). World Development Report lenges using AHP. Malaysian Journal of Computer
2018: Learning to Realize Education’s Promise. Science, 28(3), 166–188.
Washington, DC: World Bank. doi: 10.1596/978- 20 Aguti, B., Wills, G.B., & Walters, R.J. (2014). An
1-4648-1096-1. See page 16.
1-4648-1096-1 evaluation of the factors that impact on the effective-
NOTE: This report, which includes a notable amount ness of blended e-learning within universities. In
of useful empirical data, is freely available under Proceedings of the International Conference on In-
the Creative Commons Attribution license. formation Society (pp. 117–121). Piscataway: IEEE.
6 Ibid. National Academies (2018). Endnote 1-4.
1-4 21 Roscoe, R.D., Branaghan, R., Cooke, N.J., & Craig,
7 Peggy Ertmer and Timothy Newby offer a highly S.D. (2017). Human systems engineering and edu-
readable comparison of these theories in the con- cational technology. In R.D. Roscoe, S.D. Craig &
text of instructional design in their 1993 article: I. Douglas (Eds.), End-user considerations in edu-
Ertmer, P.A. & Newby, T.J. (1993). Behaviorism, cational technology design. (pp. 1–34). New York:
cognitivism, constructivism: comparing critical IGI Global.
features from an instructional design perspective 22 Sohoni, S., Craig, S.D. & Vedula, K. (2017). A
(reprint). Performance Improvement Quarterly, blueprint for an ecosystem for supporting high
6(4), 50–72. quality education for engineering. Journal of Engi-
8 Ibid. Merrill (2002). Endnote 2-65.
2-65 neering Education Transformation, 30(4), 58–66.
9 Ambrose, S. A., Lovett, M., Bridges, M. W., DiP- 23 Dick, W., Carey, L., & Carey, J.O. (2011). The sys-
ietro, M., & Norman, M. K. (2010). How learning tematic design of instruction (7th ed.). Upper Sad-
works: Seven research-based principles for smart dle River, NJ: Pearson.
teaching. San Francisco, CA : Jossey-Bass. 24 Douglas, I. (2006). Issues in software engineering
10 Chi, M.T.H., & Wylie, R. (2014). The ICAP frame- of relevance to instructional design. TechTrends,
work: Linking cognitive engagement to active 50(5), 28–35.
learning outcomes. Educational Psychologist, 25 Cooke, N.J. & Hilton, M.L. (2015). Enhancing the
49(4), 219–243. See page 220. effectiveness of team science. Washington, D.C.:
11 Winne, P.H. (2011). A cognitive and metacogni- National Academies Press.
tive analysis of self-regulated learning. In D.H. Fiore, S.M., Graesser, A.C., & Greiff, S. (2018).
Schunk, & B. Zimmerman (Eds.), Handbook of Collaborative problem-solving education for the
self-regulation of learning and performance (pp. twenty-first-century workforce. Nature Human Be-
15-32). Ney York: Routledge. haviour, 2, 367–369.
12 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67 26 For an informative and lighthearted discussion
13 Graesser, A.C. (2009). Cognitive scientists prefer on software development teams see: Fitzpatrick,
theories and testable principles with teeth. Educa- B., & Collins-Sussman, B. (2012). Team geek: a
tional Psychologist, 44(3), 193–197. software developer’s guide to working well with
others. Sebastopol, CA: O’Reilly Media, Inc.
14 Jarvis, P. (2012). Towards a comprehensive theory
of human learning, Vols. 1–3. New York: Rout- 27 www.merlot.org
ledge. The Multimedia Educational Resource for Learn-
15 Saettler, P. (1990). The evolution of American educa- ing and Online Teaching (MERLOT) project began
tional technology. Englewood: Libraries Unlimited. in 1997, when the California State University Cen-
ter for Distributed Learning developed and provid-
16 Mayer, R. E. (2017). Using multimedia for e-learn- ed free access to open educational resources.
ing. Journal of Computer Assisted Learning, 33(5),
403–423.
394 | Modernizing Learning
Schmitt, B. (1999). Experiential marketing. The 25 Interaction Design Foundation (2017). Learning
Free Press. experience design: The most valuable lessons
12 Berry, L.L., Carbone, L.P., & Haeckel, S.H. [blog post]. The Interaction Design Foundation.
(2002). Managing the total customer experience. www.interaction-design.org.
www.interaction-design.org
MIT Sloan Management Review. 26 Rifai, N., Rose, T., McMahon, G.T., Saxberg, B.,
13 Ibid. see Pullman & Gross (2004) and Berry et al. & Christensen, U.J. (2018). Learning in the 21st
(2002), respectively. Endnote 5-11.
5-11 century: Concepts and tools. Clinical chemistry,
64(10), 1423–1429.
14 Kolb, D. (1984). Experiential learning: experience
as the source of learning and development. Pren- 27 Norman, D. (2005). Emotional design: Why we
tice Hall. See page 41. love (or hate) everyday things. Basic Books.
11 Tyszko, J.A., Sheets, R.G., Reamer, A.D. (2017). 2 Cavoukian, A. (2009). Privacy by design. Ontario,
Clearer signals: Building an employer-led job Canada: Information and Privacy Commissioner of
registry for talent pipeline management. Washing- Ontario. http://www.ontla.on.ca/library/repository/
ton, DC: U.S. Chamber of Commerce Foundation. mon/23002/289982.pdf
www.luminafoundation.org 3 Knijnenburg, B. P. (2015). A user-tailored ap-
proach to privacy decision support (Doctoral dis-
sertation, UC Irvine).
Chapter 7 Endnotes
4 Westin, A.F., Harris, L. et al. (1981). The Dimen-
1 For example, see: Cybrary.it. Free hacking train- sions of privacy: A national opinion research sur-
ing. www.cybrary.it/freehackingtraining vey of attitudes toward privacy. New York: Garland.
2 Fu, H., Liao, J., Yang, J., Wang, L., Song, Z., 5 Chellappa, R.K. & Sin, R.G. (2005). Personaliza-
Huang, X., et al. (2016). The Sunway TaihuLight tion versus privacy: An empirical examination of
supercomputer: system and applications. Science the online consumer’s dilemma. Information Tech-
China Information Sciences, 59(7), 072001. nology and Management. 6(2), 181–202.
Hruby, D. (2018). Putting China’s science on the 6 Malhotra, N.K., Kim, S.S. & Agarwal, J. (2004).
map. Nature, 553(7688). Internet users’ information privacy concerns
(IUIPC): The construct, the scale, and a nomolog-
3 Gordon, L.A., Loeb, M.P., Lucyshyn, W., & Zhou,
ical framework. Information Systems Research.
L. (2015). Externalities and the magnitude of cyber
15(4), 336–355.
security underinvestment by private sector firms: a
modification of the Gordon-Loeb model. Journal 7 Knijnenburg, B.P. & Cherry, D. (2016, June). Com-
of Information Security, 6(1), 24–30. ics as a medium for privacy notices. Paper present-
ed at the SOUPS 2016 workshop on the Future of
4 Rustici Software. (n.d.) The layers of Experience
Privacy Notices and Indicators, Denver, CO.
API. xapi.com/the-layers-of-experience-api-xapi
8 Wisniewski, P.J., Knijnenburg, B.P., & Lipford,
5 Ramirez-Padron, R. (2017, July) Pushing xAPI
H.R. (2017). Making privacy personal: Profiling
statements in real time: Part 3. tlacommunity.com/
social network users to inform privacy educa-
pushing-xapi-statements-in-real-time-part-3
tion and nudging. International Journal of Hu-
6 Ibid. Ramirez-Padron (2017). Endnote 7-5.
7-5 man-Computer Studies. 98, 95–108.
7 Perrow, C. (1984). Normal accidents: Living with 9 Ibid. Wisniewski et al. (2017). Endnote 8-7.
8-7
high-risk technologies. New York: Basic Books.
10 Narayanan, A. & Shmatikov, V. (2008). Robust
8 Ibid. Perrow (1984). Endnote 7-7.
7-7 de-anonymization of large sparse datasets. In IEEE
9 Bambauer, D.E. (2014). Ghost in the network. Symposium on Security and Privacy (pp. 111-125).
University of Pennsylvania Law Review, 162, pp. 11 Ibid. Chellappa & Sin (2005). Endnote 8-4.
8-4
1011–1091.
12 Gootman, S. (2016). OPM hack: The most danger-
Lally, L. (2005). Information technology as a target ous threat to the federal government today. Journal
and shield in the post 9/11 environment. Information of Applied Security Research. 11(4), 517–525.
Resources Management Journal, 18, pp. 14–28.
13 Kobsa, A., Cho, H., & Knijnenburg, B.P. (2016).
10 Massachusetts Institute of Technology. (2018). The effect of personalization provider characteris-
Kerberos: The network authentication protocol. tics on privacy attitudes and behaviors: An elabo-
web.mit.edu/kerberos ration likelihood model approach. Journal of the
Association for Information Science and Technolo-
gy, 67(11), 2587–2606.
Chapter 8 Endnotes 14 Knijnenburg, B.P., Sivakumar, S., & Wilkinson,
D. (2016). Recommender systems for self-actual-
1 Sandia National Laboratories is a multimission
ization. In S. Sen & W. Geyer (Eds.), Proceedings
laboratory managed and operated by National
of the 10th ACM Conference on Recommender Sys-
Technology & Engineering Solutions of Sandia,
tems (pp. 11–14). New York: ACM.
LLC, a wholly owned subsidiary of Honeywell
International Inc., for the U.S. Department of En-
ergy’s National Nuclear Security Administration
under contract DE-NA0003525.
398 | Modernizing Learning
15 Page, X., Knijnenburg, B.P., & Kobsa, A. (2013). 2 Siemens, G. & Baker, R.S.J.D. (2012). Learning
FYI: Communication style preferences underlie analytics and educational data mining: towards
differences in location-sharing adoption and usage. communication and collaboration. In S.B. Shum, D.
In F. Mattern & S. Santini (Eds.), Proceedings of Gasevic, & R. Ferguson (Eds.), Proceedings of the
the 2013 ACM international joint conference on 2nd International Conference on Learning Analytics
Pervasive and ubiquitous computing (pp. 153– and Knowledge (pp. 252–254). New York: ACM.
162). New York: ACM. 3 The phrase “big learning data” is a nod to Maise, E.
16 Teltzrow, M. & Kobsa, A (2004). Impacts of user (Ed.). (2014). Big Learning Data. Alexandria, VA:
privacy preferences on personalized systems: a American Society for Training and Development.
comparative study. In C.M. Karat, J. Blom, & J. 4 Crowder, M., Antoniadou, M., & Stewart, J.
Karat (Eds.), Designing Personalized User Expe- (2018). To BlikBook or not to BlikBook: Explor-
riences for eCommerce (pp. 315–332). Norwell, ing student engagement of an online discussion
MA: Kluwer Academic Publishers. platform. Innovations in Education and Teaching
17 Compañó, R. & Lusoli, W. (2010). The policy International, 1–12.
maker’s anguish: Regulating personal data behav- 5 Mazza, R. & Dimitrova, V. (2004) Visualising stu-
ior between paradoxes and dilemmas. In T. Moore, dent tracking data to support instructors in web-based
D. Pym, & C. Ioannidis (Eds.), Economics of Infor- distance education. In S. Feldman & M. Uretsky
mation Security and Privacy (pp. 169–185). Bos- (Eds.), Proceedings of the WWW Alt. ’04: 13th In-
ton, MA: Springer. ternational World Wide Web Conference on Alter-
18 Nissenbaum, H. (2011). A contextual approach to nate Track Papers and Posters. New York: ACM.
privacy online. Daedalus. 140(4), 32–48. 6 Arnold, K.E., & Pistilli, M.D. (2012). Course sig-
19 Kay, M. & Terry, M. (2010). Textured agreements: nals at Purdue: Using learning analytics to increase
Re-envisioning electronic consent. In L.F. Cranor student success. In Proceedings of the 2nd interna-
(Ed.), Proceedings of the Sixth Symposium on Us- tional conference on learning analytics and knowl-
able Privacy and Security (pp. 13:1–13:13). New edge (pp. 267–270). New York: ACM.
York: ACM. NOTE: However, readers are cautioned that Course
20 Wisniewski, P., Islam, A.K.M.N., Knijnenburg, Signals may have produced undesirable effects,
B.P., & Patil, S. (2015). Give social network us- possibly encouraging under-performing students
ers the privacy they want. In D. Cosley & A. Forte to withdrawal. This highlights the importance of
(Eds.), Proceedings of the 18th ACM Conference considering, not just analyzing data, but how to ef-
on Computer Supported Cooperative Work & So- fectively use it to achieve the desired outcomes.
cial Computing (pp. 1427–1441). New York: ACM. 7 For a succinct overview of learning analytics and
21 Ibid. Compañó & Lusoli (2010). Endnote 8-16.
8-16 educational data mining, as well as several use-
22 Spiekermann, S., Grossklags, J., & Berendt, B. case examples, see: Charlton, P., Mavrikis, M., &
(2001). E-privacy in 2nd generation e-commerce: Katsifli, D. (2013). The potential of learning an-
Privacy preferences versus actual behavior. In alytics and big data. Ariadne. www.ariadne.ac.uk/
M.P. Wellman & Y. Shoham (Eds.), Proceedings of issue/71/charlton-et-al
the 3rd ACM Conference on Electronic Commerce 8 For additional examples and a useful historical
(pp. 38–47). New York: ACM. account of educational data mining and learning
23 Ibid. Knijnenburg (2015). Endnote 8-2.
8-2 analytics, see: Ferguson, R. (2012). Learning ana-
lytics: drivers, developments and challenges. Inter-
24 Ibid. Knijnenburg (2015). Endnote 8-2.
8-2 national Journal of Technology Enhanced Learn-
ing, 4(5/6) pp. 304–317.
Chapter 9 Endnotes 9 For a systematic review of learning analytics and
educational dashboards, from scholarly articles
1 For those curious about the similarities and dif- published between 2011 and 2015, see: Schwen-
ferences between Educational Data Mining and dimann, B.A., Rodriguez-Triana, M.J., Vozniuk,
Learning Analytics, Ryan Baker and Paul Salvador A., Prieto, L.P., Boroujeni, M.S., Holzer, A., et
Inventado authored a detailed chapter, see: Ibid. al. (2017). Perceiving learning at a glance: A sys-
Baker & Inventado (2014). Endnote 2-82.
2-82 tematic literature review of learning dashboard
research. IEEE Transactions on Learning Technol-
ogies, 10(1), 30-41.
Endnotes | 399
10 “Open-learner” (or “open-student”) models pro- 13 The full quote is, “There are three kinds of lies:
vide learning visualizations of the underlying user lies, damned lies, and statistics.” This is often at-
model within a system the learners, themselves. tributed to Mark Twain who, in turn, attributes it to
Open-learner models appear to enhance outcomes the British prime minister Benjamin Disraeli. It’s
(particularly for lower performing students) and not entirely clear who invented the witticism, but
increase motivation, improve self-awareness, and it’s sentiment is obvious. For more delightful ety-
support self-directed learning. Robert Bodily and mology about it, see: Velleman, P. F. (2008). Truth,
colleagues recently published a review comparing damn truth, and statistics. Journal of Statistics Ed-
learning analytics dashboards with open learner ucation, 16(2).
models:
Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D.,
Xhakaj, F., & Verbert, K. (2018). Open learner Chapter 10 Endnotes
models and learning analytics dashboards: a sys-
1 Ibid. Kulik & Fletcher (2016). Endnote 2-11.
2-11
tematic review. In A. Pardo, K. Bartimote-Auf-
flick, G. Lynch (Eds.), Proceedings of the 8th 2 Raybourn, E.M., Deagle, E., Mendini, K., &
International Conference on Learning Analytics Heneghan, J. (2005). Adaptive thinking and lead-
and Knowledge (pp. 41–50). New York: ACM. ership simulation game training for special forces
officers. In Proceedings of the I/ITSEC. Arlington,
Some other useful sources include:
VA: National Training and Simulation Association.
Mitrovic, A., & Martin, B. (2002). Evaluating the
3 Steenbergen-Hu, S., & Cooper, H. (2014). A me-
effects of open student models on learning. In
ta-analysis of the effectiveness of intelligent tutor-
P. De Bra, P. Brusilovski, & R. Conejo (Eds.),
ing systems on college students’ academic learning.
International Conference on Adaptive Hyper-
Journal of Educational Psychology, 106(2), 331.
media and Adaptive Web-Based Systems (pp.
296–305). Berlin, Heidelberg: Springer, 4 VanLehn, K. (2011). The relative effectiveness of
human tutoring, intelligent tutoring systems, and
Chou, C.Y., Tseng, S.F., Chih, W.C., Chen, Z.H.,
other tutoring systems. Educational Psychologist,
Chao, P.Y., Lai, K.R. et al. (2017). Open student
46(4), 197-221.
models of core competencies at the curriculum
level: Using learning analytics for student reflec- 5 Ibid. Kulik & Fletcher (2016). Endnote 2-11.
2-11
tion. IEEE Transactions on Emerging Topics in 6 Murray, T. (1999). Authoring intelligent tutoring
Computing, 5(1), 32–44. systems: An analysis of the state of the art. Inter-
Brusilovsky, P., Somyürek, S., Guerra, J., Hosseini, national Journal of Artificial Intelligence in Edu-
R., & Zadorozhny, V. (2015, June). The value cation, 10, 98–129.
of social: Comparing open student modeling 7 Koedinger, K.R., McLaughlin, E.A., & Stamp-
and open social student modeling. In F. Ricci, er, J. C. (2014). Data-driven learner modeling to
K. Bontcheva, O. Conlan, & S. Lawless (Eds.), understand and improve online learning: MOOCs
User Modeling, Adaptation and Personaliza- and technology to advance learning and learning
tion. UMAP 2015. Lecture Notes in Computer research Ubiquity, 2014(May), 3.
Science, Vol. 9146 (pp. 44–55). Cham: Springer.
8 Goldberg, B., Schatz, S., & Nicholson, D. (2010).
11 For background information on Kappa see: github. A practitioner’s guide to personalized instruction:
com/milinda/kappa-architecture.com Macro-adaptive approaches for use with instruc-
The original paper describing this paradigm came tional technologies. In D. Kaber & G. Boy (Eds.)
out of work done at LinkedIn, see: Kreps, J. (2014, Proceedings of the 2010 Applied Human Factors
July 2). Questioning the Lambda Architecture: The and Ergonomics Conference: Advances in Cogni-
Lambda Architecture has its merits, but alterna- tive Ergonomics (pp. 735–745). Boca Raton, FL:
tives are worth exploring. O’Reilly. www.oreilly. CRC Press.
com/ideas/questioning-the-lambda-architecture 9 Young, J.R. (2018). Keystroke dynamics: Utiliz-
Also, for a deeper dive, see the presentation by ing keyprint biometrics to identify users in online
Martin Kleppmann at youtu.be/fU9hR3kiOK0 courses. (Doctoral dissertation, Brigham Young
University).
12 kafka.apache.org
10 Beck, J.E., & Woolf, B.P. (2000, June). High-level
student modeling with machine learning. In Inter-
national Conference on Intelligent Tutoring Sys-
tems (pp. 584–593). Berlin, Heidelberg: Springer.
400 | Modernizing Learning
11 Sottilare, R.A., Brawner, K.W., Goldberg, B.S., & 21 Schmorrow, D., Nicholson, D., Lackey, S.J., Allen,
Holden, H.K. (2012). The generalized intelligent R.C., Norman, K., & Cohn, J. (2009). Virtual re-
framework for tutoring (GIFT). Orlando, FL: US ality in the training environment In P.A. Hancock,
Army Research Laboratory–Human Research & D.A. Vincenzi, J.A. Wise, & M. Mouloua (Eds.),
Engineering Directorate. gifttutoring.org Human Factors in Simulation and Training. Boca
12 Nye, B. D. (2016). Its, the end of the world as we Raton, FL: CRC Press.
know it: Transitioning AIED into a service-orient-
ed ecosystem. International Journal of Artificial
Intelligence in Education, 26(2), 756–770. ENDNOTES FOR SECTION 3
13 Folsom-Kovarik, J.T., Jones, R.M., & Schmorrow, (LEARNING SCIENCE)
D. (2016). Semantic and episodic learning to inte-
grate diverse opportunities for life-long learning.
In Proceedings of MODSIM World. Arlington, VA:
National Training and Simulation Association. Chapter 11 Endnotes
14 Weinstein, Y., & Roediger, H. L. (2012). The ef-
fect of question order on evaluations of test per- 1 NOTE: The views expressed in this chapter are en-
formance: how does the bias evolve? Memory & tirely those of the author, a contractor with Metis
Cognition, 40(5), 727–735 Solutions, and do not necessarily reflect the views,
policy, or position of the United States Govern-
15 For review of data quality, see: Pipino, L.L., Lee, ment, Department of Defense, United States Spe-
Y.W., & Wang, R.Y. (2002). Data quality assess- cial Operations Command, or the Joint Special Op-
ment. Communications of the ACM, 45(4), 211–218. erations University.
For a discussion on data fairness, see also Brun, 2 Shute, V. & Ventura, M. (2013). Stealth assess-
Y., & Meliou, A. (2018). Software fairness. In ment: Measuring and supporting learning in video
Proceedings of the 26th ESEC/FSE (pp. 754–759). games. Boston, MA: MIT Press.
New York: ACM.
3 Hattie, J. (2009). Visible learning: A synthesis of
16 Soh, L.K., & Blank, T. (2008). Integrating case- over 800 meta-analyses relating to achievement.
based reasoning and meta-learning for a self-im- Abingdon, UK: Routledge. See page 24.
proving intelligent tutoring system. International
Journal of Artificial Intelligence in Education, 4 Hatfield, S. (2009). Assessing your program-level
18(1), 27–58. assessment plan. Number 45 in IDEA paper series.
Manhattan, KS: IDEA Center. See page 1.
17 Folsom-Kovarik, J.T., Wray, R.E., & Hamel, L.
(2013). Adaptive assessment in an instructor-me- 5 Sadler, D.R. (1989). Formative assessment and the
diated system. In Proceedings of the International design of instructional systems. Instructional Sci-
Conference on Artificial Intelligence in Education ence, 18, 119–144. See page 121.
(pp. 571–574). Berlin, Heidelberg: Springer. 6 Sadler, D.R. (2010). Beyond feedback: Developing
18 Krening, S., Harrison, B., Feigh, K.M., Isbell, C. student capability in complex appraisal. Assess-
L., Riedl, M., & Thomaz, A. (2017). Learning from ment & Evaluation in Higher Education, 35(5),
explanations using sentiment and advice. IEEE 535–550.
Transactions on Cognitive and Developmental 7 Hattie, J. & Timperley, H. (2007). The power of
Systems, 9(1), 44–55. feedback. Review of Educational Research, 77(1),
19 Aleven, V., Xhakaj, F., Holstein, K., & McLaren, 81–112.
B.M. (2016). Developing a teacher dashboard for 8 For examples of stealth assessment, see:
use with intelligent tutoring systems. In IWTA@ Shute, V. & Spector, J. M. (2008). SCORM 2.0
EC-TEL (pp. 15-23). New York: ACM. white paper: Stealth assessment in virtual worlds.
20 Czarkowski, M., & Kay, J. (2006). Giving learners Unpublished manuscript.
a real sense of control over adaptivity, even if they Ibid. Shute & Ventura (2013). Endnote 11-2.
11-2
are not quite ready for it yet. In M. Czarkowski, &
J. Kay (Eds.), Advances in Web-based education: 9 Thille, C. (2015, November 19). Big data, the sci-
Personalized learning environments (pp. 93–126). ence of learning, analytics, and transformation of
Hershey, PA: IGI Global. education [Video file]. Presented at media X con-
ference Platforms for Collaboration and Produc-
tivity, Stanford University. https://youtu.be/cYq-
s0Ei2tFo
Endnotes | 401
5 U.S. Office of Personnel Management. (n.d.). As- 3 Raybourn, E., Schatz, S., Vogel-Walcutt, J.J., and
sessment & selection: Competencies. Retrieved Vierling, K. (2017). At the tipping point: Learning
January 13, 2019 from www.opm.gov science and technology as key strategic enablers
6 U.S. Air Force (2014). Institutional Competency for the future of defense and security. In Proceed-
Development and Management (Air Force Manual ings of the I/ITSEC. Arlington, VA: National Train-
36-2647). www.e-publishing.af.mil ing and Simulation Association.
7 Bramante, F., & Colby, R. (2012). Off the clock: 4 Stodd, J. (2018, July 10). Context of the Social Age
Moving education from time to competency. Thou- [blog post]. Julian Stodd’s Learning Blog. Retrieved
sand Oaks, CA: Corwin. See page 65. August 28, 2018 from julianstodd.wordpress.com
8 Spencer, L. & Spencer, S.M. (1993). competence 5 The Economist. (2017, January 14). The return of
at work: Models for superior performance. New the MOOC: Established education providers v new
York: John Wiley & Sons. contenders. The Economist. Retrieved February 2,
2018 from www.economist.com
9 Lucia, A.D., & Lepsinger, R. (1999). The Art and
Science of competency models: Pinpointing criti- 6 Stodd, J & Reitz, E.A. (2016). Black swans and the
cal success factors in organizations. San Francis- limits of hierarchy. n Proceedings of the I/ITSEC.
co: Jossey-Bass/Pfeiffer. Arlington, VA: National Training and Simulation
Association.
10 Ibid. Bramante & Colby (2012). Endnote 13-7.
13-7
7 Brafman, O., & Beckstrom, R. A. (2006). The
11 Ward, S.C. (2016, February 1) Let them eat cake starfish and the spider: The unstoppable power of
(competently). Inside Higher Education. www.in- leaderless organizations. Penguin.
sidehighered.com
8 Stodd, J. (2017, January 17). 10 Tips for designing
12 Hollenbeck, G. & Morgan, M. (2013). Competen- effective social learning [blog post]. Julian Stodd’s
cies, not competencies: Making global executive Learning Blog. Retrieved August 28, 2018 from
development work. Advances in global leadership julianstodd.wordpress.com
(pp. 101–119). Emerald Group Publishing Limited.
9 Ibid. Stodd (2015). Endnote 14-1.
14-1
13 U.S. Department of Energy (2013). U.S. Depart-
ment of Energy Leadership Development Pro- 10 O’Neil, H.F., Perez, R.S., & Baker, E.L., Eds.
grams 2013–2014: Readings by Executive Core (2014). Teaching and measuring cognitive readi-
Qualifications. www.opm.gov ness. New York, NY: Springer.
14 Krauss, S.M., (2017). How Competency-Based 11 Jarche, H. (2014, January). The seek-sense-share
Education May Help Reduce Our Nation’s Tough- framework. Inside Learning Technologies. jarche.
est Inequities (Issue Paper). Indianapolis: Lumina com/2014/02/the-seek-sense-share-framework
Foundation. www.luminafoundation.org 12 St. Clair, R.N., Thome-Williams, A.C., & Su, L.
15 www.luminafoundation.org/priorities (2005). The role of social script theory in cognitive
blending. In M. Medina & L. Wagner (Eds.), Inter-
16 Voorhees, R.A. Competency-based learning mod- cultural Communication Studies, 15(1), 1–7.
els: A necessary future. New directions for institu-
tional research, 2001(110), 5–13.
Chapter 15 Endnotes
Chapter 14 Endnotes 1 Yarnall, L., Remold, J., & Shechtman, N. (2018,
October). Developing employability skills: Har-
1 Stodd, J. (2015, October 30). An introduction to vesting ideas from the field. Presentation at the
Scaffolded Social Learning [blog post]. Julian annual principal investigators’ conference of the
Stodd’s Learning Blog. Retrieved August 28, 2018 Advanced Technological Education program, Na-
from julianstodd.wordpress.com tional Science Foundation, Washington, DC.
2 Foster, R.E. & Fletcher, J.D. (2002). Comput- 2 Winters, F.I., Greene, J.A., & Costich, C.M.
er-based aids for learning, job performance, and (2008). Self-regulation of learning within comput-
decision-making in military applications: Emer- er-based learning environments: A critical analysis.
gent technology and challenges (IDA Document Educational Psychology Review, 20(4), 429–444.
D-2786). Alexandria, VA: Institute for Defense
Analyses. 3 Azevedo, R. (2014). Issues in dealing with sequen-
tial and temporal characteristics of self- and social-
ly-regulated learning. Metacognition Learning, 9,
217–228.
Endnotes | 403
4 Marsick, V.J., & Watkins, K.E. (2001). Informal 14 Lan, W.Y., Bremer, R., Stevens, T., & Mullen, G.
and incidental learning. New Directions for Adult (2004, April). Self-regulated learning in the online
and Continuing Education, 2001(89), 25–34. environment. Paper presented at the annual meet-
5 Hu, H., & Driscoll, M.P. (2013). Self-regulation in ing American Educational Research Association,
e-learning environments: A remedy for community San Diego, California.
college? Educational Technology & Society, 16(4), 15 Zimmerman, B.J., & Pons, M.M. (1986). Develop-
171–184. ment of a structured interview for assessing student
Sitzmann, T., & Ely, K. (2011). A meta-analysis of use of self-regulated learning strategies. American
self-regulated learning in work-related training and Educational Research Journal, 23(4), 614–628.
educational attainment: What we know and where 16 González-Torres, M.C., & Torrano, F. (2008).
we need to go. Psychological Bulletin, 137(3), 421. Methods and instruments for measuring self-reg-
6 Zimmerman, B.J. (1990). Self-regulating academ- ulated learning. In Handbook of Instructional Re-
ic learning and achievement: The emergence of a sources and their Applications in the Classroom
social cognitive perspective. Educational Psychol- (pp. 201–219). New York, NY: Nova Science.
ogy Review, 2, 173–201. 17 Organisation for Economic Co-operation and De-
7 Ibid. Sitzmann & Ely (2011). Endnote 15-5.
15-5 velopment. (2000). Literacy in the information
age: Final report of the International Adult Litera-
8 Bandura, A. (1991). Social cognitive theory of cy Survey. Paris: OECD. www.oecd.org
self-regulation. Organizational Behavior and Hu-
man Decision Processes, 50, 248–287. 18 Ibid. Chi & Wylie (2014). Endnote 3-10.
3-10
Rotter, J. B. (1990). Internal versus external con- 19 Ibid. OECD (2000). Endnote 15-17.
15-17
trol of reinforcement: A case history of a variable. 20 Freed, M., Yarnall, L., Spaulding, A., & Gerva-
American Psychologist, 45(4), 489. sio, M. (2017). A mobile strategy for self-directed
9 Zimmerman, B.J. (2000). Attaining self-regulation: learning in the workplace. In Proceedings of the
A social-cognitive perspective. In M. Boekaerts I/ITSEC. Arlington, VA: National Training and
& P.R. Pintrich (Eds.), Handbook of Self-Regu- Simulation Association.
lation (pp. 13–39). New York: Academic Press.
10 For example, see: Moos, D.C. & Azevedo, R. (2008).
Exploring the fluctuation of motivation and use of ENDNOTES FOR SECTION 4
self-regulatory processes during learning with hy- (ORGANIZATION)
permedia. Instructional Science, 36(3), 203–231.
11 Baker, L. & Brown, A.L. (1984). Metacognitive
skills and reading. In P.D. Pearson (Ed.), Hand-
book of reading research (pp. 353–394). Mahwah, Chapter 16 Endnotes
NJ: Erlbaum.
1 Januszewski, A. & Molenda, M. (2008). Educa-
12 Zimmerman, B. J. (1998). Developing self-fulfill- tional technology: A definition with commentary.
ing cycles of academic regulation: An analysis of New York: Taylor & Francis Group.
exemplary instructional models. In D.H. Schunk
& B.J. Zimmerman (Eds.). Self-regulated learn- 2 Glover, J. and Ronning, R. (1987). Historical
ing: From teaching to self-reflective practice. New Foundations of Educational Psychology. New
York, NY: Guilford Press. York, Plenum Press.
Zimmerman, B.J. (2000). Attaining self-regulation: 3 For example, see:
a social cognitive perspective. In M. Boekaerts, Dick, W., & Carey, L. (1978). The systematic de-
P.R. Pintrich & M. Zeidner (Eds.), Handbook of sign of instruction (1st ed.). Chicago: Scott, Fores-
Self-Regulation. San Diego: Academic Press. man and Company.
13 Pintrich, P. R. (1991). A manual for the use of the Ibid. Dick, Carey, & Carey (2001). Endnote 3-23.
3-23
Motivated Strategies for Learning Questionnaire 4 Ibid. Glover and Ronning (1987). Endnote 16-2.
16-2
(ERIC No. ED338122). Ann Arbor, MI: Nation-
al Center for Research to Improve Postsecondary 5 Gustafson, K. L., & Branch, R. M. (1997). Revi-
Teaching and Learning. eric.ed.gov/?id=ED338122 sioning models of instructional development. Ed-
ucational Technology Research and Development,
45(3), 73–89.
404 | Modernizing Learning
6 Gagné, R.M., Wager, W.W., Golas, K.C., Keller, 14 Dede, C., Richards, J., & Saxberg, B. (2018).
J.M. (2005). Principles of instructional design (5th Learning engineering for online education: Theo-
Ed.). Belmont, CA: Wadsworth/Thomson Learning. retical contexts and design-based examples. New
7 See, for example: York: Routledge.
Darling-Hammond, L. (2005). Teaching as a pro- 15 Ibid. Dede et al. (2018). Endnote 16-14.
16-14
fession: Lessons in teacher preparation and pro- 16 Ibid. Dede et al. (2018). Endnote 16-14.
16-14 See page 29.
fessional development. Phi Delta Kappan, 87(3), 17 For more information, see: www.opm.gov/pol-
237–240. icy-data-oversight/classification-qualifications/
Gage, N. L. (1989). The paradigm wars and their general-schedule-qualification-standards/
aftermath: A “historical” sketch of research on 18 Dede. C. (2018, October 19). The 60 year curricu-
teaching since 1989. Educational Researcher, lum: Developing new educational models to serve
18(7), 4–10. the agile labor market. The EvoLLLution. evolllu-
Seidel, T., & Shavelson, R. J. (2007). Teaching ef- tion.com (Included with permission).
fectiveness research in the past decade: The role
of theory and research design in disentangling
meta-analysis results. Review of Educational Re-
search, 77(4), 454–499.
Chapter 17 Endnotes
8 See, for example: 1 Gall, J. (1975). General Systemantics: An Essay on
Angelo, T.A., & Cross, K.P. (1993). Classroom how Systems Work, and Especially how They Fail,
assessment techniques: A handbook for college Together with the Very First Annotated Compen-
teachers (2nd Ed.). San Francisco: Jossey-Bass, Inc. dium of Basic Systems Axioms: a Handbook and
Ready Reference for Scientists, Engineers, Labo-
Shulman, L.S. (2004). Visions of the possible: ratory Workers, Administrators, Public Officials,
Models for campus support of the scholarship of Systems Analysts, Etc., Etc., Etc., and the General
teaching and learning. In Becker, W.E & Andrews, Public. General Systemantics Press. See page 71.
M.L. (Eds.), The Scholarship of Teaching and
Learning in Higher Education: Contributions of
Research Universities (9–24). Bloomington, IN: Chapter 18 Endnotes
Indiana University Press.
Sorcinelli, MD., Austin, A.E. Eddy, P.L. & Beach, 1 Godin, S. (2016, March 6). Stop stealing dreams
A.L. (2006). Creating the future of faculty develop- [Blog post]. Medium. medium.com
ment. San Fransisco: Jossey-Bass, Inc. 2 Bordia, P., Hunt, E., Paulsen, N., Tourish, D., &
9 Campbell, K., Schwier, R.A., & Kenny, R.F. DiFonzo, N. (2004). Uncertainty during organi-
(2009). The critical, relational practice of instruc- zational change: Is it all about control? European
tional design in higher education: An emerging Journal of Work and Organizational Psychology,
model of change agency. Educational Technology 13(3), 345–365.
Research and Development, 57(5), 645–663. 3 See, for example:
10 Rothwell, W.J. & Kazanas, H. C. (1998). Mastering Lane, I. F. (2007). Change in higher education:
the instructional design process: A systematic ap- Understanding and responding to individual and
proach (2nd Ed.). San Francisco: Jossey-Bass/Pfeiffer. organizational resistance. Journal of Veterinary
11 McDonald, J. K. (2011). The creative spirit of de- Medical Education, 34(2), 85–92.
sign. TechTrends, 55(5), 53–57. Zell, D. (2003). Organizational change as a process
12 Cross, N. (1982). Designerly ways of knowing. of death, dying, and rebirth. The Journal of Applied
Design studies, 3(4), 221–227. See page 224. Behavioral Science, 39(1), 73–96.
13 As quoted from Willcox, K.E., Sarma, S., & Lip- 4 Clarke, J.S., Ellett, C.D., Bateman, J.M., & Ru-
pel, P. (2016). Online education: A catalyst for gutt, J.K. (1996). Faculty Receptivity/Resistance
higher education reforms. Cambridge: MIT. to Change, Personal and Organizational Efficacy,
Decision Deprivation and Effectiveness in Re-
For the original, see:
search 1 Universities (ERIC No. ED402846). Pa-
Simon, H.A. (1967) “Job of a college president.” per presented at the Annual Meeting of the Associ-
Educational Record 48(1), 68–78. Washington, ation for the Study of Higher Education, Memphis,
D.C.: American Council on Education. TN. eric.ed.gov/?id=ED402846
Endnotes | 405
5 Riley, W. (1989). Understanding that resistance 4 McDonalds (n.d.). Hamburger University. corpo-
to change is inevitable [Monograph]. Managing rate.mcdonalds.com
Change in Higher Education, 5, 53–66. 5 Starbucks (n.d.) Future leaders start here. www.
6 Susskind, R., & Susskind, D. (2016, October 11). starbucks.com
Technology will replace many doctors, lawyers, 6 Bureau of Labor Statistics (2017, August 24). Num-
and other professionals. Harvard Business Review. ber of jobs, labor market experience, and earnings
hbr.org growth among Americans at 50: Results from a lon-
7 Bordia, P., Hobman, E., Jones, E., Gallois, C., & gitudinal survey (USDL-17-1158). www.bls.gov
Callan, V.J. (2004). Uncertainty during organiza- 7 Bureau of Labor Statistics (2018, September 20).
tional change: Types, consequences, and manage- Employee tenure summary (USDL-18-1500).
ment strategies. Journal of business and psycholo- www.bls.gov
gy, 18(4), 507–532.
8 Aldrich, C. (2003). Simulations and the future of
8 Ibid. Lane (2007). Endnote 18-3.
18-3 learning. San Francisco: Wiley. See page 7.
9 Mulholland, B. (2017, July 14). 8 critical change 9 Loughran, D.S. (2014). Why is veteran unemploy-
management models to evolve and survive. Pro- ment so high? Santa Monica, CA: RAND Corpora-
cess.st. www.process.st tion. www.rand.org
10 Sinek, S. (2011). Start with why how great leaders 10 Snyder, T.D. (2018). Mobile Digest of Education
inspire everyone to take action. New York: Portfo- Statistics, 2017 (NCES 2018-138). U.S. Depart-
lio/Penguin. ment of Education. Washington, DC: Nation-
11 Gawande, A., & Gawande, A. (2010). The check- al Center for Education Statistics. nces.ed.gov/
list manifesto: How to get things right. New York: pubs2018/2018138.pdf
Henry Holt. 11 Ibid. Blackman et al. (2016). Endnote 6-10.
6-10
See also: Shane Parrish interview of Dr. Atul Ga- 12 Zimmerman, B.J. & Dibenedetto, M.K. (2008).
wande on “The Learning Project with Shane Par- Mastery learning and assessment: Implications for
rish” [Podcast]. (2018, October 2). students and teachers in an era of high-stakes test-
12 Teller, A. (2016, April 20). Celebrating Failure Fu- ing. Psychology in the Schools, 45(3), 206–216.
els Moonshots [Audio blog interview]. Retrieved 13 Rosenberg, M. (2014, October 14). Marc my words:
October 8, 2018 from ecorner.stanford.edu/pod- In learning and performance ecosystems, the
cast/celebrating-failure-fuels-moonshots/ whole is greater than the sum of the parts. Learn-
13 This truism is often attributed as an “African prov- ing Solutions. www.learningsolutionsmag.com
erb,” which is most likely inaccurate and certainly 14 U.S. OPM (n.d.). HR line of business: HC Busi-
overly unspecific. It’s been often quoted by vari- ness Reference Model. www.opm.gov
ous well-known speakers, including by Al Gore
when accepting his accepting his Nobel Peace 15 Ibid. National Academies (2018). Endnote 1-4.
1-4
Prize; yet, it’s unclear where the saying originally
derives from. Regardless, it’s still relevant, no mat-
ter the source! (jezebel.com/on-the-origin-of-cer-
jezebel.com/on-the-origin-of-cer-
tain-quotable-african-proverbs-1766664089)
tain-quotable-african-proverbs-1766664089
Chapter 19 Endnotes
1 For example, see: Allen, I. E., & Seaman, J. (2016).
Online report card: Tracking online education in
the United States. Babson Survey Research Group.
2 Statista (2018, December). Total training expen-
ditures in the United States from 2012 to 2018
(in billion U.S. dollars). www.statista.com/statis-
tics/788521/training-expenditures-united-states/
3 Epstein, E.A. (2010, April 11). Chew on this: In-
side McDonald’s Hamburger University—the
“Harvard of the fast food biz.” The Daily Mail.
www.dailymail.co.uk
406 | Modernizing Learning