100% found this document useful (1 vote)
528 views416 pages

Modernizing Learning - Building The Future Learning Ecosystem

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
528 views416 pages

Modernizing Learning - Building The Future Learning Ecosystem

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 416

Production and Distribution Notes

This is a publication by U.S. Government programs with external contributions.

The findings, interpretations, and conclusions expressed in this work do not necessarily re-
flect the views of the Department of Defense, U.S. Government, or other governmental enti-
ties. This work is available under the Creative Commons Attribution 4.0 license (CC BY 4.0
IGO) https://creativecommons.org/licenses/by/4.0. Under this license, you are free to copy,
distribute, transmit, and adapt this work, including for commercial purposes, under the fol-
lowing conditions:

Attribution—Please cite the work as follows: Walcutt, J.J. & Schatz, S. (Eds.) (2019).
Modernizing Learning: Building the Future Learning Ecosystem. Washington, DC:
Government Publishing Office. License: Creative Commons Attribution CC BY 4.0 IGO

Adaptations—If you create an adaptation of this work, please add the following disclaimer
along with the attribution: This is an adaptation of an original work by the Advanced Dis-
tributed Learning (ADL) Initiative, part of the Office of the Deputy Assistant Secretary of
Defense for Force Education and Training. Views and opinions expressed in the adaptation
are the sole responsibility of the author or authors of the adaptation and are not endorsed by
the U.S. Government.

ePub format
• eBook GPO Stock Number: 008-300-00197-2
• eBook ISBN: 978-0-16-095091-9
PDF format
• PDF GPO Stock Number: 008-300-00198-1
• PDF ISBN: 978-0-16-095092-6
Print format
• Print GPO Stock Number: 008-000-01329-2
• Print ISBN: 978-0-16-095088-9

Acknowledgments

The research for and publication of this book were sponsored by the Advanced Distributed
Learning (ADL) Initiative, a research and development program reporting to the Office of
the Deputy Assistant Secretary of Defense for Force Education and Training, part of the U.S.
Department of Defense.

Visual design: Sae Schatz and Elizabeth A. Bradley


Education is the answer to
everything. It’s the door opener;
it opens your mind to the possible.

Alfred Harms, Jr.


Vice Admiral, U.S. Navy (Ret.)
President, Lake Highland Preparatory School
Special Assistant to the President, University of Central Florida
VP for Strategy, Marketing, Communications and Admissions, UCF
Senior Editors
J.J. Walcutt, Ph.D., Director of Innovation, ADL Initiative
Sae Schatz, Ph.D., Director, ADL Initiative

Editorial Board
Xiangen Hu, Ph.D., Professor, The University of Memphis
Van Brewer, Ph.D., External R&D Principal (Contractor), ADL Initiative
Jody Cockroft, Research Specialist, The University of Memphis
Katie Flinn, Project Analyst (Contractor), ADL Initiative

Contributors (in addition to the authors listed in the table of contents)


Academia
David Munson, Ph.D., President, Rochester Institute of Technology
Christopher Guymon, Ph.D., Interim Dean, Graham School, University of Chicago
Christopher Dede, Ed.D., Wirth Professor in Learning Technologies, Harvard University
Susan Singer, Ph.D., VP for Academic Affairs and Provost, Rollins College
Martin Kurzweil, J.D., Director, Educational Transformation Program, Ithaka S+R
Melina Uncapher, Ph.D., Director of Education Program, Neuroscape, UCSF
Benjamin Nye, Ph.D., Director of Learning, ICT, University of Southern California
Kurt VanLehn, Ph.D., Professor, Arizona State University

Education
Ken Wagner, Ph.D., Education Commissioner, Rhode Island Department of Education
Daniel French, Secretary of Education, Vermont Agency of Education
Nathan Oakley, Ph.D., Chief Academic Officer, Mississippi Department of Education
Keith Osburn, Ed.D., Assoc. Superintendent, Georgia Virtual Learning, Georgia DoEd
Kimberly Eckert, Teacher, Brusly High, Louisiana State Teacher of the Year
Michelle Cottrell-Williams, Teacher, Wakefield High, Virginia State Teacher of the Year
Sandra Maldonado-Ross, President, Seminole Education Association, Florida
Sue Carson, Former President, Seminole Education Association, Florida

Government
Heidi Schweingruber, Ph.D., Director, Board on Science Ed., National Research Council
Suzanne Logan, Ed.D., Director, Center for Leadership Dev., Federal Executive Institute
Reese Madsen, SES, Senior Advisor to the U.S. Chief Human Capital Officers Council
Edward Metz, Ph.D., Research Scientist, U.S. Dept. of Education; Projects that Work
Erin Higgins, Ph.D., Research Analyst, U.S. Department of Education
Government ((Cont.
Cont.))
Pam Frugoli, Work Analyst, O*NET Competency Model, U.S. Department of Labor
Doug Tharp, Senior Learning Project Manger, Nuclear Regulatory Commission
Andrew Brooks, Ph.D., Chief Data Scientist, National Geospatial-Intelligence Agency

Military
Fred Drummond, SES, Deputy Asst. Secretary of Defense for Force Education & Training
VADM Alfred Harms, Jr., USN (Ret.), Lake Highland Prep School; UCF
Gladys Brignoni, Ph.D., SES, Deputy Commander, FORCECOM, U.S. Coast Guard
Lt. Gen. Thomas Baptiste, USAF (Ret.), President, National Center for Simulation
Maj. Gen. Thomas Deale, USAF (Ret.), Former Vice Director, Joint Force Development
RADM James Robb, USN (Ret.), President, National Training and Simulation Assoc.
Morgan Plummer, Director, MD5, U.S. Department of Defense
Ralucca Gera, Ph.D., Associate Provost and Professor, Naval Postgraduate School
LTC Michelle Isenhour, Ph.D., Assistant Professor, Naval Postgraduate School
Dennis Mills, Program Analyst, Naval Education and Training Command
Kendy Vierling, Ph.D., Director, Future Learning Group, USMC Training & Edu Command
Larry Smith, Technical Director, USMC College of Distance Education and Training

Non-Profit Organizations
Bror Saxberg, Ph.D., M.D., VP, Learning Scientist, Chan Zuckerberg Initiative
Russel Shilling, Ph.D., Chief Scientific Officer, American Psychological Association
Jason Tyszko, VP, Center for Education and Workforce, U.S. Chamber of Commerce
Elliott Masie, Founder, The MASIE Center
Amber Garrison Duncan, Ph.D., Strategy Director, Lumina Foundation
Emily Musil Church, Ph.D., Executive Director of the Global Learning XPRIZE
Betty Lou Leaver, Ph.D., Director, The Literary Center
Jeffrey Borden, Ed.D., Executive Director, Inter-Connected Education
Jeanne Kitchens, Credential Engine and Southern Illinois University

Industry
John Landwehr, Vice President and Public Sector Chief Technical Officer, Adobe
Phill Miller, Chief Learning and Innovation Officer, Blackboard
Shantanu Sinha, Director, Product Management, Google
Michelle Barrett, Ph.D., VP of Research Technology, Data Science, and Analytics, ACT
Anne Little, Ph.D., Vice President, Training Solutions Development
Stacey Poll, U.S. Public Sector Business Development Manager, Questionmark
Michael Freeman, Consultant, Training and Learning Technologies
Michael Smith, Senior Technical Specialist, ICF
CONTENTS
FOUNDATIONS
01 Modernizing Learning..................................................................... 3
J.J. Walcutt, Ph.D., Director of Innovation, ADL Initiative
Sae Schatz, Ph.D., Director, ADL Initiative, Office of the Deputy Assistant
Secretary of Defense for Force Education and Training

02 History of Distributed Learning................................................... 17


Arthur Graesser, Ph.D., Professor, The University of Memphis
Xiangen Hu, Ph.D., Professor, The University of Memphis
Steve Ritter, Ph.D., Co-Founder/Chief Product Architect, Carnegie Learning

03 Distributed Learning Instructional Theories.............................. 43


Scotty D. Craig, Ph.D., Associate Professor, Arizona State University
Ian Douglas, Ph.D., Executive Director, Institute for the Science of
Teaching and Learning, Arizona State University

04 Lifelong Learning........................................................................... 61
J.J. Walcutt, Ph.D., Director of Innovation, ADL Initiative
Naomi Malone, Ph.D., Research Scientist (Contractor), ADL Initiative

05 Learning Experience Design......................................................... 83


Sae Schatz, Ph.D., Director, ADL Initiative

TECHNOLOGY
06 Interoperability............................................................................ 107
Brent Smith, R&D Principal (Contractor), ADL Initiative
Prasad Ram, Ph.D., Founder and CEO, Gooru

07 Data Security............................................................................... 129


Justin M. Pelletier, Ph.D., Business Director, Eaton Cybersecurity
SAFE Lab, Rochester Institute of Technology

08 Learner Privacy............................................................................ 143


Bart P. Knijnenburg, Ph.D., Assistant Professor, Clemson University
Elaine M. Raybourn, Ph.D., Scientist, Sandia National Laboratories

09 Analytics and Visualization........................................................ 163


Shelly Blake-Plock, President and CEO, Yet Analytics, Inc.

10 Personalization.............................................................................181
Jeremiah Folsom-Kovarik, Ph.D. Lead Scientist, Soar Technology, Inc.
Dar-Wei Chen, Ph.D., Research Scientist, Soar Technology, Inc.
Behrooz Mostafavi, Ph.D., Research Scientist, Soar Technology, Inc.
Michael Freed, Ph.D., Consultant, Reperio
LEARNING SCIENCE
11 Assessment and Feedback......................................................... 203
Debra Abbott, Ph.D., Metis Solutions contractor, Joint Special Operations
University, U.S. Special Operations Command

12 Instructional Strategies for the Future..................................... 223


Brenda Bannan, Ph.D., Associate Professor, George Mason University
Nada Dabbagh, Ph.D., Professor and Director, Division of Learning Tech, George Mason University
J.J. Walcutt, Ph.D. Director of Innovation, ADL Initiative

13 Competency-Based Learning .................................................... 243


Matthew C. Stafford, Ph.D., Chief Learning Officer, Air Training
and Education Command, U.S. Air Force

14 Social Learning............................................................................ 269


Julian Stodd, Founder, Sea Salt Learning
Emilie Reitz, Bold Quest Analytical Working Group Lead, Joint Staff J6
(Joint Fires Division), U.S. Department of Defense

15 Self-Regulated Learning ............................................................ 285


Louise Yarnall, Ph.D., Sr. Research Social Scientist, Center for Technology in Learning
Michael Freed, Ph.D., Consultant, Reperio
Naomi Malone, Ph.D., Research Scientist (Contractor), ADL Initiative

ORGANIZATION
16 Instructional Designers and Learning Engineers..................... 301
Dina Kurzweil, Ph.D., Director, Education and Technology Innovation Support Office,
Uniformed Services University of the Health Sciences, U.S. Department of Defense
Karen Marcellas, Ph.D., Instructional Design Team Lead, Education and Technology
Innovation Support Office, Uniformed Services University

17 Governance for Learning Ecosystems........................................317


Thomas Giattino, Chief, Technology Integration Division, Air Education and Training Command
Matthew Stafford, Ph.D. Chief Learning Officer, Air Education and Training Command

18 Culture Change............................................................................ 339


Scott Erb, Captain, U.S. Navy (Ret.), Former Commanding Officer, Center for Security Forces
Rizwan Shah, Organizational Culture Advisor, Department of Energy

19 Strategic Planning ...................................................................... 357


William Peratino, Ph.D., Deputy Director, USALearning, U.S. Office of Personnel Management
Mitchell Bonnett, Ph.D., Distributed Learning Research, Standards & Specifications,
Directorate of Distributed Learning, Army University, U.S. Army
Dale Carpenter, Superintendent (Acting), U.S. National Park Services
Yasir Saleem, Senior Solutions Consultant, Adobe
Van Brewer, Ph.D., External R&D Principal (Contractor), ADL Initiative

Endnotes................................................................................................. 388
ACRONYMS
ADDIE  Analyze, Design, Develop, Implement, and Evaluate 
ADKAR  Awareness, Desire, Knowledge, Ability, Reinforcement  
ADL  Advanced Distributed Learning 
AI  Artificial Intelligence 
API Application Programming Interface
AR  Augmented Reality 
ASVAB  Armed Services Vocational Aptitude Battery  
BYOD  Bring Your Own Device  
cMOOC  Connectivist Massive Open Online Course 
CORDRA  Content Object Repository Registration/Resolution Architecture 
DARPA  Defense Advanced Research Projects Agency 
DHS Department of Homeland Security
DIS  Distributed Interactive Simulation 
DoD  Department of Defense 
EEG  Electroencephalogram 
EMT  Emergency Medical Technician 
ESSA Every Student Succeeds Act
FAA  Federal Aviation Administration 
FATE  Fairness, Accountability, Transparency, and Ethics 
FERPA Family Educational Rights and Privacy Act
FM  Field Manual 
fMRI  Functional Magnetic Resonance Imaging 
FYI  For Your Information 
GIFT  Generalized Intelligent Framework for Tutoring 
HLA  High-Level Architecture 
HR  Human Resources 
HSI  Human–Systems Integration 
HTML  Hypertext Markup Language 
HTTP  Hypertext Transfer Protocol 
I/ITSEC Interservice/Industry Training, Simulation and Education Conference
ICAP  Interactive, Collaborative, Active, and Passive 
ICICLE  Industry Connections Industry Consortium on Learning Engineering 
IDS  Intrusion Detection System 
IEC  International Electrotechnical Commission 
IEEE  Institute of Electrical and Electronics Engineers 
IEEE-SA  IEEE Standards Association 
InKD  Industrial Knowledge Design 
IoT  Internet of Things 
IPS  Intrusion Prevention System 
ISD  Instructional Systems Design 
ISO  International Standards Organization 
IT  Information Technology 
K-12  Kindergarten through 12th Grade 
KPI Key Performance Indicator
LMS  Learning Management System 
LOM  Learning Object Metadata 
LRMI  Learning Resource Metadata Initiative 
LRS  Learning Record Store 
LX  Learning Experience 
LXD  Learning Experience Design 
MERLOT  Multimedia Education Resource for Learning and Online Teaching 
MOOC  Massive Open Online Course 
MSSP  Managed Security Services Provider 
NASA  National Aeronautics and Space Administration 
NGO  Non-Governmental Organization 
NYCRR  New York Codes, Rules and Regulations 
OECD Organisation for Economic Co-operation and Development
OER Open Educational Resources
OPM Office of Personnel Management
PERLS  PERvasive Learning System 
PII  Personally Identifiable Information 
PLATO  Programmed Logic for Automatic Teaching Operations 
R&D  Research and Development 
RFID  Radio Frequency Identification 
ROI Return on Investment
SaaS Software as a Service
SAKI  Self-Adaptive Keyboard Instructor 
SAMR  Substitution Augmentation Modification Redefinition 
SAT  Scholastic Assessment Test 
SCORM   Shareable Content Object Reference Model 
SIEM  Security Incident and Event Management 
SOC  Security Operations Centers 
STEM  Science, Technology, Engineering, and Mathematics 
TAPAS Tailored Adaptive Personality Assessment System
TECOM  Training and Education Command (part of the U.S. Marine Corps)
TED  Technology, Entertainment, and Design 
UI / UX  User Interface/ User Experience 
VR  Virtual Reality 
xAPI  Experience Application Programming Interface 
XML  Extensible Markup Language 
xMOOC  Extended Massive Open Online Course
Foundations
Modernizing Learning | 3

CHAPTER 1

MODERNIZING LEARNING
J.J. Walcutt, Ph.D. and Sae Schatz, Ph.D.

The 21st century is marked by significant technological progress in every field.


For learning and development, these advancements have helped us realize the
promise of “anytime, anywhere” learning as well as learning personalized to
individual needs. More than that, emerging capabilities have thrown open the
door to transformative possibilities, facilitating learning at scale, optimizing
learning in response to large and diverse data sets, and developing fully in-
tegrated talent management systems for managing and enhancing the future
workforce.

Emerging technologies are not only changing the formal education and train-
ing landscape, they’re also changing our access to—and relationship with—
information and, by extension, affecting the soul of how we think, interact,
develop, and work. Our expectations for educational institutions, how and
where learning occurs, and what personal developmental looks like have
changed—and will continue to evolve into the future. The preK­­–12 system,
higher education, federal and state governments, employers, and military
must similarly adapt to accommodate.

The landscape of learning has broadened, now encompassing the full spec-
trum of formal, informal, and experiential training, education, and develop-
ment. The traditional concept of education is changing.
changing Employers are placing
less value on formal degrees. Instead, experience matters. Life skills, such as
grit and teamwork, matter. Performance-based credentials, including com-
petency badges and micro-certificates, are taking the place of transcripts to
document individuals’ traits, talents, skills, knowledge, preferences, and ex-
4 | Modernizing Learning

perience. Similarly, age is becoming less of a marker of knowledge, skill, and


capabilities. These shifts, in turn, are disrupting conventional career trajec-
capabilities
tories, as age correlates less and less with income and leadership potential,
and even changing the way we perceive employment and define our value as
contributors to our society.

We use the phrase “future learning ecosystem” to describe this new tapes-
try of learning. At the highest level, the future learning ecosystem reflects a
transformation—away from disconnected, episodic experiences and towards
a curated continuum of lifelong learning, tailored to individuals, and delivered
across diverse locations, media, and periods of time. Improved measures and
analyses help optimize this system-of-systems and drive continuous adapta-
tion and optimization across it. Its technological foundation is an “internet for
learning” that not only allows ubiquitous access to learning, it also provides
pathways for optimizing individual and workforce development at an unprec-
edented pace.

This book focuses on the human and organizational aspects of the future
learning ecosystem. It provides key terms and models, and it helps identify
the diverse professional sectors involved in the realization of this vision.

The United States Government has recognized a need for coordination among
the communities of learning scientists, organizational psychologists, software
and hardware engineers, teachers, talent managers, administrators, and other
innovators contributing to this concept. Simply organizing the multiple, in-
terdependent layers of the future learning ecosystem represents an enormous
undertaking, more so because its many facets must evolve in concert. Improv-
ing school classrooms, for instance, means little unless we also transform how
those experiences translate to collegiate, trade, business, and public-sector
settings. Similarly, developing systems for earning and communicating cre-
dentials creates scant value, unless we also understand how to authentically
measure the skills and attributes they accredit. And finally, even if we suc-
cessfully reshape every aspect of our learning and development systems, we
Modernizing Learning | 5

The future learning ecosystem—a


holistic, lifelong, personalized learning
paradigm—represents a contrast to the
Industrial Age model of time-focused,
one-size-fits-all learning

must simultaneously consider the larger cultural and societal shifts affected
by this new approach. How will the reconceptualization of learning affect
jobs, self-worth, loyalty to businesses, power dynamics, access to education,
governmental processes, and our nation overall? When the paradigm of learn-
ing (something so fundamental to each of our lives) evolves it will have ex-
pansive and exciting, but difficult to fully forecast, effects.

WHAT IS LEARNING?
At its most foundational level, learning is any change in long-term memory
that affects downstream thoughts or behaviors. The process of learning starts
with awareness of stimuli,1 cognitive encoding of that information,2 and its
retention in memory. Later, the knowledge must be retrievable (that is, not for-
gotten) and transferable  to novel situations.3 Throughout our lives, every per-
son learns constantly—all the time, every day. What we each learn, however,
its veracity, applicability, intelligibility, and whether it aids or limits perfor-
mance all vary significantly. Each day, we must reconcile among the complex,
competing information vying for our attention—all vying to “teach” us.
Contributions from diverse fields—including
IT, data science, psychology, and learning
science—form a repository of complementary
recommendations; together, these define the
framework of the future learning ecosystem

These myriad science and


technology advancements form the
alloy needed to develop optimized
learning solutions that maximize
efficiency while expanding
effectiveness
Modernizing Learning | 7

The concept of learning applies across performance domains, not only to cog-
nitive development. It necessarily includes physical and emotional aspects as
well as inter- and intrapersonal, social, and cultural components. Certainly,
learning occurs in formal settings, in grade school classrooms or professional
workshops, but it also happens in self-directed, just-in-time, social, experien-
tial, and other informal ways.4 These varied experiences accumulate in long-
term memory and, fused together, affect how we respond to the world.5 In
other words, formal learning in combination with other life experiences col-
lectively determines someone’s readiness for work, public service, and other
life challenges.

Surfacing the Iceberg

To date, our education and training systems have generally focused on the
delivery and documentation of formal learning. As a result, we’ve fostered a
society that values the accreditation of formal training and education (think
college degrees) and proxy measures of aptitude (time-based promotions)
rather than life experiences and direct measures of competence. Of course,
this is based largely on our inability to measure, analyze, and share data about
the latter. With advances in technology, however, we’re surfacing informal
learning.

In talking about learning, enough with the barriers. We’re


interested in outcomes. I want effective learning. I want
measurable learning. I want learning that results in combat
capability. That’s what we’re looking at, in terms of learning science,
from our perspective inside the Pentagon. That’s where I’m pushing
our folks.
Fred Drummond, Deputy Assistant Secretary of Defense for Force
Education and Training, U.S. Department of Defense
8 | Modernizing Learning

The growing visibility of, and access to, informal learning is reshaping our
conceptualization of learning: Increasingly away from a separate, fenced-off
and time-based activity and towards an integrated, diverse lifelong learning
continuum where all experiences and development add to an interdependent
set of holistic competencies. This paradigm shift means education is no longer
viewed as a linear and finite pathway, starting in grade school and culminat-
ing with a high school or university degree. Books and teachers, and other
hierarchical authorities, are no longer the primary gatekeepers of knowledge.
Vocational schools and formal apprenticeships no longer serve as the primary
pathways to develop trade skills. Individuals can even cultivate their athletic
abilities through self-developed and informal learning channels.

Informal learning means more than just self-directed study. Consider, for in-
stance, when a young person travels overseas for the first time. Perhaps with-
out intention, she learns about other cultures, people, history, and food, as
well as other, more subtle lessons about social dynamics, cosmopolitanism,
and even self-awareness. Undoubtedly, such experiences are learning, that
is, they impact long-term memory and change us. But how might society,
teachers, or employers value such learning? How do we record or account for
such experiences? How can we define and measure such seemingly intangible
qualities, such as worldliness, emotional maturity, or empathy?

21st Century Competencies

Elusive personal characteristics, such as good judgment and social aware-


ness, have always mattered. Increasingly, however, pundits are emphasizing
new capabilities that reflect the changing demands of the world. Automation
driven by artificial intelligence, ever-increasing computing power, big data,
advanced robotics, and the proliferation of low-cost advanced technologies
are the shifting nature of work, along with the organizational dynamics of
business, government, and society.
There’s a foundational set
of cognitive, intrapersonal,
Technology is replacing the physical— and interpersonal skills
and intellectual—tasks of many profes- that provide the flexibility,
sions, from bus drivers and construction adaptivity, and capability
workers to medics and lawyers. Jobs in- people need to navigate
volving manual labor, memorizing proce- through the kind of constant
change, discontinuous,
dures, calculating solutions, and even syn-
and sometimes irrational
thesizing diverse information into novel
situations that pervade the
forms are fast becoming the purview of
21st century.
computers. Meanwhile human work in-
creasingly focuses on social and cultural Education should focus on
that, much more than it has
factors, creativity and creative problem
in recent years, because if
solving, digital literacy and technology
we don’t make that shift,
partnership, and rapid adaptability. Mod- we’ll develop a very brittle
ern core competencies tend to emphasize set of people at a time when
higher-order, more nuanced and sophis- adaptability will be core for
ticated capabilities in lieu of fact-based their survival.
knowledge or procedural skills. Similar-
Christopher Dede, Ed.D.
ly, where in the more recent past, highly Timothy E. Wirth Professor in Learning
skilled professionals typically advanced Technologies in the Technology, Innovation,
and Education Program, Harvard University
by focusing on narrow disciplines, to-
day’s savants are often “expert general-
ists” able to synthesize across disciplines,
learn new concepts and contexts rapidly, and adapt to changing conditions.

In contrast to prior decades, there’s a greater expectation for individuals to


learn continuously and develop new capabilities across their entire careers.
In large part, this is spurred by the rapidly changing world around us. Pulit-
zer Prize winning author Thomas Friedman has dubbed this time-frame the
“Age of Acceleration,” reflecting the exponential growth in technology and
unbridled transformation across the globe.6 To excel in this age, we must learn
to thrive in volatility and complexity. We need deep understanding, across a
range of cognitive, affective, interpersonal, and physical competences, and
10 | Modernizing Learning

refresh those capabilities as situations evolve. We need to think in terms of


system dynamics, applying a strategic understanding of complex systems and
the far-reaching effects of actions taken within them. Organizations, too, must
learn to shift and grow with evolving needs, rapidly capturing and integrating
lessons learned and enabling the disseminate of new ideas painlessly across
their enterprises.

In short, to develop and maintain 21st century competencies, individuals re-


quire a greater breadth of interdependent knowledge and skills, at an increased
depth, that is, more advanced levels of nuanced capabilities, and these compe-
depth
tencies must be acquired at a more rapid velocity.
velocity To meet such demands, we
must embrace continuous learning, find more efficient ways to develop and
maintain relevant knowledge and skills, and develop reliable feedback loops
that ensure our systems remain relevant in our ever-changing environment. In
other words, we must profoundly redesign the integrated continuum of formal
and informal training, education, and experience.

FUTURE LEARNING
ECOSYSTEM
The future learning ecosystem is a substantive reimagination of learning and
development. This concept recognizes the increasing need for cognitive agili-
ty, meaning learning is no longer viewed as a single event—nor even a series
of events—but rather as a lifelong experience of continual growth.
growth Second,
the pathways through which learners progress must be personalized to their
unique attributes, skills, interests, and needs in order to achieve necessary
effectiveness and efficiency in learning. Finally, instruction and information
presentation methods must more strongly emphasize deep learning and expe-
dite the transfer of learning from practice to real-world settings.
settings 7
Modernizing Learning | 11

Extensive research, across myriad disciplines, has already examined many


aspects of the future learning ecosystem. However, to achieve its full imple-
mentation and maximal benefits, it’s necessary to harmonize the advance-
ments in learning science, technology, data science, organizational dynamics,
and public policy.

Technological Infrastructure

Information technology forms the enabling foundation of the future learn-


ing ecosystem. Instructional systems, interoperability standards, cross-plat-
form data integration, and centralized software services form the sinews and
nerves that transform today’s stovepiped, staccato learning episodes into a
holistic lifelong experience. Data schemata, technical standards, and gover-
nance conventions enable the recording, aggregation, and analysis of diverse
learning events—opening the possibility for substantial personalization and
data-driven enterprise adaptations. In other words, an integrated, technolog-
ical-enabled learning architecture unlocks the anticipated transformation in
learning. It means that learning can become pervasive—truly accessible any-
time, anywhere, in many forms, and for many functions; and accordingly,
learning can be tailored for optimal effect.

Design

Where technology will open a new world of learning possibilities, learning


science and learning engineering—the thoughtful design of learning com-
ponents and systems—will allow us to capitalize on it. The future learning
ecosystem opens the aperture of learning and changes its core characteristics.
The classic instructional systems design model no longer suffices. The design
of learning, at both the local and enterprise levels, will need new theories and
practices. Learning designers will need to understand how to differentially
apply diverse technologies, blend disparate delivery modalities into holistic
To realize the future learning
ecosystem vision, six critical
areas must align.

TECHNOLOGICAL INFRASTRUCTURE
Flexible, interoperable technologies for pervasive learning

DESIGN
Intentional methods applied to o
 ptimize learning

COMMITMENT
Contributions to a shared vision a cross communities

GOVERNANCE
Negotiation of standards, conventions, and ethics

POLICY
Regulations and recommendations f or behavior

HUMAN INFRASTRUCTURE
Diversely skilled individuals and organizational structures
Modernizing Learning | 13

experiences, build-in and apply learning analytics, balance practical logis-


tics against learning outcome criteria, incorporate learning and development
into personnel and workforce systems, and perform all these actions within a
heterogeneous system-of-systems, which they only partially control.

Commitment

The term “ecosystem” refers to complex, interconnected systems. In stark


contrast to today’s more hierarchical training and education events, where the
teacher reigns within his classroom or the trainer dictates the design of her
curriculum, achievement of the future learning ecosystem requires collective
coordination across diverse communities. The benefits of the future learn-
ing ecosystem can only be realized through their gestalt.
gestalt Learning design-
ers must embed ways to capture learning data, ideally using shared semantic
vocabularies. Technology vendors must eschew proprietary, closed systems
and embrace open architectures and interoperability standards. Early child-
hood educators must plan their curricula with postsecondary, workforce, and
community intersections in mind. Parents, learners, teachers, administrators,
human resource planners, and organizational leaders will need to buy-in to
this concept—and actively contribute to its realization. While interoperable
technologies may form the foundations of the future learning ecosystem foun-
dations, the social contracts followed by the ecosystem will give it breadth
and traction.

Governance

The future learning ecosystem grows from organizational coordination,


technological interoperability, and the aggregation of learning data across di-
verse technological and administrative boundaries. Even without (especially
without) a hierarchical leadership structure, such a complex system requires
sophisticated governance processes. Cross-sectional governance bodies will
14 | Modernizing Learning

The crisis across the nation is that


there is so much disparity between
what each child can access. The
system has to be pervasive.
The dream of America is that all
Americans should have a free
education through 12th grade.

Alfred Harms, Jr.


Vice Admiral, U.S. Navy (Ret.); President, Lake Highland
Preparatory School; Special Assistant to the President and
VP for Strategy, Marketing, Communications and
Admissions, University of Central Florida

need to negotiate the conventions for sharing and protecting individuals’ data,
for designing and updating shared application programing interfaces, and for
balancing the competing interests of educational, commercial, and govern-
mental organizations. Accreditation bodies will need to evolve to accommo-
date new types of assessments and credentials. These governance bodies will
also have a responsibility to consider the social and societal impacts of this
new learning system. They will need to navigate a spate of new social and
ethical considerations, envision new legal and regulatory rules, and attempt to
envision the emergent risks and opportunities as the system matures. While
government will undoubtedly play a role, we—the stakeholders across highly
Modernizing Learning | 15

diverse communities—have a responsibility to actively participate in these


governance processes. Unlike a walled garden, where appointed caretakers
can curate the design, the future learning ecosystem requires the community
to take an active role in steering its ecology.

Policy

Governance bodies, along with the actual government and key performers
within the ecosystem, will inform policies for the future learning ecosystem.
Policy is the blueprint of recommendations and regulations that define guide-
lines for behavior within the system. Recommendations might include best
practices for collecting and personalizing learning in response to data. Regu-
lations, or rules put in place to protect the public, might include guidance on
the privacy, ownership, and commercialization of learners’ data. Nearly all
innovation carries a double-edged sword: Creative foresight, social account-
ability, and ethical principles will need to guide employment of the future
learning ecosystem for our public sector as well as personal and business-re-
lated interests.

Human Infrastructure

Although technological advances make the future learning ecosystem pos-


sible, its implementation requires a multitude of differently skilled (human)
contributors. Hence, as we develop its technology infrastructure, learning the-
ories, and organization processes, we must also cultivate the future learning
ecosystem’s critical human infrastructure. A new subdomain of technologies
and learning-focused data scientists are clearly needed. The system will also
require numerous insightful talent managers, learning engineers, and course-
ware designers. Teachers, trainers, coaches, and mentors will need to be em-
powered and trained to take full advantage of this new milieu of learning.
Even individual learners will play a key role—not only in the “consumption”
16 | Modernizing Learning

It’s about the dignity of work. How do we create


in our country a sense of work pride? We have an
obligation and opportunity to create an environment
where everyone has skin in the game.

U.S. Congressman Jack Bergman


Lieutenant General, U.S. Marine Corps (Ret.);
…from a presentation at the 2018 I/ITSEC Conference

of learning but also in crowdsourced, peer-to-peer, and collaborative learning.


The future learning ecosystem will affect us all, and in turn, we can each
shape and contribute to it.

Blueprint for Implementation

This book examines the future learning ecosystem concept, our collective
progress towards its realization, and the pivot our systems and society need to
make away from formal, detached education and training towards experien-
tial, personalized, interconnected learning journeys. The U.S. Government’s
ADL Initiative has taken the lead in designing this book and is helping to
coordinate across the broad stakeholder community, both conceptually and
practically. The following chapters in this publication provide a snapshot of
the achievements the ADL Initiative and other contributors have made to date,
what we need to build for tomorrow, and what this near-future system will
enable our children, workforce, society, and military personnel to achieve.

Learning is a journey, not a destination.


History of Distributed Learning | 17

CHAPTER 2

HISTORY OF
DISTRIBUTED LEARNING
Art Graesser, Ph.D., Xiangen Hu, Ph.D.,
and Steve Ritter, Ph.D.

Learning science and associated technologies have advanced dramatically,


and disruptively, over the last 30 years, and they will no doubt continue to
evolve through the foreseeable future. To proceed with wisdom, it’s prudent
to review the past and to examine how we came to our current state, what
achievements and pitfalls we encountered, and what lessons might translate
into the future learning ecosystem.

This chapter specifically examines the evolution of distributed learning. Un-


der this moniker, we’ve included related terms, often used synonymously,
such as distance learning, distributed or distance education, web-based and
web-enabled instruction, online learning, and e-learning—just to name a few!
More recently, “distributed learning” has come to reference an even-wider
perspective, sometimes incorporating concepts such as distributed simula-
tion, mobile learning, augmented and virtual reality, computer-assisted in-
struction, and web-based self-directed learning. We touch on those, too. Even
certain generic terms, such as technology-enhanced learning or educational
technology, are sometimes used to reference distributed learning, and where
applicable, we’ve included those concepts as well.

Although we recognize distinctions among these terms, this isn’t an academ-
ic chapter on the nuances of vocabulary. Instead, we attempt to take readers
on a brief journey, starting with the foundations of distributed learning and
18 | Modernizing Learning

considering its evolutionary progress, across many different fields, towards a


unified, technology-enabled interconnected learning paradigm.

Certainly, others have written more robust historical accounts, for those in-
terested in more detail. For instance, in a now classic article, Soren Niper
outlines the three historic generations of distance education, starting with cor-
respondence teaching, followed by multimedia offerings (e.g., cassettes and
television broadcasts), and finally the third-generation, involving information
and communication technologies.1 Building upon Niper’s framework, Mary
Simpson and Bill Anderson wrote a brief and accessible overview of the “His-
tory and Heritage in Distance Education.” 2

For truly comprehensive treatments, refer to Michael Grahame Moore and


William Anderson’s Handbook of Distance Education originally published in
2003 (or Moore’s update of that classic in 2013).3 Also review Paul Saettler’s
thorough examination on The Evolution of American Educational Technolo-
gy 4 and J. Michael Spector and colleagues’ Handbook of Research on Edu-
cational Communications and Technology.5 In the latter, Michael Molenda’s
“Historical Foundations” chapter offers a particularly readable treatment of
the field’s development.

1980s
In all historical accounts of distributed learning, authors seem compelled
to highlight its analog foundations—hand-painted slides illuminated by oil
lamps in the 17th century, correspondence learning by mail in the 18th centu-
ry, or silent films in the early 20th.6 However, for our purposes, the history of
distributed learning meaningfully begins in the 1980s. This decade witnessed
the rise of personal computers, with widespread adoption in most schools
beginning around 1983.7 Their proliferation ushered in Niper’s so-called
History of Distributed Learning | 19

third-generation of distance education, shifting away from “boxes of books”


and towards computer-based learning experiences.

Computer-based learning generically refers


to the use of computers to access training
and education. It can involve synchronous
and/or asynchronous activities, delivered via
networked or standalone stations. Early ex-
periments in computer-based learning began
in the late 1950s and early 1960s, with the
University of Illinois’s PLATO project often
cited as the first computer-based system and Student using PLATO III, 1970;
Gordon Pask and Robin McKinnon-Wood’s courtesy of the University of Illinois at
Urbana-Champaign Archives
SAKI as the first adaptive trainer. SAKI,
which stood for Self-Adaptive Keyboard In-
structor, used a mechanical device to modify typing exercises in response to
learners’ performance, typically shortening training time by one-half to two-
thirds as compared to conventional instructional methods.8

These experiments gave rise to the first-generation of computer adaptive tu-


computer-assisted instruction tutors.”
tors, often called “computer-assisted tutors In his meta-analytic
review of computer-assisted instruction from this time-frame, James Kulik
found students typically performed better (with an average effect size of .35
standard deviations), completed learning activities more efficiently (about a
quarter to a third more quickly), and tended to have more positive outlooks
on learning with computer-assisted instruction.9 Groundbreaking systems
emerged around this time-frame, including intelligent tutoring systems, which
were a substantial advance over computer-assisted instruction tutors with their
very simple assessment, feedback, and lesson-branching rules. Landmark
early intelligent tutors included Alan Lesgold’s SHERLOCK, John Ander-
son and colleagues’ LISP tutor, and John Seely Brown and Richard Burton’s
SOPHIE.10 These systems used automated computational procedures to guide
20 | Modernizing Learning

learners through problem steps, give hints, and provide teacher-like feedback.
The more advanced intelligent tutoring systems showed even higher learn-
ing gains, an effect size of .76 standard deviations, according to more recent
meta-analyses conducted by James Kulik, Phil Dodds, and Dexter Fletcher.11

Many of the early instructional technologies weren’t yet distributed, but that
was changing. Throughout the 1980s, U.S. federal agencies, including the De-
partment of Defense, National Science Foundation, and Department of Edu-
cation sponsored significant research on computer-based instruction, includ-
ing distributed learning.12 In 1989, the U.S. Office of Technology Assessment
delivered a Congressional report, called Linking for Learning, summarizing
the progress such investments had made over the decade:

Distance learning is expanding. …a national survey of representative


school districts indicated that an estimated 22 percent of school districts
now use distance learning, some 33 percent expect to be using these
resources by 1990. The second trend is more subtle. Distance learning is
changing educational boundaries—boundaries traditionally defined by
location and by institution. In the pooling of students and teachers, dis-
tance learning efforts reconfigure the ‘classroom.’ No longer bound by
the physical space, classrooms extend to other students in the same dis-
trict, to other districts, to other States, or even across national borders.13

The report also called for increased research on distributed learning, partic-
ularly regarding its effectiveness, methodology, and design. “The quality and
effectiveness of distance learning are determined,” it explained, “by instruc-
tional design and technique, the selection of appropriate technologies, and the
quality of interaction afforded to learners.” This was a job for instructional
designers.

The origins of Instructional Systems Design (ISD) trace back to the 1960s,
but the 1980s saw a proliferation of ISD models appear in the literature.
Roughly around this time, the ADDIE concept also materialized, apparent-
ly spontaneously,14 as a generic framework underpinning the various mod-
History of Distributed Learning | 21

els. Traditional ISD approaches grew out of the behaviorist paradigm, and
similarly, most early computer-based learn-
ing used drill-and-practice tactics grounded
in behaviorism.15 As Kulik observed at the
ADDIE
Analyze, Design, Develop,
time, “Most programs of computer tutoring Implement, and Evaluate
derive their basic form from Skinner’s work …an evergreen model, general enough
in programmed instruction. Skinner’s model to suit pretty much any process

emphasized (a) division of instructional ma-


terials into a sequence of small steps, or instructional frames; (b) learner re-
sponses at each step; and (c) immediate feedback after each response.” 16

Some educators in this decade also advanced an industrialized model for dis-
tributed learning, as best expressed by Otto Peters. He positively compared
distance education to industrial production, citing the division of labor, mass
production, realization of economies of scale, and reduced unit costs. His
model wasn’t intended as an instructional theory, but rather as an organiza-
tional concept that, in his own words, described the industrial “objectification
of the teaching process.” 17

Nonetheless, the state of learning science in educational technology was pro-


gressing. The 1980s saw a growing influence from the cognitivist school, for
instance, with the development of concepts such as cognitive-load theory.
theory Al-
though this theory’s antecedents began in the 1950s, it wasn’t until the 1980s
that John Sweller connected those earlier cognitive principles to practical edu-
cational tactics. Based on observations of students studying, Sweller proposed
that inherent bottlenecks in our cognitive processes create barriers to learn-
ing that teachers can mitigate through careful instructional design. In other
words, Sweller’s theory posits that certain factors can increase our cognitive
load and distract us from learning the relevant information; more importantly,
his theory offered actionable recommendations to teachers and designers for
mitigating those distractions, including implications for educational technol-
ogy designers.18
22 | Modernizing Learning

Benjamin Bloom was also exploring the impacts of cognitive science on ed-
ucation. His influential research on the “two-sigma problem
two-sigma problem” attracted the
attention of many learning researchers. Bloom found that students who re-
ceive instruction via one-on-one (human) tutoring using mastery learning
techniques outperform those who receive group-based instruction in class-
rooms.19 This foundational study has become a rallying point for proponents
of computer-based adaptive learning.

Although Bloom’s classic study, as well as most of the computer-based learn-


ing so far, emphasized individual instruction, by the mid-1980s learning sci-
entists had begun exploring more constructivist and collaborative techniques,
building upon the constructivist educational theories of Jean Piaget, for ex-
ample, and of collaborative constructivist Lev Vigotsky.20 The most radical
constructivist educational theories begin with the premise that objective “re-
ality” is unknowable, and, instead, individuals construct a subjective, con-
textualized reality within their own minds. Less radical constructivists still
emphasize the active construction of knowledge that tends to settle into the
constraints of the objective physical and social world. For educational envi-
ronments, this implies that students learn best by engaging with instructional
material, actively generating learning experiences rather than passively inter-
preting information. Constructivism catalyzed a change in educational theo-
ry, moving it away from instructor- and content-centric views and towards a
learner-centric one.21 Social constructivism takes this premise a step further,
emphasizing collaboration and the impacts of social interactions on learning
and the construction of knowledge by groups.22

Social constructivist educational theories spurred the development of com-


puter-supported collaborative learning,
learning software designed to support interac-
tive learning and computer-mediated communications. Businesses and uni-
versities began to develop communicative and educational technologies, such
as Xerox’s NoteCards and Carnegie-Mellon University’s Andrew.23 Marlene
Scardamalia and her colleagues from the University of Toronto also made
History of Distributed Learning | 23

significant impacts on this field. For instance, they experimented with com-
puter-supported intentional learning environments that enabled collaborative
meaning-making by helping students share ideas, pictures, and notes via net-
work computers.24 Projects like this influenced the wider field of educational
technology, encouraging a fundamental shift towards social learning.

Such interest helped foster the idea of a “virtual


virtual classroom,”
classroom a multi-person
anytime, anywhere learning environment facilitated by networked comput-
er-mediated communications. “Suddenly it came to me,” Starr Roxanne Hiltz,
from the New Jersey Institute of Technology, explained. “A teaching and
learning environment did not have to be built of bricks and boards. It could
be constructed in software. It could be Virtual! In an era when many teachers
and students have their own microcomputers, it was no longer necessary for
them to travel to a classroom…the classroom could come to them, over their
telephone lines and through their computer.” 25

The digital collaborations spawned in the 1980s led to contextually rich envi-
ronments in the ensuing decades. While Hiltz and her colleagues developed
virtual classrooms, other built entire worlds. Virtual worlds,
worlds or “synchro-
nous, persistent network[s] of people, represented as avatars, facilitated by
networked computers” 26 and synthetic environments,
environments or realistic simulated
environments, similarly emerged during this era. One example of this is Mi-
chael Naimark’s concept of “surrogate travel,” virtual recreations of real en-
vironments navigable via a LaserDisc.27 Another instance is the NASA Ames
Laboratory’s virtual reality system, which used stereoscopic head-mounted
displays and a fiber-optic data glove. Finally, Habitat, developed by Lucasfilm
Games in association with Quantum Computer Services, Inc., is often-cited
as one of the first attempts to develop a large-scale, multiplayer, commercial
virtual world.28 Such systems would require several intervening decades to
reach fruition, but the contributions of these forerunners can’t be understated.

While the education community developed virtual worlds and collaborative


virtual classrooms, the training industry similarly explored collective-learn-
24 | Modernizing Learning

ing capabilities, in their case, for multi-person


training simulations. Promoted by organiza-
tions such as NASA and the U.S. military,
computer-supported trainers first emerged
in the 1940s. Initially, these instructional
simulations were used as substitutions for
live training that was too costly, unsafe, or
otherwise inconvenient. However, during
the 1970s, the training community began
to value instructional simulation beyond
mere substitution, seeing it as a unique
instructional tool and a potential platform
By the end of this the 1980s, “virtual”
exploration was demonstrated routinely for team-based practice. Encouraged in part
at NASA Ames and elsewhere. The
by the demand for collective and improved
picture above, taken in 1990, shows
an operator using NASA’s Virtual training, researchers started developing col-
Interface Environment Workstation,
developed by NASA and VPL Research,
lective, distributed simulation-based training
Inc.; photo courtesy of NASA. technology. The Defense Advanced Research
Projects Agency’s (DARPA) Simulation Net-
work (SIMNET), fielded in 1987, serves as a
notable example. However, distributed simulation wouldn’t become a truly
29

viable learning modality until the 1990s and the rise of the global internet.

1990s
Computer-based learning continued to expand throughout the 1990s, in con-
junction with the increasing prevalence of personal computers, improvements
in their multimedia capabilities, and advances in computer networking. Most
notably, the 1990s were profoundly marked by the growth of the world wide
web (invented in 1989), and with it, broad access to networked communications.
History of Distributed Learning | 25

The first operational web-based courses appeared in the mid-1990s, and by


the end of the decade, around 60% of all U.S.-based universities had web-
based offerings.30 Simultaneously, the e-learning industry emerged. Through-
out the ‘90s, vendors developed tools to help teachers and institutions manage
their e-learning resources. The associated software was released under a di-
versity of titles, including course management systems, virtual learning en-
vironments, learning platforms, and managed learning environments, as well
as learning management systems and learning content management systems,
systems
which remain popular today.

In addition to traditional e-learning, some researchers began to promote adap-


tive hypermedia. In contrast to typical websites, which provide the same text,
links, and multimedia to all viewers, adaptive hypermedia systems create user
models of each visitor and then adapt the information and links presented.
Peter Brusilovsky and colleagues developed and tested adaptive hypermedia
systems that integrated web communication and intelligent tutoring concepts.31

Along with adaptive hypermedia, the so-called “second-generation” of adap-


tive tutors—formally called intelligent tutoring systems—also
systems matured. As
one notable example, the cognitive tutors developed by Ken Koedinger and
his colleagues trained middle school students in mathematics at thousands of
schools throughout the United States and showed impressive learning gains in
rigorous evaluations.32 In their meta-analysis on the topic, Kulik and Fletcher
show that intelligent tutors in the ‘90s reportedly average effect sizes of nearly
one standard deviation—gains nearly twice as high as the first-generation of
computer-assisted instruction tutors.33 The learning gains of these intelligent
tutors are approximately equivalent to human tutors.34

Affective computing originated as a branch of computer science around the


middle of this decade, notably by Rosalind Picard.35 Those researchers exam-
ined how to simulate emotions in AI, and they developed ways for machines
detect emotions in humans. Both goals would prove relevant for education.
The former helped inform research on pedagogical agents,
agents or animated char-
26 | Modernizing Learning
51

acters that serve as tutors or peers in instructional technologies.36 The latter


would help inform the adaptive responses of personalized learning systems,
such as by responding to students exhibiting boredom or frustration.37 Later,
as this discipline matured through the 21st century, researchers such as Rafael
Calvo and Sidney D’Mello would develop ways to more reliably, less inva-
sively sense these states, using tools such as eye-trackers, facial and gesture
recognition, mouse movements, and posture sensors.38

With all of these emerging technologies, it was becoming increasingly clear


that new evidence-based principles of learning were needed. One such ad-
vancement came from Richard Mayer and his multimedia learning theory.
theory
Building on Sweller’s cognitive-load theory as well as other cognitive princi-
ples, Mayer carefully described learners’ mental processes when interacting
with multimedia instruction and then offered guidance on optimizing it, such
as: Present an explanation in words and pictures rather than solely in words,
and present corresponding words and pictures contiguously rather than sepa-
rately.39 Mayer’s work had significant impacts on the field; it made cognitive
science more accessible to educators and gave instructional designers clear
advice they could implement.

Instructional theories related to computer-mediated communication also


gained traction.40 Although these concepts emerged in the 1980s, it wasn’t un-
til this decade, with its ready access to web-based communication, that they
blossomed. Randy Garrison, a prolific scholar in this area, wrote of the time
“…we are entering a postindustrial era of distance education characterized
by the ability to personalize and share control of the educational transaction
through frequent two-way communication in the context of a community of
learners.” 41 Where the previous decade tended to emphasize the industrial
value of distributed learning tools, in the 1990s, theorists such as Garrison
began to place greater emphasis on the facilitation of teaching and learning at
a distance. Even Otto Peters, who first proposed the industrial model of dis-
tance education, asked in the 1990s whether there were “early signs of a ‘new
History of Distributed Learning | 27

era’ which might be called ‘postindustrial’?” 42

While instructional theorists cheered the pedagogical opportunities offered


by the world wide web, some universities had even grander designs. In his
book, Mega-universities and Knowledge Media, John Daniel examined the
transformative power of large-scale, open distance learning in postsecondary
education, highlighting its promise to decrease costs, create flexibility, and
provide greater access to higher education (particularly in underprivileged ar-
eas). Daniel specifically examined the solutions offered by mega-universities
mega-universities,
such as the British Open University. By definition, these institutions remove
barriers to enrollment and serve a minimum of 100,000 students. “Providing
education and training for the burgeoning population of the developing world
is not only a challenge for the countries concerned,” Daniel wrote. “The secu-
rity of humankind may well depend on it.” 43

The power of the web to change society via education could not be ignored.
Marking its impact, the U.S. Congress established the Bipartisan Web-based
Education Commission in 1998, part of the reauthorization of the Higher Ed-
ucation Act. In the Commission’s subsequent—and evidence-rich—capstone
report, titled The Power of the Internet for Learning, it urged Congress to
make e-learning a center-piece of the nation’s education policy, saying “The
Internet is perhaps the most transformative technology in history, reshaping
business, media, entertainment, and society in astonishing ways. But for all
its power, it is just now being tapped to transform education. …It is now time
to move from promise to practice.” 44

The six promising trends cited by the Commission’s report included greater
broadband access; pervasive computing,
computing “in which computing, connectivi-
ty and communications technologies connect small, multipurpose devices,
linking them by wireless technologies;” 45 digital convergence,
convergence or the merging
of telecommunications, radio, television and other interactive devices into a
ubiquitous infrastructure; education technology standards;
standards emerging adaptive
technologies that combine speech and gesture recognition, text-to-speech,
28 | Modernizing Learning

language translation, and sensory im-


mersion; and finally, the dramatically
decreasing cost of internet bandwidth.

With the benefit of hindsight, we can


append several additional trends to this
list. One example is mixed reality,
reality a con-
tinuum including virtual reality (VR)
and augment reality (AR). Although pi-
oneered throughout the 1950s through
Virtual Fixtures, considered the first 1980s, their first practical applications
immersive augmented reality system, was
built by Louis Rosenberg while at the U.S. Air for education and training came in the
Force Research Laboratory. Pictured above, mid-1990s. VR offerings at that time
Rosenberg using the system in 1992; photo
courtesy of AR Trends. typically used either head-mounted
displays or cave-like projections rooms
to create immersive experiences.46 In
contrast to VR, which attempt to wholly replace reality with virtual sights
and sounds, AR systems inject virtual stimuli into actual situations, such
as overlaying graphics onto a real-time, real-world video. However, in both
cases, the technology was still expensive and generally cumbersome—but it
has been advancing rapidly. Still, empirical evaluation of the effectiveness of
these technologies for improving learning or motivation remains surprisingly
minimal, even to this day.

Distributed simulation also saw marked progress during this decade. The
developments of SIMNET, the decade prior, had given birth to the era of
networked real-time simulations. Now, those same proponents that drove the
creation of SIMNET sought to develop synthetic environments capable of
seamlessly integrating live, virtual, and constructive simulations within a
common environment.47 Towards that end, engineers were developing new
interoperability standards to support synchronous instructional scenarios, in-
cluding the Distributed Interactive Simulation (DIS) and the High-Level Ar-
History of Distributed Learning | 29

chitecture (HLA) protocols,48 and researchers were examining the viability of


using the world wide web for distributed simulation.49

The U.S. Government was also looking at better ways to leverage web-based
learning, particularly for military and workforce development. These require-
ments led to the creation of the Advanced Distributed Learning (ADL) Initia-
tive. The ADL Initiative traces its antecedents to the early 1990s, when Con-
tive
gress authorized the National Guard to build prototype electronic classrooms
and learning networks for their personnel. By the mid-1990s, DoD realized
the need for a more coordinated approach, and the 1996 Quadrennial Defense
Review formalized this by directing development of a Department-wide strat-
egy for modernizing technology-based education and training. This strategy
became the original ADL Initiative. In 1998, the Deputy Secretary of Defense
directed the Undersecretary of Defense for Personnel and Readiness, in col-
laboration with the Services, Joint Staff, Undersecretary for Acquisition and
Technology and the Comptroller, to lead the burgeoning program. He also
directed the creation of a department-wide policy for distributed learning,
development of a corresponding “master plan” to carry out the policy, and
resources for the associated implementation. Shortly thereafter, aspects of the
ADL Initiative grew into a federal-wide program, with a mandate to help uni-
fy e-learning systems through coordination, shared technology standards, and
the application of modern learning theory.

The advanced distributed learning strategy requires re-engineering the


learning paradigm from a “classroom-centric” model to an increasingly
“learner-centric” model, and re-engineering the learning business
process from a “factory model” (involving mainly large education
and training institutions) to a more network-centric “information-age
model” which incorporates anytime-anywhere learning.50

Part of the ADL Initiative’s mission involves technology standards for dis-
tributed learning. In the 1990s, standards such as Hypertext Transfer Protocol
(HTTP) and Hypertext Markup Language (HTML) were just appearing. Sim-
30 | Modernizing Learning

ilarly, Extensible Markup Language (XML) was released in the mid-1990s,


helping to turn the web from a presentation medium to a data-rich platform
and, notably, opening the door to the semantic web.
web

Whole books could (and most certainly have been) written about the techno-
logical advancements seen in the last decade of the 20th century. For our pur-
poses, a few other notable ones included the growing prominence of AI and
data mining,
mining availability of natural language interfaces, commercialization of
personal digital assistants and associated cellular communications, and cre-
ation of DVDs. Unprecedented demand for computational models also devel-
oped, encouraging researchers to craft extensive model sets for all manners of
industries including airport facilities, call centers, businesses, health centers,
and even fast-food restaurants.51 Cognitive modeling approaches,
approaches initially ex-
plored in earlier decades, started to be realized in applied systems. DARPA’s
Pilot’s Associate, for instance, incorporated artificial intelligence and cogni-
tive modeling to infer an aircraft pilot’s intentions and support her decision
making. These sorts of cognitive and neuroscience advances also marked this
era, and later lead president George H. W. Bush to designate it “the Decade
of the Brain.”

2000s
The 2000s continued to see acceleration in learning technologies, aided by ex-
panding broadband access, consumer smartphones, streaming video services,
e-book readers, and the rise of social media. As mobile phones permeated
across the globe, practitioners embraced mobile learning (or m-learning). In
developing nations, m-learning became a lifeline, delivering education to mil-
lions of otherwise disconnected or underserved people.52 Even in industrial-
ized countries, m-learning opened new doors, offering an innovative platform
for context-aware, pervasive learning.53
History of Distributed Learning | 31

Content designed for m-learning often took the form of bite-sized,


microlearning chunks. Although microlearning and mobile learning are dis-
tinct concepts, the two overlap and intersect considerably, with both empha-
sizing flexible self-pace content, and contextualization of learning. Smart-
phone-based microlearning helped realize the original promise of anytime,
anywhere—truly ubiquitous learning, delivered at the point of need.

While m-learning developed, conventional online learning continued to grow.


By the end of the decade, 80% of U.S. school districts offered online cours-
es.54 Nearly all universities included some form of e-learning, and many cor-
porations, such as Cisco and AT&T, had migrated substantial portions of their
corporate training online.55 Commercial learning management systems, such
as Blackboard and WebCT, held prevalent market share, and open-source
competitors, such as Moodle and Sakai, were gaining popularity.

The growing demand for e-learning software reinforced the need for asso-
ciated technology standards, such as the Learning Object Metadata (LOM)
and Dublin Core for defining content metadata, and the Sharable Content Ob-
ject Reference Model (more commonly known as SCORM
SCORM) specifications for
making e-learning content interoperable across systems.56 Dovetailing with
these specifications, researchers promoted the concept of “instructional ob-
jects,” or encapsulated learning materials that could be remixed and reused.
As Fletcher predicted in 2005:

…the emphasis in preparing materials for technology-based instruction


(or performance aiding) will shift from the current concern with de-
veloping instructional objects themselves to one of integrating already
available objects into meaningful, relevant, and effective interactions. 57

With such goals in mind, proponents began creating learning registries and
content repositories—federated systems intended to support seamless discov-
ery and access to content, such as the Content Object Repository Discovery
and Registration/Resolution. Architecture (CORDRA)58 and the Multimedia
Education Resource for Learning and Online Teaching (MERLOT) project.
32 | Modernizing Learning

Although the idea of object registries has floundered somewhat in the inter-
vening years, 59 the promise of ready access to learning continues to gain
ground.

Interest in making education broadly accessible spurred the open educational


resources movement, committed to making learning resources free and wide-
ly available to teachers, trainers, and learners.60 Creative Commons, and its
open licensing model, formed around this time, and Wikipedia launched in
the same year. Wired magazine also coined the term “crowdsourcing” in the
mid-2000s, defining it as “…taking a function once performed by employees
and outsourcing it to an undefined (and generally large) network of people in
the form of an open call”—a concept the open educational community quick-
ly embraced.61

The campaign for open education also drove development of massively open
online courses or MOOCs. Although MOOCs wouldn’t become widely pop-
ular until 2012, they first appeared in 2008. Platforms, such as Udemy and
Peer 2 Peer University, were founded soon after, offering free online cours-
es to thousands of students. MOOCs also introduced a new learning para-
digm. The first MOOCs grew out of connectivist learning theory,
theory developed
by George Siemens and Stephen Downes. Dubbed “a learning theory for the
digital age,” 62 connectivism suggests that knowledge is distributed across
networks of connections—particularly in our complex modern world. Con-
sequently, it emphasizes continuously learning, the ability to see connections
among information sources and across different fields, and the importance
of current, diverse knowledge. The original, connectivist MOOCs are some-
times called cMOOCs, to accentuate their emphasis on social learning, coop-
eration, and the use of collaborative learning tools.

In addition to connectivism, several other learning theories developed


throughout the 2000s. For example, the National Research Council published
How People Learn, 63 an influential book encapsulating far-reaching insights
on classroom teaching and learning. Lorin Anderson and David Krathwohl
History of Distributed Learning | 33

released their two-dimensional revision of Bloom’s famous taxonomy.64 Da-


vid Merrill published his First Principles of Instruction,65 which helped to
integrate competing behaviorist, cognitivist, and constructivist learning the-
ories. Steve Fiore and Eduardo Salas published a compendium dedicated to
applying collaboration dimensions of learning science to online learning,66
and the Institute of Educational Sciences released its seven cognitive princi-
ples of learning, backed by solid empirical data and readily applicable in the
classroom.67

The research and practice of personalized learning environments matured,


growing out of the fields of constructivism and adaptive hypermedia 68 as
well as intelligent tutoring systems and artificial intelligence in education.69
The flipped classroom concept, originally developed in the 1990s,70 gained
widespread popularity. This instructional technique reverses the classic
schoolhouse model by delivering didactic instructional content outside of the
classroom and using face-to-face time for
interactive learning, notably those activities
traditionally reserved for homework. The
growth of online learning tools and stream-
ing technologies made flipped classrooms
more accessible to teachers. Salman Khan,
who founded the Khan Academy in 2004,
also significantly contributed to their popu-
larity, helping to broadly familiarize teach-
ers and the public to the concept.71

Likewise, the application of spaced learn-


ing tactics gained widespread acceptance
The influential How People
during this decade (one of the seven cogni- Learn, and its sequel How
tive principles of learning by the Institute of People Learn II, are openly
available from the National
Educational Sciences72), although its roots Academies at www.nap.edu
date back to the 19th century. Also called
34 | Modernizing Learning

distributed practice, this principle highlights that learning occurs best (that
is, is best encoded in and made retrievable from long-term memory) when its
presentation happens over time rather than massed into shorter, less frequent
intervals. Paul Kelley, headteacher at a British high school, helped popularize
spaced learning in his 2008 book Making Minds, which drew notably from
neuroscience principles. In it, he wrote, “As of this moment, scientific analysis
of learning has hardly made any impact on education. In contrast, knowledge
in areas of technology and science generally is growing rapidly. As we will
see, this knowledge is often quite at odds with the conventional wisdom of ed-
ucation. The scientific understanding of the human brain, and how it works, is
beginning to show that learning is not an abstract transmission of knowledge
to an infinitely plastic intelligence but a biochemical process with physical
limitations.” 73

Conversation-based learning environments with pedagogical agents and


avatars on the web flourished during this decade—into and the future. Stu-
dents could learn by holding conversations in natural language, such as in the
AutoTutor system developed by Art Graesser and colleagues 74 and in virtu-
al reality environments, such as Crystal Islands developed by James Lester
and colleagues 75 and the Tactical Language and Culture System developed by
Lewis Johnson.76 These systems promoted constructivism and collaboration,
with engaging social and emotion sensitive interaction.

Desire for increased, evidence-based rigor was also seen among assessments
of learning.77 Although not a new concept, learning scientists strongly pro-
moted the use of tests for learning,78 and urged teachers to move away from
multiple-choice items in favor of more active techniques, such as writing es-
says, which most teachers didn’t know could also be automatically graded
with high reliability.79 Relatedly, by the end of this decade, increasing com-
puting power and the expanding amounts of learning data encouraged the de-
velopment of learning analytics,
analytics led by George Siemens and his colleagues,80
and educational data mining,
mining led by Ryan Baker and his colleagues.81 These
History of Distributed Learning | 35

closely related fields, each evolved to have


professional societies and journals of their
own, apply principles of data science to
learning data, often collected from inter-
action logs or assessments built into edu-
cational technologies. Although research-
ers continue to debate the finer points of
these definitions, both fields emphasize
the use of measurement, collection, and An early AutoTutor interface from the
1990s, courtesy of Graesser et al.
analysis of data relevant to learning and
development, along with the application
of those analyses for enhancing some aspects of the learning system.82

2010-PRESENT
From a learning science and technology lens, the 2010s blend into the prior
decade, but there are technological advances that have changed the landscape
dramatically. This decade ushered into our world accurate spoken language
understanding, smartphones at all spectrums of societies, ubiquitous gam-
ing and social media, tracking of performance in log files at fine grain sizes,
sensing algorithms that detect emotions and identity of people, MOOCs on
thousands of topics, hyper-realistic animated agents, collaborative problem
solving, and disruptive AI that will replace many jobs. It is impossible to fore-
cast the most impactful inventions of our current era. However, a few trends
already stand out for our current decade, but whether they will stand the test
of time remains to be seen.

MOOCs have continued to develop, although not without their critics and con-
cerns. More commonly, today, MOOCs follow the so-called Extended MOOC
model. These xMOOCs share some features with cMOOC, including open
We just finished up a manuscript for the Journal of
Cognition and Development describing where we’ve
come from in the learning sciences and where we’re going.
We traced the funding investments from the 1970s until now
and noted that the funding is coming from different places,
including multiple federal agencies and private foundations.
For example, the Office of Naval Research has a long track
record of funding in this space, as does the Department of
Education in many capacities—not just through the Institute
of Education Sciences but also through predecessors, like
the National Institute of Education.

Federal agencies take different approaches to funding this


research, in part due to the differences in agencies’ mis-
sions, but the goal of understanding how people learn is
shared. We observed that these investments either took
a content-agnostic approach—studying learning principles
typically studied in the laboratory that may have wide ranging
benefits for learning, such as retrieval practice for example—
or, they took a content-dependent approach. For instance,
investments in reading were a focus in the ‘70s and ‘80s and
then again in 2010 with the Institute of Education Sciences’
Reading for Understanding Initiative… This content-depen-
dent approach is a very different approach than the con-
tent-agnostic one; it’s about identifying nuances and chal-
lenges within a content area from a cognitive science angle.

Both the content-agnostic and content-dependent ap-


proaches have been funded in parallel over the years, and
both have made important contributions to our understand-
ing of how people learn. You need the content agnostic ap-
proach to identify promising learning principles but the con-
tent dependent is also necessary because each content area
has unique needs. Ultimately, we need to combine these
two approaches; however, they’re taken by different types
of cognitive scientists. It would be beneficial if those groups
started working together.

Erin Higgins, Ph.D.


Program officer within the Institute of Education
Sciences, U.S. Department of Education
Look for Higgins, Dettmer, and Albro, currently in press
History of Distributed Learning | 37

access and large scale. However, where cMOOCs stress connectivist learn-
ing, xMOOCs generally use more traditional, instructivist methods, focusing
instead on scalability. Spanning both industry and academia, the most popu-
lar xMOOCs launched in 2012 including Coursera, edX, and Udacity. These
platforms, which attempt to provide learning at scale,
scale have been significantly
aided by the development of cloud computing in the 2000s and by the con-
sumer release of Amazon Web Services and Microsoft Azure. Cloud systems
made the “service” model of computing viable, freeing software applications
to become device and location independent, allowing for more frequent appli-
cation updates, and creating a near-infinite capacity to scale on-demand.

Cloud computing also helped realize the Internet of Things (IoT), the network
of smart devices that can connect to networks and share data. Cisco’s Chief
Futurist, Dave Evans, estimates the IoT was “born” around 2008 or 2009, but
researchers have only begun exploring its applications for learning.83 In the
context of education and training, IoT helps bridge real and virtual contexts,
allowing learners to interact with networked physical objects that also have
digital footprints.84 These objects might include embedded RFID sensors,
spatial beacons, or wearable technologies,
technologies such as FitBits or Google Glass.85

Some wearable technologies also incorporate neurophysiological sensors,


such as heart-rate monitors or eye-trackers. The commercial versions of these
still usually suffer from noisy data, and are only starting to be meaningfully
integrated into applied learning systems. Applications of psychophysiological
tools (e.g., eye-tracking, skin conductance), brain imaging tools (e.g., fMRI,
EEG), and affective computing are rapidly advancing in laboratory contexts,
and researchers are already having success detecting students’ emotions from
low-cost video feeds, pulled from the stock cameras on phones and laptops.86
Further, several new DARPA programs are teasing science fiction–like results
as they explore neural interfaces; these have already shown to enhance human
cognition and learning in clinical experiments, and they could one day enable
complex human-machine teaming.87
38 | Modernizing Learning

Each of these applications produces an overwhelming amount of digital


by-products—a smog of data. The explosion of learning data, and corre-
sponding growth and diversity of learning platforms, has once again created a
need for new technology standards. The ADL Initiative began developing the
Experience API (xAPI
xAPI) in 2011, with its first public release in 2013. xAPI lets
software applications share (potentially big) data about human performance,
along with associated instructional or performance context information. xAPI
helps analysts aggregate and collectively analyze learner data from different
systems—from traditional LMSs to mobile devices, simulations, wearables,
and physical beacons. xAPI also represents one piece of the developing Total
Learning Architecture,
Architecture a set of specifications that promises to connect the
many dissimilar and stovepiped learning technologies into a more cohesive
system-of-systems.

The sophistication of the 21st century learning environments and complexity


of data within them have the unfortunate consequence of driving up costs.
An expensive system, say costing $50 million, is economically plausible if it
delivers training to 10 million learners­—but not if only 100 people benefit.
There have been a number of efforts to reduce costs in addition to improving
learning and motivation. For example, intelligent tutoring systems have been
expensive to develop in the past, so the Army Research Laboratory, led by
Bob Sottilare, organized a community of over 200 researchers and developers
to articulate adaptive instructional system guidelines in a 7-volume book se-
ries that covers learner modeling, instructional management, authoring tools,
domain models, assessment, team tutoring, and self-improving systems.88
This Generalized Intelligent Framework for Tutoring (GIFT) initiative also
includes a functional computational architecture that can be used to develop
and test systems.

Another emerging approach to reducing costs is to use crowdsourcing in con-


tent creation and modification, with machine learning to automatically tune
quantitative parameters in self-improving systems.89 Unfortunately, the field
History of Distributed Learning | 39

T3 INNOVATION NETWORK
In early 2018, the U.S. Chamber of Commerce and Lumina Foundations launched
the T3 Innovation Network to bring businesses, postsecondary institutions, techni-
cal standards organizations, human resource professionals, and technology vendors
together to explore Web 3.0 technologies for an increasingly open and decentralized
public–private data ecosystem. Since its kickoff, the Network has grown into a thriv-
ing network of over 128 organizations who are addressing three key challenges: (1)
The need for harmonization among technical data standards groups to ensure data is
interoperable and shareable across systems and stakeholders; (2) The need to apply AI
solutions to improve how learning objectives, competencies, and skills are authored,
translated, and distributed; and (3) The need to empower learners and the American
worker with data to improve their agency and ability to manage and connect to oppor-
tunities in the talent marketplace.

still lacks a systematic, widely accepted approach to estimating costs and de-
velopment time for building and testing these complex learning environments.

With the increasing automation in education and training, there’s been a cor-
responding push to create semantically rich data, that is, to give the mean-
ing to the underlying data elements—in ways computers (and other humans)
can understand. The developers of xAPI, for instance, are attempting to build
semantically rich usage profiles as well as published, shared vocabularies.
Proponents of competency-based learning are attempting a similar feat, but
in their case, to define the data elements that make up a human competency.
Volunteers supporting the IEEE established a working group in 2018 to revise
the decade-old Reusable Competency Definition (1484.20.1), expanding its
utility and harmonizing it with other standards for competencies and compe-
tency frameworks.90

The working group’s efforts are timely, as more formal education programs
are embracing competency-based degrees, i.e., postsecondary programs
where students earn diplomas by demonstrating mastery through real-world
projects—rather than through time-based credit hours. In competency-based
40 | Modernizing Learning

programs, students are typically assigned learning coaches, rather than di-
dactic instructors, and they have access to an array of open-source resources,
including videos, textbooks, and online communities.91 As of 2014, there were
already an estimated 200+ competency-based learning postsecondary degree
programs in the U.S., but policy regulations are lagging.92 It’s not clear how
this trend will resolve, but we fully expect the core concept expand in the
coming years.

Like competency-based degrees, micro-credentials


micro-credentials, and the associated tech-
nology standards for digital badges,
badges have garnered growing attention. Train-
ing and education credentials, such as licenses and diplomas, have existed
for centuries as a way to verify someone’s educational pedigree. Like their
more robust cousins, micro-credentials assert that a person has demonstrat-
ed a particular competency. Unlike more formal credentials, however, learn-
ers can receive micro-credentials for smaller learning segments, and (at least
hypothetically) micro-credentials reflect the performance-based approach of
competency-based learning. Whether micro-credentials catch on remains to
be seen. Practical and policy challenges still face the field; although, organi-
zations such as the Lumina Foundation, Digital Promise, and BloomBoard
are working to overcoming them. Meanwhile some commercial organizations
are charging ahead with their tiny certs, including Udacity’s nanodegrees and
edX’s MicroMasters.93

Given these many technological inventions, the rise of learning analytics,


surge in neuroscience research, and developing maturity of learning science,
educators and instructional designers are forced to rethink their discipline as
well as their own capabilities. If done correctly, the future of learning will
look noticeably different from its Industrial Age ancestor. Corresponding-
ly, some have embraced the concept of learning engineers—a
engineers new (and still
forming) paradigm that describes the “instructional designer” of the future.
In 2017, the IEEE created a working group, named the Industry Connections
Industry Consortium on Learning Engineering, to help mature the idea, led
History of Distributed Learning | 41

by Bob Sottilare, Avron Barr, Robby Robson, Shelly Blake-Plock, and others.
In 2018, Chris Dede, John Richards, and Bror Saxberg released their guide to
Learning Engineering for Online Education.94 Saxberg, who also serves as
a Consortium advisor and as vice president of learning science at the Chan
Zuckerberg Initiative, described the emerging discipline:

A Learning Engineer is someone who draws from evidence-based in-


formation about human development—including learning—and seeks
to apply these results at scale, within contexts, to create affordable, reli-
able, data-rich learning environments.95

To add, from another of his quotations:

There will come a time when we look back at how we “used to do learn-
ing,” and, just as we now look at medicine in the 19th century, wonder
how we ever made progress without using the science and evidence that
we can now generate. We’re not there yet—but we may be on our way.96

Saxberg’s words ring true, not just for learning engineers but for the wider
learning and development sector. Much has changed as technology advanced
and learning science evolved. The concept of “distributed learning” has pro-
gressed, from its simple roots as a pragmatic tool to bridge the transactional
distance, to today’s cacophony of ubiquitous, adaptive, on-demand instruc-
tion. A central goal of the ADL Initiative and its larger community has always
been to bring clarity and coordination to this discipline. Today, more than
ever, the distributed learning community needs organizational, theoretical,
technological, and policy structures to bring unity. We are, perhaps, in the
middling ugly-duckling years of the field’s maturation. The promise of re-
sponsive and evidence-driven ubiquitous learning is there, crafted by con-
tributors for over 40 years. It’s now our challenge to resolve the complexity,
to bridge across its numerous facets as our connectivist peers have taught us,
infuse deliberate learning theory into our work as learning science scholars
advise, and, as the learning engineers promote, to embrace a comprehensive
approach to enhancing the full continuum of learning.
The first hurdle is to move past the “recorded slides and
talking head” form of online learning. The instructors need
to be trained on advances in digital learning technology and
methodology. The second hurdle is ensuring that the organization
has a modern experience driven learning environment that supports
these more interactive and personalized experiences. The third is to
communicate expectations between the instructor and learners that
this isn’t a lecture, rather, it’s a facilitated dialogue, not limited to a
particular place and time—but available for continuous reference and
enhancement.

John Landwehr
Vice President and Public Sector Chief Technical Officer, Adobe
Distributed Learning Instructional Theories | 43

CHAPTER 3

DISTRIBUTED LEARNING
INSTRUCTIONAL THEORIES
Scotty D. Craig, Ph.D. and Ian Douglas, Ph.D.

Learning has moved beyond the classroom. It’s happening everywhere, all the
time, formally and informally, incidentally and intentionally—and increas-
ingly supported by digital technologies. For more than a decade, online educa-
tion has consistently expanded.1 The U.S. Department of Education estimates
that 5.8 million students enrolled in distance
education courses in 2015, the most recent Distributed learning
year for which statistics exist, accounting should employ evidence-
for 28% of the total student population.2 The based practice, built on
Association for Talent Development report- the science of learning
ed that 88% of corporations offered e-learn-
ing as part of their workforce development in 2017, and 27% of high-perfor-
mance organizations used e-learning for a majority of their training.3 MOOC
clearinghouse Class Central reported that MOOCs also grew, serving over 80
million students in 2017.4

No doubt the impact of distributed learning will continue to grow; hence,


educational decision makers, instructional designers and learning engineers,
teachers, and trainers should understand the best practices for technology-en-
abled learning—and implement these to their best abilities and resources.
This isn’t just our opinion. For instance, the Every Student Succeeds Act,
signed into law by President Barack Obama in 2015, requires that students in
America be taught to highest academic standards and asks schools to employ
evidence-based approaches to learning, supported by a scientific process that
44 | Modernizing Learning

provides evidence of effectiveness. Similarly, the World Bank, cited “acting


on evidence to make schools work for learning” among their three priorities
for 2018, writing “Act on evidence—to make schools work for all learners.
Use evidence to guide innovation and practice.” 5

Building evidence and properly validating a theory within a scientific disci-


pline, however, can take decades. Then more years to communicate its prem-
ise to the wider community—not withstanding those pockets who will inev-
itably resist the idea of evolution. Meanwhile, as this process plods forward,
practitioners are anxious for improvement. So, they embrace theories that, on
their face, seem to make sense, even if there’s little proof to accompany them.
Commercial interests further complicate matters, as companies are often
quick to adopt popular theories, promote their unique value propositions, and
build technology around them—all before adequate research has concluded.

But the world isn’t so grim. The scholarly pursuit of learning science is in-
creasing. The National Academies recently released a sequel to their excellent
compendium, How People Learn. This new volume, How People Learn II,
published near the end of 2018,6 included new research on educational tech-
nologies, including findings on neurological processes, lifelong learning, and
the impact of social and cultural factors. There’s also growing awareness from
policymakers and administrators of the importance of learning science and
greater numbers of research programs at institutions such as the aforemen-
tioned Department of Education and World Bank.

In this chapter, we mix optimism with some healthy caution. In the next sec-
tions, we overview research that provides some guidance on designing for
technology-supported learning and practical best practices for establishing
associated design teams. We’ve omitted many quality theories, for the sake
of brevity, but will summarize a few of the most relevant to the design of
distributed learning. Our main goal is for readers to take away the ideas that
distributed learning theories exist, authors have taken steps to make them
accessible to practitioners, and new distributed learning systems—whether
Distributed Learning Instructional Theories | 45

concerned with the content development level or the enterprise infrastructure


level—should be informed by this work.

INSTRUCTIONAL THEORIES
As outlined in the preceding chapter by Art Graesser and colleagues
(Chapter
Chapter 2),
2 learning science theories have generally evolved with the zeit-
geist of cognitive science. Early educational theories followed the behaviorist
model, emphasizing drill-and-practice tactics, reward and punishment, feed-
back, and repetition. Cognitivist theories came next. In contrast to the be-
haviorists, cognitivists sought to understand the mind and apply principles of
cognitive processing to the design of learning content. A third prominent par-
adigm, constructivism, followed. Constructivists argued that humans create
rather than acquire information; it’s therefore impossible for some “correct”
understanding of the world to be transferred from one person’s memories to
another. Individuals must learn through engagement.7

As one might expect, each of these paradigms encouraged the development


of various instructional theories. Seeing the proliferation of competing the-
ories, Dave Merrill set out to evaluate and eventually harmonize the field.
His resulting work, First Principles of Instruction, had wide impact.8 For the
first time, a framework incorporated the breadth of theories—and all within a
concise set of principles. The inset below summarizes them, but we encourage
readers to read Merrill’s original article where he includes crisp guidance for
instructional designers on each.

In How Learning Works, Susan Ambrose and colleagues followed in Merrill’s


footsteps.9 They built on his First Principles and added to them new synthe-
sized research on teaching. Their subsequent framework includes seven cate-
gories, each with several underlying recommendations written specifically for
teachers and instructional designers.
46 | Modernizing Learning

1 st
PRINCIPLES OF INSTRUCTION (DAVE MERRILL)

Problem Centered – Engage learners in solving real-world problems


Activation – Active learners’ relevant previous experience
Demonstration – Demonstrate what’s to be learned (don’t merely talk about it)
Application – Have learners use their new knowledge or skill to solve problems
Integration – Encourage learners to transfer new learning into their everyday lives

Both Merrill and Ambrose et al.’s work recommends that practitioners create
active learning environments. However, in practice, this suggestion is often
watered down, distilled to superficial criteria like measures of classroom at-
tendance or homework completion, or it’s otherwise simplified to proxy in-
dicators, such as attitude or interest. None of these truly meet the mark. As
Michelene Chi and her collaborators have observed:

In short, although “active learning” is a great idea for overcoming “pas-


sive learning,” we have identified three concrete practical challenges
that teachers may face when developing lessons that promote “active
learning.” First, broad recommendations such as engage students cog-
nitively, encourage meaningful learning, and get students to think about
it do not tell teachers how to create activities that overcome “passive
learning.” Second, teachers have few criteria to use in deciding which
are the best “active learning” activities to design and implement. Third,
there are no guidelines for teachers regarding how to best modify their
favorite existing assignments in order to optimize “active learning.” 10

Chi and colleagues developed the Interactive, Collaborative, Active, and Pas-
sive (ICAP) framework to provide guidelines for fostering active learning
environments. The ICAP categories describe hierarchical levels of cognitive
engagement, with “passive” learning typically producing the weakest learn-
Seven principles for smart teaching from Ambrose et al.

1 Learners’ prior knowledge can help or hinder learning


Teachers should talk with other instructors and use diagnostic tests of prior knowledge
to learn about their students. Be explicit to students about the connection between new
material and their prior knowledge; this aids long-term retention.

2 How individuals organize knowledge influences how they learn


It also affects how they apply what they know. So, make use of techniques that make
knowledge organization schemes explicit, such as concept maps. Look for patterns of
mistakes and misconceptions in learners’ conceptions.

3 Learners’ motivation determines, directs, and sustains learning


Help learners see the value in what’s being taught and how it helps their future
development. Provide authentic tasks with an appropriate level of challenge (simulations
and games are useful). Get learners to understand the reasons for success and failure.

4 Learners must acquire and integrate component skills


To develop mastery, learners need to practice integrating component skills and know when
to apply what they’ve learned. Be aware of expert “blind spots”—steps they perform
unconsciously and are, therefore, not well-articulated in instruction. Provide isolated
practice of component skills in diverse contexts and then facilitate the integration of
component skills in more challenging tasks.

5 Goal-directed practice with targeted feedback enhances learning


Phrase instructional goals in terms of capabilities rather than knowledge (refer to
Chapter 13,, in this volume, on competency-based learning). Provide time for deliberate
Chapter 13
practice, and pair this with feedback that focuses on specific items that need improvement.

6 The social, emotional, and intellectual context impacts learning


Learner current development is influenced by the context. A positive and constructive tone
of communications within the learning community often improves learners’ motivation and
behavior.

7 Students must learn to monitor and adjust their own learning


Help learners develop metacognitive skills, such as self-monitoring. A malleable, rather
than fixed, perspective of intelligence can also be promoted and has been found to
influence performance.
48 | Modernizing Learning

ing outcomes and “interactive” learning often promoting the strongest. Inter-
active learning encourages learners to actively integrate new and prior knowl-
edge, draw inferences to fill knowledge gaps and confusions, and otherwise
enact strategies that build rather than merely rehearse knowledge, ultimately
supporting deeper learning and increased transfer to new domains. Notably,
this research highlights that it’s the way learners engage in different activities
that makes them more or less passive; learners’ engagement levels aren’t nec-
essarily “cooked in” to the instructional interventions, themselves.

Example of “watching a video” at various levels of engagement:


PASSIVE ACTIVE CONSTRUCTIVE INTERACTIVE
Receiving Manipulating Generating Dialoguing
Watching the video, Actively engaging Explaining concepts Debating with a peer
without doing with the playback, from the video; about the message
anything else such as rewinding taking paraphrased in the video; actively
and pausing; taking notes; contrasting analyzing the position
verbatim notes the video to other of the video in a small
materials group discussion
Example application of Chi and colleagues’ ICAP framework

Much of our preceding discussion has emphasized the science of teaching or


the practice of instructional design. However, as Ambrose and her coauthors
highlighted in their own work, recommendations on designing and delivering
instruction miss more than half the equation. Though tightly bound, learn-
ing and development are wholly different phenomena from education and
training. With this understanding, Ambrose et al. highlighted three critical
components of learning:

1. Learning is a process, not a product.


2. Learning involves changes in knowledge, beliefs, behaviors, or
attitudes, which must unfold over time.
3. Learning is not something done to others, but rather something
learners must do themselves.

Teachers, trainers, and instructional designers can’t directly manipulate


Distributed Learning Instructional Theories | 49

what’s happening in learners’ minds, but some theories give guidance on how
to encourage better learner processes.

Self-regulated learning theory, for example, describes learning processes


guided by the learner him- or herself and which is, at least partially, intrinsi-
cally motivated. At its most basic, self-regulated learning involves planning,
executing, and then reflecting on some activity. Hence, it involves the ap-
plication of metacognitive knowledge and monitoring skills, such as under-
standing different cognitive tactics and correctly identifying the difficulty of
different tasks.

Louise Yarnall and her colleagues describe self-regulated learning in more de-
Chapter 15).
tail later in this book (Chapter 15 In short, one way to envision it is as a cycle,
involving different phases that someone undertakes to strategically and inten-
tionally improve performance.11 These phases start with task definition, where
someone works to understand the problem at hand along with any available
resources. This is followed by a goal setting and planning phase, where learn-
ers establish objectives and select tools and strategies to meet them. Next, an
enactment or engagement phase occurs, where learners implement their cho-
sen strategies and attempt to perform the task. Finally, there’s an evaluation
or adaptation phase, where learners assess their actions and outcomes, and
revise their goals, plans, and strategies, accordingly. Although these actions
are, by definition, learner driven, individuals without strong metacognitive
skills can be taught. For instance, teachers and trainers can provide scaffolds
to help guide learners through these self-directed learning processes.

In closing, this section has offered the barest summary of instructional


theories. Some other sources serve as useful supplements. Harold Pashler and
colleagues published seven principles for instructional strategies, including
recommendations for spaced learning, using worked examples in combination
with problem solving, combining graphic and verbal descriptions, integrating
abstract and concrete concepts, using quizzing and questions to eliminate
misconceptions, and supporting self-regulated learning by helping learners
50 | Modernizing Learning

allocate study time.12 Art Graesser built on prior work to define 25 principles
of learning (clearly an overachiever in learning frameworks!).13 These roughly
group into recommendations for reducing processing load, facilitating learning
by implementing strategies within (e.g., feedback and deep questions) and
around the learning content (e.g., testing effects and spaced learning), and
suggestions for helping learners understand the process of learning (e.g. self-
regulated learning and desirable difficulties). Finally, for a truly comprehensive
historic treatment, Peter Jarvis authored a three-volume set, beginning with
the book, Towards a Comprehensive Theory of Human Learning.14

INSTRUCTIONAL
TECHNOLOGY
THEORIES
Classic instructional theories emphasize learner-content, learner-teacher, or
learner-learner interactions. Starting in the around the 1960s, researchers be-
gan to also examine learner-interface dynamics, leading to unique pedagogies
for educational technology. Early work on instructional media involved com-
parison studies, often looking at technology-mediated versus traditional set-
tings. These found “no significant differences,” but this was the behaviorist era
and (as described below) instructors tended to employ the instructional media
in the same way they might deliver traditional teaching. In the 1980s, with
growing interest in the cognitive perspective, researchers began to look more
closely at media attributes and their interactions with individual differences.15

Coming out of this growing appreciation of instructional technologies, Rich-


ard Mayer published his highly influential Cognitive Theory of Multimedia
Learning. Multimedia learning is a combination of more than one modes of
information presentation, such as visual images with a narration, within a
MAYER’S 12 PRINCIPLES OF MULTIMEDIA LEARNING

COHERENCE SIGNALING REDUNDANCY


eliminate extraneous highlight essential use graphics and narration
information information (not on-screen text)

SPATIAL CONTIGUITY TEMPORAL CONTIGUITY SEGMENTING


put words and related show words and related present lessons in
pictures near each other pictures simultaneously under-paced segments

MULTIMEDIA MODALITY PRE-TRAINING


words + pictures are use graphics and narration start lessons with a quick
better than words alone versus animations and text refresher and an overview

PERSONALIZATION VOICE IMAGE


use a conversational narrate in a friendly human the narrator’s image isn’t
style, not a formal one (not machine) voice needed on-screen
52 | Modernizing Learning

learning environment. May-


er’s theory builds on core
…education doesn’t educate you
cognitive mechanisms. For
unless it changes you.
instance, it acknowledg-
es the limited capacity of
Betty Lou Leaver, Ph.D. working memory, assumes
Director, The Literacy Center; Manager, MSI that learners have two cog-
Press; Former Provost, Defense Language
Institute Foreign Language Center nitive processors that handle
new information differently
(an auditory and a visual processor), and that learners must be cognitively
engaged to produce new knowledge structures.16

Recommendations for technology-enabled instruction naturally followed from


these tenets. For example, given the limits of working memory, multimedia
learning materials need to moderate the amount of essential processing re-
quired by the learners depending upon their prior knowledge, experience, and
competencies. Furthermore, given our brains’ two processing channels, com-
plementary information should be delivered simultaneously to both to more
efficiently support learning. Many other design principles can also be derived;
these cluster under 12 principles, as summarized in the adjacent graphic.

Another uniquely technology-centric theory is described by the Substitution


Augmentation Modification Redefinition (SAMR) model, popularized by
Ruben Puentedura.17 It emphasizes a unique challenge with learning technol-
ogies; that is, often people use them in similar ways and assume similar con-
text to traditional, formal education settings—with classrooms, instructors, a
fixed body of content to be learned and a fixed amount of time. This model
helps explain why, for instance, the first web-course designers attempted to
recreate printed texts online or why the original virtual classrooms took so
many cues from physical ones.

The SAMR model defines levels of technology use in teaching and learning.
The most basic, and most often implemented level is substitution, where the
Distributed Learning Instructional Theories | 53

technology is used to perform the same task as was done before. For example,
an instructor uses PowerPoint to replace acetate slides or students use laptops
to replace paper notebooks. Alternatively, the highest level is redefinition,
where technology supports new learning tasks that were previously incon-
ceivable. This level represents the future of learning and is a foundational
reason for reimagining instructional design.

Technology is changing the way we live, and future instructional technol-


ogy theories should reflect new approaches to learning, including in indi-
vidual, social, and lifelong learning contexts. However, many of our current
best practices were developed before this digital explosion, leading us to ask,
“How will we transform our current models for learning and not just consider
how we make incremental improvements to the traditional approach?”

SA M R The SAMR model highlights our tendency to


use new technologies in old-school ways.

REDEFINITION
Technology enables new tasks, previously inconceivable

MODIFICATION
Technology enables significant task redesign

AUGMENTATION
Technology acts as direct substitute, with functional improvement

SUBSTITUTION
Technology acts as direct substitute, with no functional improvement
54 | Modernizing Learning

VISION FOR THE FUTURE


OF LEARNING THEORY
One challenge with learning theories is that they’re prone to focus just on the
design, delivery, and evaluation of instruction. Even with additional consider-
ation for technologies used for learning, we’re still omitting part of the puzzle.
Earlier in this book, Walcutt and Schatz outlined six elements that need to
be considered for the future learning ecosystem: technology infrastructure,
design, commitment, governance, policy, and human infrastructure. The con-
struction of learning elements—including those theories covered so far in
this chapter—fall into their “design” category. Undoubtedly, the careful de-
sign of learning content, associated delivery and evaluation techniques, and
learner-support methods are critical. However, the other elements in this also
framework warrant consideration.

Certainly, Walcutt and Schatz aren’t the first to suggest a wider aperture.
Badrul Khan,18 for instance, proposed an eight-dimension framework for
e-learning, comprised of institutional, management, technological, peda-
gogical, ethical, interface design, resource support, and evaluation factors.
Shahid Farid and colleagues built on Khan’s work.19 They use empirical data
from stakeholders about roadblocks to e-learning in postsecondary environ-
ments. Farid et al.’s model includes software, technical, institutional, person-
al, and cultural dimensions. Beatrice Aguti and her colleagues also devel-
oped a broader model for higher-education contexts, but this time for blended
learning. Their framework has four dimensions, including e-learning course
delivery strategies, e-learning readiness, quality e-learning systems, and ef-
fective blended e-learning.20 For our purposes, we’re less concerned with the
potential similarities and differences of these various frameworks. Our point
is simply that learning—and particularly technology-enabled learning—hap-
pens within a broader context.
Distributed Learning Instructional Theories | 55

From this broader perspective, it’s


clear successful distributed learn- Take care to avoid the
ing enterprises will rely up-on in- Everest Syndrome—the
terdisciplinary, effective teams of urge to embrace new
practitioners. Where teachers once instructional technology
presided over their classes or prin- just because it’s there
cipals over their schools, the emerg-
ing learning ecosystem has less de-
fined boundaries, and it also relies upon a greater diversity of expertise (as
described in more detail in Chapter 19
Chapter 19, which discusses learning engineers).

Successful, future (distributed) learning will be developed by organizations


able to build and support multidisciplinary teams. For instance, in lieu of an
isolated instructor, we could imagine a team of three to five members working
together to develop learning experiences. This team might involve an instruc-
tor or content expert, an instructional design or learning science practitioner,
a technology expert, and perhaps even a data scientist. Additional members,
such as usability experts and psychometricians, might also be required.21
Finally, to be truly successful, there needs to be a larger learning organization
(administration) in place to facilitate interactions and coordination.22

This new team structure will also require strong leadership.23 Leaders respon-
sible for learning will need awareness of the expertise available to them and
know how to integrate different kinds of expertise into learning development
processes. They’ll need to understand evaluation, at multiple levels (such as
within the content, to assess learners, but also at an institutional level to evalu-
ate the learning experience, itself), and they’ll need to consider broader impli-
cations, such as privacy, ethics, and social factors. During learning design and
development phases, leaders will need to look for efficiencies. For instance,
they’ll need embrace the reuse of learning materials, looking for ways to re-
duce the cost of development efforts by reusing already developed content
elements, technologies, or tools.
56 | Modernizing Learning

Thus, learning leaders should continually ask themselves questions, such as:

• Do we have all the specific expertise on our team to meet our goals?
• Is the team working effectively as a community with shared purpose?
• Are we making good use of existing reusable resources and tech?
• Are our evaluation processes (at all levels) the best we can achieve?
• Are we aware of the evidence in support of each instructional resource,
method or technology we use?
• Do we have someone capable of interacting with the output of the
learning science community to identify relevant knowledge that can be
adapted into our process?

The development of instructional materials has sometimes been compared to


software development.24 The development of software in the early days of the
personal computer involved one or a few individuals crafting an application
with a primary focus on function; however, modern software development
involves large teams of different specialists (e.g., software architects, soft-
ware engineers, user-experience designers, cybersecurity specialists) working
together and collectively considering a broad range of design attributes (e.g.,
functionality, security, aesthetics, usability). Modern software developers
are also comfortable with the idea of reuse and “mashups” (combining data
or functionality from different sources). Numerous repositories of reusable
code are readily available on the internet. Also, connections called Applica-
tion Programming Interfaces (APIs) allow different operational software plat-
forms to share data across them, enabling sophisticated functionality, such as
Google maps, or up-to-the-minute data, such as from the U.S. Government’s
data.gov, to be embedded in any other application. However, the same ethos
data.gov
isn’t always found in modern instructional development—both the organiza-
tional dynamics of multidisciplinary instructional teams and the infrastruc-
ture needed to share and integrate learning materials need to be cultivated.

However, promoting successful interdisciplinary teams is challenging, not be-


cause adequately skilled individuals are unavailable but because they often
Distributed Learning Instructional Theories | 57

90% …of students were highly engaged, when


taught via well-designed, service-learning methods

Projects That Work is an ongoing research study with the goal to provide teachers
data-driven information to make decisions to use service learning flexibly, efficient-
ly, and effectively. The premise is that if schools and teachers have continuously
updated lists of projects that were highly rated by 20 or 25 previous classes
around the country, these projects would (a) be known to teacher and (b) could
be replicated, providing all students the opportunity to realize the potential of
what service learning has to offer.…Preliminary findings revealed that about 90%
of students were highly engaged by service learning and produced positive results
from many types of service learning projects. Many of the findings to date echo pri-
or research demonstrating the role of well-designed programs that include specific
activities to prepare students with a clear and compelling rationale for the project
and with specific roles and responsibilities. The key to replication in schools with
less expertise in service learning may focus on teachers having information on key
components of projects. It’s important to ensure that the projects are feasible for
teachers and students to do, and that they lead to students’ belief that they’re
making a difference and perceive that they’re learning.

Edward Metz, Ph.D.


Projects That Work

lack teamwork and collaboration skills—skill which learning professionals


are rarely taught explicitly.25 Thus, a key step towards achieving the future
learning ecosystem will involve the maturation of organizational processes,
teamwork-focused professional development for various contributors, and a
culture shift—similar to one that happened within software engineering.26

On the content-sharing side of the equation, we’ve already seen significant


efforts to encourage reuse in instructional development, but so far, these have
met with limited success, particularly compared the level of reuse and da-
ta-sharing that happens in software development. Roughly two decades ago,
SCORM was developed to help facilitate learning content reuse, and there
58 | Modernizing Learning

We used AutoTutor for the Office of Naval Research and put it into ALEKS,
a commercial adaptive learning system. It went ok, but then we tried to do
a scale-up in a school district. We were able to get a big teacher preparation
session. They were reasonably optimistic. The strategy was to let them use
ALEKS on their own before getting AutoTutor. We found that initially a lot
of people liked it, but then they had school vacation and then after that they
had a huge snow storm and were out for about 8 days of schools. Then they
had a very short time for standardized testing for the state (about 5 weeks)
that resulted in universal attrition. In talking with the teachers, they had to
teach to the test, but ALEKS is based on mastery learning. It won’t allow you
to do topics you’re not ready for.…From a learning perspective, it makes
sense, long-term, but teachers have many logistical needs that aren’t directly
represented in adaptive systems. They have to have the kids know information /
knowledge at a certain time whether or not the student is technically
ready for it—even if they aren’t going to remember it. Their knowledge
repository might collapse later because they didn’t get the foundational
information when they needed it but it’s what they needed for the test.
Benjamin Nye, Ph.D.
Director of Learning, Institute for Creative Technologies,
University of Southern California

have been multiple attempts to build repositories of reusable educational re-


sources, such as the MERLOT.27 Another, more recent repository, the Open
Educational Resources Commons,28 offers curated content with open licens-
es; it also encourages co-creation and participation by users.

Newer repositories are now integrating evidence in support of the assets pro-
vided to the community. The What Works Clearinghouse from the Institute
Distributed Learning Instructional Theories | 59

of Educational Sciences at the U.S. Department of Education is one exam-


ple of a research-evidence repository.29 This clearinghouse identifies stud-
ies with credible and reliable evidence of effectiveness, and it disseminates
free reports and summaries on its website. The What Works Clearinghouse
currently has over 700 summaries on effective educational innovations and
over 10,000 reviewed studies available in its repository. A number of simi-
lar government-sponsored research communities can also be found, such as
CLEERhub 30 for National Science Foundation research on Engineering Edu-
cation; the National Academies Press, with open-access e-books on hundreds
of topics, including Behavioral and Social Sciences and Education;31 and the
Defense Technical Information Center for military-funded research.32

CONCLUSION
In summary, extensive research has been conducted to inform instructional
theory, but there continues to be a gap between scholarly findings and their
practical application. However, there are many excellent resources for teach-
ers, trainers, instructional designers, policymakers, and administrators. Un-
fortunately, many of these resources still assume that learning will occur un-
der traditional (Industrial Age) conditions; so, consider them with caution.
Some theories have been developed specifically with instructional technol-
ogies in mind. Seek these out, but also remember it takes years to properly
validate a theory; so, watch out for hype, particularly when commercial profits
or someone’s reputation is on the line. Also, when designing for technolo-
gy-supported learning use a measure of creativity, to avoid succumbing to the
“just substitution” mindset. Similarly, also be willing to rethink the design,
delivery, and coordination of learning processes. Emerging technologies are
radically changing the ways we train, educate, learn, and develop, and they’re
similarly changing the ways learning professionals operate—embrace teams,
seek out shared materials, and embrace a culture of reuse.
Education in the future will be more of an iterative process.
Currently, people pursue their education at the beginning of
their lives and then they go to work. Education beyond that initial
period typically only happens because of some disruption in their
lives—they lose their job or other changes in circumstances. It’s
difficult to access at that stage in life, but in the future, while you’ll still
have early-life education, it might look a bit different—with more of
an emphasis on work-ready skills and learning to continuously learn.
There will also be many more opportunities for dipping in and getting
back out of the workforce throughout someone’s life. Education will be
more just-in-time and based on the needs of the moment. Technology
will support that, but it requires significant change in the way
education institutions operate and the way employers do things.

Martin Kurzweil, J.D.


Director of the Education Transformation Program, Ithaka S+R
Lifelong Learning | 61

CHAPTER 4

LIFELONG LEARNING
J.J. Walcutt, Ph.D. and Naomi Malone, Ph.D.

The world has progressed in so many ways over the past 100 years, yet our ed-
ucational structures have stayed relatively unchanged. Incremental progress
has certainly occurred, to include improvements in classroom organization
and information delivery, but the developmental models, progression of for-
mal educational offerings, and recognition of learning via grades and degrees
have proven resistant to change. As a society, we still focus on controlled
settings for learning and group-based information delivery. The sequence is
linear, the instruction is split into finite end-points, and the whole process is
assessment-oriented.

We rely on outmoded developmental models (such as Jean Piaget’s stages of


cognitive development) and use a failure-focused mindset when measuring
learning, that is, students’ developmental speed and depth of knowledge are
judged against expected averages, largely defined by age-based phases. In a
K–16 setting, those who fail to conform to expectations are “behind in their
development,” and in workforce or military settings, those who lag are judged
as incapable, unmotivated, or possessed of other character flaws. We assign
grades based on achievement and determine progression through the system
based on time factors, such as credit hours or classroom attendance, along
with single-point, high-stakes testing. Similarly, we make strategic-level cur-
riculum decisions based on these goals, such as how to achieve increased
seat-time or time-on-task, assuming that more time spent learning will result
in improved outcomes (even though data suggest that students need non-in-
structional assimilation time and varied experiences to aid comprehension,
and that learning needs to be context-based).1
62 | Modernizing Learning

In the modern world, we’re


inundated with unlimited,
unfiltered, and unmanaged data;
individuals can learn anything they
wish, but we run the very real risk
of increasing low-level information
acquisition to the detriment of
higher-order comprehension.
Lifelong Learning | 63

We largely place students in controlled settings (classrooms), where infor-


mation is filtered by a teacher or curriculum designer to ensure its accuracy
and intelligibility, where goals are clearly defined, the level of information
provided is appropriate for learners, the pace is controlled, and someone is
available to help monitor the informational content and its delivery. In many
ways, this is where we’ve seen improvement in learning over the last century.
Many of the advancements in instructional theory have focused on formal
learning experiences, and teachers and administrators made efforts to bring
those findings into the classroom.2

However, learning isn’t confined to the classroom. The world outside the
schoolhouse is filled with limitless sources of potential learning. We’re in-
creasingly exposed to torrents of data, questionable “facts,” and diverse un-
connected information. It’s incumbent upon the individual—the learner—to
determine the value of that information and how it connects to other data or
experiences. The speed and diversion of information in our modern world im-
pacts our abilities to synthesize useful knowledge, effectively retrieve it, and
translate or apply it in practice.

Information overload is a significant and growing issue; volumes of data are


bombarding people at faster and never-ending rates. When exposed to too
much data, the human brain will tend to focus on the clearest, easiest to un-
derstand, most familiar elements—and discard the rest.3 It’s the body’s nat-
ural way of functioning in a focused and emotionally stable state. However,
in today’s data-rich climate, this sometimes means retention of false or mis-
leading information, which can lead to poor decisions at both individual and
collective levels. Thus, as the world continues to become increasingly vola-
tile, uncertain, complex, and ambiguous, we need educational practices that
ensure people are prepared, not only for today’s classroom but for tomorrow’s
global landscape.

That preparation doesn’t end at 18 or 25 (or even 100!) years of age. With in-
creasing average lifespans 4 and worldwide pace of change, continuous lifelong
64 | Modernizing Learning

learning has become a necessity. New inventions create or destroy whole in-
dustries each year, and AI is altering the nature of work in fundamental ways;
add to that increasing lifespans and the evolving view of employee-company
permanency. All this means that many people will change careers—not just
jobs—multiple times within their lives.5 Thus, we need to expand the time-
frame of learning beyond K–12 and even beyond traditional higher education
and vocational schools. While these forms of formal, developmental educa-
tion are likely to persist for some time, we can expect more learning to occur
later in life—in the 30 to 65 age range.

It’s time to change course by moving away from incremental


improvements to our existing education system and instead,
reimagining how foundational scientific principles can inform
a new model of learning—one that spans the lifetime.

LIFELONG LEARNING VISION


Our vision for lifelong learning takes a more naturalistic perspective, acknowl-
edging that learning is pervasive.
pervasive It happens all the time and everywhere, in
the classroom, online, at home, and through lived experience. Learning is per-
sonal, changing in form based on the unique personality, interests, skills, at-
sonal
tributes, circumstances, and beliefs of each individual. It’s fluid and nonlinear.
nonlinear
Various subjects don’t exist in distinct and disconnected packages; instead,
diverse concepts that can be learned together. It’s flexible.
flexible People can achieve
success in countless ways via individualized learning trajectories that max-
imize their unique potential, rather than boxing them into a finite set of “ac-
cepted” developmental boxes. It’s holistic.
holistic Future learning experiences will
reach beyond the cognitive domain to emphasize the whole person, including
INDUSTRIAL AGE PAST
FUTURE
Holistic development across
Mastery of facets, merging cognitive, physical,
knowledge and skills social, emotional, and so on
(mostly cognitive and psychomotor)

FOCUS

Facilitator, mentor, and coach,


Expert authority figure; within a larger, connected network
learning designer and director

EDUCATOR

More personalized and active,


Mostly structured, often passive with a greater formative focus
and linear, with summative assessments

EXPERIENCE

Continuous lifelong learning,


Discrete, episodic, largely age-based integrated across experiences
(K-12, higher education, career training)

TIMING

More diverse and blended choices,


Limited access choices, usually either truly enabling “anytime, anywhere”
in a face-to-face setting or online

ACCESS

Distributed systems-of-systems,
Dedicated systems in silos, an interconnected ecosystem
often focused on formal learning

TECHNOLOGY
66 | Modernizing Learning

their social, emotional, and physical development. Education will be designed


to help cultivate people who can thrive in a complex and chaotic future, rather
than simply ushering them through the linear, K–12 milestones we have today.

4 KEY TENETS: LIFELONG,


HOLISTIC, UBIQUITOUS, AND
ASSET-FOCUSED
Our lifelong learning model includes four main principles. First, as its name
implies, it considers learning a continuous, lifelong experience. Today, we
tend to view learning in discrete developmental phases—early childhood,
then K–12, and, finally, higher education or workforce training. In the future,
we’ll view learning as an ongoing process, where information is constantly
synthesized, all the time and from copious sources. The second tenet of this
model is that learning isn’t constrained to cognitive development. Rather, we
must recognize learning as an interplay among cognitive, social, emotional,
and physical skills, attributes, and capabilities. Third, learning involves a mix
of formal, nonformal, and informal activities. Today, we primarily measure
and accredit knowledge and skills acquired in formal settings and assessed
within similar structures. However, in the future, life experience and indepen-
dent, informal learning will also be measured and recognized as much as—
or, in some cases, more than—formal learning. As our capacity to measure
learning and experience improves, we’ll also be able to examine individuals’
experiences more systematically, to better understand what they know, com-
prehend, and are capable of achieving. Finally, this is an asset model, not a
failure model. This means learners of all ages are viewed through a lens that
considers where they are today and where they’ll grow to tomorrow.

Each of these tenets is described in more detail below.


Lifelong Learning | 67

OECD LEARNING FRAMEWORK 2030 www . oecd . org

The Learning Framework 2030, from the Organisation for


Economic Co-operation and Development, defines a
vision and underlying principles for future of educational
systems. Still a work-in-progress, the framework is being
developed by a community of experts, school networks,
teachers, students, youth groups, parents, universities,
local organizations, and social partners. Its vision is to help
every learner develop as a whole person, able to fulfill his
or her potential and contribute to worldwide wellbeing.
The current version of the framework emphasizes:

• New solutions for a rapidly changing world with diverse global challenges
• New transformative competencies for innovation, responsibility, and awareness
• Learner agency—the responsibility for one’s own education throughout life
• A new, broad set of desired knowledge, skills, attitudes, and values
• Individual and collective educational goals for wellbeing
• Design principles for eco-systemic change

1. Learning is lifelong

Although 90% of brain volume is attained by age 6, learning occurs across


the lifetime and continues to affect the brain’s capabilities. Certainly, early
childhood experiences impact individuals’ ability to compensate effective-
ly as they age.6 However, research on neuroplasticity demonstrates that the
brain can reroute information and make up for trauma due to brain injury.
Essentially, people can gain or regain skills otherwise lost during the trauma.7
There’s also significant evidence that neural development continues through-
out the lifespan.8 Although cortical thickness, mass, and connectivity seem to
decrease with age, adults can compensate by activating interdependent neural
mechanisms gained from life experience. In other words, although the brain
develops most rapidly in childhood, learning can effectively occur throughout
68 | Modernizing Learning

life and is shaped by individuals’ behaviors.9 What and how much individuals
learn depend on a variety of micro- and macro-level factors. Micro-level fac-
tors include individual choices, motivations, and the ability to self-regulate,
particularly outside of formal education settings. Macro-level factors include
learners’ neighborhoods, societies, and cultures.

Some of these factors make adults particularly well-suited for learning. Clar-
ity of interests and goals, and greater self-awareness make this time-frame
conducive to personal growth and often encourage a greater motivation to
learn. Adults also have a greater wealth of experiences to draw upon, which
can help them synthesize new information more deeply and efficiently.10 How-
ever, placing the control of learning into adults’ own hands may encourage
them to focus too narrowly on limited, task-specific forms of learning. We’ll
need structures that protect and support a comprehensive view of learning.
Otherwise, we risk having deep experts embedded within stovepiped knowl-
edge communities who lack a general understanding of how the pieces fit
together to work within a holistic, efficient system.

2. Lifelong learning must encompass


whole-person development

The ability to effectively participate in life is not exclusively determined by


one’s cognitive abilities or educational attainment. Rather, resilience, motiva-
tion, circumstance, exposure, metacognition, self-regulation, and other per-
sonal attributes contribute to a person’s ability to navigate life. This position
is strengthened by the finding that “brain development and cognition (and the
connectivity between cortical areas) are influenced and organized by cultural,
social, emotional, and variability in learning.” 11

In other words, whole-person development necessarily incorporates cognitive,


social, emotional, and physical capabilities and these are, in turn, influenced
by cultural systems.
Lifelong Learning | 69

We’re training people for jobs that


aren’t going to exist anymore.
James Robb
Rear Admiral, U.S. Navy (Ret.)
President, National Training and Simulation Association

Harvard’s 60-Year Curriculum initiative encourages a new


paradigm of thinking about learning and the education
process. It recognizes that people learn throughout their
work lives—and often beyond, into retirement.

It’s just a subset of the larger territory that we’re looking at; it’s an under-
appreciated subset but important for our economy and civic health. We
need to recognize that the world is changing and that we don’t leave people
out to dry because their first career fizzled out and dried, and we didn’t have
a mechanism to help them out. Under the spotlight, we have K–12, higher
education, and retirement, but when you have a career change and the world isn’t
helping you, it’s murky. We held a conference recently focused on the concept
of education ages 15–75. We asked, “How do we make that a different span of
life during which people feel supported? Do we need unemployment insurance?”
We’re interested in figuring this out. For example, what if I’m really struggling and
I don’t know if I want to be a researcher or a designer? The real question now is
what do you want to be first? We didn’t have those dialogs in the past; it’s totally
different now.
Christopher Dede, Ed.D.
Wirth Professor in Learning Technologies
Harvard Graduate School of Education
70 | Modernizing Learning

We will need new models of learning and theories of development to effec-


tively address the “whole-person” learning paradigm. To date, much of the
human-development has focused on the early stages of life (prior to adult-
hood). As we move away from a front-loaded notion of education and towards
a lifelong learning concept, we’ll need to expand this body of research to
incorporate adult learning, changing societal conditions, and the goal of de-
veloping more holistic capabilities across time and space.

COGNITIVE DEVELOPMENT

Although mature theories of cognition and learning already exist, these will
need to be expanded and potentially reevaluated within the future lifelong
learning model. Discussions of cognitive development usually point back to
the foundations built by Jean Piaget (1936) and Lev Vygotsky (1978).12 Piag-
et’s theory of cognitive development defined four critical periods in which a
young child develops sensorimotor intelligence, preoperational thought, con-
crete operations, and, finally, formal operations. Interestingly, the final stage
spans ages 11 to adulthood. People who reach this final stage (and not all do,
according to Piaget) are able to think abstractly. Since we now know that
learning occurs throughout an entire lifetime, what happens after reaching
this stage? Vygotsky’s sociocultural theory of cognitive development offers
some answers; it focuses on a person’s journey to individualized thinking
through a co-constructed process of social and cultural interaction. Therefore,
the individual learns either by using self-regulatory tools (e.g., self-speech) or
by observing and/or taking direction from others. Though both Piaget’s and
Vygotsky’s theories recognize the interplay between self-development and di-
rected learning, they take some opposing views; neither accounts for develop-
ment across the lifetime, and neither consider how a person can achieve a set of
meta-skills across disciplines, experiences, and formal and nonformal learning.

Further, technology is changing the nature of human cognition. We can now


offload data storage and “lower-order” cognitive tasks to computers, aggre-
Lifelong Learning | 71

gate and analyze large sums of in-


formation as never before possible,
and access content ubiquitously.
These affordances create the oppor- Key hurdles in developing
tunity to exponentially accelerate today’s students to be ready
human cognitive development, both for life include a lack of early
in time and scale. For example, if childhood experiences and
foundational language that
human brains have finite working
can serve as a springboard
memory capacities,13 then comput-
for later learning opportuni-
ers can expand this—to not only ties. Expectations are not
enable humans to work with more always where they need to
information (without task shedding) be, from teachers or leaders;
but to also better digest and com- consistently higher expecta-
prehend wider amounts of infor- tions are needed.
mation simultaneously. As another Nathan Oakley
example, since humans are highly Chief Academic Officer, Mississippi
influenced by life experiences and a Department of Education

computer can provide opportunities


to experience simulated situations,
we can expand our store of experiences in significantly shorter amounts of
time, benefiting from what might be called “unlived experiences.”

Computers are augmenting human cognitive development, not merely by in-


creasing access to information but by also affecting our brains structurally
and neurologically. Across its lifetime, the brain will continue to develop and
learn but also, as new generations are born, they will increasingly have the
benefit of access to accumulated knowledge and experiences of those who
came before. In a twist of irony, while the stage theories of Piaget and Vy-
gotsky have been overcome, the basic belief that cognitive development as
a mixture of human natural capacity building and social-historical influence
remains correct. What they didn’t foresee was the expansion of capabilities
that human-computer interactions could achieve.
WE AREN’T BUILDING RESILIENCE IN AMERICANS ANYMORE.
It starts with little kids, as little as 6 months old. We used to give them a spoon
and a pot, and they were creative with what they had. Now they’re given little
kid toys—each toy has one function. These toys have pre-thought goals, and by
providing them, we’re taking away kids’ creativity. Resilience is not fortitude; it’s the
creativity to find your way out of a hard situation. It also isn’t singular; it’s social and
emotional. We can hide feelings but that’s uncomfortable. Instead, we have to learn
that emotions should be managed; there are times we should be mad and times we
shouldn’t, and we should know the difference.
Betty Lou Leaver, Ph.D.
Director, The Literacy Center; Manager, MSI Press; Former Provost,
Defense Language Institute Foreign Language Center

SOCIAL DEVELOPMENT

Like cognitive developmental models, the lifelong learning paradigm obliges


that social developmental theories be expanded. Social development research-
ers have primarily studied younger ages 14 or special needs populations.15 No
doubt, developing social skills in young people is a worthy goal; however, re-
searchers have focused on these populations while sparing much less attention
for older populations and lifelong social learning. A body of research does
Lifelong Learning | 73

exist around interpersonal employment skills, but social tendencies, changes,


growth, and goals across the lifetime require more attention.

A recent 20-year retrospective study in the American Journal of Public


Health found participants with higher “social competence” characteristics,
such as sharing and cooperating, were more likely to have higher education
attainment and better-paying jobs. 16

In the lifelong learning model, there’s an expectation that formal education


will evolve to encompass these skills, in both the formative and later years of
life. Additionally, we expect that resumés will acknowledge these skills in the
future. If we’re going to reframe our focus from creating workers to develop-
ing whole persons—individuals who can be successful across life experienc-
es—social skills figure prominently into the holistic model. That means not
only understanding lifelong social skills and finding ways to cultivate them
but also rewarding individuals who possess them.

EMOTIONAL DEVELOPMENT

A prominent model of emotional development, developed by Carolyn Saarni,


measures emotional competence as a set of affect-oriented behavioral, cog-
nitive, and regulatory skills people develop over time within their social en-
vironment.17 These skills include individuals’ awareness of their own emo-
tions, ability to discern and understand others’ emotions based on situational
and expressive cues, and capacity to cope with distressing emotions using
self-regulatory strategies. Similar to Piaget’s and Vygotsky’s models, Saarni’s
model uses phases to categorize the developmental process, and it only ad-
dresses development from early childhood to adolescence. Moving toward a
lifelong learning model of education requires more research on adult emotion-
al development as well as the impact of individuals’ emotional wellbeing (e.g.,
mental health, and ability to deal with stress) across all ages.
74 | Modernizing Learning

The Collaborative for Academic, Social, and Emotional Learning,18 a not-for-


profit dedicated to enhancing social and emotional learning, recommends a
more robust model that integrates intrapersonal, interpersonal, and cogni-
tive competence. It includes five key areas that encompass various behaviors,
mindsets, strategies, and skills:
• Self-awareness, such as accurate self-perception and self-efficacy
• Self-management, for instance, impulse control
• Social awareness, including empathy and respect
• Relationship skills, such as teamwork and communication
• Responsible decision-making, including reflection and ethics

Research suggests that early emotional regulation skills have a significant im-
pact on development and outcomes in later life.19 For example, emotional reg-
ulation is part of the spectrum of skills needed to be successful in the class-
room. The emotional regulation and interpersonal strategies children develop
in early years allow them to navigate the school system, and more than that,
these skills become key tools for success in life—arguably more than the ac-
ademic knowledge itself. But can these skills be taught? Substantial evidence
exists 20 that suggests: yes. Explicit teaching of social and emotional skills

There’s a significant emotional impact of constant change and


intellectual learning required for a multi-career expectation. We’re
living in constant acceleration, and we’re trying to keep up with
it. So, the question is: What’s the foundation we need to provide
people so they can thrive on chaos? Some of the answer lies in
really raising what we think about teamwork—something the
military studies very deeply. The team becomes the buffer on which
the group defends.
Christopher Dede, Ed.D.
Timothy E. Wirth Professor in Learning Technologies in the Technology,
Innovation, and Education Program, Harvard University
Lifelong Learning | 75

leads to better interpersonal skills and decreased anti-social behaviors, and


it also improves students’ academic achievement. The interactions between
social and emotional development and outcome performance make sense. For
instance, consider that distractions of any type during learning, including
internal anxiety, stress, or personal or professional challenges, can detract
from one’s ability to acquire and encode new information. However, emotion-
al regulation, resilience, and persistence can improve both learning as well
as decision making under stress.21 Accordingly, emotional regulation skills
developed early can improve long-term functioning and can also be improved
with time, experience, and formal education. Nonetheless, more inquiry is
needed to examine how such capabilities directly impact adult performance
and lifelong learning, and importantly, improved developmental metrics and
instructional approaches are needed for honing these skills in life.

PHYSICAL DEVELOPMENT

Formal investigation into motor and physical development traces its founda-
tions to the 1920s, when doctors began weighing infants to determine if they
met appropriate growth benchmarks.22 More significant research began in
earnest the 1970s and 1980s, spurring significant advancements in the under-
standing of average motor development, constraints both within and external
to a person, and the benefits of aiding, enhancing, and improving motor skills.
However, like other developmental domains, much of the research in physical
development has been limited to early childhood and disorders, with some
unique focus areas for special populations such as sports and military per-
sonnel. Yet, beyond the scope of these specific groups, general physical mat-
uration and the impacts of motor skills and practice have been less studied,
although that is changing.

Body development, awareness, health, and wellbeing have large impacts on


long-term functioning. Increasingly, improved methodologies and new tech-
nologies are creating ways to better understand how a body develops into and
76 | Modernizing Learning

As a society or culture, if across adulthood, how physical capabili-


we think about reimagining ties can be honed, and how these connect
learning as a lifelong endeavor, to other developmental domains such as
we’ll help so many kids. emotional stability, social capabilities, and
We need to get out of the cognitive development.23
structure of grades and look Simultaneously, wearable devices and the
at learning as “I’ve mastered
so-called quantified self  24 have created
it now, or I haven’t yet.” We
enthusiasm about improving physical ac-
need to tailor education.
tivity and the nuances of each individual
body. They are providing individuals with
Michelle Cottrell-Williams
Teacher, Wakefield High School, 2018 the ability to have access to personalized
Virginia State Teacher of the Year data that was previously unavailable and
empowering people to make improved
decisions about their health and physical
activities as a result.25 The medical ben-
efits of these technologies have not yet
been fully understood at a societal level,
nor have they been fully utilized for opti-
mizing human motor capabilities outside
of specific, controlled settings, such as
Olympic athletic training. However, as the
research continues, it’s not unreasonable
to believe a new theory of physical and motor development that encompasses
average, lifelong populations will be forthcoming—one that actively incor-
porates considerations for human-technology interaction, the processes and
impacts of physical development across society, psychophysical literacy, and
the interplay of motor development with social, emotional, and cognitive de-
velopment.

Understanding, philosophically, the holistic connectivity of human capabili-


ties, and how behaviors are enacted across contexts, will be important within
Lifelong Learning | 77

a whole-person development model.26 A better understanding of the self, to


include the physical self, is needed to achieve more holistic, personalized,
developmental trajectories.

3. Learning is ubiquitous

Lifelong learning comprises all phases of learning and stages of life, and it
occurs across diverse contexts, from school to the workplace, at home and
within the community.27 Lifelong learning activities can happen in formal
settings (e.g., courses offered by a university), nonformal contexts outside of
fully structured institutions (e.g., meet-up workshops), and in informal and
spontaneous ways (e.g., while chatting with a co-worker or reading a post on
social media).28

Learning already occurs in all of these ways, all the time, and everywhere. To
date, however, we’ve largely documented (and, subsequently, largely valued)
only formal learning experiences. Informal and experiential learning can have
as much, or even more, impact on individuals’ abilities to acquire, assimilate,
and apply knowledge. With the development of data science, machine learn-
ing, and interoperable data standards that allow us to measure and classify
experiences, we’re unlocking the ability to better capture and communicate
a person’s true skill level as well as his or her ability to perform in a variety
of settings and across communities. It’s irrelevant where a person “learned”
something—the transfer of that learning into practice is what matters.

The idea that learning happens everywhere and all the time isn’t new. Rather,
it’s our ability to measure it and communicate about it (e.g., through compe-
tency badging and credentialing) that’s novel. This also ties to the whole-per-
son principle described in the preceding subsection. That is, various skills
contribute to someone’s success in the world. In military contexts, for exam-
ple, there’s much talk of grit and resilience, and in higher education, we often
reference executive functioning and well-roundedness; however, such capaci-
78 | Modernizing Learning

STACKED CREDENTIALS: For example, in business, a student takes


three courses and completes them satisfactorily, and they earn a
certificate in the area of finance. After navigating that successfully,
they take three courses in marketing and receive another certificate.
Then those groups of certificates are stacked into a personalized
master’s degree. This approach allows the student to acquire
credentials in bite-sized chunks and offers more flexibility.

David Munson, Ph.D.


President, Rochester Institute of Technology

ties are rarely measured or reported in transcripts and personnel records. As-
sessing their applications in real-world contexts and giving “credit” for other
lived experiences will also enable our ability to create personalized learning
trajectories, improve talent management into the future, and create equitable
opportunities for more people.

4. Lifelong learning must employ an asset model

In developmental psychology, an “asset model” refers to an approach that rec-


ognizes individuals’ unique assets and focuses on adding capabilities to them.
This concept is compared to the “deficit model,” which focuses on areas of
weakness and involves comparing individuals to group averages. The benefits
of using an asset model are several-fold. First, there’s a psychological benefit
in the form of increased energy and the improved outcomes that result when
a positive focus is used for learners. This can be seen in sports psychology
with relation to performance on the field 29 and is directly translatable to the
classroom or boardroom. Building people up to an optimal capability is far
more encouraging than forever attempting to “fix” them.
Lifelong Learning | 79

Second, asset models help support whole-person development. Asset models


better allow for the inclusion of skills and attributes outside of those measured
on averaged, norm-referenced assessments. By looking at these other factors of
success, we can better recognize, help develop, and otherwise enable such skills.

Finally, an asset model can better support a focus on continual, lifelong learn-
ing. The structure of this type of model naturally defines success at every
level, with every addition, and yet has an infinite number of notes, skills, and
competencies that one can attain. The reframing of both the learner and the
educational system can aid in the reimagination and refocus on how we can
improve the system and work toward optimization of each individual, rather
than focusing on creating able-workers ready for an industrialized nation.

IMPLEMENTATION
The previous section outlined a vision for lifelong learning in the future. This
section outlines specific steps we can take towards that vision.

USE MULTIPLE THEORIES TO INFORM EDUCATIONAL DESIGN. Life-


long learning means learning across time, space, purpose, media, and for-
mality. We’ll need to transition this strategic-level concept into tactical-lev-
el interventions for classrooms, workshops, training exercises, experiential
learning, and other formal and nonformal activities—implementing and in-
tegrating theoretical approaches from multiple disciplines, including instruc-
tional design, information management, educational psychology.30

STACK CONTENT-SPECIFIC, CONTENT-AGNOSTIC, AND SOCIAL AND


EMOTIONAL LEARNING. Refocusing education to incorporate a holistic
view of human development will necessarily require a shift in educational
requirements. However, the ability to add more requirements to an already
packed schedule isn’t reasonably feasible. Rather, we’ll need to change the
80 | Modernizing Learning

organizational structure of formal education and training pipelines as well as


take advantage of project-based learning options where multiple skills across
the cognitive, emotional, social, and physical domains can be simultaneous-
ly developed. Content-agnostic learning strategies, for meta-skills such as
self-regulation and executive functioning, will also need to be learned at the
same time. A stacking of skills, content, and connectivity across topics should
become the norm, rather than the exception, particularly for formal and non-
formal education.

MAKE TECHNOLOGY INTEROPERABLE TO MEASURE AND CONNECT.


This vision of lifelong learning depends, in part, on the collection and analysis
of learner data. To enable that, we’ll need to first define measures appropriate
for formal, nonformal, and experiential learning. We’ll also need to develop
the associated technology, including interoperable systems that can safely and
ethically aggregate data across time, space, and communities. This “internet
for learning” will need to securely store a person’s data and make it accessi-
ble, across a lifetime, by approved entities who can use those data to person-
alize learning episodes and developmental trajectories.

USE THE SCIENCE OF LEARNING TO OPTIMIZE THE LIFELONG


LEARNING SYSTEM. Learning can be enhanced by employing a set of in-
structional principles, such as specific teaching and assessment principles.
As described in the preceding chapter (Chapter
Chapter 3),
3 many existing instruction-
al theories already articulate well-documented best practices for supporting
evidenced-based teaching and testing. However, we need to widen our per-
spectives—to consider whole systems, the range of interacting micro- and
macro-factors, and their interplay across space, time, and purpose. To ac-
commodate individualized pathways through education programs and other
developmental experiences, we’ll also need to change how information flows
and how people progress through the system. This will impact secondary and
postsecondary education, trade training, workforce development, and life
experiences. While it’s possible to allow technology advancements to drive
Lifelong Learning | 81

The Department of Defense focus falls short by focusing on eighteen- to


nineteen-year-olds and not thinking about how we can support kids at the
younger ages. So, by the time we get them in DoD, we’re dealing with
resilience issues and putting band-aids on problems. We spend 20 years
building a new weapon system but our kids in second grade are going to
be in DoD in 10 years. The first thing DoD needs to do is consider learning
as a continuum to include civilian education. Social emotional learning and
executive functioning need to be a focus. There’s a whole bunch of things
that need to be mitigated before we get them in DoD.
Russ Shilling, Ph.D.
Chief Scientific Officer, American Psychological Association
Former Senior Innovation Fellow, Chan Zuckerberg Initiative; Former Executive
Director of STEM, U.S. Department of Education; Former Program Manager,
Defense Advanced Research Projects Agency; U.S. Navy Captain (Ret.)

these changes, it would be wiser to help cultivate the ecosystem more holis-
tically. We need to collect evidence and recommend best practices about the
elements within it and their collective impact as well as incentivize those ele-
ments that bring out its best features—for individuals and society, writ large.
Learning science, both its extant research and its inquiry principles, can aid
this endeavor, but we must commit to using it for this larger vision.
82 | Modernizing Learning

Too often, we define the symptoms, not the underlying issue.


We tend to problem-solve instead of problem-find. As you
try to create these innovative things, you’ve got to do a really good
job separating problem and symptom as you build the ecosystem.
The idea of an ecosystem is based on interdependencies, so a
technological ecosystem has to work like a biological ecosystem:
At the same time, it must take into account all the components—to
include the people. These solutions are often developed without the
human and user in mind. People don’t always think about the end-user
when they develop technology.

Jeffrey Borden, Ed.D.


Executive Director, Inter-Connected Education; Chief Academic Officer,
Ucroo Digital Campus; Former Chief Innovation Officer, St. Leo College
Learning Experience Design | 83

CHAPTER 5

LEARNING
EXPERIENCE DESIGN
Sae Schatz, Ph.D.

The phrase “fog of war” is generally attributed to Prussian military theorist


Carl von Clausewitz, who wrote his quintessential treatise, On War, in the ear-
ly 19th century. In it, he describes war as the realm of uncertainty; this gives
rise to our classical understanding of the “fog” as a state where information
is scant, unreliable, and hidden from view.1 However, in the modern world of
smartphones, broadband, and social media, this concept is taking on a differ-
ent cast. Today’s “fog” isn’t caused by a dearth of information but rather by
the overwhelming glut of it. The quantity of resources represents only one
part of the challenge. So much of the available information is inaccurate, con-
tradictory, inapplicable, or disconnected. There’s a signal-to-noise problem.
Added to all of this, we’re expected to monitor multiple information feeds,
carryout parallel multitasking, and pay attention to alerts and interruptions.2

Sometimes humorous phrases—infobesity, infoxication, data smog, or info


pollution—describe the phenomenon, but its effects are no laughing matter.
One result of the pace and abundance of resources is, paradoxically, a drop in
productivity. For example, workers need an average of ≈25-minutes to “reset”
after being interrupted by a work email, and such distractions account for
around one-third of the time a typical knowledge worker spends on the job.3

In addition to issues with efficiency, information overload can profoundly
impact effectiveness. Notably, it dangerously affects attention, encoding, and
decision-making processes. For instance, when overloaded, individuals are
84 | Modernizing Learning

more likely to monitor the most superficial data and defer to familiar con-
cepts while ignoring conflicting evidence. Attention-deficit disorder specialist
Thomas E. Brown has even found that most people, i.e. those without the
syndrome, report symptoms similar to it multiple times a day, including the
inability to concentrate and to pay attention to what needs to be done.4 In
decision-making contexts, overload depletes mental resources, driving indi-
viduals to expedient (rather than optimal) choices, encouraging them to avoid
decisions or defer to negative or default options, and allowing unrelated emo-
tions to play an undue role.

 We’re data rich but 


 increasingly knowledge poor. 

Unfortunately, as discussed in the preceding chapter (Chapter


Chapter 4),4 creating
“more” education and training won’t solve this problem. In fact, as we look to-
wards the future learning ecosystem, with its vision of diverse and pervasive
lifelong learning, we run the risk that—rather than optimizing our learning
and development—we instead add to this destructive cacophony. The learn-
ing ecosystem has other potential pitfalls, too; for instance, like today’s world
wide web learners might be faced with the daunting task of independently
curating and synthesizing their own instructional resources. Further, with its
reliance on technology, poor usability and breakdowns with other nonfunc-
tional requirements (so called “-ilities”) could become insurmountable bar-
riers to its effective and efficient use. In other words, without care, there are
an excess of ways that the learning ecosystem could add to the “noise” rather
than strengthening and clarifying the “signal.”

Solving this problem will require several concomitant solutions. Notably,


applying holistic instructional strategies (Chapter
Chapter 12),
12 developing learners’
self-regulation abilities (Chapter
Chapter 15),
15 and thoughtfully applying automated
personalization (Chapter
Chapter 10)
10 are all essential. In addition, the intentional inte-
Learning Experience Design | 85

One of the key hurdles in developing students for life is that


we’re still trying to assess them on information from the
past—the way we used to teach. Forty percent of students
will work on jobs that don’t exist yet. We need to teach them
the skills to collaborate and innovate.…If we can google it,
then we shouldn’t spend our time teaching it! I need to be
able to facilitate their learning.

Michelle Cottrell-Williams
Teacher, Wakefield High School
2018 Virginia State Teacher of the Year

gration of these practices, along with the strategic design of learning systems
and careful attention to their practical interaction details must be considered.
Hence, this chapter focuses on the design of learning experiences as a neces-
sary complement to the other critical elements informing the future learning
ecosystem.

LXD: DESIGNING HOLISTIC,


LEARNER-CENTERED
EXPERIENCES
Broadly defined, design refers to a series of interrelated actions, purpose-
fully taken to achieve specific outcomes or goals. People often associate the
word with artistic activities, such as painting or fashion; while it fully applies
86 | Modernizing Learning

to these fields, “design” also pertains to any problem-solving discipline that


uses a combination of grounded knowledge, skill, and creativity. For instance,
teachers may design a curriculum for optimal transfer-of-training, and soft-
ware developers may design a new app for security and reliability. Even mil-
itary leaders discuss operational design as a core element of their planning
processes.

Learning experience design,


design abbreviated as LX or LXD, is a relatively new
concept, originating around a decade ago.5 It largely grew out of user experi-
ence design:

The term “user experience” or “UX” wasn’t always an overused Sili-


con Valley buzzword. Coined in the mid ‘90s by Don Norman, while
he was vice president of advanced technology at Apple, it refers to an
abstract way to describe the relationship between a product and a hu-
man. Back then, Norman argued that technology must evolve to put user
needs first—the opposite of how things were done at the time. It wasn’t
until 2005 that UX gained mainstream relevance: 42 million iPods were
sold that year and the mass market experienced great design at scale. …
Instructional design is now approaching a similar transition.6

With roots in UX, it’s unsurprising that educational technologists were among
the first to embrace LXD, nor that much of the discussion around it has con-
centrated on design thinking, usability, and interaction design methods for
technology-aided learning. LXD practitioners also frequently emphasize the
application of user-centered design, sometimes drawing a distinction with con-
ventional instructional design by contrasting LXD’s learner-centered meth-
ods.7 Increasingly, though, LXD proponents are widening its scope beyond
(learning) product design, focusing more on broad learning outcomes with
an extensive toolkit to apply towards this end. For instance, Margaret Weigel
and her colleagues with Six Red Marbles have begun emphasizing LXD’s ho-
listic approach to design and its synthesis of instructional design, educational
pedagogy, neuroscience, social sciences and UI/UX principles.8 There’s also
Learning Experience Design | 87

growing consideration for informal and social learning, game-based learning


methods, neuroscience-informed principles, and the shifting role of teachers
from learning providers to learning facilitators. The field, however, still has
some maturing to do, and several related disciplines can help inform this.

Industrial Knowledge Design,


Design or InKD (pronounced like “inked”) devel-
oped around the same time as LXD, and it shares a similar focus: 9

InKD…describes an approach involving interrelated techniques drawn


from diverse evidence-based scientific disciplines, aesthetic principles,
and professional best practices which together help practitioners more
effectively and efficiency achieve purposeful knowledge transfer goals
and objectives.

Like LXD, InKD considers interaction design and usability principles, and in
many practical ways the two concepts overlap. InKD, however, grew out of
different foundations and, as such, contributes some unique perspectives. It
adds to LXD by identifying a set of (1) foundational scholarly fields to draw
upon for theories and concepts as well as (2) practical applied fields from
which to derive actionable tools and processes. Specifically, InKD draws
from information science fields concerned with the analysis, collection, clas-
sification, manipulation, storage, retrieval, movement, dissemination, and
protection of information. These include, for instance, instructional design,
knowledge management, informatics, semiotics, and media design. It synthe-
sizes these with neurocognitive fields concerned with how individuals interact
with data, process information, and form knowledge; these include, for exam-
ple, learning science, cognitive science, human factors psychology, cognitive
ergonomics, and marketing.

The stated goal of InKD practitioners is to use evidence-based techniques to


increase individuals’ motivation to receive information, its effective convey-
ance, recipients’ encoding and later retrieval of that information, its action-
ability, and the overall impact of communications. In contrast to LXD, InKD
88 | Modernizing Learning

has taken a more academic route, which


But there’s this whole contributes definitions and conceptual
other world, conceptually, linkages to the burgeoning discipline.
in different sectors who This helps ground LXD in established
aren’t having conversations theory and evidence-based practice, and
with each other. It’s it gives LXD designers a full “rolodex”
shocked me that people of disciplines with methodologies and
really are doing it in silos. tools ripe for use.

Emily Musil Church, Ph.D. For example, marketing and related dis-
Executive Director of Global Learning, ciplines such as consumer behavior, pub-
Prize Development and Execution, XPRIZE
lic relations, and advertising offer ample
guidance applicable for learning. While
that may sound surprising, in practice,
marketing and learning professionals
share many similar goals: Both try to
understand their audiences, generate mo-
tivation, capture attention, make their
messages memorable, and affect their
audiences’ downstream behaviors. Of
course, marketers generally want to sell
products or services, while learning pro-
fessionals may seek to foster an accurate and robust understanding. Still, the
techniques are often the same.

One distinctly applicable approach from marketing is experience design.


design It’s
a practice usually used in business and entertainment contexts to elevate rou-
tine customer “interactions” into more compelling and memorable customer
“experiences.” Experiential designers are successful when they encourage
people to create meaningful emotional and social connections and to construct
personal narratives that involve episodic memories and positive associations
with the artifacts of that experience (such as a product, in marketing terms).10
Learning Experience Design | 89

Experiential design practitioners assert that well-designed experiences con-


vey a more salient “sense” of a product or brand, enhance customer emotions
towards it, build loyalty, and ultimately enhance revenue.11 Applications of it
have supported these claims; for instance, a major hospital faced with increas-
ing competition and declining customer-satisfaction used experience design
to create a 13% increase in perceived quality of care and a decrease of 33%
in customer complaints with no other facility management changes.12 Other
successful use cases, from car rentals to circus entertainment, have also been
reported,13 and we’ve likely all experienced the effects of well-designed con-
sumer experiences firsthand at theme parks or popular coffee shops.

Philosophically, experiential design isn’t too different from classical expe-


riential learning.
learning Popularized by David Kolb, experiential learning is “the
process whereby knowledge is created through…the combination of grasp-
ing and transforming experience.” 14 Experiential learning recognizes that not
all experiences enrich learning. Instead, meaningful learning occurs when a
learner “‘touches all the bases’—experiencing, reflecting, thinking, and act-
ing—in a recursive process…” 15

Experiential learning theory offers a useful model for conceptualizing the


processes, and proponents of it have published extensive theories, techniques,
and studies about it—some quite useful for LXD.16 However, like much of
traditional instructional design, experiential learning theory generally takes a
straightforward approach, focused on cognitive processes with less attention
for emotional and social mechanisms, and it tends to treat learners as motivat-
ed, self-regulated, and logical actors. This is a place where marketing can use-
fully augment educational theory. Experiential designers take more holistic
approaches, beyond rational cognition or even the immediate experience, and
they focus more on practical outcomes. For example, experience design offers
a set of tools for selectively manipulating contextual variables to influence
experiences and for creating these outcomes at scale. One popular framework
involves five categories that designers need to affect, and when all five are
90 | Modernizing Learning

successfully integrated, they form a “holistic experience”: 17


• Sense – Reactions to sensory stimuli within or around an experience
• Feel – Emotions and their intensity in response to an experience
• Think – Mental engagement, e.g., problem-solving or creative thinking
• Act – Personal identity and behaviors; a desire to engage or act
• Relate – Experiences that provoke a social identity; co-experiences

Experiential designers, and marketers more broadly, tend to more willingly


accept the reality that humans aren’t rational actors. This gives them more “le-
vers” for affecting outcomes, and it frees them from unfeasible expectations
about the logic of consumers’ (or learners’) thoughts and actions. The study
of why and how people make seemingly illogical decisions has grown in pop-
ularity over the last 20 years. Today, under the name behavioral economics,
economics
practitioners have defined a litany of routine decision-making biases, mental
heuristics, and cognitive filters that, largely, everyone uses.

Behavioral economics grew out of work by Nobel Prize recipients Herbert A.


Simon and Daniel Kahneman (among others), and it’s been popularized by
Dan Ariely 18 and Freakonomics authors, Steve Levitt and Stephen J. Dubner.
It also has roots in the psychology of influence and persuasion,
persuasion notably from
work by Robert Cialdini.19

Behavioral economists Cass Sunstein and Richard Thaler (who also received
a Nobel Prize for his work) have expanded the field, widening it to explore
nudge” decisions at large scales. Their canonical book, Nudge,20 out-
ways to “nudge
lines principles for subtly coaxing people towards better choices. Proponents
have used these to great effect. For instance, Collin Payne and colleagues used
small cues at a grocery store to increase shoppers’ likelihood to buy fresh
fruits and vegetables (e.g., designated sections for produce in shopping carts
and big green arrows on the floor). These yielded a 102% increase in pur-
chasing for fruits and veggies, with 9 out of 10 shoppers following the green
arrows to the produce section when first arriving at the store.21
Learning Experience Design | 91

96%

11% According to polls conducted jointly by Gallup and the


Lumina Foundation, 96% of chief academic officers at
higher education institutions felt their programs were
“very” or “somewhat” effective at preparing students
for the world of work—but only 11% of business leaders
strongly agreed. Business leaders said graduates lack the
skills and competences their companies actually need.
Source: Preety Sidhu and Valerie J. Calderon (2016). https://news.gallup.com

UX design, experience design, behavioral economics, and nudge all highlight


ways in which subtle features and thoughtful design can influence outcomes.
But when designing a new system—whether for learning or performance—
how do you think through all of the factors potentially affecting behavior?
How do you ensure the various elements are designed in harmony and with
common ends in mind? Human–Systems Integration (HSI) offers some an-
swers here.

HSI is a philosophy and set of processes that focus on systems-level human


performance design and development activities. It grew out of the U.S. De-
partment of Defense after a 1981 General Accounting Office report revealed
that 50% of all military equipment failures were caused by human error and a
corresponding U.S. Army report that found that many military human errors
could be traced back to poor development processes that failed to sufficiently
consider human performance concerns.22 Basically, HSI combines systems
92 | Modernizing Learning

Human–Systems Integration is a philosophy and set of


processes that focus on systems-level human performance
concerns throughout a system’s lifecycle. Its purpose is to
mitigate the risk of downstream system failure.

EMPHASIZE OPTIMIZE THE TOTAL CONSIDER THE FULL FACILITATE


HUMANS SYSTEM LIFECYCLE DESIGN

engineering methods, human factors principles, and human-centered design


practices—yielding a practical toolkit for designers of any system that in-
cludes people, technology, and desired organizational outcomes.

HSI has four core tenets:

► Emphasize Humans – Emphasize human performance early and often


in the system design process; give humans equal treatment to hardware
and software

► Optimize the Total System – Optimize overall system performance at


the comprehensive (big picture) level and not simply at the individual
component levels

► Consider the Full Lifecycle – Take a long view; maximize a system’s


benefits—while controlling its costs and mitigating risks—across the
entire system lifecycle

► Facilitate Design – Facilitate multidisciplinary design; help “translate”


among specialists in different disciplines as well as between designers
and other stakeholders
Learning Experience Design | 93

Under each tenet, HSI practitioners have developed systematic processes, de-
sign tools, and documentation methods. While many of these are designed
for projects involving highly complex sociotechnical systems (e.g., building
a new aircraft carrier), they can provide LXD designers, at any level, with
inspiration and an extensive toolkit to draw from, and HSI’s core tenets serve
as valuable touchstones for LXD, as well.

Summary

Each of the disciplines discussed in this section can contribute to a maturing


understanding of LXD. The foundations of LXD create its underlying
philosophy and conceptual paradigm, and its underpinnings in UX offer
readymade design thinking principles and user-centered design processes
applicable for learning contexts. InKD widens this aperture to more fully
integrate information science and neurocognitive science, along with their
subfields. In so doing, InKD brings an array of grounded theories and applied
tools to LXD.

Commercial fields also offer useful methods. For instance, experience de-
sign has concepts, methods, and use-cases for constructing memorable and
motivating holistic experiences, often at scale through mass customization
techniques. Similarly, behavioral economics helps us understand more about
individuals’ real-world (“predictably irrational”) decisions, and it teaches us
ways to “nudge” behaviors, whether to persuade individuals or shift whole
communities.

Finally, LXD designers can leverage the four HSI principles as well as its
robust collection of established processes and developer tools. Notably, HSI
uniquely contributes methods for integrating human-centered design princi-
ples with systems engineering, balancing local outcomes against global con-
siderations, and facilitating these designs at scales within production teams
and formal organizations.
94 | Modernizing Learning

RECOMMENDATIONS
Each of the fields of study discussed so far offers a wealth of insights for
learning design. Below is a list of recommendations drawn from across them,
although it surely only scratches the surface.

1. Identify and focus on the actual goal

Across all application areas, a prerequisite of effective design is its conceptu-


alization as a goal-directed process.
process While this may sound evident, too often
people fail to identify the actual goal, and instead focus narrowly on imme-
diate actions or process outcomes, without thinking through the larger “why.”

Consider, for instance, compliance training—something many of us have en-


dured. Originally, the true goal of a compliance course may have been to
address some actual risk, say, to train employees to avoid cyber-scams. The
program manager assigned to the job, however, may inadvertently change
the goal from reducing cybersecurity incidents to mitigating organizational
risk—a seemingly small change. As the job progresses, the goal drifts further,
from designing training that mitigates organizational risk to creating an inter-
vention that shifts risk. This, in turn, may influence programmatic decisions;
for instance, the program manager might begin to view the mere exposure to
training information (rather than effective transfer-of-training) as sufficient
for shifting the risk.

Logically, then, the program manager may select the most economical ap-
proaches for creating that exposure. Meanwhile, the instructional designer is
likely given a stack of materials and told to “train” employees on them—al-
beit with limited resources. Now, his apparent goal becomes communicating
as much information as possible under challenging constraints. Subsequently,
supervisors’ goals become checking off each employee from a completion
list, and employees’ goals become completing the training as quickly as pos-
sible.…and so on until, ultimately,
everyone’s best intentions yield lim-
ited actual utility. We did a fairly broad study with 47
large, well-known companies from
UX and user-centered design have around the world, and we synthesized
proven processes for uncovering the attributes of their learning
organizations. In all cases, what we
strategic goals and designing solu-
found was that they are mission-
tions for them; so, LXD already
focused. They created an architecture
excels in this area. Jesse James Gar- clarifying how data-driven decisions
rett’s Elements of User Experience 23 about training connect to the mission.
is an oft-cited resource for learning Their organizational structures focused
on growing internal people and were
designers, even though his work
really helpful to outcomes and buy-in.
focuses on digital product design,
more generally. His five-layer model Michael Smith
starts with Strategy (defining goals Senior Technical Specialist, ICF
and user needs), and then progress-
es through Scope (requirements
and specifications), Structure (in-
teraction models and architectural
design), Skeleton (interface, naviga-
tion, and information designs), and
Surface (sensory elements and aes-
thetics) elements.

Application of Garrett’s methods, or


similar goal-focused design processes, can profoundly and positively affect
learning design. Using these sorts of approaches means focusing on outcomes
rather than processes. They also require that designers (at all levels through-
out the processes) challenge assumptions, strive to understand and work to-
wards strategic (rather than just local) goals, and consider creative approaches
that fall outside of traditional practices, such as using informal interventions,
holistic experience design, or nudge techniques.
96 | Modernizing Learning

2. Apply holistic user-centered design methods

Results published by the National Academies Press show that only 34% of
technology development projects in the U.S. are successful,
successful and projects most
frequently fail because “(1) an inadequate understanding of the intended users
and the context of use, and (2) vague usability requirements, such as ‘the sys-
tem must be intuitive to use.’” 24 As education and training increasingly rely
upon technology, it’s important to incorporate UX, interaction design, human
factors, ergonomics, and other closely related human-centered disciplines into
learning design processes.

User-centered design is more than just usability. It needs to consider peo-


ple holistically. Experience design offers some insights here. For instance,
rather than focusing largely on cognition, also consider other internal pro-
cesses such as emotion, confidence, and motivation. Recently, the Interaction
Design Foundation published an article highlighting how LXD, like all oth-
er human-centered design applications, is really attempting to solve one (or
more) of these five common problems—only one of which directly addresses
cognition: 25
• Lack of knowledge – Doesn’t understand the material or instructions
• Lack of skill – Lacks skill, practice, or ability to apply knowledge
• Lack of confidence – Lacks positive, yet realistic, self-perceptions
• Lack of motivation – Disinterest in applying cognitive effort or action
• Lack of resources or tools – Problems that prevent otherwise
knowledgeable, skills, confident, and motivated persons from acting

Again, it’s important to consider these dimensions creatively and holistically.


As Bror Saxberg, VP for Learning Science at the Chan Zuckerberg Initiative,
has pointed out: “Even physical and mental health matter—a very hungry
student is unlikely to start, persist, or put in mental effort no matter how
gloriously designed a learning experience he’s put in. Getting students access
to a healthy breakfast is potentially a great personalization of the learning
Learning Experience Design | 97

environment!” 26 In other words, in learning, sometimes more effort needs to


be invested in providing resources (beyond those with apparent education or
training utility), refining the learning context, or instilling confidence. Think
beyond pure information conveyance!

3. Design for real—messy, irrational—humans

Cognitive science and behavioral economics teach that humans are predict-
ably irrational. We’re prone to making expedient (rather than optimal) deci-
sions, substantially more motivated to avoid loss than seek gain, and vulnera-
ble to a slew of other biases. Recognize that learners have these “flaws.” That
doesn’t imply you should deceive or condescend—none of us is a rational
actor! Rather, acknowledge and design for the messiness of humanity. This
may mean, for instance, designing for emotional effect or carefully avoiding
information overload during a learning experience.

As part of a creative, holistic user-centered design approach, also consider


nudge techniques to augment the more obvious learning interventions. Nudg-
es can help individuals overcome inherent biases and might be useful, for
example, in encouraging self-regulated learning practices, such as studying or
reflection. Also, reach beyond the straightforward cognitive domain, and con-
sider nudges related to other behaviors that may impact learning, like well-
being and self-care. Behavioral economics and nudge theory offer excellent
examples to inspire these interventions. Related fields, including industrial
design, graphic design, and communication, also offer tools for designing in-
terfaces, spaces, contexts, and content elements to achieve persuasive effects.

4. Design holistic experiences

Nothing exists as a simple point in time. As experience design and instruc-


tional theory both teach, a given experience is preceded by a preparatory or
anticipatory phase, and it’s followed by a reflective one. Design for these pre-
98 | Modernizing Learning

Some experiences are all about people-to-people interaction. There


are lots of things that can’t be learned online, such as hands-on design
projects or working on a start-up company. Sometimes you have to sit
with your buddies (fairly intensively, for a few years) or travel overseas.
Sometimes you have to be there and you have to immerse yourself.

David Munson, Ph.D.


President, Rochester Institute of Technology

and post-learning phases to the extent possible. Further, an experience has


different components. Drawing from experience design, it’s useful to inten-
tionally design across the full range of sensory stimuli (sense), emotional fac-
tors ( feel), cognitive elements (think), personal connections and engagements
(act), and social identity/co-created elements (relate). Also consider the col-
lective effect of integrating these five facets, and think about how to address
them before, during, and after a learning experience.

Similarly, don’t forget about the power of aesthetics when designing for hu-
mans. Psychological research actually shows that “pretty things work bet-
ter”—that is, individuals’ perception of aesthetics directly impacts their per-
formance outcomes.27 Such aesthetic principles have been well codified for
most media by applied creative types; however, practitioners of more “se-
rious” disciplines are often more hesitant to invest in them. In fact, some
subcultures, such as certain academic disciplines or military sectors, wholly
reject the application of aesthetics (under the assumption, presumably, that too
much polish will detract from the “seriousness” of the message—even though
scholarly research supports the positive impact of quality aesthetic design).
Learning Experience Design | 99

We’re only beginning to understand the psychology of emotional design even


though it’s been around for decades. Yet, it has formidable promise for LXD.

5. Use systematic processes to design effectively


within larger organizations

Increasingly, learning professionals are working in diverse production teams


and deploying interventions at larger scales. This marks a shift in the way edu-
cation and training interventions are developed: Where once they were largely
artisan creations crafted independently by experts, they’re progressively more
likely to be designed and implemented by teams and situated within larger
organizations. HSI offers useful tools for navigating the practical challenges
that come with these changes.

A first lesson from HSI is to consider a long view of a system attempting to


maximize its benefits, while controlling costs and mitigating risks, across its
entire lifecycle;
lifecycle that is, through its initial design and development phases,
along with its implementation, operation, and eventual retirement stages. The
point is, when designing a new process, system, or learning experience, con-
sider it within the context of the organization across time: How will it be
designed and eventually built? How will it be rolled-out to stakeholders? How
will it be maintained and continuously improved over time? When should it
be retired?

HSI similarly offers methods for conceptualizing organizational components.


Typically, HSI practitioners recognize the manpower, personnel, training,
safety and occupational health, human factors engineering, habitability, and
survivability domains. They try to take these factors into account when cre-
ated integrated designs. For instance, if enough operators (manpower) aren’t
available, then they might increase the experience requirements of operators
(personnel) so that each can perform more efficiently. While these classical
domains have some applicability for learning systems design, LXD practi-
100 | Modernizing Learning

tioners will likely need to modify this model. What’s more important than
its specifics, however, is the broad, system-wide perspective it encourages.
When designing a learning experience, it’s useful to not only consider its
delivery but also, for instance, how many learning professionals are needed
to implement it (manpower), what skills those professionals need (personnel),
how they’ll be preparation for their roles (training), and the context in which
they’ll deliver the intervention (habitability).

Finally, HSI practitioners often facilitate a multidisciplinary design process,


process
helping to document and “translate” between specialists in different disci-
plines (e.g., between sociologists and computer scientists) and negotiate re-
quirements among interested parties (e.g., brokering compromises between
training specialists and manpower analysts). In practice, this means that HSI
practitioners spend considerable time eliciting inputs from various stake-
holders, documenting assumptions, clarifying friction points, and developing
“shared representations” that transform these requirements and analyses into
meaningful, unambiguous formats such as storyboards, concept maps, pro-
cess diagrams, storyboards, and wireframes. These HSI processes and tools
are useful for LXD designers, as well.

6. Maximize global outcomes vs. local processes

Explicit in the “learning ecosystem” concept are the notions of diversity and
interconnectivity—across an entire lifetime (or, at least, career). This con-
nectivity creates new opportunities for us to consider learning experiences
in concert rather than as isolated events. Other chapters in this book discuss
Chapters 4 and 12
instructional strategies for connecting learning events (Chapters 12, in
particular). This chapter, however, adds practical considerations that LXD is
uniquely positioned to address.

First, consider the impact of lower-level decisions in aggregate. What’s the


gestalt, or combined impression, they collectively produce? Do certain implicit
Learning Experience Design | 101

The challenge is getting teachers to share the imperfect; they


like to reach perfection, and convincing them to share an
imperfect product is difficult. The historical Vermont paradigm
has been teachers are artisans working in relative isolation, but
that system is breaking down. It’s antiquated and not working
relative to personalizing student learning. We need to scale and
showcase the very good work that’s going on in some of our
districts—but not all.
Daniel French
Secretary of Education, Vermont Agency of Education

messages, such as emotional or motivational suggestions, carryover from one


event to the next? For instance, imagine a multipart workshop taught by four
different instructors. If each asks trainees to complete some kind of pretest,
engage in initial icebreaker activities, and respond to post-training questions,
the trainees are likely to grow bored, lose motivation, and might even become
cognitively overloaded. A clever designer might find ways to tie the different
segments together, introduce novelty throughout the four segments, build-in
time for cognitive reset, and find ways to simplify the overall UX. Similar
considerations apply as we scale-up our reference frames and begin integrat-
ing more diverse learning experiences across time, subject, and media.

Second, when designing learning interventions, it’s tempting to try to opti-


mize each individual event, without considering their collective, long-term
result. For example, consider a company that’s decided to shift from tradition-
al, weeks-long vocational courses to on-the-job, just-in-time training. On the
one hand, this method helps avoid inefficient massed learning where individ-
uals often wastefully forget much of what they learned. On the other, it risks
creating disjoint learning that individuals struggle to meaningfully integrate
and comprehend beyond a superficial level. It may also create unforeseen bur-
dens on more experienced operators in the job environment. There is nothing
102 | Modernizing Learning

inherently wrong with just-in-time learning; rather, the point is to consider


system-wide learning strategies that balance holistic efficiency and longitudi-
nal performance against local optimizations. If each module, course, or indi-
vidual designer develops local optimums in isolation, we risk creating overall
inefficiencies and ineffectiveness. Strategy—informed by learning science—
must be applied to, and integrated across, all levels.

7. Continue to synthesize theories and practices


from diverse disciplines

LXD, like the future learning ecosystem concept writ large, represents a syn-
thesis of varied and emerging disciplines. Learning design teams in the fu-
ture will likely involve instructional designers, learning scientists, learning
engineers, technologists, data scientists, and other professionals. LXD fills a
unique void, helping to integrate the diverse perspectives across these team
members, giving voice to learners’ (and other stakeholders’) needs, and en-
couraging the use of disciplined human-centered design practices.

It’s impossible for any one person to thoroughly know all of the disciplines
that inform LXD, but it’s important for LXD designers to avoid “reinventing
the wheel” with their work. As this chapter has shown, many existing domains
offer useful theories, processes, use-cases, and tools. Seek out these prior
solutions; curate and remix them for your own purposes. Look in creative
places, such as the advertising literature or systems engineering manuals, and
look to conventional principles of instructional design, learning science, and
cognitive psychology, too. This discussion on LXD isn’t meant to supplant
those important fields but rather to supplement them by integrating design
principles that consider human-system interactions, applied cognition, orga-
nizational dynamics, and user experiences. Together, in synthesis, these vari-
ous methods can help learning designers to not only create quality instruction
but to better achieve learning outcomes for real people, in real-world contexts.
Everyone comes through the same
education system, and we get locked
into believing that’s the way we
learn—when we really don’t.
Doug Tharp
Senior Learning Project Manger
Nuclear Regulatory Commission
Technology
Interoperability allows data to easily flow, even among applications
developed for different purposes, using a standardized vocabulary,
structure, and cadence. Interoperability implies common standards
that promote system-to-system communications, potentially across
organizational boundaries and institutional firewalls, using specified
data formats and communication protocols. These standards form
the fundamental building blocks for technology-enabled lifelong
learning by establishing consistent protocols that can be universally
understood and adopted by related systems to enable data exchange
about learners, activities, and experiences.
Interoperability | 107

CHAPTER 6

INTEROPERABILITY
Brent Smith and Prasad Ram, Ph.D.

Interoperability, when applied to learning, transcends the full spectrum of


learning environments, systems, data, and organizational entities that individ-
uals encounter throughout their lives. The highly mobile nature of our popula-
tion requires that information about learning be shared in an efficient manner
across this ecosystem of learning. When individuals advance in their careers,
or transfer from one career to another, or simply progress through the continu-
um of learning from one organization to another, high quality data about their
learning experiences needs to be shared. However, today’s learning ecology
consists of stovepiped organizations using highly customized management
systems, accessing disparate sources of data in any number of nonuniform
architectures. To achieve the future learning ecosystem concept, we’ll need
to exchange data across a full range of products, made by different vendors,
and encountered throughout the entire continuum of lifelong learning. Key
to managing all of this data is interoperability—afforded through the use of
internationally accepted technical specifications and standards.

In today’s digital world, information is readily accessible anywhere and ev-


erywhere. Large-scale social networks, interactive content, and ubiquitous
mobile access are emerging as driving technologies in education and training.
At the same time, data science presents new opportunities for assessing the
effectiveness of learning content for different learners, understanding organi-
zational trends across large volumes of learning data, and using amassed data
to continually improve training and education. Yet, there are interoperability
challenges. Today’s digital learning ecosystem is fragmented. Data from one
108 | Modernizing Learning

system can’t always integrate with data from another, which means learning
records aren’t easily transferable between institutional systems and across or-
ganizations. Training and education institutions don’t even record the same
learner activities or capture learner achievement information in the same for-
mats, which further complicates our ability to aggregate data.

FORMAL LEARNING
PROGRESSION
Beginning with K–12 education, most state educational systems use products
from multiple vendors, and each district deploys their systems independent-
ly. Historically, these applications have used limited (or no) underlying data
standards. Instead, most employ their own internal data models, and inte-
gration across systems requires a patchwork of connections at the state and/
or local levels.1 Consequently, there are gaps in the integration among dis-
parate applications, and many systems are simply not interoperable. Ideally,
data from multiple products, such as learning management systems, student
information systems, and learning object repositories, would be aligned to the
same common data standards, enabling seamless coordination across these
applications.2

The existing higher education system is also its own stovepipe. With its fo-
cus on credit hours, semester-long courses, and formal credentialing, these
institutions often fail to account for new practices available in a digital, and
globally connected, world—such as emerging global online learning envi-
ronments that increasingly blur formal and informal practices. Students are
now much more interested in interactive and self-guided approaches, and with
so much information online (and often available for free), universities are no
longer the only places to find higher-level learning. Consequently, the value
Interoperability | 109

The biggest problem we have is the


lack of connected infrastructure across
postsecondary learning systems.
Amber Garrison Duncan, Ph.D.
Strategy Director, Lumina Foundation

of a degree is gradually decreasing as employers place greater weight on a


candidate’s capabilities developed outside formal education.

Within military education and training there are many different schools and
training programs designed to foster technical, professional, and leadership
skills in service members. Many of these programs, their instructional tech-
nologies and personnel information systems, exist in stovepipes. Further, his-
torically there’s been a separation between the education and training com-
munities across the U.S. Defense Department. Education traditionally occurs
incrementally and involves grappling with ambiguity while thinking and re-
flecting about the concepts being learned.3 Training is linked to readiness and
offers opportunities to apply knowledge, skills, and abilities in a manner that
provides immediate feedback and progress measurement.4 Within the current
context, training and education have different reporting structures, motiva-
tions, and logistical requirements such as fuel, personnel, and the access to the
appropriate environments or equipment. Combined, this leads to data being
acquired from many different sources but with little-to-none of it standard-
ized or connected.
110 | Modernizing Learning

Types of Interoperability

Rapid technological change has become the norm in the modern landscape of
training and education. Within learning contexts, the pressure of such chang-
es is felt acutely by educators, trainers, administrators, and learners alike.
Table 6-1 shows a different view of the various learning technologies, en-
vironments, organizations, and outcomes a given learner might encounter
throughout his or her career. This matrix highlights the numerous types of
interoperability required to facilitate a future learning economy. This is large-
ly due to the organizational design of the current learning landscape as well
as the different reporting structures and responsibilities for when and where
training and education occur.

Some Representative Examples of Where and How Learning Occurs


Learning Learning Learning Learning
Technologies Environments Organizations Outputs
Electronic Instructor-Led K–12 School Districts Transcripts
Classrooms Classrooms Trade Schools Diplomas and
Interactive e-Books Live Training Ranges Colleges and Degrees
Learning Mgmt. At Home and Cafés Universities Standardized Test
Systems In Transit on the Postsecondary Scores (SAT, ACT,
MOOCs Train or Bus Accreditation ASVAB)

Mobile Devices On-the-Job Agencies Licenses and


Experiences and Licensing and Certifications
AR and VR Systems
Mentorship Credentialing Bodies Digital Badges and
Live, Virtual, and Micro-Credentials
Constructive Field Trips and Corporate Human
Simulations Military Staff Rides Resources Programs Formal Performance
Workshops and Military Manpower, Evaluations
Embedded Training
and Performance Conference Personnel, Training, Resume Listings
Support Libraries and Education Continuing
Systems Education Units
IoT systems Navy Ships Afloat
Industry Associations and Professional
Wearables Austere Job Sites and Development Units
Military Stations International
Performance Organizations and School and
Qualification Systems Simulation Centers NGOs Workplace Credits

Table 6-1: Learning Activity Matrix


Interoperability | 111

Many types of interoperability are required:

► Systems Interoperability – Digital systems need to work together. The


existing systems we use to collect, manage, analyze, and report on data
are often disconnected and don’t always work well together. Some of the
technology challenges center around data standards, including incon-
sistency of standards and the inability to access data in a usable format.
Progress is being made by numerous ongoing efforts across govern-
ment, industry, and academia.

► Application Interoperability – Systems are comprised of numerous


disconnected applications that, theoretically, must all be capable of com-
municating about how they’re impacting learning for each individual.
Currently, different applications track performance differently, and the
ability to infer information about each activity within an application is
not always well-defined.

► Data Interoperability – The seamless, secure, and controlled exchange


of data between applications is critical to maximizing our ability to un-
derstand individuals’ learning. Not only are data are often stored in iso-
lated data within applications, but these datasets often use custom or
proprietary data models. Common data standards, along with support-
ing data governance and metadata information, are needed to maximize
return on investment in interoperable applications, perform workforce
planning, and support other derived benefits from data analytics.

► Human-Machine Interoperability – The different environments where


learning takes place impact the types of learning technologies used. As
new tools and technologies come into play, individuals must become
more technically savvy and industry must find ways to better support
the seamless transition of learning across a multitude of computing plat-
forms, devices, and learning modalities.
112 | Modernizing Learning

► Organizational Interoperability – Data ownership is a critical obsta-


cle that impedes true interoperability. In the knowledge economy, data
is often monetized and leveraged for purposes other than learning. Or-
ganizations as well as savvy individuals are reluctant to share their data.
Creating cross-platform and interorganizational interoperability will
require a change in culture, and, arguably, that poses an even more dif-
ficult challenge than technical interoperability.

Already, educational institutions, training organizations, and instructional


technologies collect some learner data, such as, demographics, assessment
results, teacher observations, learner-created content, attendance, and course
grades. However, these data points don’t provide a complete picture of a learn-
er unless connected with data collected throughout the continuum of learning.
Additionally, we’re touched by learning broadly throughout our daily lives
by numerous informal interactions, both with other people and through our
own self-directed efforts, but none of those data are captured in a manner that
allows aggregation, comparison, or analyses.

Resolving these interoperability challenges is key to setting the foundations of


a global learning economy that enables learners to constantly update, retool,
rethink, and relearn.

VISION
Common standards and shared technical specifications create the underpin-
nings needed for the future learning ecosystem, from a technology interopera-
bility perspective. These standards consist of published documents that estab-
lish key interface specifications, communication protocols, and data structures
designed to facilitate interoperability among connected components. In this
context, interoperability specifications form the fundamental building blocks
Interoperability | 113

for lifelong learning by estab-


The more people
lishing consistent protocols that
understand Google and
can be universally understood
and adopted by any component the benefits of cross-
of the learning ecosystem to en- domain work, the more
able data exchange about learn- they want it—and the
ers, activities, and experiences. more the silo boxes are a
In the future, such interoper- problem.
ability will unlock rich data Jeanne Kitchens
about learners and learning Chair of the Technical Advisory Committee
activities, empowering organi- for Credential Engine; Associate
Director of the Center for Workforce
zations to build comprehensive
Development, Southern Illinois University
solutions that meet the needs
of their specific populations.
Standardized, documented in-
terfaces will also enable “plug-and-play” replacement of new
or upgraded capabilities on existing platforms. In other words,
interoperability will allow organizations to add, modify, replace,
remove, and support different learning technologies (from different
vendors) throughout their lifecycles.

Interoperability will facilitate data aggregation across the continuum of learn-


ing. Analyses of these data, in turn, will enable learners to optimize their
learning journeys across their many diverse learning activities, throughout
their careers, and, ultimately, across their lives. These data could also help
address institutional questions, such as determining which academic cours-
es produce the best learning outcomes or predicting workforce skill gaps.
Combined with the science of human capital management, enterprise learning
analytics could also help organizations address their strategic talent manage-
ment goals, including succession planning, career assessments and growth,
development, retention, and knowledge sharing.
114 | Modernizing Learning

Several types of technical interoperability are needed to achieve this vision.


These include standard ways of defining competencies (for use in both learn-
ing and performance contexts), for encoding data about individuals’ perfor-
mance and behaviors, for aggregating and visualizing these performance data
in meaningful ways, and for describing and locating various learning activi-
ties. The following subsections describe each of these in more detail.

Competencies
Interoperable frameworks that form the
“common currency” of the future learning ecosystem

Competencies form the interoperability crossroads of the future learning eco-


system, serving as the Rosetta Stone between different learning systems and
workforce applications. A competency describes a set of skills, knowledge,
abilities, attributes, experiences, personality traits, and motivators needed to
perform a particular task. Competencies might include technical, business,
leadership, social, ethical, or emotional capacities, or any number of other
personal traits and capabilities. Additionally, competencies may be highly
dependent on their usage
context; differences in en-
Organizational competencies need to
vironmental factors, task
be encapsulated within a competency
framework to map all learning activities complexity, and related pro-
a learner might encounter cesses or policies can all im-
within an organization. pact their applications.

A competency model (also


called a competency frame-
work) combines multiple competencies, and their underlying factors, into a
work
framework related to particular domain, career, or job area. Some competency
models further separate this information into levels of mastery, such as infor-
mation about the level of competence required at different occupational levels,
Interoperability | 115

and these various elements within a competency framework can have many
nonexclusive relationships with one another.

Education and training organizations may use these frameworks to inform


learning outcomes, and they’re also widely used in business for defining and
assessing requirements for both hard and soft skills associated with job perfor-
mance. The use of common competency frameworks will allow data from dif-
ferent sources to be meaningfully interpreted in and translated to other contexts.

One challenge is that there’s no standard for competencies. Different industries,


accreditation authorities, and trade associations use a variety of different exist-
ing frameworks. Some follow any number of specifications and others do not.
Many competency frameworks include rubrics, performance levels, or other
data that can be used to evaluate proficiency while others rely on supplementary,
external components to house assessment and evaluation criteria. Some com-
petencies are linked to the environment in which the competency is expressed,
and others are motivated by training or education objectives (e.g., knowledge,
skills, abilities). To enable the future learning ecosystem vision, shared vocab-
ularies, classifications, and frameworks of competencies will be needed, and
these will need to allow for commonality and reuse of competency objects and
their descriptors across diverse organizations. Shared metadata vocabularies
might also be required, to include descriptors such as the type of skills includ-
ed (e.g., psychomotor or cognitive skills), skill decay estimates, or relevant
environmental factors that impact or inform the description of a competency.

Activity Tracking
Data about learners’ performance and behaviors

Activity streams are a nearly ubiquitous feature on many of the applications


we use on a daily basis. For example, newsfeeds on social media use activi-
ty streams to record users’ interactions. Activity streams use serialized data
that consist of statements about behaviors. Such statements typically involve
116 | Modernizing Learning

The American Council for Education has developed a digital competency-based


credential that will enable an individual to transfer learning from work to a degree
path. The T3 Innovation Network* is testing the use of competency translation
algorithms to review curricula and competencies. The algorithms are reviewed
by faculty to confirm and are at an 80% accuracy rate right now—and that will
continue to improve. The ability to use advanced technology will help us start to
harmonize towards a more common competency language since we, as humans,
cannot connect the 1000+ frameworks that exist without technology’s help.

Amber Garrison Duncan, Ph.D., Strategy Director, Lumina Foundation


 The T3 Innovation Network is an initiative of the U.S. Chamber of Commerce
Foundation for exploring emerging technologies and standards in the talent
marketplace to better align student, workforce, and credentialing data.

a subject (the person doing the activity), a verb (what the person is doing), and
a direct object (what the activity is being done to or with); optionally, other
elements that describe the performance context can also be incorporated. The
resulting dataset tells the story of a person performing an activity. Examples
include “Mike posted a photo to his album” or “Emily shared a video.” In
most cases, these components will be explicit, but they may also be implied.

Within the future learning ecosystem, activity streams need to capture what
individuals do, which learning activities they perform, and how they perform.
Each entry in the stream should be timestamped, meaning that a learner can
have progress measured as a function of time, not simply a function of state.
The goal of activity streams is to provide data (and metadata) about activities
in rich, human-friendly formats that are also machine-processable and exten-
sible. This interaction data will need to be published by any activity a learner
engages with. In some instances, data might be generated by a learner’s per-
formance, and, in other cases, a system might generate data based on system
events or key milestones achieved by a learner. Alternatively, data may be
Interoperability | 117

generated to establish the context of the learner, the application, or other com-
ponents within the learning ecosystem.

The subject of an activity is nearly always the learner but could, foreseeably,
be an instructor, cohort, or other human or machine agent. The direct object of
an activity depends on its context, as do the verbs (although to a lesser extent).
Universal terms, particularly verbs, will need to use a common vocabulary
across systems, otherwise the data will lack semantic interoperability and lose
much of its utility. By formalizing a common vocabulary, activities can ref-
erence an established set of attributes along with rules for how the dataset is
stored and retrieved by components in the learning ecosystem.

Universal Learner Profiles


A common place to aggregate and visualize learners’ data

The current way learner records are managed is insufficient for the evolving
needs of instructors, learners, and organizations. Today, a transcript is typically
used to record learners’ permanent academic records. Transcripts usually list
the courses taken, grades received, honors achieved, and degrees conferred
from a formal academic institution. Only this most basic of information
follows individuals across their different learning episodes. Teachers and
trainers have little visibility into individuals’ past performance, such as what
other instructors have noted about them, the informal or nonformal learning
they’ve experienced, or their strengths, weaknesses, and individual needs.

In the future, transcripts—or “learner profiles”—will still need to expand to


incorporate a broader range of credentials, micro-credentials, and other learn-
ing activity information along today’s formal learning information. They will
also need to become more dynamic, shifting away from being static records
and instead acting as dynamic tools that learners and organizations can use to
determine learners’ unique paths to achieving proficiency in all their desired
competencies.
Society expects us to be innovative.

It’s imperative that we evolve because changes are happening


whether we lead them or not. The demands society places on
innovation mean that we’ve got to stop looking through the
lens of today and start looking through the lens of tomorrow
with a vision for K-12. Our kids today will be the workers,
leaders, academics, or soldiers of tomorrow.

So, the questions to ask are: How can we use technology to


help us pedagogically? Can we conduct formative rather than
just summative assessments of individual aspects of learning
that ultimately enable us to give learners a better education
than they’ve ever had?

Our academic standards are now in a machine-readable format


and we can do true gap analyses to make inferences to inform
teachers’ decisions and also save untold billions of dollars.
Information rich micro-credentials, such as badges, support
measurable progression, process, and evidence of learning.
Using xAPI that records these steps builds a documentation
of learning that lives beyond the institutional level. It supports
lifelong talent-management and allows our systems to be
seamlessly aligned across time and communities. We need
to ensure that the same measurement we use is both useful
today and understandable to the next community.

Learning is truly measurable.

Keith Osburn, Ed.D.


Associate Superintendent
Georgia Virtual Learning
Georgia Department of Education
Interoperability | 119

Learner profiles have the potential


to empower personalized learning …it’s more about
within the future learning ecosys- the person, not the
tem through better data that can in-
technology.
form learning in new and meaning-
ful ways. As envisioned, a learner Emily Musil Church, Ph.D.
profile is analogous to a mashup of Executive Director of Global
Learning, Prize Development and
information about a learner, popu-
Execution, XPRIZE
lated from various sources and con-
sisting of both explicit and derived
data. A future learner profile might
include a broad range of information, such as demographic data, data about a
person’s interests and preferences, and existing competencies and those that
need to be developed (in the personal, academic, and career arenas). It also
might include information someone’s learning strengths, needs, and the types
of learning interventions that have been successful in the past. We use the
term “universal” when describing the learner profile, because we envision
data from multiple systems flowing into a shared representation. Further, as
a learner’s interests change, or as he or she becomes competent in new areas,
the profile would continually update to reflect the latest “state” across time.

Safeguarding learner data to preserve privacy is an important legal and ethi-


cal consideration. We could also imagine that individuals would need to con-
trol their own data; so, we anticipated that individuals would have access to
obtain, share, and interact with these artifacts as well as to control the other
people, organizations, or applications that can access them.

Activity Registry
Arrays of diverse learning activities

An activity registry is an approach to capturing, connecting, and sharing data


about available learning resources. Unlike federated repositories, search en-
We have 11 missions in the
Coast Guard and every one of
gines, or portals, activity registries
them is diverse, with different
stakeholders for each. Our are a Resource Distribution Net-
borders, to include our waterways, work with open APIs that anyone
are very important, and if we were can use to register, expose, or con-
to go to war, we’d be supporting sume learning resources and infor-
the Navy. So it’s important that mation about how those resources
we’re connecting and maintaining are used. Key features include the
readiness; yet, we’re bounded ability to generate and manage con-
by many different DHS and DoD
tent metadata (data about the pub-
policies. It’s critical that we can be
disruptive thinkers about training, lisher, location, content area, stan-
and it’s even more critical that we dards alignment, ratings, reviews,
can interoperate. and more), manage taxonomies and
ontologies, manage the alignment of
Gladys Brignoni, Ph.D.
content with competencies, generate
Deputy Commander, Force Readiness
Command and the U.S. Coast and manage paradata (data about the
Guard’s Chief Learning Office metadata, such as resource usage,
comments, rankings, and ratings),
perform semantic search services,
and create machine-actionable meta-
data for AI-based recommenders.

An activity registry houses metada-


ta, paradata, assertions, analytical
data, identities, and reputations that
flow through the distribution net-
work. An activity registry will also
contain access information and permissions for different learners. The ac-
tivity registry requires a trusted relationship with different learning-related
activities as well as other essential services, such as launch and discovery. We
imagine that any of the communities or organizations that consume a learning
resources will also capture information about how those resources are used,
such as their context, user feedback, user ranking, rating, and annotations, and
Interoperability | 121

these paradata might be incorporated into the activity registries. We imagine


that such usage data and third-party analytical data could become valuable for
resource discovery and for understanding what learning resources are most
effective.

Learning Content Metadata


Data that describe learning resources

To effectively enable activity registries, the resources they point to will need
to be described in some manner. Such descriptions are encoded as metadata.
In training and education, many different metadata formats have already been
explored, including Learning Object Metadata (LOM; IEEE 1484.1.1), which
is commonly used with SCORM managed content, the Dublin Core Metadata
Initiative, and the Learning Resource Metadata Initiative (LRMI).5

LMRI is a particularly common metadata framework, used for describ-


ing learning resources in web-based instruction. LRMI was adopted by
Schema.org 6 in April 2013, which allows anyone who publishes or curates
educational content to use LRMI markup to provide rich, education-specific
metadata about their resources with the confidence that this metadata will be
recognized by major search engines. Founded by Google, Microsoft, Yahoo,
and Yandex, Schema.org’s vocabularies are developed by an open community
process with a mission to create, maintain, and promote schemas for struc-
tured data on the internet, including on web pages, in email messages, and
beyond. LRMI’s adoption into Schema.org provides many benefits. In theory,
nearly any Schema.org “thing” could be defined as a learning resource. There-
fore, LRMI addresses those metadata properties that distinguish content when
it’s deliberately used for learning. This was done by adding learning-resource
properties to key root types. For example, LMRI incorporates the Creative-
Work property, which includes descriptors such as Educational Use, Educa-
tional Alignment, and Course,7 the latter of which is defined as a sequence of
122 | Modernizing Learning

one or more educational events and/or other types of CreativeWork that aims
to build knowledge, competence or ability of learners.

Talent Management
Bridging education, training, and workforce silos

The preceding subsections highlighted the interoperability afforded by tech-


nical standards. We’ve primarily discussed these standards within the context
of training and education, but they also apply to workforce activities. The
worlds of human resources, training, and education have never been more
closely linked. Organizations, employees, departments, data, customers, and
partners can no longer function successfully in their own silos. As mentioned
above, today’s training and education systems are often disconnected from
one another; further, they’re rarely interoperable with internal or external HR
systems. This results in incomplete or duplicative data, inefficient or inaccu-
rate reporting, complex and costly vendor management, and inefficient and
manual HR transactional processing.8 Standards and specifications that allow
these disparate systems to communicate have the potential to assist organiza-
tions of all sizes to improve performance and workforce satisfaction.

The systems around talent management need to work seamlessly. Within the
future learning ecosystem, an employees’ digital records will include data
from various stages of their careers related to recruitment, training and de-
velopment, and performance management. Many new standards are actively
being developed through the International Standards Organization (ISO) rele-
vant to business-crucial areas such as compliance and ethics, workforce costs,
diversity, leadership, occupational health and safety, organizational culture,
productivity, recruitment, mobility and turnover, skills and capabilities, suc-
cession planning, and workforce availability. All these areas contain specific
metrics and reporting recommendations. Creating systems that combine these
workforce data with other training and education information will enable the
Interoperability | 123

advancement of evidence-based human capital management policies and pro-


vide access to lifecycle data for transaction processing. It will also provide the
data needed for workforce planning and strategic decision making.

Talent, too often, is treated as an afterthought. With increasing retirements


and a fluid workforce, organizations are finding it more difficult to manage the
end-to-end employee data lifecycle due to duplicative HR IT systems across
agencies that are unable to interface and exchange data. The different systems
in use today are like different countries—with distinct languages, customs,
and religions. They use diverse data formats and moving data between them
is difficult and, when done, is often accomplished in nonstandard ways. To
improve the interoperability of HR systems, the different applications need
a common record that covers all aspects of the employee lifecycle, from hire
to retire, for each person. Additionally, to achieve greater synergy within an
organization and to drive human capital performance across the breadth and
width of organizational competencies, organizations must shift from ad-hoc
to strategic talent management programs.

These enhancements to workforce HR systems will benefit learning institu-


tions, as well. Experts commonly agree that most learning takes place on the
job.9 Hands-on experience allows individuals to refine their job skills, make
decisions, address challenges, and interact with others in the organization.
They also learn from their mistakes and receive feedback on their perfor-
mance, and they may engage in coaching, mentoring, collaborative learning,
and other forms of social learning. Rarely (if ever) are these informal learning
experiences tracked. By understanding how and when these types of learning
take place, we can construct more robust profiles of individuals, whether to
inform their learning journeys or to increases the collective intelligence of
their organizations.

The capacity of an organization to innovate, change, and become more ef-


fective depends on employees’ capabilities, thus highlighting the importance
of developing those individuals.10 However, just as we need better indicators
124 | Modernizing Learning

for undergraduate performance, we need better measures of performance in


the workplace. The competitive nature of the global economy and the world
stage increase the need to focus on the human capital supply chain that or-
ganizations employ. While this concept is attractive to organizations, there
are ongoing challenges for its implementation. Workforce planning requires
authoritative data for proper modeling and predictive analytics. Recruitment
requires integration with onboarding and performance data to improve hiring
strategies. By enabling a common language that the different systems can
read from and write to, we’re able to identify hidden dependencies and rela-
tionships within an organization and provide other analytics that help them
make better and faster data-driven decisions.

In America, companies are struggling to close a skills gap, which is


negatively impacting their ability to compete and grow in a global
economy. The U.S. Chamber of Commerce Foundation’s Talent
Pipeline Management initiative is exploring how employers can close the
skills gap by improving how they communicate or “signal” their hiring
requirements. Through this work, they’re creating the Job Data Exchange
that enables a set of open data resources, algorithms, and reference
applications for employers and HR technology partners to use to improve
how competency-based hiring requirements are defined, validated,
and communicated. This provides a critical linkage between the job
performance data, credentialing systems, and learning record systems.11

IMPLEMENTATION STRATEGIES
The future learning ecosystem promotes an increasingly complex world of
interconnected information systems and devices. The promise of these new
applications stems from their ability to create, collect, transmit, process, and
archive information on a massive scale. However, the vast increase in the
quantity of personal information being collected and retained, combined with
Interoperability | 125

our increased ability to analyze it and combine it with other information, cre-
ate valid concerns about managing these volumes of data responsibly. There
is an urgent need to strengthen the underlying systems, component products,
and services that make learning data meaningful. The following subsections
outline a foundation for an enterprise-wide learning ecosystem that can adapt
and grow with the needs of the organization.

1. Identify and describe organizational competencies

Organizations need to inventory the skills required to successfully perform


all business functions within their institutions. These include technical, pro-
fessional, and leadership capabilities across numerous departments, divisions,
or lines of business. Each role within the organization will typically include
a career trajectory with an accompanying learning trajectory for the knowl-
edge, skills, attitudes, and other contributing factors an employee needs to do
their job effectively. Different roles may share the same competency but have
different contexts for how that competency is performed or a weighting for
how much it impacts a particular job. A competency framework provides the
common reference model across HR, training, and education systems, and
the critical indicators associated with competencies within it help quantity in-
dividuals’ performance. As new tools, technologies, and processes transition
into the work environment, competency models will need to be continually
updated effectively operate in the future.

2. Formulate a data strategy

The current landscape of disparate learning and personnel systems will con-
tinue to evolve for the foreseeable future. A cohesive data strategy needs to be
implemented to help identify all of the relevant data types required to support
the human capital supply chain, to define the relevance of different data types
over time, to identify an approach for capturing the decay of importance be-
126 | Modernizing Learning

tween different data types, and to identify authoritative sources for generating
each data type. An effective data labeling strategy will enable automation,
increased analytics, and an associated lifecycle for how long the different
data elements remain relevant. Data labeling attaches meaning to the differ-
ent types of data, correlated to the different systems that generate the data
across the lifelong learning continuum. This allows all systems in the learning
ecosystem to use the data as needed, such as to adaptively tailor learning to
individuals. Patterns of data should also be explored to derive additional in-
sights at institutional levels. Consider both structured and unstructured data
that may be generated in different areas, and develop clustering strategies for
how to organize the different data types so that all components have access to
the data they need.

3. Define standards, specifications, and vocabularies

The more we can formalize the requirements for standards, specifications,


and shared vocabularies used across the nation, the easier it will be to inte-
grate components into the ecosystem. While there are many automated tech-
nologies that can semantically align different terms, there are benefits to de-
signing systems that use shared vocabularies to describe learning activities,
digital content, learners, and competencies. Activity tracking across learning
activities also works best when each activity uses a common library of terms
for different instructional modalities, media types, or as a roll-up to other
systems in the organization (e.g., talent management).

4. Define the governance strategy

Organizations need to be responsive and proactive in recruiting, educating,


and preparing their existing workforce for the future. The knowledge and
skills required to be successful today will change and new tools, technologies,
and methodologies will migrate their way into the organization. Emphasis
Interoperability | 127

As we think about creating an interoperable system across DoD, my initial


thoughts are just how big the problem is. With four different Services,
how is it that you generate buy-in from each—who for a long time have
been doing their own thing? I think the first question is: What are the
digestible parts that would have commonality across the Services? If you can
determine that, then how do you get buy-in to the digestible parts? Because
in the end we all want to know how we’re going to do it better, faster,
cheaper; this is a problem of all the Services.
Thomas Baptiste
Lieutenant General, U.S. Air Force (Ret.)
President and CEO, the National Center for Simulation

should be placed on the protection of personally identifiable information, pro-


tected intellectual property, and other proprietary organizational data. As new
systems come online, all aspects of the data strategy, competency framework,
and human capital supply chain will need to be revised. Workforce planning
strategies should tie into the lifecycle management of these critical compo-
nents. Governance should also be addressed in the data strategy so that spe-
cific indicators and outcomes can be tracked, measured, and analyzed.

These four steps provide a strategic framework from which a learning eco-
system can be built. These aren’t trivial tasks and will be implemented dif-
ferently in each organization, depending on its size, complexity, and goals.
Collectively, these steps allow organizations to embrace the future learning
ecosystem concept and to benefit from the rich data it will produce, allowing
businesses to maximize their workforces and learning-delivery organizations
to optimize and manage the quality of training and education experiences
they offer.
In thinking about risks associated with the learner data needed
to power personalized adaptive learning, privacy and security
are clearly at the top of my mind. But we need to expand the set of
values beyond those two in determining if the use of student data is
responsible /ethical. There’s value in advancing knowledge, ensuring
students are successful, and promoting the development of practices
that have the potential to affect a lot of people. This multi-faceted
approach isn’t new: A lot of these values are considered in the context
of human-subjects research reviews. It’s important for the academic
community to have a similar process for considering such a range of
values in evaluating our practice, in addition to our research.

Martin Kurzweil, J.D.


Director, Educational Transformation Program, Ithaka S+R
Data Security | 129

CHAPTER 7

DATA SECURITY
J.M. Pelletier, Ph.D.

Data breaches, like traffic accidents, are inevitable. Yet, it’s also a requirement
that we progress as a nation to a digitized learner ecosystem. Accordingly, this
chapter describes the ways we can be proactive in simultaneously managing
the likelihood of occurrence, damages of impact, and potential for contagion
of breaches across learner data systems. An effective learning architecture re-
quires security to preserve privacy, to prevent cheating by the individual, and
to prevent intrusion by external threat actors. Accordingly, balanced effort
is required across the three pillars of security: confidentiality, integrity, and
accessibility. While most security investigations focus on confidentiality and
integrity, the access to that data enables timely and well-informed decisions.
Further, users are highly likely to invalidate security controls if accessibility
is inadequate. All of these concerns can be addressed by hardening devices
and networks in a way that places users at the center of each improvement.
To do this efficiently, data security design should enable individuals and or-
ganizations to limit the spread of breaches within current and future learning
architectures. Thus, this chapter describes principles and strategies that will
allow distributed learning environments to keep pace with developments in
cybersecurity.

Data Security Threats and Challenges



Several issues must be addressed as we progress to a nation that embraces
the accumulation of data. We must recognize both those elements that pro-
tect the human as well as the need to protect the system and integrity of the
130 | Modernizing Learning

data. In America, an indi-


vidual’s privacy is a funda-
You can’t hold firewalls and intrusion
mental right, and security
detection systems accountable. You
preserves the dignity that
can only hold people accountable.
privacy allows. Yet a sub-
set of learners will strug-
Daryl White
gle with integrity; learners
Chief Information Officer, Department of Interior cheat by bypassing access
…as quoted in the Information Security
controls to steal answers or
Management Handbook, Sixth Edition, V7
alter grades. Also, foreign
adversaries constantly use
cyber means in their attempt to assess national capabilities and influence or-
ganizational priorities. Finally, regardless of resource investment, we’re see-
ing consistent increases in both the impact and probability of breaches.

AMATEUR THREATS

The most immediate concern is that malicious software is becoming increas-


ingly automated. Learners need little technical ability to steal answers and
change grades. There are many thousands of free, step-by-step tutorials that
walk would-be attackers through the process of conducting the most-known
technical penetration.1

FOREIGN THREATS

A broader challenge exists in relation to foreign adversaries. As advanced


persistent threats become increasingly capable, the former cornerstones of
security are quickly becoming obsolete. This is especially true as our na-
tion races to keep pace with adversarial advances in quantum and classical
supercomputing capabilities.2 Any widespread use of data management sys-
tems should be resilient to known attack methods and provably secure against
cryptographic brute force, side channel, and intercept attacks.
Data Security | 131

SOCIAL ENGINEERING

The value of access to an organization’s learning architecture shouldn’t be


underestimated. Expert-level social engineering results in manipulation of the
behavior of entire societies. There’s a broad and deep body of knowledge in
the use of deception and propaganda to control populations. This happens on
an individual level through confidence artistry and is scalable to any number
of persons using similar techniques. Centuries of Russian military thought
and government experimentation in social engineering provide us with the
domain of маскировка (mask-ee-rove-ka), which roughly translates to “mas-
querade,” “disguise,” or “deception,” and includes the concept of reflective
control. Reflective control is the art of strategic injections of (usually truthful)
information to cause a person or society to choose freely actions that are most
beneficial to the other party. The select injection of truth can manipulate per-
ception. Further, a single, well-designed falsehood within a trusted environ-
ment has disproportionate network effects. Just as a person can be manipulat-
ed to act on behalf of another person’s interest, so too can organizations. At a
societal level, this shapes political will and, ultimately, public policy.

INVESTMENT MODELS

Recent research in information security economics has attempted to build


models that help evaluate optimal levels of information security investment.
These generally apply risk management strategies to calculate an optimiza-
tion function associated with expected monetary loss, assessed vulnerability,
and likelihood of a breach. Some of these models consider breach contagion
effects, but they aren’t prescriptive in suggesting how the economically opti-
mal funding amounts should be invested.3

There seems to be consensus among economists and cybersecurity experts


that the only solution is to spend more money on any sort of solution that
can lock down the data. That way of thinking about security is analogous
to allocating enormous resources to make every car into a tank!
132 | Modernizing Learning

SUMMARY OF CURRENT
BEST PRACTICES
Subsequently, cyberspace has become an asymmetric battlefield, upon which
attackers operate at a disproportionate cost advantage and seek to win through
attrition. While these problems may seem intractable, there are specific best
practices that can preserve the confidentiality, integrity, and accessibility of
distributed learning architectures without exorbitant expense. The single most
critical practice for security requires regular examination of standards, re-
quirements, protocols, and implementations. Effective cybersecurity requires
extensive review of technology specifics, which is far beyond the scope of
this document. Instead of an exhaustive review, we consider here a few extant
vulnerabilities within the current distributed learning protocols. The goal is
two-fold: first, to support immediate improvement and, second, to support
ongoing sustainment in security that will result in cost-efficient reliability
across distributed learning architectures. The implementation plan at the con-
clusion of this chapter recommends a process for further review and hands-on
validation, which will allow a rank-ordered task list after a structured risk
management process.

Future Learning Ecosystem


Implementation Layers

Data security is a mature, albeit continuously evolving, field relative to the


general IT field. Those best practices translate to the distributed learning ar-
chitectures, which build upon common operating systems, servers, and net-
work technologies. However, the future learning ecosystem will also need
unique interoperability data formats, transport layers, interfaces, and storage
solutions. Two examples of these are xAPI and Kafka.
Data Security | 133

EXPERIENCE APPLICATION PROGRAMMING INTERFACE

xAPI is an example of an interoperability specification that enables data col-


lection about a wide range of online and offline learning experiences. It pro-
vides a standardized data structure and vocabulary for data captured across
a variety of learning technologies. xAPI is designed for simplicity and flex-
ibility, and it provides a basis for communicating and evaluating learning
throughout the future learning ecosystem. A non-exhaustive list of applica-
tion areas include: real-world activities, experiential learning, social learning,
simulations, mobile learning, virtual worlds, and serious games.

Systems conformant to the xAPI specification record interaction data, such as


between people and learning content. These interactions can occur anywhere
and often signal the potential for learning. The recording process involves the
transmission of statements to a Learning Record Store (LRS), which is part of
the xAPI technical specification. Each LRS can then share the recorded xAPI
statements with other LRSs and across a range of other learning technologies
(as access controls permit).

The xAPI specification provides this interoperability through a series of im-


plementation layers:4

xAPI Interoperability Implementation


Layer Four
Correlates training data with broader
job performance metrics
Layer Three
Designs data to flow seamlessly across
applications regardless of semantics
Layer Two
Records any learning experience,
including informal learning
Layer One
Improves upon previous (SCORM)
tracking by adding new capabilities with
current best practices
134 | Modernizing Learning

Security within and across each of these layers must allow for consistently re-
liable application without exposing organizations to unnecessary information
risk. This is especially important as data become increasingly standardized
across the wide range of learning interactions tracked by xAPI. Any security
evaluation starts with an assessment of each of the controls that are currently
in place. A preliminary analysis reveals that there are several vulnerabilities
that require immediate consideration.

KAFKA

Apache Kafka is an example of a message-oriented middleware system that


can process learning record changes at a massive scale. It was developed as
the message collection and analysis mechanism at LinkedIn and is probably
best-known for allowing data processing with very high volumes of variable
message rates in real-time.5 Its features include partitioning, replication, and
fault-tolerance, which make it ideal for distributed messaging of big data.
Generally, it is a unified platform that allows for reliable and asynchronous
message exchange.

ACCESS IS KEY. Our biggest issue right now in readiness: How do


we get training to the point of need? Access is really the thing we
have to focus on. Our networks are very secure, but they’re very slow and
performance is lackluster. And then we have BYOD (bring your own device),
but not all Marines have tablets and computers though they all have
phones. So the real question is: “What is the balance between access and
security?” I feel like I’m always fighting against the network guys—how do
we get both yeses? Is the mission goal security or learning?

Larry Smith
Technical Director
U.S. Marine Corps College of Distance Education and Training
Data Security | 135

Other examples of message-based middleware that can work for learning pro-
cessing are based on the Advanced Message Queuing Protocol 1.0, which is
an international standard (ISO/IEC 19464) with several implementation op-
tions that are optimized for smaller systems. Some of these options include
ActiveMQ, Apache Qpid, and RabbitMQ.6

VISION FOR DATA SECURITY


IN THE FUTURE LEARNING
ECOSYSTEM
As standards and best practices change, so should the implementations within
the distributed learning architecture. This suggests the need for a theoretical
orientation that allows for a practical, continuous evaluation of learner data
security.

Nothing humans make is impregnable, yet data sharing is also unavoidable. In


the realm of distributed learning, this means we should acknowledge that we
can neither completely eliminate the risk of a student hacking a data stream
to get examination answers, nor entirely prevent sophisticated attacks from
taking or changing information. However, we must find ways to (a) reduce
the likelihood of successful attacks and (b) develop barriers to reduce their
impact if penetration occurs. The core tenet of normal accident theory is that
technological failures are inevitable when a system is complex, tightly cou-
pled, and has catastrophic potential.7

Thus, we must consider the potential issues that can result from failures with-
in interdependencies among complex systems involving identification, access
control, authorization, auditing, network segmentation and boundary enforce-
136 | Modernizing Learning

ment, endpoint protections, encryption, and transaction security. The assump-


tions of normal accident include:
• Humans cause errors;
• Small accidents tend to escalate into big ones; and
• The organization of technology—not technology itself—usually
causes problems.8

Current research in normal accident theory


The most common application and organizational reliability suggests we
of accident theory is on
should design strategies that handle breaches
roadways, where we assume
there is no set of systems that as inevitable and aim to prevent their spread.9
will entirely prevent all traffic Despite our best efforts, systems within every
accidents. Subsequently, safety
distributed learning architecture have been,
mechanisms like seat belts and
bumpers limit the impact of or eventually will become, compromised; so,
each accident, and medians our goal is to minimize that impact. In prac-
and shoulders on high-speed
tical terms for distributed learning, it’s rec-
corridors provide spacing
that prevent the tragedy of ommended to maintain segregation of data
fatal collisions from becoming lakes through network and organizational
multi-car catastrophes. segmentation. Each department or agency
should maintain separate content networks
and, within them, build compartmented sec-
tions for each learner type. This will simul-
taneously control the spread of breaches and
preserve data integrity by creating a content
blockchain. Further, centrally-managed syn-
dication and subscription to content will help
preserve confidentiality to avoid aggregation
that can reveal organizational priorities and
strategic aims. A few specific security concerns are addressed in the remain-
der of this section to ensure network and endpoint protections, in the near
term, and security sustainability, in the long run.
Data Security | 137

Hardening Networks

The highly technical nature of a firm’s information storage and retrieval sys-
tem makes the Intrusion Detection System (IDS) and Intrusion Prevention
System (IPS) useful components for breach identification. While most in-
trusion detection and intrusion prevention systems monitor network traffic,
host-based anomaly detection can reveal and report unauthorized attempts
to access examination answers or to manipulate grades. There are also sev-
eral commercially available Security Incident and Event Management tools
(SIEM), which explicitly monitor network logs and data flows for indicators
of compromise. The inclusion of these tools is likely to significantly increase
awareness of security compromises, reduce detection timelines, and inform
organizational needs for response. For distributed learning, data streams
should be designed as one-way valves. Data lakes should be tightly patrolled
with a SIEM and organizational Security Operations Centers (SOC), which
monitor the SIEM data and conduct live response around the clock. Several
Managed Security Services Providers (MSSPs) provide SOC capabilities for
organizations that are too small to maintain their own defenses.

CROSS-LEVELING STATE-OF-THE-ART SECURITY

A more extensive review of the xAPI and Kafka standards, in light of the
Kerberos protocol, is likely to yield an elegant alternative to the current secu-
rity schema. Furthermore, the integration of a robust security layer within the
API can provide abstraction that simplifies the instantiation of authentication
mechanisms across content providers and distributed learning hosts.

Kerberos was developed as the network authentication protocol for on-campus


communications at Massachusetts Institute of Technology. Its main strength
is that it’s designed to be secure even when performed over an insecure net-
work. More specifically, passwords never transit the network during the ses-
sion authentication process. Each transmission is encrypted using a secret key
EXAMPLE – PRELIMINARY VULNERABILITY ANALYSIS
The most significant residual risks associated with Kerberos occurs when
endpoints are compromised. If the Authentication Server is compromised,
attackers can generate a validly encrypted Ticket Granting Ticket. If the
Ticket Granting Server is compromised, attackers can configure it to ignore
the initial authentication to the domain controller, as well as obviate the
service prescription. That allows the attacker to generate tickets for any
service, not just those that would be normally defined by the Authentication
Server, but it cannot authenticate new users to the domain or allow offline
password cracking. If the Service Server is compromised, there is no
fraudulent ticket generation, but it can bypass the need for client to have a
ticket at all.
Note: The Golden Ticket Attack grants tickets and persistent access to any service for 10
years, but can be prevented with a relatively simple network security setting.

and attackers can’t gain unauthorized access to a service without compromis-


ing an encryption key or breaking the underlying encryption algorithm. It’s
designed to protect against replay attacks, where an attacker eavesdrops and
retransmits legitimate communications. Further, the protocol uses symmetric
key encryption, which makes it computationally efficient at the device-lev-
el, and thereby suitable for use on resource-constrained devices. The use of
symmetric key encryption also provides resilience to potential compromises
of the Certificate Authority within a Public Key Infrastructure. Finally, Ker-
beros has a widely-available open-source implementation, which facilitates
non-proprietary integration into government-owned systems.10

HARDENING DEVICES

The best way to harden a device is to take an offensive mindset to determine


the ways an attacker might attempt to compromise that system. Through at-
tack/defend exercises, defenders can learn which vulnerabilities result in the
most critical exploits and investigate ways to remedy security deficiencies.
Regular penetration tests will reveal vulnerabilities on each device and across
Data Security | 139

networks. Across any distributed learning architecture, this is most important


for those devices related to the access control for a personalized learning data
store network (public key repositories, domain controllers, certificate author-
ities, and authentication servers) and for evaluation materials (content reposi-
tories that include answer keys).

SOCIAL HARDENING: DEVELOPING RESILIENCY

Learning management is integral to the concept of social hardening. To econ-


omize network and device hardening efforts, an explicit evaluation of human
behaviors within the organization along with associated training interventions
are needed. From a security perspective, social hardening is an opportunity
to develop organizational resiliency because humans start to learn the “why”
behind the design of technical controls and how they can prevent and contain

EXAMPLE – SOCIAL HARDENING ACTIVITY


Purple team exercises are one mechanism for social hardening. They teach
network defenders what attacks look like on their own network. These
testing and training events involve live attack/defend scenarios for IT staff and
a cross section of their managerial hierarchy. A team of penetration testers
(red team) openly engage in attacks while the threat-hunting defenders
(blue team) try to spot and deny those attacks in real-time. These scenarios
can take place on the organization’s environment, in a virtual replica of that
environment, or within a simulated disaster scenario on a near-neighbor en-
vironment. Purple team scenarios can also involve sets of novice and expert
end-users, who are valuable for considering and evaluating impacts, based on
their experience. The incorporation of end-users can provide insight to how,
when, and why users may attempt to bypass security controls. In practice,
these exercises reduce the time it takes to detect attacks, test organizational
response procedures, discover previously hidden vulnerabilities, and ulti-
mately result in a superior organizational security posture. Purple teaming
exercises often work best when performed by internal staff and facilitated by
an independent third-party. When necessary, external penetration testing can
be a strong substitute, if internal red teams are unavailable.
140 | Modernizing Learning

data breaches. The most important component of social hardening, though, is


institutional retention that creates a culture of best practices.

Implementation Recommendations

Broadly speaking, the plan for implementing security within the future learn-
ing ecosystem should include four phases. Some form of this plan is likely the
most rapid and cost-effective way to improve cybersecurity capabilities in an
extensible and forward-leaning manner.

PHASE 1: DEFINE SECURITY REQUIREMENTS. The need for security is


evident, but which standards to mandate is less clear. Security requirements
engineering is a first step in this process because it offers a disciplined look at
the interoperability needs across the system-of-systems. This step will iden-
tify and stress-test the various security procedures already in place, validate
which portions of those protections are inadequate, and conduct collaborative
attack/defend exercises with current content providers to validate findings.

EXAMPLE – IMPLEMENTATION ACTIVITIES


Phase 1 is likely to include a series of purple team exercises. These involve
a team of penetration testers (red team) who openly engage in attacks, while
the threat-hunting defenders (blue team) try to spot and deny those attacks
in real-time. These scenarios create local, contextualized learning because
they generally take place on the organization’s actual environment or in a
virtual replica of that environment.
During Phase 2, the process of local learning that occurred during Phase 1’s
purple team exercises should be transformed into multimodal instructional
content. This should involve building the notes and findings from Phase 1 into
case studies that educate the wider community. For example, these might
include lectures, online labs, and evaluation materials specifically designed to
teach learning technologists about the threats and cybersecurity protocols of
their own organizations.
Data Security | 141

This will culminate in an economized application of available resources fo-


cused on prioritizing efforts to those most likely, critical, and impactful secu-
rity improvements.

PHASE 2: DESIGN, IMPLEMENT, AND EVALUATE SECURITY LEARN-


ING ACTIVITIES. Learning new processes is an optimal way to project con-
tinuous improvement into the future. During the execution of Phase 1 there’s
an ability to monitor and evaluate the practices of security requirements en-
gineering. These evaluations can yield individual and organizational learning
activities that are derived from use cases within actual distributed learning
architectures. Further, integrating Phase 1 and 2 schedules and teams is high-
ly likely to generate cost-efficiency. These learning activities will build under-
standing regarding the specific processes for security engineering within dis-
tributed learning environments, which is highly likely to yield future security
improvement across multiple generations of the technology itself.

PHASE 3: DRAFT SECURITY POLICIES AND STANDARDS. In addition


to mandating specific security protocols and acceptable technologies in the
short-term, there’s an opportunity to inculcate a long-term process focus
within security policies and standards. For example, requiring a third par-
ty to conduct an annual security vulnerability assessment is common to the
military (e.g., Army FM 3-19.30.2) and has been adopted within the financial
industry (e.g., 23 NYCRR 500). An organization’s draft security policies and
standards should help integrate both product- and process-focused needs into
an attempt at establishing lasting security across its learning ecosystem.

► Networks – Network hardening should be the first step in securing


learner data. This can take several forms, though it could involve an
initial round of vulnerability testing, development, and deployment
of a Kerberos-inspired alternative relevant for the learning ecosystem
data formats, repositories, and transportation layers (such as defined by
the xAPI and Kafka standards). This is especially promising given the
142 | Modernizing Learning

potential for applying normal-accident theory to achieve a high-reliabil-


ity learner data schema that hardens both network and, later, devices.

► Devices – Device hardening is likely to present challenges due to the


disparate nature of machines seeking to read, write, and execute the files
associated with learner data. Subsequently, this step consists of system-
atic review of individual agency standards, and will recommend a viable
minimum standard for device connectivity.

► Humans – Social hardening is a difficult challenge, especially for tech-


nology-focused personnel. Evaluating and improving the human compo-
nent in data security requires an understanding of both human behavior
and technology to define policies and standards that shape behaviors that
deny human-enabled cyber-attack vectors. Careful review of existing
personnel security standards, such as the Army’s Threat Awareness and
Reporting Program (AR 381-12), are likely to yield a series of best practic-
es for securing the human element in distributed learning architectures.

PHASE 4: PREPARE EXPECTATIONS AND MANAGE RISK. No security


plan can completely eliminate risk. The accelerated pace of technological
change makes this especially true for systems that aggregate, store, and pro-
cess data. The final phase of this plan explicitly examines the risks, controls,
and residual risks associated with current security findings in light of expect-
ed future technologies. The result of Phase 4 should include an assessment of
when the policies and standards drafted in Phase 3 may require update. Ma-
jor deliverables should include a list of assumptions, findings, and indicators/
warning of disruptive impact on the analyses conducted in this plan.
Learner Privacy | 143

CHAPTER 8

LEARNER PRIVACY
Bart P. Knijnenburg, Ph.D. and Elaine M. Raybourn, Ph.D.1

Privacy is particularly important for distributed learning systems because


managing learners’ trust among disparate sources is like managing the pri-
vacy of apps on one’s phone—a difficult task that arguably becomes even
more pertinent when it deals with sensitive learning data. Particularly, some
systems explicitly consider every activity of their users as a potential learning
activity, thereby playing into people’s tendencies to learn and train not just in
the classroom but also in natural settings.

Modern digital learning systems employ ubiquitous data collection to enable


highly personalized and pervasive learning recommendations. Going beyond
a fixed, one-size-fits-all curriculum of activities, these systems track learners’
progress in minute detail and tailor subsequent learning activities to their per-
formance. While this helps tremendously in achieving highly efficient learn-
ing practices, the data collection and user modeling practices employed by
such systems may cause privacy threats that act as a barrier to their adoption.
As users’ trust in personalization providers is starting to fail, it’s crucial to
investigate the privacy implications of such data collection and learner mod-
eling practices.

Social networking capabilities, often featured in learning systems, may also


introduce privacy considerations that may inhibit their adoption. Users have
expressed severe privacy concerns with social networks, yet users of these
applications tend to struggle with managing their privacy on these networks.
Hence, it’s important to provide thoughtfully designed privacy management
mechanisms in learning applications.
144 | Modernizing Learning

PRIVACY IN THE FUTURE


LEARNING ECOSYSTEM
In many existing learning systems, privacy controls are an afterthought—a
series of privacy settings accompanied by a complicated privacy policy. In
contrast, the future learning ecosystem should employ the philosophy of pri-
vacy by design 2 to allow developers and researchers of such systems to select
the characteristics that best alleviate users’ concerns. Moreover, the imple-
mentation of user-tailored privacy will allow systems to model learners’ pri-
vacy concerns and provide them with adaptive privacy decision support.3

While this may nominally lengthen the development cycle, it prevents a sit-
uation where the system has numerous complex privacy settings and a com-
plicated privacy policy that learners are unable to navigate—or worse yet: no
privacy protections at all.

Data Collection

Many types of data might be available through a digital learning system, in-
cluding learner runtime activity, competencies, and context. Such data can
be collected anonymously or identifiably connected to a learner’s profile. The
data collection practices of a digital learning application can have unique pri-
vacy implications depending on the type of data collected, its source, and its
potential identifiability. This section discusses how to consider those aspects
when defining and developing the data collection practices of a digital learn-
ing application.
Learner Privacy | 145

PRIVACY DECISION-MAKING

Arguably the most important advice for developers of distributed learning


systems is to study the privacy concerns and practices of the (potential) users
of those systems. One of the most consistent findings in privacy research is
that people vary extensively in their information disclosure practices.4 Gen-
erally, users of digital systems acknowledge the benefit of data collection for
personalization, but when taken too far, the same data collection can deter
users from using the system extensively or even dissuade them from using the
system at all.5 The point where this happens differs per user. Understanding
how different learners make privacy-related decisions can inform strategies
that help alleviate these issues.
issues

An often-used conceptualization of people’s conscious information disclosure


decisions is the “privacy calculus,” which suggests people make privacy deci-
sions by balancing the perceived risks and perceived benefits of the available
choice options. Therefore, it’s important that digital learning systems highlight
the relevance of a requested disclosure behavior, and that they refrain from
asking for information in situations where relevance is not readily apparent.

Research has also demonstrated that user trust has a significant influence on
disclosure behavior in digital systems.6 Therefore, building trust is an im-
portant strategy for increasing acceptance of the data collection and tracking
practices employed by modern digital learning systems. Trust can be built
by ensuring learning applications originate from trustworthy sources, and by
employing sensible, transparent data collection practices from the outset.
outset

However, people aren’t always rational in their privacy decision-making:


when they make “heuristic” privacy decisions, they don’t carefully weigh
risks and benefits; instead, they rely on superficial but easily accessible cues,
such as website reputation, ostensible privacy guarantees, and design quality.
Digital learning systems should survey their users to learn more about the
heuristic decision processes that may negatively affect disclosure. Moreover,
146 | Modernizing Learning

they should tailor to learners’ heuristic privacy decision-marking processes


by giving them sensible default settings and providing both rational (e.g., pri-
vacy policy) and heuristic (e.g., privacy certifications or seals) sources of trust.
Learners with low levels of motivation (privacy concerns) and/or low self-ef-
ficacy (privacy literacy) are more likely to make heuristic privacy decisions.
If rational privacy decision-making is required, digital learning systems can
attempt to instill motivation and ability in their users by providing contextu-
alized privacy controls and easy-to-understand privacy information, such as
instructions designed as cartoons or comic strips.7

COMMUNICATION STYLE

Privacy in digital learning systems extends beyond personalization; it’s also


relevant to the interpersonal (“social networking”) aspects of these systems.
Social networks typically provide a plethora of mechanisms to manage one’s
privacy beyond disclosure, and research finds that users tend to employ a wide
variety of strategies to limit their disclosure, such as the six privacy manage-
ment strategies uncovered by Pamela Wisniewski and her colleagues 8 (see
Figure 8-1). These archetypes arguably extend to other social network-based
systems, including social learning platforms and other applications or features
that leverage social networks in learning systems.

Internet users also choose their social network based on their preferred com-
munication style. Research 9 suggests services that broadcast implicit social
signals (e.g., location-sharing social networks) are predominantly used by
“FYI (For Your Information) Communicators,” who prefer to keep in touch
with others through posting and reading status updates. They tend to benefit
from the implicit social interaction mechanisms provided by broadcast-based
social network systems. People who are not FYI Communicators, on the other
hand, would rather call others, or otherwise interact with them in a more di-
rect manner. They tend to benefit more from systems that promote more direct
interaction. In order to tailor to both types of communicators, digital learning
Learner Privacy | 147

systems should employ both automatic social-network style sharing (for FYI
Communicators) and direct, chat-style interaction (for non-FYI Communica-
tors). Further, since the communication styles of FYI and non-FYI Communi-
cators are at odds, developers should also pay attention to effects of integrat-
ing different communication styles within a single application.

Digital learning systems that employ or


implement social network components should
tailor their privacy functionality to different
privacy management styles

LEVELS OF IDENTIFIABILITY

The use and sharing of learners’ personally identifiable information (PII) de-
serves special attention, because it presents the risk of revealing the identity
of learners to other parties. PII can be defined as any information that could
be used on its own or with a combination of other details to identify, contact
or locate a person, or to identify a person in context. The privacy concerns
associated with PII can be mitigated by allowing users of a digital learning
system to remain fully anonymous.
anonymous

Fully anonymous interaction means that there are no persistent identifiers


associated with the user. This is difficult to accomplish in digital learning
systems, though, since most learning activities follow a trajectory over mul-
tiple interactions, which means that the system must be able to recognize
the learner across these interactions. More realistically, users can be allowed
to interact with the digital learning system under a pseudonym. The effec-
tiveness of pseudonyms and other means of de-identifying personal data has
been called into question, however, since such data may still be at risk of be-
ing re-identified, especially in digital learning systems that collect data with
high dimensionality and sparsity.10 Regardless, researchers have argued that
Figure 8-1 (derived from work by Wisniewski and colleagues)

Privacy Management Archetypes


People tend to use various privacy management strategies to greater or lesser extents

Limiting Access Control


Block Apps
PRIVACY and Events
MAXIMIZER Restricting Chat
Highest level of
privacy behavior Block People
across the majority
of privacy features Altering
News Feeds
Reputation
Management

Friend List
Management

Withholding
Contact Info Withholding
Basic Info
Selective
Sharing Timeline and
Wall Moderation

SELECTIVE PRIVACY
SHARER BALANCER
Leverages Moderate levels
more advanced of privacy
privacy settings management

TIME SAVER SELF-CENSOR PRIVACY


Use strategies to be Censors by withholding MINIMALIST
passive consumers, not basic and contact Lowest level of privacy
bothered by others information modification behavior
Learner Privacy | 149

de-identification of server data is still a good security practice, as it would


take considerable effort to re-identify all users if the server is compromised.

COLLECTING LEARNER DATA

Digital learning systems have an opportunity to collect a wide array of data


about their users. Long-term persistent data tracking allows learning systems
to personalize learning and uncover useful insights about the learner base.
However, each type of data also has unique privacy implications that must be
considered. At the most granular level, digital learning systems can collect
“learner runtime activity”—users’ step-by-step actions that can be used to
track users’ progress and to adapt the learning experience to their specific
abilities, knowledge, and pace.

Continuous tracking may create a digital panopticon that restricts user free-
dom. Therefore, users should be given easy-to-use notice and control mech-
anisms to manage the boundary between leisure and learning. Additionally,
users’ runtime activity should be carefully protected through a combination
of strict access control, de-identification, obfuscation, encryption, and/or cli-
ent-side personalization (see later sections).

INFERENCES

Learners’ privacy concerns can also be impacted by the inferences made


about them by the digital learning system. Users of personalized systems are
negatively impacted when these systems make incorrect inferences about
them. Even when inferences are correct, they may not always be wanted by
the learner or be in their best interest. For example, research has shown that
people are intuitively uncomfortable with the idea that sites track their data,11
which may reduce their trust and negatively affect their disclosure behavior.
Theories and recommendations for self-regulated learning practices should
be incorporated into the trust-building requirements during the development
phase.
150 | Modernizing Learning

panopticon / pan·op·ti·con / OUTPUT MODALITIES


noun – a circular prison design, AND DEVICES
built for surveillance; so that
all (pan-) inmates could be Future digital learning systems are envi-
observed (-opticon) by a single sioned to be pervasive, multi-device expe-
watchman at all times—and riences that might include smartphones,
so inmates knew they were
always being watched. smart TVs, e-books, smart watches, and
a multitude of other devices. These de-
vices each present unique privacy con-
siderations. Personal devices, such as
smartphones and wearables, are ideal for
real-time learning but can also create dis-
tractions. Therefore, learning experiences
on such devices should be structured such
that they don’t disturb learners or reveal
information about them in uncontrolled
ways (such as a push reminder displayed
as a popup—while projecting to a group from a smartphone). Strategies for
achieving this include planning notifications carefully, avoiding interruption
of a learner’s current task, and adapting notification timing to the learner’s
context. Devices that are shared by multiple people (e.g., smart TVs) should
also avoid leaking personal information in social settings. To do so, notifi-
cations on such devices should provide generic recommendations that mask
details unless they are requested by the learner.

DATA LOCATION AND OWNERSHIP

A typical reason for integrating learning experiences in a distributed platform


is to provide recommendation and adaptation capabilities across these learn-
ing experiences. This requires the implementation of data collection and stor-
age facilities, communication channels, and adaptation capabilities. In most
systems, these components will be centralized, so building trust between the
learner and these components is extremely important. This can be done by
Learner Privacy | 151

putting these components under control of a trusted, local entity, such as a


single department or organization. However, this may also shield the compo-
nents from important insights that can be gained from data collected across
instances, and it may make the mobility of learner data more cumbersome.

Instead, one could build a learning platform where all users, departments, and
organizations share the same centralized components. However, a single en-
tity that collects the data of all users creates an attractive target for hackers.12
A good trade-off is therefore to put these components at a level that is “low”
enough for learners to trust but high enough to allow efficient mobility and
user-modeling synergies. In other words, data/insight mobility problems can
be reduced through portability requirements and standardized APIs.
APIs

Another question is how each learning application on the platform can access
learners’ data. Since users are likely to trust different applications to differ-
ent extents, an access control mechanism is needed to allow applications to
optimally utilize the learners’ data while at the same time respecting each
learner’s privacy preferences. A recent development in adaptive systems is to
perform the calculations required to compute adaptations “client-side” rather
than on a centralized server. Research shows that such client-side methods al-

We think there’s a big opportunity to open that data up to an ecosystem


concept. For example, predictive analytics can help identify who will do
poorly or who will do well in courses…but should we show that to students?
Will we create a self-fulfilling prophecy? It’s important to consider the possible
unethical deployment of this. The way to avoid it is to use a governance
system to manage the data systems and be thoughtful about this.

Phill Miller, Chief Learning


and Innovation Officer, Blackboard
152 | Modernizing Learning

leviate privacy concerns.13 However, client-side adaptation methods can only


use limited inference methods (e.g., if-then rules, simple classification), and
research has shown that users are concerned that their data can be hacked if
their device is stolen, and that their user model is lost forever in case they lose
or break their device.

Given these considerations and limitations, we suggest a three-tier data man-


agement and personalization approach: On the first tier, learner competency
data is used by the platform to decide what learning applications to recom-
mend to the user (meta-adaptation). On the second tier, individual applications
can use similar data—albeit with regulated access control—to make app-level
adaptations (macro-adaptation). Finally, on the third tier, client-side mecha-
nisms can use fine-grained learner runtime data and behavioral tracking to
make subtle adjustments to the learning experience (micro-adaptation).

DATA OWNERSHIP AND STEWARDSHIP

The end-user license agreement of most modern online services claims full
ownership over the personal information they collect about their users. The
legality of this claim is questionable though: The legal concept of “owning
information” is still new, and laws are still being written about this topic.
Moreover, preliminary investigations among users show that there are merits
in granting end-users ownership of their personal information, and it may ex-
pedite the movement of data among different digital learning systems. How-
ever, data ownership is not exclusive, and it may be desirable to give other
entities (e.g., applications, employers, researchers) partial co-ownership over
an individual’s data. These co-owners should request minimal amounts of
data, avoid duplicate storage, and de-identify data where feasible.

Data ownership puts an important responsibility on the shoulders of the learn-


ers. It allows them to play an active role in making sharing decisions about
their data, but not all users may be motivated and capable of taking on this
responsibility.
Learner Privacy | 153

In the 401(k) model, learners formally own the data, but they can partially
delegate the responsibility of making decisions regarding their data to a fidu-
ciary, such as a teacher or administrator. As a “data steward,” this fiduciary
would then be allowed to make decisions on the learner’s behalf; although,
there should be a strict policy that outlines the
limits of these powers. This policy can outline
Structure data
several practices that are always allowed, nev-
ownership like
er allowed, or require the explicit consent of the a 401(K)
user. In the latter case, such consent should not
just be a notice with an option to “opt out.” Rath-
er, it should ask the user to formally opt-in to the proposed practice—this
practice makes it more likely that learners will make an informed consent
decision.

Finally, when more than one party has a say over the disclosure and use of cer-
tain data, Private Equality Testing can be used to create a Two-Person Con-
cept solution (a concept proposed by U.S. Air Force Instruction 91-104 [16])
that prevents any single person from intentionally or unintentionally leaking
data or becoming victimized by extortion or social engineering attacks.

Data Sharing

Data collected in digital learning systems can be used for purposes outside the
system. One such purpose is to make the data available to the learner them-
selves, which allows quantified self–like innovations. Beyond this, learning
systems can allow learning materials, activities, and outcomes to be shared
with fellow learners (enabling social learning experiences), researchers (cat-
alyzing learning innovation), and employers (informing organizational de-
cision-making). This section covers the privacy-related consequences of the
social, academic, and organizational use of data collected and generated by
digital learning systems.
154 | Modernizing Learning

QUANTIFIED SELF

By sharing learner data with the learners, themselves, digital learning sys-
tems can create a “quantified self” experience that allows them to gain in-
sights into their own data. For example, carefully constructed personalized
infographics can allow individuals to explore the common and unique sides
of their identities.14 Such insights are an important reason for many people to
accept the potential privacy intrusions that come with wearable technologies
and constant tracking. As such, the quantified self can be a motivating factor
behind the data collection efforts of a digital learning system. Also, the quan-
tified self can be a catalyst for learning. Translating self-tracked parameters
into a game-like structure can create new motivational and heutagogical sup-
port structures that encourage and enable users to push themselves further.

SOCIAL LEARNING EXPERIENCES

Sharing learner data across learning environments can, in some cases, be


considered a violation of regulations, such as the Family Educational Rights
and Privacy Act or General Data Protection Regulation. Hence, care should
be taken that the learner (not the system) makes the decision to disclose such
information. Even learners willing to share might not want to share with all of
their contacts because they could be bothered by an overload of social activity.15
As such, users should be allowed to select a subset of their contacts for shar-
ing purposes, and the learning system can actively help them in this process.

RESEARCH AND ORGANIZATIONAL DECISION-MAKING

Learning data can also be used for research and organizational decision-mak-
ing. Privacy experts argue that secondary use of information should be ex-
plicitly communicated to users, otherwise they may be surprised to find out
about it and feel that their privacy is violated.16 Moreover, there are laws and
regulations surrounding research and employment-related practices that need
to be adhered to. For example, whereas employment discrimination is ille-
Learner Privacy | 155

gal, algorithmic decisions have been shown to incorporate unwanted biases.


Therefore, ethical considerations need to be made before using machine judg-
ment for, e.g., promotion decisions.

Privacy Support Mechanisms

Several techniques for privacy support can be implemented in digital learning


systems. This final section discusses their benefits and shortcomings.

PRIVACY NOTICES

Online privacy policies are often written in a legalistic, confusing manner


and require a collegiate reading level to understand them. Indeed, while many
people claim to read online privacy policies, many don’t actually review them
or don’t read closely enough to understand them.17 A lot of work has therefore
gone into summarizing privacy statements, but summarized privacy notices
are often too simplistic to accurately represent the policies they reflect.18 One
way is to add textured agreements, which add layers of emphasis to make
the text more readable,19 but these have been shown to increase (rather than
decrease) the amount of time people spend reading the agreements. Although
the consensus is that people should be informed about the privacy decisions
they are asked to make, the reality is that doing so often makes them more
fearful or unwilling to come to a decision. The conclusion, then: It’s better
not to rely on any privacy notices, but to instead make the privacy decisions
themselves simpler.
simpler

CONTROL MECHANISMS

Simple privacy controls can help users take control over their privacy settings.
For example, in social sharing settings, recipients can be grouped to simplify
the decision landscape and graphical representations of the control matrix
can help users understand and manage their sharing patterns. Selective infor-
156 | Modernizing Learning

mation sharing is just one of many strategies users may employ to alleviate
privacy tensions. Likewise, privacy control can be provided in more diverse
and intuitive ways than a traditional “sharing matrix” in which users specify
who gets to see what. Research has found that it’s important to give users the
privacy features they want, lest they experience reduced connectedness and
miss out on social capital.20

Unfortunately, while users claim to want full


control over their data, they often avoid the
hassle of actually exploiting this control  21

In combination with overly permissive defaults, users’ avoidance of control


mechanisms leads to a predominance of over-sharing. In order to facilitate
control, digital learning systems should use smart default settings and make
the available controls as simple as possible.

PRIVACY NUDGING

Nudges are subtle yet persuasive cues that make people more likely to decide
in one direction or the other. An example of a privacy nudge is a justification
that makes it easier to rationalize a privacy decision. Justifications include
providing reason for requesting the information, highlighting the benefits of
disclosure, appealing to the social norm, or providing a symbolic character
to represent the trustworthiness of a recipient (e.g. a “privacy seal”). Another
approach to nudging users’ privacy decisions is to provide sensible default
settings, which tend to nudge users in the direction of that default.

The privacy nudges evaluated to date usually only work for some users, how-
ever, and they leave others unaffected or even dissatisfied. Some researchers
argue that this is because nudges take a “one-size-fits-all” approach to pri-
vacy.22 Since such nudges are rarely good for everyone, they may actually
threaten consumer autonomy. It’s therefore best to only use nudges if there’s
ORGANIZATIONAL
CONSTRAINTS
AND PRACTICES

MEASURE MODEL ADAPT

DEFAULTS
USER
CHARACTERISTICS
DATA USER
RECOMMENDATIONS
USER RECIPIENT OTHER
BEHAVIORS FACTORS
JUSTIFICATIONS

USER PRIVACY MODEL

Figure 8-2: A schematic overview of user-tailored privacy

User-tailored privacy aims to strike the balance between giving learners


no control over or information about their privacy at all versus giving
them full control and an overload of information about it.

TWO EXAMPLES TO HELP ILLUSTRATE THE USER-TAILORED PRIVACY CONCEPT

A digital learning system normally tracks users’ location (Data) in order to give context-
1 relevant training exercises (Organizational practice). However, user-tailored privacy
knows that like many young mothers (User characteristic), Mary (User) does not want her
location (Data) tracked outside work hours (Other factor). It therefore turns the location
tracker off by default when Mary is not on the clock (Default).
David needs to decide how to share his recent milestones—two certificates he’s just
2 earned (Data)—within his organization (Recipient). Due to the rules of his employer
(Organizational constraint), user-tailored privacy requires him to share these milestones
with his direct supervisor (Recipient). Moreover, from his previous interactions (User
behaviors), the user-tailored privacy knows that David keeps close ties to several other
divisions. User-tailored privacy therefore suggests (Recommendation) that he should share
his new certifications with the heads of these divisions (Recipient) as well, explaining that
they’re likely to be interested in exploiting his newly gained skills (Justification).
158 | Modernizing Learning

consensus among learners about privacy. In these situations, nudges apply


privacy by default but give learners a choice in the event that they want a dif-
ferent setting after all.

USER-TAILORED PRIVACY

User-tailored privacy is a novel means to support users’ privacy decision-mak-


ing practices.23 A user-tailored privacy–based system first measures users’
privacy-related characteristics and behaviors, then uses this as input to model
their privacy preferences, and finally adapts the system’s privacy settings to
these preferences (see Figure 8-2).

The first step to user-tailored privacy is to measure learners’ privacy-related


characteristics and behaviors. To accomplish this step, learning system de-
velopers should acknowledge the plurality and multi-dimensionality of users’
decision-making practices. They should also note the variability of learners’
privacy practices; although, these can often be captured by a concise set of
“privacy profiles,” and, similarly, the potential data recipients can often be
organized into a number of groups or “circles.”

The next step is to model privacy. This can be done in a way that matches the
learners’ current privacy practices; however, in some cases, it may be better
to suggest privacy practices that are complementary to their current practices,
and in still other cases, it may be best to completely move beyond learners’
current practices. The model can also take the practices and constraints of
users’ organizations into account. Finally, using this user model, user-tailored
privacy can personalize the privacy settings of a digital learning application
as well as the justifications it gives for requesting certain information, its pri-
vacy-setting interface, and its learning recommendation practices.

Arguably, user-tailored privacy relieves some of the burden of the privacy


decision from a learner by providing the right privacy-related information and
right amount of privacy control, without being overwhelming or misleading.24
Learner Privacy | 159

Implementation Recommendations

We recommend several steps in the development process that will both build
intuitive privacy controls into the design of the learning ecosystem as well as
create privacy-sensitive recommender agents to guide learners.

1. DECISION-MAKING

Build trust:
trust Ensure that the learning applications originate from trustworthy
sources. Employ sensible data collection practices and a privacy-by-design
philosophy from the outset. Finally, provide contextualized privacy control
mechanisms and easy-to-understand privacy information

2. COMMUNICATION STYLE

Tailor to different privacy management strategies:


strategies Give Selective Sharers the
ability to selectively expose data to specific apps and groups of people. Al-
low Self-Censors to use non-personalized mechanisms for selecting learning
material and to restrict their forms of sharing. Allow Time Savers to opt out
of active notifications and social features. Give Privacy Maximizers all of
the functionality; Privacy Balancers mechanisms for curation, blocking, and
avoiding direct interaction; and Privacy Minimalists adaptive and social func-
tionalities within the ecosystem.
U.S. FEDERAL AVIATION ADMINISTRATION

“What we found when we studied the FAA is that the lines between training and
operations are blurring. …Aircraft have sensors with analytics; so, they can make
profiles and tell if pilots do something unsafe. It allows the FAA to look into a pro-
gram to provide information back to pilots. But the pilots, being union-driven and
structured, said “No, you can’t watch us!” So, they made a union the go-between
guardian for that data. This way, if there is an issue, there are a series of approvals
and guardians of the data, so that the pilot can’t be punitively damaged but can be
informed.” – Michael Smith, Senior Technical Specialist, ICF
160 | Modernizing Learning

3. LEVELS OF IDENTIFIABILITY

Devise appropriate levels of identifiability:


identifiability Use, but do not rely on, de-iden-
tification for privacy purposes while allowing creative and (self-)evaluative
environments to use pseudonymity. Formal and diplomatic settings should
enforce a real-name policy.

4. COLLECTION OF DATA TYPES

Protect learner runtime activity:


activity Reduce unfettered context tracking to pre-
vent the creation of a digital panopticon, and provide easy-to-use notice and
control mechanisms to control the boundary between leisure and learning.
Protect learner runtime activity using access control, encryption, de-identifi-
cation, and obfuscation and, where possible, process and use learner runtime
activity data locally.

5. OUTPUT MODALITIES AND DEVICES

Don’t disturb the user:


user Plan notifications carefully and provide easy controls
for notification urgency. Adapt notification timing to the learner’s context.

Prevent leaking personal information in social settings:


settings Provide generic no-
tifications that do not reveal (potentially sensitive) details and change the
amount of information provided in each notification depending on the number
of people who are near the learner.

6. MANAGE ADAPTATIONS

Implement the centralized components of learning platforms at the appropri-


ate level:
level Put centralized learning components under the auspices of a trusted
entity and support the portability of learning models. Allow for interoperabil-
ity of learning applications through standardized APIs.
Learner Privacy | 161

Regulate access of individual learning applications to the centrally collected


data: Allow learning applications to do their own adaptations and put access
data
control mechanisms in place to regulate the use of centrally collected data.

Use client-side micro-adaptation:


micro-adaptation Collect and analyze learner runtime data
in client-side applications. Prevent the unnecessary storage of this data, and
handle it in an ephemeral manner to prevent data loss or theft.

7. DATA OWNERSHIP AND STEWARDSHIP

Give learners ownership over their data:


data Allow learners to peruse their raw
data and user models, and enable them to take their data with them across
different learning institutions or employment organizations.

Give employers and learning applications limited co-ownership:


co-ownership Allow em-
ployers and learning applications to co-own appropriate data while requesting
minimal amounts of data. Avoid duplicate storage and de-identify data.

Allow learners to designate a “data steward”:


steward” Allow learners to delegate re-
sponsibilities to a “data steward” to manage their data under a fiduciary pol-
icy, and implement the Two-Person Concept using Private Equality Testing.

8. SOCIAL LEARNING EXPERIENCES

Give users control over what to share:


share Refrain from sharing any learning out-
comes with others by default. Instead, require an explicit decision from learn-
ers before sharing learning outcomes with others. Allow learners to limit their
connections to those they deem relevant for each application and implement a
“learning buddy” recommender.

9. RESEARCH AND ORGANIZATIONAL DECISION-MAKING

Let learners know about secondary data use:use Communicate secondary data
use practices to learners and indicate exactly the data used and its purpose.
162 | Modernizing Learning

Act responsibly regarding research and organizational decisions:


decisions Anonymize
research data, and make sure that promotion decisions are made in a non-dis-
criminatory manner.

10. PRIVACY NOTICES

Increase the chance that learners read privacy notices:


notices Use privacy nutrition
labels to give learners a quick overview, and make notices textured to empha-
size the details. Make the privacy notices attractive and approachable, such
as by using use comics, and similarly, make the decisions simpler—ideally, to
the point that notices are no longer required.

11. CONTROL MECHANISMS

Use accessible, graphical privacy controls:


controls Make controls obvious and easily
accessible. Use graphical methods to give users easy-to-understand controls,
beyond just information access. Use a privacy setting interface that works for
everyone (where possible) and keep it simple.

12. PRIVACY NUDGING

Use nudges if there is a consensus:


consensus Use justifications and defaults when virtu-
ally all learners agree on the optimal privacy setting, and incorporate nudges
to provide learners choice in case they want different settings.

13. USER-TAILORED PRIVACY

Employ user-tailored privacy to support learners’ privacy decision-making


practices: Measure learners’ privacy preferences in context, exploiting their
practices
multi-dimensional nature. Carefully balance recommending current, comple-
mentary, or novel privacy practices as well as proactive and conservative ad-
aptation strategies.
Analytics and Visualization | 163

CHAPTER 9

ANALYTICS AND
VISUALIZATION
Shelly Blake-Plock

Analytics and data visualization are now mainstream. The maturation of


cloud services and the adoption of new web technologies have accelerated
both fields. Among the most important innovations has been the develop-
ment of new streaming data systems. These technologies can handle the ex-
ponentially increasing scales of data produced—not only by traditional web
and social media technologies but also by machines and sensors deployed in
cyber-physical systems such as consumer wearables, smart city implementa-
tions, and connected industrial devices.

This chapter summarizes the state-of-the-art in streaming data, learning an-


alytics, and data visualization for non-technical readers. It provides context,
lays out a vision, and provides high-level guidance on implementation ap-
proaches. The goal is to provide practical knowledge to teachers and trainers,
business users, and programmatic decision-makers, helping them to envision
how learning analytics and visualization can increase the capacity of learning
organizations as well as the general approach to such implementing systems.

What are we talking about?



There’s so much data in the world. Each of us produces a cloud of “data ex-
haust,” with every mouse click or upvote. Learners, too, generate masses of
data—information that could inform education and training if we could access,
educational
analyze, and meaningfully visualize it. Two closely related fields—educational
164 | Modernizing Learning

data mining and learning analytics—are


analytics providing tools to meet those goals.

Both fields differ slightly, for instance, based on their origins, primary appli-
cation areas, and preferred AI algorithms.1 Learning analytics grew out of
efforts in the semantic web, and it’s practitioners tend to emphasizes big-pic-
ture analyses and decision-support for teachers and learners. Educational data
mining developed out of the adaptive instructional technologies tradition, and
it tends to focus on automated adaption and reductionist modeling.2 For our
purposes, in this chapter, we’re less concerned with the finer details distin-
guishing the two disciplines. Instead, we’re focused on their shared purpose:
Understanding and applying data-intensive approaches to education and train-
ing, particularly for large-scale learning data—so called big learning data.3

As the phrase “big data” implies, training and education analytics often (but
not exclusively) employ machine learning techniques. Machine learning is a
subset of AI that uses algorithms to automatically uncover patterns in data to,
for instance, assign classifications, estimate the influence of different variables
on downstream outcomes, or make predictions based upon historical data. In
the training and education domain, these applications have notably matured
over the last 20 years, coalescing into the two communities mentioned above.

But what can you do with these tools? People have applied analytics to a
variety of learning systems. For instance, some applications use analytics to
predict engagement and then recommend personalized resources to encour-
age students’ participation.4 Others can analyze students’ interactions and
proactively alert instructors as to which may need help.5 One well-known
example, Purdue University’s Course Signals, used current data from an LMS
combined with historical data (such as course attendance and prior grades)
to forecast which students would fall behind in a course and then alert both
learners and their teachers about their risk levels.6 Other tools apply similar
retention management approaches across an entire student body, identifying
those at highest risk of dropping out—in time for the administration to in-
tervene.7 Basically, any of the analytics applications we’ve come to expect of
Analytics and Visualization | 165

e-commerce systems, from time-sensitive personalized recommendations to


system-wide trend analyses, can translate into analytics for learning.8

DIP YOUR TOE


INTO THE STREAM
Streaming-data analytics is a uniquely exciting and freshly emerging subfield
within analytics. When we say streaming data, we’re generally talking about
a range of data types that are event-based and track some variety of activities,
whether human or machine in origin. The invention of streaming data has im-
pacted the way we think about what data, itself, represents as well as how it’s
leveraged to guide human insights or automated machine processes.

For instance, in the domain of sales and marketing, event-based data has in-
creased our capacity to understand the market and prospective customers. It
provides a window (for example, via analysis of social media streams) into
the story of the prospect’s journey, both as it relates directly and indirectly to
a product or service offering. In the entertainment industry, streaming data
informs recommendations of content, such as movies and television shows on
Netflix. In politics, streaming data helps analysts identify and capitalize on
public sentiment and social trends.

Data-stream architectures are contrasted against traditional


batch-processing systems. Data streams are characterized by
data moving at a high velocity. They also have strict constraints
for processing the incoming data online, within limited amounts
of memory and time, and they must always be ready to provide
analytical predictions when queried.
166 | Modernizing Learning

Just as these technologies and data architectures have transformed business,


entertainment, and politics, so too are they able to transform learning. In the
learning space, the availability of activity-based data streams offers an oppor-
tunity to trace and understand learners’ journeys. Analytics in the service of
these streams of data can provide accessible, automated, and near real-time
data visualizations, as well as trigger alerts and interventions based on key
performance indicators. These journeys—which comprise learners’ activity
and behavior profiles—can be considered highly formative, quantifiable mi-
cro-assessments.

Digitizing the Analog World

We often see a desire to digitize the analog world. We wear digital watches
that resemble their windable cousins. We create “offices” in our computers,
mirroring the components of the physical workplace. In education, we digitize
chalkboards, loose leaf, and books. But the inclination to recreate the analog
world within the digital domain eventually confronts both the limits of ana-
log practice as well as the more esoteric surprises of what, when it works in
our favor, we call innovation. When we move from tangible “things,” such
as chalkboards and books, to conceptual practices and processes, such as as-
sessment, the situation gets particularly dicey. Esoteric and nuanced concepts
become oversimplified to the point of caricature. This leads to notions being
thrown around such as, AI will replace educators! or Automation could never
substitute for teachers!—arguments that tend to betray a misunderstanding of
both AI and teachers. However, in the world where access to learning is dis-
tributed across the internet, expansive in breadth and always available, there
are practical limits to the analog approach of teaching. While there’s little
danger that AI will “replace” human teachers, their role—and the way we im-
plement training and education, writ large—needs to evolve in collaboration
with evolving technologies.
Analytics and Visualization | 167

In a world that needs learning at scale, the real 


conversation should be, how can AI serve the 
needs of teachers—and vice-versa? 

Data at Scale

Contrast the analog “data set” with the contemporary “data assets” created
by social media newsfeeds. These data assets support the creation of time
series–based behavioral profiles that hold the activity records, built up over
time, from users’ behaviors on social media platforms, including likes, com-
ments, shares, photo posts, video watches—all user actions. These become
part of the user’s behavioral profile, and, in turn, become nodes on a vast
social graph. Each node owns a narrative. That data asset is key to the social
media industry’s business model. It’s the aggregate of these profiles that cre-
ates the opportunity for more targeted advertising, and, at scale, it’s a most
impressive record of formative experiences—of individuals, yes, but more so
of vast aggregate populations.

For social medial data assets, value isn’t encapsulated in a single pinpoint
score. It’s not even found in the ability to estimate a single user’s likelihood of
accepting a given advertisement (although this certainly brings some benefit).
Rather, or (at least) more importantly, value derives from the cumulative amal-
gamation of all these behavioral profiles. The power is in the aggregate. Only
the scale of the aggregate provides the rich raw data necessary to uncover the
array of patterns, categories of human interest, and shared narratives of hu-
man experience. It’s a matter of scale.
scale Similarly, the challenge streaming data
poses to the traditional view of assessment comes down to a matter of scale.
A gradebook at scale will never offer the insights into learning experiences
that an activity feed at scale can provide. This isn’t to denigrate gradebooks;
rather it’s a reminder to recognize their functions and where their value lies.
Consider a typical gradebook full of letter grades and percentages. In
one sense, this table of letters and numbers offers a substantial bit of
information about how one student may have progressed over time or
how she compares to her peer group’s scores. But in another sense—in
the sense informed by a world of streaming data, where data convey a
narrative about students’ digital experiences—the gradebook tells us little
about what actually happened, how it was done, and what it suggests
about the learner. The gradebook, and the modes of assessments
that inform it, are analog technologies. They’re no worse than digital
technologies merely because they’re not computerized, but they are
technologies reflecting an earlier paradigm—a paradigm ill-equipped to
support learning at scale in a digitized, interconnect world.
Analytics and Visualization | 169

Supporting Decision Making

Learning practitioners have long sought to increase their insights into forma-
tive development. For instance, teachers may subconsciously wonder, How
far along is each student in his or her learning journey? Unfortunately, dif-
ficulty in gathering the data points needed to make confident and continuous
formative appraisals makes the alternative—a big summative assessment—
seem like the only option. This can be understood as a scale problem. Yet, by
leveraging activity and event-based data in a manner similar to what social
media employs, we can create formative profiles of learners. These, in turn,
can empower (human) educators and trainers to make better decisions about
instruction and help them tailor guidance in ways that would otherwise be im-
possible. We can similarly empower learners, administrators, systems teams,
content and experience providers, and a whole host of constituents across the
learning ecosystem with information relevant to improving, and making more
meaningful, their own pieces of the puzzle.

The result of this merging of activity and event-based streaming data, along
with the subsequent human applications of the knowledge derived from it,
could offer a path towards something of a Golden Age for formative assess-
ment—but this Golden Age doesn’t stand a chance if either the technologies
or instructional strategies employed fail to attend to the matter of scale.

A challenge, therefore, is to reconceptualize assessment from the point of view


of learning at scale, as opposed to its traditional analogs found in “un-scaled”
contexts. Computational learning analytics are core to this conversation. Any
notion of assessment in the digital world must consider the impacts of scal-
able, continuous, multifactor data. The future of assessment is analytics.

The time is ripe to investigate new models of assessment that take advan-
tage of advancements in cloud services, streaming data architectures, APIs,
and a new generation of web-based applications. By applying these tools to
One key topic of focus for future
learning is data analytics. We
currently use very fanaticized or learning, we can surface meaningful pat-
ritualized measures, like time on terns previously too obscure, if not overly
task or changes in knowledge complex, to act upon.
in a single area. How do we get
that mind reset to the galactic This prompts us to consider a wholly
view of learning? new human-machine model of assess-
ment for the digital age, not simply a
Elliot Masie
Founder, The MASIE Center digitized version of analog assessment
at scale. For example, it’s routinely noted
that automation can maximize the effi-
ciency and timeliness of tactical learning
interventions (e.g., micro- and macro-ad-
aptations). However, automation can also
help identify those interventions best
addressed by a human—who, in a web-
scale context, needn’t be a single preas-
signed instructor. Rather, learners could
be served by a distributed network of po-
tential teachers and mentors, and based upon various automated analyses, the
system could recommend the optimum (human) learning facilitators for dif-
ferent situations (including, potentially, the individual learners, themselves).
In this way, we enable widespread distribution, not just of individual instruc-
tion, but of the entire ecosystem—including its human capital.

This suggests a new paradigm for learning and assessment, one where
machines and humans complement one another—a symbiotic system.

In addition to automating the collection and analysis of data, it’s possible to


automate its visualization via learning analytics dashboards.9 The idea pro-
posed here is to fully leverage activity and event-based data to provide 360°
views of learners in real-time.
Analytics and Visualization | 171

These dashboards could readily visualize concepts, such as:

• Frequency, time, and duration of individual, cohort, global activities


• Frequency, time, and duration of engagement with specific content
• Outliers among actors or content, in terms of level or type of activity
• Relations between actors, such as shown by a directed network graph
• Individual or cohort performance aligned to KPIs or business goals
• Recommended interventions to support learners’ progress
• Trends among content engagement activities and learning pathways
• Outliers among actors, in terms of similarity or dissimilarity of content
usage, types of engagement, or times and durations compared to a
cohort or global group

Further, in future iterations—once enough relevant data points have accu-


mulated—machine-learning algorithms could help uncover common learning
trajectories or the factors that make different pathways more or less effective
for different categories of learners. These sorts of activity patterns could be
visualized, for example, by using heatmaps to depict which instructional con-
tent successful learners spend the most time with or by using polar graphs to
indicate the behavioral trends exhibited by learners of different aptitudes as
they interact with a given learning object (e.g., fast-forwarding through parts
of a video or abandoning a simulation at certain times). For learners, dash-
boards can help individuals visualize their own gaps and proficiencies, and
help them take steps towards managing their own learning.10 For adminis-
trators, these algorithms could help forecast enterprise-level planning issues,
inform education and workforce strategic-level decisions, or suggest incre-
mental improvements for the system itself. Ultimately, a “mission control”
dashboard comprised of modular data cards—each representing different in-
sights and each providing ways to query the data—could be available to each
“persona” with in the learning ecosystem, including for learners, instructors,
content developers, administrators, and policymakers.
172 | Modernizing Learning

It’s almost cliché to say, “learning is a journey.” But when most people use
this platitude, it’s possible they really mean, “Sure, you’re going to find
out new things in the future, but this class ends in three weeks and you’d
better finish this learning by that time.” An assumption of the learning
ecosystem concept, and the closely related philosophy of personalized
lifelong learning, is a shift away from output-focused, time-based
learning—characterized by high-stakes summative tests—and instead
towards more a process-focused outlook on learning—supported by a
steady stream of formative assessments. This represents a fundamental
shift for learning and assessment—away from discrete mathematics and
towards continuous equations.

IMPLEMENTATION
RECOMMENDATIONS
Because the field of streaming data and the capabilities it supports are still
emerging, we expect future innovations to eclipse the suggestions made in
this chapter. But in terms of a starting point, the section below outlines practi-
cal implementation steps to consider when looking to bring this new wave of
digital transformation to bear.

1. Needs Analysis and Data Assessment

As in most processes, the first step involves problem framing. Determine


what outcome data are needed and what types, quality, and amounts of data
are already available. Ask questions to identify factors, such as the state of
current and historical data assets and data-producing sources, both within
and external to the current system as well as the status of currently accessible
Analytics and Visualization | 173

data, including the shape of the data model and where, when, and how it was
data
delivered and stored. Also document the status of the current data architec-
ture and system design,
design and information about its previous incarnations (if
any), including its historical levels of use and expectations for the scale to be
served by the new system. Finally, as appropriate for any project, catalog the
known risks and protocols (such as privacy, data governance, and security);
the objectives and goals of digital transformation,
transformation so as to provide guidance
on what new data sources will need to be integrated into the system to provide
desired metrics and insights; and the timeline, scope, and budget,
budget in order to
best enable (what will most often be) a phased approach to implementation of
the complete system.

2. Data and Visualization Designs

Practitioners often make mistakes during the data design phase that only sur-
face later in the process. To limit exposure to errors, poor design, and the ac-
cumulation of technical debt, it’s useful to work backwards. Begin by laying
out key questions;
questions simultaneously, it’s helpful to draw prospective visualiza-
tions for these questions,
questions particularly in collaboration with their respective
end-users. Next, identify performance indicators that provide insights to those
questions, and determine what data sources may best inform these perfor-
mance indicators (whether or not those data sources currently exist). Then
design the “ideal” data model, incorporating the hypothetical data sources
previously identified; take care to deliberately consider how different data
sources may react to one another and how data from multiple sources may be
needed to inform recommended actions—possibly including actions taken by
other providers within the larger ecosystem. Once this optimum data model is
developed, look for available data sources to fill, or at least partially address,
its proposed components; also, consider potential limitations or access issues
with these data. Finally, revisit and tailor the visualization mock-ups to the
final data model.
174 | Modernizing Learning

There are a variety of ways to visualize data. Key factors to consider include
the velocity of data streaming through the system, the shape of the data, se-
mantic features including both human- and machine-readable attributes, po-
tential correlations or potential false flags among the data, and the metrics
necessary to demonstrate progress towards key performance indicators. Ad-
ditionally, strive to design visualizations to be as transparent as possible, to
help end-users build appropriate levels of trust in the algorithms and make
informed decisions based upon the analyses they depict.

Related concerns, such as privacy or access to data streams, should also be


considered during the design phase. Adhering to industry or organizational
policies, such as learner privacy rules, may limit the ability to create a robust
profiles. Sparse data may impede the ability to generate analytics using many
established big-data methods. It’s important to realistically scope the data
model and visualizations to a realistic volume and robustness of data, and to
determine the minimum amounts needed to produce useful insights around
the identified key indicators.

3. Architecture Development

Once the conceptual data model is designed, the next step is to develop it.

To achieve the “future learning ecosystem” vision, learning applications need


to capture and structure (or at least semi-structure) learner activity data, to
support its aggregation and utility at scale. xAPI is among the most capa-
ble and flexible learning-data specifications for this purpose, and it can be
leveraged alongside other data formats (either non-activity-based or from
non-learning-domains) to provide a fuller view of learner experience.

When applying the xAPI specification to capture and store data, an xAPI
Profile should be used,
used either an off-the-shelf Profile or, if none suffice, then
a new one created for this system. xAPI Profiles define the accepted terms (or
Analytics and Visualization | 175

variables) within a given implementation as well as their uses and semantic


values. xAPI Profiles create clear, domain-based modeling structures that help
define the scope of a project, making it easier to deliver human-readable data
and provide navigable machine-readable data across the ecosystem. Profiles
can also serve as a useful tool to ensure a clear alignment of business process-
es and learning objectives to the proposed data model before its implemented.

Next, choices will have to be made regarding the integration of other data
sources. Some learning data sources already may be delivered natively in
sources
xAPI formats. These data will usually be validated and made available by a
learning record store, a particular kind of datastore defined by the xAPI spec-
ification. Standardized data and APIs, such as those offered by xAPI, make
data aggregation relatively easy. However, there may be other learning data or
non-learning activity (such as on-the-job workflows across web services) that
aren’t natively structured as xAPI statements. One option is to instrument the
external source to deliver xAPI data, but this can be difficult when working
with proprietary third-party software. An alternative is to coerce the data into
an xAPI format using API methods. However, it won’t make sense to force
all data into an xAPI-based data model. There’s no reason to transform data
into xAPI formats if it’s not a good fit. Instead, this heterogeneous data either
may be modeled to another specification or just passed directly through the
Kafka Streams processor (described below), where it can be subscribed-to by
different applications and joined with disparate data in downstream analyses.

Once the native data format and external data streams have been defined,
they’ll need to be implemented within a streaming data architecture.
architecture These
can follow several models, but we would usually recommend the Kappa
Architecture 11 as the software architecture pattern for a real-time learning
ecosystem. This paradigm treats everything as though it were streaming
data and processes these data into a stream that may be leveraged by various
microservices. This approach generally makes it easier and more efficient to
deal with various forms of data, as opposed to creating polyglot solutions and
176 | Modernizing Learning

maintaining a separate code base for batched and non-streaming data or—in
the case of xAPI—each non-conformant data source or data type that may pass
through the system (e.g., from student information systems, HR technologies,
and legacy databases). In this architectural paradigm, regardless of the nature
of the source, the data comes into the stream as logged events. This is a huge
benefit to real-time analytics because from an operational perspective, the
subscriber to the data stream never has to request that the data producer batch
the data. Instead, the subscriber always has access to the log and can replay
the events in the log as necessary to perform operations.

When considering the integration of data from different sources, it’s import-
ant to carefully consider how users’ identities will be handled.
handled Identity man-
agement should be organized so that everything is kept orthogonal. When
designing a streaming data architecture, it’s also best to keep identity man-
agement and administrative provisioning matters close to the point of ingress;
so that no data elements slip through unaccounted for.

As mentioned above, streaming architecture may be served by implementing


an open-source stream processor, such as Apache Kafka.12 Identity manage-
ment and security applications will need to work in concert with the Kaf-
ka implementation. Once set up, data from all sources will flow into Kafka
to be processed and sent down into a data stream. Data in that stream may
be subscribed to by any application, such as business intelligence tools or a
learning record store. The application listens to the stream and pulls out a
copy of a piece of data when it recognizes it. Microservices provide these
capabilities and help to automate data flow. Ideally, data will automatically
go where they’re supposed to; so that they can be analyzed, visualized, ag-
gregated, verified, or so on, by various subscribed applications. Meanwhile,
all of the original data passing through the stream eventually ends in a data
lake, where it may be accessed and queried manually or via machine means
later, as necessary. And, as mentioned above, all of the data is now available
as logged events—which provides considerable operational efficiencies. This
Analytics and Visualization | 177

The system for military promotion is well known, albeit difficult


to use. Whenever you get to a point of appraising someone’s
abilities, people become serious about what the metric is and
how it is being collected. They want to know, “How do I achieve
the metric?” They will focus their minds on the details of metric
collection and if they don’t get promoted, they will expect a debrief
that provides clarity on why they missed the mark. They want to
know with credibility; it can’t just be a machine saying you just didn’t
get promoted/recommended. This is all part of treating them right.
We will always need humans in the loop when dealing with human
performance assessment.

James Robb
Rear Admiral, U.S. Navy (Ret.)
President, the National Training and Simulation Association

stream processor model is in contrast to point-to-point architectures, where


all applications within a learning ecosystem attempt to connect with one an-
other, to exchange data bilaterally. Point-to-point architectures scale poorly.

Finally, a word of caution: Generally speaking, especially in enterprise scale


implementations, we would refrain from using third-party SaaS integration
solutions. They add cost and licensing complications, may affect throughput,
and can be a burden in the event things break or the third-party ceases to pro-
vide the services. Third-party services can also create unanticipated security
challenges. In our personal experience, it’s almost always better to build na-
tively or to provide data translation services of your own design.
178 | Modernizing Learning

4. Deployment

The fourth implementation step is to choose the deployment environment.


environment
There are a variety of commercial and specialized cloud architectures that can
support streaming data. Depending on your needs, you’ll likely be choosing
between enterprise SaaS and Virtual Private Cloud instances and creating the
templates to size them appropriately. On-premise deployment is an option,
though it may greatly increase complexity and cost both during deployment
and in ongoing maintenance.

Most implementations will follow a general pattern of alpha to beta to produc-


tion deployment. As part of your alpha deployment, you should identify and
address issues around privacy and security protocols, identity management
and administrative provisioning, quality assurance, and continuous integra-
tion regimes.
regimes You’ll also need to conduct systems testing.
testing During the beta
implementation and testing period, you’ll stress test the system with real us-
ers; take this opportunity to identify bugs as well as ways to improve the user
experience both for end-users and for those maintaining the system.

5. Production Implementation

Production implementation marks the beginning of a new phase. Depending


on the volume and consistency of data, machine learning techniques (to poten-
tially include deep learning approaches) can be applied to these real-world data
flowing through the system. Deep learning processes could unlock a host of
innovations in this space, including ways to link cognitive machine processes
with biometric, decision-making, and event-based human learning activities.

Be warned, however, by their very nature, streaming architectures can be


fragile. New product development by a vendor may break an endpoint. This
will have to be fixed in order for the data from that vendor to be able to flow as
Analytics and Visualization | 179

it is supposed to. Because other services may be depending on data from that
vendor in order to process jobs, breaks such as this can cause bottlenecks that
affect the larger system. For that reason, it’s crucial that stream-processing
systems be attended to by services teams,
either locally or via managed services. Some practitioners use
Luckily, making fixes is usually a relative- the acronym FATE when
ly painless process so long as you’ve done discussing Fairness,
your due diligence into the quality of the Accountability, Transparency,
and Ethics in AI
data sources feeding into your system. Fur-
ther, because most breaks will be caused by
things like changes to endpoints or reconfigurations of APIs, they’re usually
well-documented and part of the product plan shared with the team—mean-
ing most breaking changes will be telegraphed well in advance and can be
planned for.

Just as important to the success of the analytics and data visualizations ser-
vices within the future learning ecosystem will be scalability and extensibil-
ity. Advances in learning tools, web technologies, and AI are likely to alter
future learning analytics and data visualizations. Likewise social changes in
behavior, expectations, methods of instruction, access to learning, and pref-
erences among both formal and informal learners will influence the nature
of the events captured in activity data streams. The technologies deployed to
serve learning analytics and data visualization objectives, therefore, should
be as flexible, extensible, and open, as possible. The systems must be built to
withstand whatever is thrown at them. Dedication to open source standards
and specifications will aid in meeting this need.

Conclusion

In the end, the quality of insights gleaned from analytics and visualizations
will be tied to the quality of their data models, the velocity and variety of the
In a learning management system, you can get a gradebook, much
like analog systems today but available online. But with the advances
in assessment analytics, you can delve much deeper to gain insight
into how reliably your questions and tests are measuring what
they’re supposed to measure. You can determine if your question
bank is fair, valid, and reliable. You can see in multiple views in a
dashboard, and you can even see it within, and eventually across,
education, defense, commercial, and healthcare.
Stacy Poll
U.S. Public Sector Business Development Manager
Senior Account Manager, Questionmark

data they employ, and the accuracy of the data’s representations. As the truism
goes, there are lies, damned lies, and statistics.13 Statistics, and even more so
infographics and visualizations, when misapplied can obfuscate the “truth” of
data. It’s far too easy to make bogus claims, given any data set—particularly
one as complex, personal, and socially and culturally situated as learning.
Consequently, the design of the data, application of algorithms, and layout of
visualizations are of great consequence. Small decisions during these design
and development phases can lead to significant downstream effects—hope-
fully positive ones—for learners and other learning stakeholders.
Personalization | 181

CHAPTER 10

PERSONALIZATION
Jeremiah Folsom-Kovarik, Ph.D., Dar-Wei Chen, Ph.D.,
Behrooz Mostafavi, Ph.D., and Michael Freed, Ph.D.

Scientific studies show that personalized learning produces better outcomes


than static, one-size-fits-all instructional experiences.1 When instruction is
personalized, learners show improved recall and better near- and far-trans-
fer. Personalized learning can engender deeper understanding as well as hone
higher-order cognitive skills, such as leadership and adaptive thinking.2

Customized experiences, like those a skillful tutor might craft, are the gold
standard for learning, but these don’t scale well, given the costs and limit-
ed availability of expert teachers and trainers. Computer-assisted instruction
can mitigate scalability issues, and personalized learning technologies can (at
least partially) unlock the benefits of one-on-one learning, similar to working
with a personal mentor.3

Generally speaking, personalized learning technologies attempt to create dif-


ferent experiences for different learners (or for the same learner at different
points in time). At the simplest level, this might involve customized settings
based on individuals’ preferences or differentiated instruction, where prede-
termined categories of learners receive different instructional packages (e.g.,
a system that offers unique pathways for novice and intermediate students).
More notably, personalized learning can incorporate adaptive mechanisms—
adjusting the learning experience based upon a stream of incoming data. This
sort of adaptive learning is usually what’s meant when people tout the bene-
fits of personalization. (And, on the whole, this chapter focuses on adaptive
learning, as well.)
Modern technologies increasingly employ
a spectrum of personalized learning meth-
ods to tailor instructional elements, such
as task selection and tutorial examples,4 to
better suit individuals’ goals and charac-
teristics, prior experiences, demonstrated
knowledge and performance, environ-
There are numerous ways mental conditions, and/or social contexts.
that consumers are already For example, as someone gains proficien-
experiencing personalization: cy, a system may alter the order and fre-
Coupons printed at grocery
quency of problems, progression through
store cash registers, dynamic
the curriculum, and types of feedback
home pages of e-commerce
sites based on previous
given. Adaptive learning systems can help
purchases and store browsing, ensure learners have truly mastered each
personal-assistant capabilities, required objective, guiding them through
recommend restaurants, and activities that exercise and verify each of
driving directions to get there. the enabling objectives and progressively
Consumers now expect the scaffolding learners to reach mastery. Ad-
benefits of those experiences
ditionally, as evidence accumulates from
in other online experiences—
multiple learners, some systems can use
like learning.
data-driven methods to identify trends,
The personalization capabilities such as portions of the instructional se-
become a virtual concierge for quence that are problematic or unintui-
learning experiences, making tive. Other systems can use learners’ be-
recommendations based on haviors to recommend peer-to-peer and
a combination of needs and
team matchmaking, or to identify when a
interests of the learner.
student needs human (versus automated)
feedback.
John Landwehr
Vice President and Public Sector
Chief Technical Officer, Adobe
Adaptive learning technologies, on av-
erage, produce substantially better out-
comes than conventional, group-based or
Personalization | 183

non-adaptive learning.5 Adaptive technologies can also make learning more


efficient, delivering training and education in less time or at lower run-time
costs. For instance, learners can spend less time reviewing material already
familiar to them, and they can receive remediation as soon as it’s needed.
Adaptive systems can also use fewer, or at least shorter, assessments because
questions can be carefully chosen to maximize their utility in estimating each
learner’s capabilities.

LIMITS OF CURRENT PRACTICE


While personalized learning has already been used in various settings, its full
potential hasn’t been achieved. Part of the problem is that these systems are
typically designed to meet specific, narrowly focused instructional needs, and
as such, their benefits tend to be localized. Widespread implementation of id-
iosyncratic solutions also means that methods of development, evaluation, and
reporting are nonstandard. This makes the transfer of data between insular sys-
tems difficult, which limits the available adaptations and means that instruc-
tional episodes are likely to appear disconnected and inconsistent to learners.

Another challenge is their development costs, which, historically, have aver-


aged around 100–300 hours of time—from highly skilled researchers, soft-
ware engineers, and subject-matter experts—for each hour of learner inter-
action.6 A significant portion of this time is spent building the learning and
behavioral models that make automated adaptation possible. Considering the
hundreds of hours of instruction needed for a single domain, along with the
personnel and time required for its development and testing, the cost of per-
sonalization can be high.

When considering its many benefits as compared to current one-size-fits-all


practices, however, even expensive adaptive learning offers an overall ad-
184 | Modernizing Learning

vantage. More than that, with the advancement of model-building techniques


using machine-learning methods and the increasing availability of author-
ing tools,7 development is becoming more efficient. Today, a modern system
could be built with as few as 20–30 expert hours for one hour of instruction.

Overall, this field is fast growing, and new technologies are improving the
sensitivity, impact, efficiency, and cost-effectiveness of personalized systems
every day. The following sections outline a general approach to designing and
deploying personalized learning, with a particular focus on how new adaptive
learning capabilities will inform the future learning ecosystem.

DESIGNING PERSONALIZED
LEARNING
When preparing to implement a personalized learning approach, it’s useful
to consider which aspects of a learning experience are most impacted by per-
sonal differences as well as how instructional elements might be varied in
response to those differences. The availability of historical, real-time, and
external data sources will also influence the adaptive system. The next three
subsections step through high-level considerations for data collection, data
analysis, and what and how to personalize learning.

Data Sources

Adaptation requires something to adapt to; this could include demographic


and background information as well as real-time performance, sensor, and
behavioral (event-based) data from learners. There may also be important
contributions from information sources outside of the learner directly, such as
contextual information and instructor inputs.
Personalization | 185

Relatively static data,


data such as learner traits and prior experiences, can in-
form simpler forms of customization, such as role-based differentiation, or
help seed a new learner profile within a system. Some personal traits shown
to meaningfully inform personalization include goal orientation, general
self-efficacy, computer attitudes, and metacognitive abilities. Constitutional
attributes, such as job title or military rank, can also be useful, in particular,
because they’re often easy to obtain and can somewhat substitute for past-per-
formance information (if those data aren’t available). Prior knowledge and
skills, unsurprisingly, are among the most useful historical data for informing
personalization.8

Learner performance data can include both static data, such as from historical
test results and portfolio scores, as well as more timely data from quizzes,
exercises, simulations, and other activities within the given instructional ex-
perience. Learner performance can be used to inform complex inferences,
through methods such as item-response theory or Bayesian knowledge-trac-
ing; simpler approaches, such as comparisons to threshold metrics and pop-
ulation norms, also provide some utility. However, even basic learner-perfor-
mance data isn’t always easy to collect; sometimes, for instance, individuals
or organizations may feel threatened by the measurement and recording of
their scores. Despite this, learner performance data makes a big difference to
personalization; it’s worth the effort to devise quality measures, collect the
data, and analyze them carefully.

A new source of data available in some settings comes from sensors


sensors, i.e., de-
vices that can measure physical or physiological information about learners
objectively, removing some ambiguity surrounding the mediators and mod-
erators of their performance. Some specialized sensors, such as galvanic skin
response and heart-rate variability monitors, can detect learners’ mental and
emotional states (to an extent). What’s more, specialty hardware isn’t always
required; low-cost sensors are already built into many devices, such as laptops
and cell phones, and these can track location, context, gaze direction, pupil
Different people have different dilation, and various other
strengths so how can we structure the inputs from voice, gesture,
training based on those differences? and posture cues. Data from
How do we deliver the required
these low-cost sensors has
training in less time and have our
already been used to infer
military personnel better prepared
when they come out the back end? states such as stress, bore-
dom, and confusion. In-
strumentation within soft-
Thomas Baptiste ware can even use keyboard
Lieutenant General, U.S. Air Force (Ret.) and mouse inputs, such as
President and CEO, the National Center for Simulation
slower typing or repetitive
mouse movements, to infer
learners’ attentiveness, engagement, or irritation, as well as help confirm a
learner’s identity or uncover signs of cheating.9

Related to both learner performance and sensor data, learner experience data
refers to event-based data that describe what learners see and do. Compared to
learner performance data, learner experience captures not just the outcomes
but all the steps that explain each outcome—the fine-grained, step-by-step
activities a learner (or other relevant human or machine agents in the setting)
perform. These could include pausing a video, selecting (and then changing)
a quiz answer before submitting, or requesting help from an automated tutor.

Important insights may also come from external sources,


sources outside of the imme-
diate delivery technology or instructional activities. For example, other social
interactions, such as casual discussion boards in online courses, can be mined
via natural language processing to learn more about learners’ interests and
attitudes or to inform social network analyses. Contextual information about
the learning environment can also be used. For example, time and location
data can be collected by learners’ sensors and then integrated with external
weather and map databases to inform real-time context-relevant learning ex-
amples. Similarly, logistical considerations may affect learning delivery con-
Personalization | 187

siderations; these could include the digital devices available to that learner
(e.g., smartphone versus laptop), the number of seats available in a particular
course, or cost and time constraints. Organizational factors may also inform
personalization in various ways. As one example, consider how the design
and delivery of learning might change depending on whether someone is
completing a training course for workforce compliance reasons, because of
professional development goals, or out of personal curiosity.

Another form of external data comes from human observations and inputs,
inputs
including from learners themselves, their peers, instructors, and supervisors.
For instance, an instructor might input a critique about a student’s persuasive
writing, or an observer/trainer might score exercise trainees against a perfor-
mance rubric. A student may even self-report data, or it might come from peer
evaluations or 360° surveys. (The point is, it’s not necessary for all aspects of
the future learning ecosystem to be digitized and automated! In fact, this is
an important area for ongoing research, i.e., how to best integrate technology
with learning facilitators in a symbiotic—rather than substitutional—way.)

Finally, it’s important to note that learner data is often more useful when it’s
more robust, more personal, and more contextualized—but these same char-
acteristics also increase privacy concerns. A balance must be carefully struck.
(Refer to Chapter 8 for a more detailed discussion.)

Data Analyses

Collected data need to be analyzed in some meaningful way, and then the sys-
tem should use those analyses to make diagnoses, predictions, and adaptation
decisions. What kinds of decisions can personalized-learning technologies
make? The most obvious answer is they can estimate learners’ content mas-
tery and then take actions to fill capability gaps and remedy misconceptions.
People learn at different rates, and some of the most impactful interventions
a system can make are simply to ensure each learner progresses at his or her
188 | Modernizing Learning

optimal pace so that all learners reach mastery, without skipping over import-
ant subcomponents or suffering through already-known materials.

Definitionally, mastery describes an estimate of a learner’s competence, the


true value of which is hidden from observation. Mastery results in observable
performance, such as correctness and speed of responses.10 Mastery estimates
can be informed by static data from a learner’s profile or demographic inputs,
particularly initially. During a learning episode, mastery estimates are best
informed by newly generated, contextually relevant data. Take care, howev-
er, to acknowledge the limitations of mastery estimations. Lucky guesses,
accidental inputs, trial-and-error, and any number of accidental or inten-
tional errant behaviors can create inaccuracies. Adaptive systems should al-
ways be designed with a healthy skepticism around learner mastery data and
should incorporate ways to verify and mitigate bad estimates. Some ways to
guard against inaccurate mastery models include via instructor inputs, learn-
er-choice recommendations that override system behaviors, and open learner
models that let learners view (and sometimes directly or indirectly change)
their mastery estimates.

In addition to mastery, many individual states and traits impact learning and,
thus, can be useful targets of analysis. Learner states are malleable features
that change from moment to moment, while learner traits are more fixed and
change only over longer periods of time, if at all. Affective states, such as
frustration or boredom, can reduce individuals’ motivation to learn; physio-
logical states, such as hunger or lack of sleep can also affect learning, both
by impacting emotions and by moderating cognitive functions. As mentioned
earlier, personality traits (e.g., goal orientation and general self-efficacy) can
also provide some insights; additionally, personal characteristics, such as so-
cial identity traits or learning goals, may be useful.

Finally, aggregations of data from many learners over time can inform trend
analyses or, at sufficient scale, be used to train machine-learning algorithms
that uncover hidden patterns. At a minimum, collective data can provide some
Personalization | 189

general benchmarks, such as average completion time requirements. In more


sophisticated systems, these data can also improve automated diagnoses and
adaptation recommendations as well as inform system-wide improvements,
such as identifying problematic sections in the instruction, optimum learning
trajectories for different types of learners, and ways to incrementally improve
the learning interface, content, or delivery.

I want to be in a position where there’s truly


personalized learning based on a student’s
individual needs while at the same time balancing
it with content-standard expectations. I’d love to
see opportunities for students to dig in deeper, to
have responsive educational opportunities.

Nathan Oakley, Ph.D.


Chief Academic Officer
Mississippi Department of Education

Adaptations

The next important consideration concerns the kinds of adaptation the sys-
tem will make. This could involve modifications to many factors, including
display elements, what and when content is presented, the task sequence, the
contents of instructional materials, embedded content features (e.g., selection
of relevant examples), extrinsic content features (e.g., feedback and hints), in-
structional strategies and tactics, delivery methods, delivery devices, perfor-
mance standards, learner goals, and various other interactions. These forms
of adaptation can be expressed, to a greater or lesser extent, at the micro-,
macro-, and meta-levels.
190 | Modernizing Learning

First, the micro-level focuses on task-specific adaptation in response to learn-


er actions within a learning session, problem-solving opportunity, or single
task. This could be, for example, in the context of one algebra problem or
within a simulation scenario. Intelligent tutoring systems produce this sort of
adaptation, albeit usually for fairly constrained purposes and subject areas.
Intelligent tutoring technologies are becoming commodities, and it’s easy to
find commercial and open-source options with an internet search. However,
many of these off-the-shelf tools work best in well-defined subject domains;
so, while there are several available mathematics tutors, there are fewer for
writing and fewer still for social and emotional skills. For ill-defined domains
or specialized material, developing task-specific personalization can be a
time-consuming effort, requiring extensive inputs from learning, engineer-
ing, and subject-matter experts. In those cases, the need for human expertise
creates a bottleneck to their development 11 and is part of the greater cost of
personalized learning.

Second, the macro-level focuses on content-wide adaptation. This could in-


volve choosing the next instructional topic, sequencing instructional blocks
within a curriculum, asking learners to repeat unmastered concepts, or allow-
ing them to skip previously learned areas. The granularity of a given “topic”
or “block” can vary widely, but they’re meant to refer to learning episodes
(rather than their component tasks or larger aggregates). Macro- and mi-
cro-level adaptation typically occur within a bounded system, that is, within
a single application.

A third type of personalization is emerging at the meta-level


meta-level. Meta-adapta-
tion is applied across disparate curricula, learning systems, and/or organiza-
tional functions. In contrast to the micro- and macro-levels, adaptation at the
meta-level occurs in system-of-systems environments. Meta-adaptation may
involve, for instance, choosing which application to use to meet a particular
learning goal—e.g., whether to train a medic on a new procedure via an online
simulation, in a blended-learning workshop, or with an on-site mobile training
In education, we typically focus on
supply rather than demand, compliance
rather than growth, academic facts
rather than context and experience.
We’re trying to shift this. And we’re
doing it in such a way to align our
workforce, K-12, and post-secondary
sectors. We need different pathways
because kids are different. So, their
journeys through our system should
also be different.

Ken Wagner, Ph.D.


Education Commissioner
Rhode Island Department of Education
192 | Modernizing Learning

team. As this example highlights, different learning systems use distinct, and
often complementary, approaches.12 Intuitively, each experience might work
better (or worse) for each learner. Consider, for instance, how professional
development goals, workshop scheduling logistics, available technologies, ur-
gency of earning the licensure, and risk tolerance of the organization might
affect the way the hypothetical medic is trained.

Meta-adaptation could also augment learning activities within a given


system. Imagine that the imaginary medic is learning that new procedure
through a simulation, and the system diagnoses a gap in an interrelated area,
say, pharmacology, that’s not explicitly addressed by the current system. In
this case, the meta-adaptive solution may be able to recommend external
remediation resources, such as a book chapter, micro-learning refresher, or
online course.

Meta-adaptation is a property of modern learning ecosystems, which com-


bine multiple learning systems to let them share data and work together. This
highlights one reason why it’s important to use standardized protocols, ma-
chine-readable data, and well-defined metadata in learning systems. When
data are shared across systems in standardized ways, it enables the personal-
ization of unified and optimal learning paths 13 —at the broader, lifelong learn-
ing scale.

Technological Considerations

The design, deployment, and impact of personalization are heavily influenced


by the technical environment where the system is deployed. This section
highlights a sample of those considerations.

HARDWARE AND SOFTWARE

Computer-based personalization clearly requires hardware and software. Less


obviously, these systems may require specialized components, for example,
Personalization | 193

extensive and highly secured digital storage for massive amounts of learn-
er data, flexible servers capable of processing online AI algorithms at scale,
or federated systems that share data across APIs. Similarly, depending upon
the selected data sources, unique hardware devices may be required, such as
wearable sensors, environmental beacons, or instructor input tablets.

BANDWIDTH

Although personalized-learning technologies can function natively on a client


application, we imagine most systems will use networked components (and,
likely, software as a service or SaaS solutions). However, bandwidth limita-
tions may affect some deployments. For example, K–12 schools may need
to share a limited internet connection across many users, or military units
afloat or in austere conditions might have long periods without connectivity.
In such cases, personalized-learning applications should be designed to re-
duce network usage, function despite slow response times, or operate without
a connection. Methods for implementing this include batch processing, local
replication, and caching of expected next steps when possible.

DATA

Personalized learning requires data.


data

Data models can be informed by extant data, whether collected through large-
scale validation and norming studies, from other applications in a learning
ecosystem, or from centralized data repositories. A word of caution, however:
More isn’t always better. It’s important to judge the extent to which previously
collected data accurately reflects the current population. In precision settings,
for example, bias has been detected from differences as subtle as the order of
questions within a test.14 As this highlights, data quality is a key concern—
whether data come from external sources or from system-collected inputs.
Resilience to error, completeness, objectivity, fairness, timeliness, and consis-
tency (to name a few) are all critical factors for personalization.15
194 | Modernizing Learning

Another key consideration involves storage and processing requirements.


Some algorithms require data from hundreds or thousands of learners to cal-
ibrate a system before it becomes useful. Furthermore, depending on the al-
gorithms, the amount of data generated could dramatically increase memory
and computational requirements.

MACHINE LEARNING

Data, at large scales, can be used to train machine-learning algorithms.


These can, for example, predict which learning paths will work best for
different types of learners or create a self-improving system that detects
obsolete content based on usage patterns.16 Furthermore, machine learning
can automate how personalization works for different populations or uncover
changing interaction patterns over time. However, machine learning is not a
silver bullet. It also requires a significant amount of data, which means many
learners will need to use a system before a machine-learning algorithm is
ready to fully deploy. Furthermore, many organizations will require ongoing
validation of personalized-learning interventions, which may involve human
oversight of an algorithm’s functionality, leading to increased complexity and
costs. Machine-learning can also suffer from transparency and explainability
limitations.

TRANSPARENCY AND EXPLAINABILITY

Personalized learning systems should function transparently, that is, in a


way that allows stakeholders to the see the data, analyses, and reasons for
actions. Transparency is defined in contrast to black-box technical systems,
which might perform the same actions but without a way for users to trace the
system’s decision processes. Ideally, outputs from personalization should be
available at individual and aggregated levels, and they should allow users to
drill-down (or drill-through) to reach explanatory detailed views. Data visu-
alizations and dashboards designed for learners, instructors, administrators,
supervisors, and/or commanders may prove useful here.
Personalization | 195

I think it’s an interesting and exciting future, if there are


multiple paths of developing competencies and ultimately
getting the job you want. For way too long, we’ve had this
single path to get to success. It’s often served more as a filter
than as a capability-building mechanism.
Shantanu Sinha
Director, Product Management, Google; Former Founding
President and Chief Operations Officer, Khan Academy

Ideally, personalized learning systems should be explainable as well as trans-


parent; this helps stakeholders understand the system’s actions in order to prop-
erly evaluate and accept them.17 Consider this distinction: A technical system
that lacks transparency might contain proprietary functions and black-box
machine learning; however, opening a window onto these algorithms won’t
necessarily make their underlying logic or emergent behaviors understand-
able. Transparency without consideration for end-user explanations can still
create confusion; hence, personalized-learning systems also provide expla-
nations of the reasons for their estimates and adaptations. As one example, a
personalized-learning system may use probabilistic math to update estimates
and combine them into decisions. Studies show that merely displaying prob-
abilities isn’t useful, because even well-educated users may struggle to intu-
196 | Modernizing Learning

itively understand them. Instead, explainable systems might provide natural


language descriptions and evidence for their decisions in familiar terminolo-
gy. Recent research is also investigating how to construct explanations after
the fact for those complex systems that don’t normally explain themselves.18

The output of transparent and explainable systems should be actionable for


the end-users. Systems shouldn’t simply output data; they should help make
data meaningful to the stakeholders who use it—e.g., as open learner models,
instructor dashboards, or visualizations designed for administrators and orga-
nizational decision-makers.19 And when these systems incorporate good ex-
plainability, users are more likely to trust them, understand their limitations,
take actions in response to system recommendations, and continue using the
systems over time.

CONTROL

Transparent and explainable systems let users see why and how an application
works, but what if those stakeholders want to control some of its functions?
Systems can allow learners, instructors, and other human stakeholders to in-
fluence their estimations and/or actions. This sort of human-machine teaming
is an ongoing area of research.20 Ideally, learning stakeholders should be able
to retain the kinds of control they want while they offload tasks to comple-
mentary technology that augments them with faster processing of large or
detailed data.21

USABILITY

Finally, to effectively implement personalized learning, usability and user ac-


ceptance are critical performance metrics. Usability stakeholders include not
only learners, instructors, and administrators, but also the instructional de-
signers who plan and implement personalized learning, system engineers who
need to monitor data models and adaptation algorithms, and even developers
of other applications within a learning ecosystem.
Personalization | 197

BUILDING EFFECTIVE
PERSONALIZED LEARNING
Ultimately, the purpose of personalization is to help individuals achieve learn-
ing objectives more effectively and efficiently. But how do we determine how
well a particular system—its data, analyses, and adaptive interventions—per-
functional, i.e., does
forms? The first question to ask is whether a system is functional
it give different learners experiences that fit their needs? Can we verify that
it performs as designed and expected? It’s useful to break these evaluation
factors down into several categories. For instance, how does the system—as
a software application—perform? Consider elements such as: the amount of
work done by a user without help from the system, time-related information
about the work processes, information related to the accuracy of underlying
models, and the behavior of users in interacting with the system. It’s also
useful to evaluate the content within the application, for instance the extent to
which a system produces recommendations for every possible target learning
outcome, quality of the instructional “catalog” the system draws from, and
quality of instructional interventions made.

The quality of instructional interventions can be measured in many dimen-


sions, including the breadth, sensitivity, and completeness of different learn-
ing interventions, the number of unique recommendations the system makes
in proportion to the entire catalog, or how often the system recommends the
same few popular results to different users. Relatedly, questions to ask include:
What were the differences in support and feedback between learners? What
was the difference in the order of progression from one topic to the next? Did
students get stuck at any point during task- and content-specific operations
and, if so, where? How often did trainees drop out of training or pause it?
Were there indicators of off-task behaviors or attempts to game the system?
198 | Modernizing Learning

The next question to ask is whether the system is effective


effective, i.e., does it make
adaptations that enhance the outcomes of the learning experience? Can we
validate that it achieves the broader outcomes we’re seeking? Most obviously,
these may include training effectiveness and efficiency measures, i.e., did the
system produce better topic mastery or faster speeds of completion versus
other methods? More than that, other outcomes may be equally desirable,
such as increasing retention rates, improving motivation, fostering certain at-
titudes, or encouraging social interactions.

Finally, there are practical considerations for evaluating a personalized learn-


ing system: What does it cost? How much time and how many resources were
needed to develop it, and what are the costs of its operation and maintenance?
Are the components of the system modular, scalable, extensible, and reusable?
How much data does it collect, and how are those data handled? And, ulti-
mately, is the system providing good return on investment.

CONCLUSION
Personalization is among the most important ways to achieve effective learn-
ing outcomes, and computer-assisted personalization can bring this benefit to
more learners. The field of learning science has advanced our understanding
of what and how to adapt learning (through decades of research in educational
theory and cognitive science), and innovations in technology are improving
our ability to implement these methods, efficiently and effectively at scale.

The promise of personalizing learning will be realized when individual com-


ponents and learning systems work together, as a system-of-systems, sharing
data and optimizing learner trajectories across longitudinal and diverse expe-
riences. The potential for learning personalization is immense, and research-
ers and educators are just beginning to explore the possibilities.
One of the most important points we
heard…is the need for a list of principles—
and that technology shouldn’t lead. It should
be about technology enabling our systems
to attain equitable and ethical outcomes.
Amber Garrison Duncan, Ph.D.
Strategy Director, Lumina Foundation
Learning
Science
If we can assess both formal and informal learning experiences
in our students, what might you use that data to inform?
Teamwork skills? Classroom management? Extracurricular skills? As
a teacher, I have little control over the design of statewide summative
assessments, but I could put those skills and others, like thinking and
problem-solving, at the forefront. Developing these skills is also an
objective. Students are always aware that they’re honing a number of
skills that will help them be successful outside of our class.

Kimberly Eckert
Teacher, Brusly High School; Louisiana State Teacher of the Year 2018
Assessment and Feedback | 203

CHAPTER 11

ASSESSMENT AND
FEEDBACK
Debra Abbott, Ph.D.1

The future learning ecosystem will change the management and processing
of learners’ data across systems, communities, and time. As new analytics
capabilities evolve, they will catalyze change in several ways: by increasing
the level of insight into how learners develop over longer periods of time, by
enhancing the ability of instructors to make teaching more responsive and
adaptive, and by recommending experiences and learning pathways designed
to meet the needs of individuals. However, new technologies won’t enhance
learning if they’re applied without purpose. The current system too often elic-
its an abundance of learner performance data without making effective use
of it. And, too often, other factors essential to learning—such as motivation
and long-term goals—are ignored, or learners receive feedback that’s neither
useful or actionable and, hence, quickly forgotten. This chapter lays out an up-
dated framework for assessment and simultaneously emphasizes the impor-
tance of analyzing the intent behind assessment activities, reforms available
through improvement of formative feedback, and affordances required in a
technology-enabled system of assessment.

Background and the Limits of Current Practice



As technology rapidly transforms training and education, the choices regard-
ing learning assessment have become more confusing for instructors and
riskier for education and training program managers, who must navigate a
204 | Modernizing Learning

We need formative rather than just summative assessments;


we need to push and melt these technology tools to do a better
job and use the analytics in linear or nodal fashion. The goal is to
understand individual aspects for education that ultimately enable
us to give them a better education than they’ve ever had.

Keith Osburn, Ed.D.


Associate Superintendent, Georgia Virtual Learning
Georgia Department of Education

bewildering forest of accountability-oriented data on programs, classrooms,


and outcomes. Unfortunately, such recordkeeping often takes on a life of
its own as data, originally connected to specific learning goals, becomes an
enterprise asset to be gathered, maintained, and reported for its own sake.
Additionally, developments in research, paradigm shifts in assessment, and
changes in the landscape of learning have essentially rewritten the rules of the
game. The professional development of education and training stakeholders,
however, has not kept pace with these changes, and this has frequently led
teachers, instructional designers, and others to operate under outdated mod-
els of assessment—where assessments are primarily summative, quantitative,
and focused on decontextualized snapshots of learner performance.

Valerie Shute and Matthew Ventura sum up the consequences of this state of
affairs:

Many of today’s classroom assessments don’t support deep learning or


the acquisition of complex competencies. Current classroom assess-
ments (referred to as “assessments of learning”) are typically designed
to judge a student (or group of students) at a single point in time, without
providing diagnostic support to students or diagnostic information to
teachers.2
Assessment and Feedback | 205

Often, instead of providing a clear path towards a solution, the advance of


technologies—including algorithms that personalize learning, new delivery
platforms, and a host of other rapidly expanding choices—muddy the waters.
There’s a risk that novelty effects or the complexity of some learning tech-
nologies mask flaws in design. Learning science informed by research-based
principles can help. Whether learning takes place in virtual reality or a class-
room seminar, the history, principles, and processes of learning science con-
stitute a valuable toolkit for learning ecosystem designers and developers.

PRECONDITIONS FOR
ASSESSMENT: THE ESSENTIALS
In Visible Learning, John Hattie names two elements as “essential to learning”:
(1) a challenge for the learner and (2) feedback.3 Similarly, both factors serve
as a foundation, or as the minimum requirements for, assessment. If challenge
is insufficient, neural connections are neither strengthened nor altered in a
learner’s brain, and if useful feedback isn’t present, the learner is acting blind-
ly, unable to relate her performance to either current or future learning goals.

New-age learning analytics have moved the needle considerably as they allow
for continuous, real-time monitoring of performance and can present up-
to-date dashboards to stakeholders. This is a far cry from assessment in the
age of our grandparents. For most of the 20th century, a “factory model” of
training and education prevailed and, with it, an assumption that teaching is a
transmission process, with learners on the receiving end. The goal was to fill
everyone’s head with knowledge and deliver a uniform product, the graduated
student, to society. Instructors were told that a period of teaching needed to be
followed by an assessment, followed by another period of teaching and another
assessment, ad infinitum until a program of instruction ended. Assessment
206 | Modernizing Learning

was thought to be an on-again-off-again occurrence situated in this linear


process.

Many decades ago, the design of assessments wasn’t considered particular-


ly important, since they were like accessory events to the primary focus of
teaching and learning. Paper-based activities such as tests and essays pre-
vailed—except in special settings, such as art, speech, or physical education
where performance mattered. And in this environment, it was assumed that
students could receive feedback in the same manner as they did any other
sort of information: Many instructors never thought twice about red-inking
students’ papers or telling them harshly that they lacked writing or thinking
aptitudes—a practice that would lead some learners to a state of learned-help-
lessness. Conversely, it was acceptable to praise high-performing students’
abilities and intellect, often undermining their growth mindsets and instilling
a false sense of the level of effort required to learn.

Nowadays, most classrooms, whether they exist within in a company, at a


military base, or on a computer screen, experience at least some differences
in assessment practice and the attitudes toward it. Assessment state-of-the-art

This was once the forefront of assessment!


Assessment and Feedback | 207

in many places can be described (albeit cautiously) as more learner-centered


than in the past. These changes may be ascribed to the impact of constructiv-
ist learning theories and methods such as active-learning and learner-centered
design. Improved practices and attitudes have also resulted from numerous
assessment movements that have achieved notice, if not popularity, over the
last few decades: authentic assessment, performance assessment, alternative
assessment, formative assessment, portfolio assessment, embedded formative
assessment, longitudinal assessment, and assessment for learning (which is
distinct from assessment of learning).

So, in this new age must we always be assessing? What’s best for learners?
For now as well as in the foreseeable future, some forms of student work
and performance will be prioritized above others as the significance of any
given assessment is socially constructed. For example, in adult education, as-
sessments that mirror authentic types of workplace tasks may be more great-
ly valued and better serve to articulate learning objectives. It’s important to
recognize that not all actions or learning artifacts individuals produce will
have equal value relative to learning goals, program objectives, or learning
outcomes. Part of the challenge, therefore, lies not only in designing and de-
livering effective assessments but also in prioritizing their applications and in
considering their broader roles within the learning ecosystem.

Building upon the progress made to date, assessment in the future must con-
tinue to empower education and training stakeholders. Understanding assess-
ment is no small feat, but to start, it’s useful to clarify the true purpose for
systems of assessment, including for singular high-stakes assessments, and to
encourage a mindset shift away from 20th century preconceptions that cou-
ple valid measurement almost exclusively with summative measures such as
tests, papers, quizzes, and the like. It’s also useful to become versed in devel-
opments arising from research in formative assessment, as well as its close
cousin, feedback—which has a symbiotic relationship with learning. Finally,
as we embrace a more technology-centric approach to learning, it’s useful
208 | Modernizing Learning

to consider the affordances that learners will require in environments where


assessment may occur in real-time and continuously.

Purpose of Assessment

The ostensible reason for assessing learning is to aid decision-making. How-


ever, assessments are quite often used to hold an entity or a person account-
able for meeting predefined criteria or achieving certain outcomes. As such,
student learning outcomes are almost always written to reflect some level of
desired change, such as the desire for increased performance on a standard-
ized test; advancement in subject-area ability; or achievement of a curriculum
objective defined by a certification entity, a state-level department of educa-
tion, or an employer. In a classroom, quizzes may be used to hold students
accountable for studying; at an organizational-level, standardized tests may
hold school districts accountable for collective performance, and in workforce
contexts, assessments might be used to assign accountability for adhering to
regulations by verifying employees have completed compliance training.

However, despite their practical utility, these sorts of accountability assess-


ments of learning often have less utility for learning. Susan Hatfield, in Kan-
sas State University’s long-running practitioner paper-series on improving
learning in higher education, highlighted the distinction:

The best way to determine the reason for doing the assessment is by
examining the focus of the plan. Is the focus simply on collecting data?
Or is the focus on using data to improve student learning? Assessment
plans designed to appease others generally involve a lot of data collec-
tion but are rarely put to meaningful use. Plans that focus on student
learning connect collected data to potential courses of action.4

The potential “courses of action” Hatfield mentions might occur at various


conceptual levels, from more immediate task- or course-focused perspectives
to organizational and lifelong learning considerations. In other words, whether
Assessment and Feedback | 209

Recently I took a test for Google for


Education Certification. I thought it would
be a typical test so I crammed… how I
always tested. That’s usually how you have
to prepare for nearly every standardized
test I’ve ever taken. However, when
I started the test I realized it’s not a
crammable test! It’s all practice based so I
actually learned while I took it. I had all the
tools, it felt like fun, and most of all it was
meaningful. I hold this experience dearly!

When I took the Level 2 test in the series,


I didn’t prepare the same way! I looked at
problems and thought through scenarios.
I didn’t even realize the hours passing in
the test. I wasn’t bogged down. From
then, I pushed myself to start assessing my
students in the same way.

Authenticity is the key. We’re stuck in


a century that’s long gone. We need to let
go of that and start encouraging the sort
of growth mindset that allows students to
perform and grow and struggle with dignity.
This is how they’ll feel prepared for life.
The school of life… it’s all competency
based.

Kimberly Eckert
Teacher, Brusly High School
Louisiana State Teacher of the Year 2018
210 | Modernizing Learning

used for accountability or formative learning, assessments that inform


macro-level decisions should (and generally do) differ from those at micro-
levels. Macro-level decisions rarely, if ever, are based on a single source of
the evidence. In an education system, for example, the higher one goes up
the decision-making tree from classroom, to school, to district, to state, the
more important it becomes to aggregate the results from multiple, varied
assessments and make a thoughtful human judgment called an evaluation
evaluation.
Evaluation is a complex art, dependent upon accurate data and a capacity
for judgment derived from knowledgeable instructional practice. Experience
with effective assessment and instruction is, in fact, the crucible that enables
individuals to make good evaluative judgments.

As evaluation enters the picture, it widens the aperture about the purpose and
utility of assessments. Evaluations and other macro-level assessments should
emphasize measures of effectiveness,
effectiveness that is, meaningful outcomes in terms
of the impact of learning, such as college admittance rates or improvement in
job performance. Measures of effectiveness are contrasted against measures
of performance,
performance or process-focused measures such as a student’s grade-point
average or how many people completed a training workshop.

This distinction gets to the heart of training and education. Whether individ-
uals are enrolled in a high school composition course, corporate training pro-
gram, or professional military education seminar, the aim of most formal and
informal learning is to engender practical competence—competence that’s
necessarily instantiated in a particular context or environment. For exam-
ple, if you tell students to achieve a set of general communication outcomes,
they’re likely to shrug and disengage. However, if you focus those students on
writing their college entrance essays, corporate work plans, or five-paragraph
field orders, they’re not only likely to show greater motivation but assessments
of their abilities are apt to be more authentic, meaningful, and reliable.

One of the most persistent problems in (adult) training and education stems
from inadequate understanding of how applied performance—real people
Assessment and Feedback | 211

performing real jobs—relates to learning outcomes. Part of the challenge


lies in understanding the distinctions among competence, competencies, and
learning outcomes. Competence is a hidden property, inherent to a person,
team, or organization. It can’t be directly assessed. Competencies
Competencies, on the oth-
er hand, are the clusters of knowledge, skills, attitudes, attributes, and other
characteristics that attempt to itemize competence. In turn, these competency
descriptions can be used to articulate job requirements or to inform learning
outcomes for training and education. (See Chapter 13 for more details on com-
petency-based learning.)

Unfortunately, the more a certain activity requires higher-order cognitive


and social-emotional competence, such as intrapersonal communication or
leadership skills, the more difficult its components are to identify, define,
and assess. Similarly, practical competence requires the interplay of differ-
ent competencies (such as empathy and communication skills combined with
subject-matter expertise), which also creates difficulty. This is the classic “ice-
berg problem.” For example, capabilities your boss thinks are important for
your job are anchored to its most visible aspects, while you know that your job
also involves another set of less visible, less well-defined facets. The same is
true outside of an employment context; those capabilities that prepare some-
one for life, or to be a good member of our society, are problematic to charac-
terize, delineate, and measure.

In summary, having a clear view of the purpose of an assessment is the first


step towards increasing its productive utility. The true purpose should be ana-
lyzed: Is the desire to measure the most meaningful, or merely the most conve-
nient, things? Has the system of assessment sufficiently addressed real-world
competencies, and are the assessments of sufficient breadth and depth to real-
istically measure them? Finally, what evidence is there that assessment results
are being used to improve instruction? To the latter question, the results from
assessments can inform instructional adaptations or organizational decisions,
212 | Modernizing Learning

and in particular, they can be used to generate valuable feedback to learners,


teachers, trainers, and organizations.

WHAT LEARNERS NEED


FROM ASSESSMENTS
By their very existence, assessments affect learning. Individuals will change
their behaviors if they know they’ll be tested, and completing an assessment
encourages learners to recall and exercise their knowledge and skills. Howev-
er, substantially more value comes from actually using the evidence collected
from an assessment. Unfortunately, all too often reams of data are produced
without any practical application of them.

1. Serviceable Feedback

The importance of feedback to assessment is vastly underrated, and what con-


stitutes high-quality feedback is often misunderstood. At the most founda-
tional level, quality feedback should enable an instructional system to close
the loop—to come full-circle—while simultaneously affording learners and
organizations data that improves their development processes. Royce Sadler
observed in his widely cited article on formative assessment:

If the information is simply recorded, passed to a third party who lacks


either the knowledge or the power to change the outcome, or is too deep-
ly coded (for example, as a summary grade given by the teacher) to lead
to appropriate action, the control loop cannot be closed and “dangling
data” substitute for effective feedback.5

The “control loop” in Sadler’s quotation concerns the system-control func-


tion, which conceptualizes learning as a loop and feedback as an intervention
Assessment and Feedback | 213

used to iteratively close the gap between the actual level and desired level of a
particular capacity. Assessment results that don’t meaningfully inform some
aspect of teaching and learning, or that fail to help this progression, are con-
sider “dangling data.”

The term “feedback” is not only vague but itself a misnomer. Assessment
expert Dylan Wiliam is fond of saying that it more aptly refers to the view
from the front windshield rather than the rearview. It can refer to performance
observations or advice, reflective prompts and questions, or other information
relevant to an individual or group; and it may refer to past, present, or future
performance.

So, as long as teachers and trainers deliver accurate and relevant feedback,
what’s the difficulty? It was Sadler 6 who again uncovered the key: There are
several reasons a learner may have trouble implementing feedback—even if
it’s of exemplary quality and delivered early enough in a period of instruction
to be useful. First, the line may be blurry for the learner between the work
as realized and what was intended; individuals may see potential where in-
structors may see flawed work. Second, terminology or criteria related to the
instructional task may not be understood. Third, students may fail to grasp
tacit knowledge. For example, statements such as “this doesn’t follow logical-
ly from what goes before” makes no sense to students who don’t recognize
the hallmarks of subpar writing structure: It looks fine to them. Last, learners
often cannot consolidate or apply advice fast enough for learning to stick. To
be effective providers of feedback, then, teachers and trainers need to better
understand learners’ own visions of their work, their challenges, and any gaps
in their learning. Also, learning facilitators would be wise to implement learn-
er self-assessments and peer assessment, since both can go a long way toward
meeting these needs.

Another model for the creation of more comprehensive and appropriate feed-
back comes from the work of John Hattie and Helen Timperley.7 They believe
that learners need three questions to be answered concerning their perfor-
214 | Modernizing Learning

mance. First, they need information about the performance goal, which an-
swers the question, “Where am I going?” This includes specific and compre-
hensible success criteria and is referred to as the “feed up” stage. It’s followed
by the “feedback” stage, which answers the question, “How am I going?”
Lastly, the question is, “Where to next?” This final stage is called “feed for-
ward,” and it’s probably the most critical juncture for applied learning and
development. Hattie and Timperley also identify four targets for feedback:
feedback about the task, about the processing of the task, about self-regu-
lation, and about the self as a person. Their three questions apply to each of
these categories, and together these twelve targets become a useful, heuristic
catalog for learner feedback.

2. Evidence-Based Systems

As the characteristics of training and education evolve, enabled by the affor-


dances of the future learning ecosystem concept, new models of assessment
and feedback can be more readily supported. For example, the proliferation
of new media devices, wearable sensors, and IoT appliances has correspond-
ingly created an abundance of data. Even without these new hardware tools,
someone’s activities (say, in a social-media app or on an e-commerce site) can
be tracked with uncanny precision. By analyzing an individual’s behaviors, as
revealed by these data, we can start to better understand their attitudes and
capabilities in ways unimaginable with legacy assessments.

Valerie Shute and colleagues popularized the concept of “stealth


stealth assessment,”
assessment
which involves interweaving assessments, informed by evidence-centered
design principles, directly and invisibly into the fabric of an application en-
vironment. For instance, they integrated stealth assessment into a popular
video game (Plants vs. Zombies 2), and from player interactions could infer
measures of their problem-solving skills. Shute et al. have recommended this
approach for applied, competency-based assessments, particularly for certain
Assessment and Feedback | 215

Feed Up Feedback Feed Forward

FOUR TARGETS FOR FEEDBACK


the task
the processing of the task
about self-regulation
about self as a person

ill-defined capabilities otherwise difficult to evaluate, such as persistence,


creativity, self-efficacy, openness, and teamwork.8

Shute and her colleagues advise against hiding assessments or evaluating in-
dividuals without their awareness; rather the term “stealth” refers to the fric-
tionless integration of the measurement, where it’s inherently situated within
a task rather than an exogenous activity to it. Two other characteristics of
stealth assessment are that it’s continuous (in contrast to single-point summa-
tive testing) and probabilistic (in contrast to the predefined criteria frequently
used by standardized exams with well-defined correct and incorrect answers).

Stealth assessment can be supported by, or otherwise inform, various da-


ta-driven analysis methods. As discussed in Chapter 9 of this volume, learn-
ing analytics and educational data mining
minin are two such approaches. Stanford
University Professor Candace Thille has drawn parallels to the way similar
technologies have transformed e-commerce: Companies can predict buying
patterns, use targeted advertising, and employ frequent A/B testing to contin-
uously improve their businesses. Analogous capabilities are being applied to
216 | Modernizing Learning

learning to uncover learner needs by group or type, help personalize learning


based on individual needs and characteristics, or help predict which individu-
als are likely to succeed in a given course.9

“The big power of this technology is that we can construct these interactions,
collect this data on students’ interactions, and use it to drive very powerful
feedback loops in the learning system.” – Candace Thille 10

However, stealth assessment, learning analytics, and educational data-mining


can suffer from the “dangling data” problem that Sadler mentioned. In other
words, it’s possible to estimate someone’s problem-solving ability, let’s say,
without taking steps to support its improvement or even communicating the
evaluation results to the learner. Ideally, such data shouldn’t merely be used to
pass external judgment—the results should be put to work, helping individu-
als and organizations better meet their goals. Further, this doesn’t just mean
using the data to inform automated personalization or AI-based adaptation.

With the growing use of automation, we run the risk of disempowering learn-
ers, teachers, and trainers. Despite their enormous potential, automated sys-
tems are only as strong as their weakest link—which is very often the user
interface and user experience. Even today, in arguably simpler times, comput-
er-assisted instruction is fraught with UI/UX design challenges, delivery tool
mismatches, and assessments that learners perceive as irrelevant. While new
technologies can enable more frequent and better attuned assessments, these
may be relatively meaningless if they fail to offer learners and instructors
sufficient interaction affordances, such as for understanding and making use
of the assessments, feedback, and subsequent intervention recommendations.

3. Learner Autonomy

Professor Jon Dron, from Athabasca University, posited a theory of trans-


Assessment and Feedback | 217

actional control, which may be relevant, here. It builds on Michael Moore’s


well-known theory of transactional distance, which essentially shows that the
relative “distance” someone feels in an e-learning context is based on the
amount of interaction and structure in it, rather than the physical separation
between learners and instructors.

Dron extended the transactional distance theory to highlight the impact that
control, or the extent to which choices are made by teachers and learners, is
the fundamental dynamic of it. The central idea is that flexibility, negotiation
of control (or “dialogue”), and autonomy all matter a great deal in learning
contexts.11 The solution isn’t as simple as giving learners (or instructors) full
autonomy; rather, a thoughtful approach, considerate of control, is needed. As
Dron explains:

Most learning transactions tend towards control by either the learner or,
more often, the teacher. From a learner perspective, being given control
without the power to utilize it effectively is bad: learners are by defini-
tion not sufficiently knowledgeable to be able to make effective deci-
sions about at least some aspects of their learning trajectory. On the oth-
er hand, too much teacher control will lead to poorly tailored learning
experiences and the learner may experience boredom, demotivation, or
confusion. Dialogue is usually the best solution to the problem, enabling
a constant negotiation of control so that a learner’s needs are satisfied...
The ideal would be to allow the learner to choose whether and when to
delegate control at any point in a learning transaction.12

A key takeaway is that learners must be afforded enough autonomy to remain


engaged, construct their own knowledge and skills, and develop their self-
regulation abilities. Striking the right balance between teacher-controlled—or
AI-controlled—learning versus learner-regulated anarchy is key. As Dron’s
quotation highlights, systems that favor negotiated control, as much as
possible, are preferred. In the future learning ecosystem, this prompts us to
consider how control is distributed across individual and collective learners,
teachers, and automated systems.
218 | Modernizing Learning

RECOMMENDATIONS
Given the principles of assessment and feedback, as well as the opportunities
(and challenges) afforded by new technologies, there are several precepts to
consider regarding assessment and feedback for the future.

1. FIRST AND FOREMOST, CULTIVATE LEARNER MOTIVATION. As long


as designers of instruction strive to cultivate learners’ interest and motivation
with regard to assessment activities, then they are excellent change agents.13
When designed and implemented well, assessments afford rich opportunities
to develop learners’ concepts, communication skills, subject-area expertise,
judgments, and abilities.

2. MAKE ASSESSMENT AND FEEDBACK LEARNER-CENTERED. Learn-


ers aren’t merely passive vessels but active participants who seek out useful
feedback when motivated to do so.14 Educators and trainers must try to view
assessment through their eyes. Success in assessment is tied to learner engage-
ment (like everything else in education and training). Even in an imaginary
future, where AI systems have the ability to determine learning priorities,
content, and sequence, learners will still need to be actively engaged, given
explicit feedback, and afforded agency over their own learning.

3. INTERWEAVE ASSESSMENTS THROUGH INSTRUCTION. Instruc-


tion and assessment have a truly symbiotic relationship; they’re inextricably
linked and interactive.15 A variety of types of assessment activities should be
threaded throughout lessons, modules, and courses of instruction. Even so,
assessments will always vary in terms of their relative importance, and this
is as it should be: The extent to which an assessment fulfills the overarching
objective of instruction represents the degree to which it possesses socially
constructed value.
Assessment and Feedback | 219

1. Cultivate learner 2. Make assessment


motivation and feedback
learner-centered

3. Interweave 4. Vary the types of 5. Mitigate the


assessments through data collected fluency illusion
instruction

6. Plan for curricular 7. Integrate feedback 8. Plan for systemic


alignment early on into learning design change
COMPUTERS AND HUMANS WORKING IN CONCERT: At Arizona State University,
we have some huge introductory courses, for example, College Algebra with 3000 students.
About 5 years ago, we created an adaptive general education structure. There are approx-
imately 13 modules for College Algebra, but if students finish early, they can enroll in the
stretch-version of the course—it doesn’t cost and it gives them credit for the second semes-
ter. We use a program called ALEKS for instruction, adaptive testing, and adaptive placement
to determine which courses each student is ready to take (Algebra, Precalculus, or Calculus).
Sometimes ALEKS isn’t perfect; so, perhaps someone ends up in College Algebra and they
get through the course in the first month—that’s fine! There’s one more aspect of this, but
it doesn’t scale well: Students are also required to attend class, where they’re coached by
teaching assistants and take exams. Every week or so, they work in small groups with more
difficult problems, and they’re scored as a group because collaborative problem-solving is
an important skill. During those days, it’s active—and the students are loud!—but it helps
keep the them engaged in the course. Note that these are undergraduates; so, the in-person
courses also act as a bit of a clinically counseling apparatus. The assistants help mentor, and
if they find struggling students, they can refer them to a counselor. We have a web-based
system for the counseling staff, too. We’re really into helping the first-time freshman!

Courtesy of Kurt VanLehn, Ph.D., Professor, Computing, Informatics,


and Decision Systems Engineering, Arizona State University

4. VARY THE TYPES OF DATA COLLECTED. A functional system of assess-


ment for learning should be eclectic and incorporate a variety of measures
such as quantitative, qualitative, estimated, and predictive data types. This
approach suits the social-science aspect of the measurement objective. Look-
ing ahead, as the vision of an interconnected learning ecosystem comes to
fruition, assessment evidence from highly varied sources can be collected,
Assessment and Feedback | 221

stored in persistent learner profiles, and examined in aggregate. This will start
to shed more light on competencies in situ as well as the interplay among di-
verse knowledge, skills, attitudes, and other characteristics.

5. MITIGATE THE FLUENCY ILLUSION. Today, our most highly valued


assessments are usually summative performances (e.g., final exams, formal
presentations, final projects, professional portfolios) that differ in significant
ways from practice and study contexts. This discrepancy can create a “fluency
illusion,” where individuals misjudge their capabilities by thinking that their
fluency—or ability to remember and apply skills—in practice settings will
translate to performance scenarios. To mitigate this, learners require opportu-
nities for practice assessments such as pre-tests or trial performances that are
spaced out in time, occur in a mix of locations or under varying conditions,
and are sequenced in a special way that mixes problems or content elements
(referred to by educators and psychologists as “interleaved” practice).16

6. PLAN FOR CURRICULAR ALIGNMENT EARLY ON. Good assessment


is planned for very early in the instructional design process, and it begins
by imagining what post-instructional success looks like. Outcomes and
assessments are like the “bones” of instruction and should be constructed first,
so that lessons may be structured around them.17 This process is referred to as
the backwards design of assessment.18 Relegating assessment to an ancillary
concern typically puts validity at risk by increasing the likelihood of measuring
achievements that are unrelated to the specific learning objective of interest.

7. INTEGRATE FEEDBACK INTO LEARNING DESIGN. As with assessment,


feedback approaches should be incorporated early into the instructional de-
sign process. While feedback as a dialogue between instructors and learners
is highly productive, learners can (and often do) obtain feedback from multi-
ple sources. How these multidirectional and distributed feedback loops fit into
the design of instruction requires planning.19 Explicit and thoughtful efforts
are needed, particularly as automation becomes more profuse, threatening to
reduce individuals’ control and transparency of learning. Good feedback de-
222 | Modernizing Learning

sign ensures that learners receive useful information that’s timely, actionable,
and customized to their needs.

8. PLAN FOR SYSTEMIC CHANGE. The most challenging aspect of assess-


ment is often the sleuthing necessary to figure out how all the parts fit togeth-
er: How do the instructional design, delivery, assessments, and measurement
data collectively tell the story of what a learning experience was like for a
group or individual, and how can we improve such experiences systematical-
ly? Organizationally, there should be a forcing function or mechanism that
causes the results of assessments to be utilized. However, teachers and train-
ers, or automated systems, shouldn’t make those decisions alone. Taking ac-
tion in response to assessment is important, but equally critical is considering
how to bring learners into that equation.

Conclusion

It’s strange that we don’t hear more frequent comparisons made between
the practice of teaching and the practice of medicine. Both require intense
amounts of skill, professional development, and consistent practice. As as-
sessment expert Dylan Wiliam says: Teachers need professional development
because the job of teaching is so difficult, so complex, that one lifetime is not
enough to master it.20 Mastering assessment in teaching is a bit like mastering
triage skills in the emergency room, in that successful intervention depends
on successful evaluation of the unique situation of each individual. And, yes,
because so much of our survival and future success depends on acquiring
effective training and education, one’s learning needs often are (at least in a
theoretical sense) as urgent as many health needs. Perhaps because nearly all
of us have been coaches, trainers to workplace apprentices, or teachers to our
own children, the instructional process may have lost its mystique somewhere
along the way. Hopefully, a clearer vision may help us appreciate the mystery,
regain some enthusiasm, and redefine as well as reimagine assessments to
work more effectively and purposefully to uplift and motivate our students.
Instructional Strategies for the Future | 223

CHAPTER 12

INSTRUCTIONAL STRATEGIES
FOR THE FUTURE
Brenda Bannan, Ph.D., Nada Dabbagh,
Ph.D., and J.J. Walcutt, Ph.D.

As education and training opportunities become ever more available—on de-


mand, anywhere, anytime, and across our lifespans—individuals increasingly
experience bursts and waves of disconnected, transitory, and episodic learn-
ing. Hence, it’s our challenge, as learning science practitioners, to help learn-
ers filter data noise, focus on relevant information, and meaningfully connect
new learning to past experiences. Towards that end, this chapter provides a
framework that illustrates a shift in thinking about instructional strategies, re-
focusing these principles to better support the future learning ecosystem and
foster connections across learners’ lived experiences. Building on traditional
instructional strategies shown to be effective in formal learning contexts, we
propose new approaches that cut across individuals’ learning episodes, poten-
tial careers, and lifespans.

Background

For decades, the design of instructional strategies (and learning systems, in


general) has been largely treated as a micro-level, reductionistic, and linear
activity—focused on analyzing particular learning outcomes, aligning them
with suggested instructional strategies, and then delivering instruction in
straightforward ways to elicit desired responses. However, today, learning oc-
curs in a multidimensional frame, blending formal, nonformal, and informal
experiences that transcend time, space, medium, and format. The complexity
224 | Modernizing Learning

of our lives and diversity of available technologies warrant a shift in learning


theory, away from standalone learning episodes that push information in a
singular manner and towards a multipoint, multimodal view where learning
crosses the boundaries of time, context, delivery methods, and devices.

Although networked technologies have already made it possible to support


ubiquitous lifelong learning, our teaching methods and instructional strate-
gies haven’t caught up with these new learning affordances. We’re still de-
signing at the module, course, or program-level, ignoring broader learning
pathways, and discounting the additive peripheral events learners encounter
throughout their lives. We need to modernize our conceptualization of “in-
structional strategies,” and expand these principles to support a more open,
flexible, and personalized learning ecosystem. We need to create continuous
and meaningful lifelong learning and find ways to incorporate elements from
diverse and informal contexts into it.

Fostering more cohesive, coherent learning will likely involve designing some
manner of “macro-level instructional arcs” that span a mosaic of individual
and collaborative learning experiences—meaningfully intersecting different
events across a lifetime. It will also require us to make better use of multimod-
al communication tools to help individuals curate information and generate
knowledge across experiences. This position reflects the connectivist view of
learning, which perceives knowledge as a network, influenced and aided by
socialization and technology.1 From this standpoint, knowledge isn’t only con-
tained within an individual or information artifact; it’s also distributed exter-
nally through networks of internet technologies and communities, accessible
via social-communication tools. Learning takes place in these autonomous, di-
verse, open, interactive, collaborative, and global knowledge systems. Hence,
recognizing relevant information patterns, constructing new connections,
and nurturing and maintaining connections become critical skills for achieve-
ment. Individual learning opportunities can be (and have been) designed
with this paradigm in mind; 2 the full solution, however, requires even more.
Instructional Strategies for the Future | 225

At IES [the Institute of Education Sciences within the U.S. Department of


Education], we funded two R&D centers to bridge cognitive science and
education.…This important work was especially useful in demonstrating what the
research to-date has not addressed. When you take something that has been exten-
sively researched in the lab setting—like self-explanations, making comparisons, or
studying worked examples—and then implement those principles in the curriculum,
there are a lot of design decisions need to be made: What kinds of comparisons need
to be made? And how do you present these ideas on a textbook page? What infor-
mation do you highlight and how do you highlight it in a textbook? In the lab, these
types of questions don’t come up. Another issue is, how do you combine learning
principles like retrieval practice, worked examples, etc.? Historically, we’ve studied
these principles in isolation, but when you combine them into a year-long learning
experience, there are many questions about how to do that effectively.

Erin Higgins, Ph.D.


Program officer within the Institute of Education Sciences
U.S. Department of Education

Limits of Conventional Instructional Design

Traditionally, an instructional designer begins with some given set of criteria


such as the lesson’s purpose and subject matter, learners’ general characteris-
tics, and likely some logistical constraints. From these, designers extrapolate
the type (e.g., psychomotor, cognitive, affective) and level of learning out-
comes (e.g., remembering and understanding, applying and understanding),
objectives of the associated assessments (e.g., formative, summative), and oth-
er delivery factors (e.g., course schedule, perhaps). They break the goals into
objectives, the objectives into tasks, and then select some set of instructional
interventions to help learners master each component. They continue work-
ing in this linear fashion—breaking down the plans into smaller and smaller
parts, and carefully considering the content, delivery, and learner activities for
each. This is known as “backwards design.” 3
226 | Modernizing Learning

The traditional approach to designing instruction generally assumes a given


target—a particular individual or cohort—as well as a specific setting and
general set of conditions. It focuses on determining the appropriate configu-
ration of instructional interventions in insular and finite curricular units, such
as a course or training program. However, as we envision learning across
lifetimes, this model no longer suffices. In the future, we need instructional
design that encompasses diverse learning experiences, media, populations,
and contexts—many of which will fall outside the instructional designer’s
purview. In other words, we need an updated approach that:
• Facilitates learning as a gestalt, derived from the collective
sum of all learning events and experiences;
• Recognizes learning outcomes are increasingly self-directed and
stitched across different contexts, networks, and communities; and
• Actively incorporates technology to enable learning—not
only as an instructional delivery mechanism but also as
the “glue” to connect learning events to one another.

Consequently, we need a multidimensional model of instructional design that


integrates traditional micro-level interventions as well as macro-level princi-
ples, that considers not only instructor interventions but also learners’ own
agency, and that actively connects experiences across the crisscrossing land-
scape of learning.

Strategies and Tactics; Instruction and Learning

Instructional design terminology is used in a hodgepodge of ways.4 We won’t


attempt to unkink it, but it’s useful to highlight several terms. First, consider
“instructional
instructional strategies”
strategies (also frequently called “teaching strategies”). This
is the most common way to refer to the instructional interventions used by
teachers, trainers, and instructional designers. In more careful discussions,
this concept is typically divided into “instructional organizers,” at a more
Instructional Strategies for the Future | 227

global level, and “instructional tactics” at a more granular one.5 Exactly where
the lines are drawn between these levels is a bit fuzzy—and largely irrelevant
to our discussion. What’s more applicable is the general idea that there are
instructional design distinctions at different conceptual and granular levels.

The second important distinction comes in comparing instructional strategies


to learning strategies. Where instructional strategies are devised and applied
by learning experts to some planned block of instruction, learning strategies
are personal methods used to improve one’s own knowledge, skills, and expe-
riences across the range of formal and informal learning. In theory, learning
strategies and instructional strategies mirror each other. For example, an in-
structor might design a lecture, provide some illustrative examples, and give
feedback. Meanwhile, a learner may work to memorize terms, mentally com-
pare-and-contrast new ideas to prior knowledge, and reflect on performance.

In many ways, the distinction between instructional strategies and learning


strategies is a question of control. As discussed in the previous chapter,
chapter trans-
actional control (or the extent to which the learner makes decisions versus
some external authority, such as the instructor or software) is an important
factor. As one might expect, control of learning can be handled in different
ways: Internally by the learner, externally by some structure or authority, or
insufficiently, without effective support from either internal or external sourc-
es. Also, as Jon Dron’s transitional control theory emphasizes, some form of
negotiated control, in the middle of internal–external control continuum, is
best.6 Hence, the notable concept here is not only the contrast of instructional
strategies to learning strategies, but also the potential for their integration—
that is, blending learner-directed and authority-directed strategies together.

One final distinction for the future learning ecosystem is belied by its name.
Why is it an ecosystem; why not just a regular, old system? An ecosystem, by
definition, is comprised of interconnected parts, with the behaviors of many
individual agents affecting one another as well as the environment’s over-
all holistic pattern. It’s a dynamic system, in the engineering sense, involv-
228 | Modernizing Learning

ing many dispersed, interdependent, interacting elements, and, notably, it’s


not guided by some top-down, centralized control. Some portions may be
structured and designed, while others act or interact with their own agency.
Consequently, for our learning ecosystem, how we understand instructional
structure and learning is an essential consideration.

THE EXPANDING
CONTEXT OF
FUTURE LEARNING
To advance instructional theory, it’s necessary to expand its design towards a
modern, longitudinal view of learning, one that facilitates connectivist prin-
ciples and seeks to amplify outcomes throughout an array of teaching and
learning situations, across multiple contexts, diverse learning objectives, and
disparate learning modalities. This section outlines eight principles likely to
shape the purpose and application of instructional strategies in this complex
future context.

1. Connect diverse learning experiences

Explicit in the “ecosystem” concept are the notions of diversity and intercon-
nectivity. Most relevant, here, are the diversity of learning experiences and
their complex interconnectivity with one other. As humans, all of our experi-
ences naturally affect one another. The question is not simply “how to ensure
learning episodes are somehow additive,” but rather how to intentionally build
meaningful and effective connections among learning episodes that advance
overall learning goals. Even within a relatively constrained setting, like a sin-
gle course, instructors and instructional designers need to broadly consider
multiple and varied learning modes and,
importantly, how to help connect learners’
experiences across them. As a simple ex- The transitions for
ample, consider a semester-long class that learners from K–12 to
incorporates face-to-face seminars, online postsecondary education
courseware, an additional smartphone app are significant, and if we
used to remediate some students, and in- really want to learn about
formal resources, such as videos or blogs, accumulated learning, we
that students find online. Courses that have to have data systems
blended these sorts of resources are al- that talk to each other. In
ready common. Part of the challenge, how- the science standards,
ever, is gracefully navigating the available we’re thinking about the
set of learning-resource options and inten- progression of learning
tionally integrating them so that they not over time. Learners need
only coexist but also correlate. time to digest what they’re
learning in a deep way.
This mosaic of learning components, of
Heidi Schweingruber, Ph.D.
course, is often more complex than this
Director, Board on Science Education,
example describes. In reality, learning National Research Council, U.S.
National Academies of Sciences,
experiences span multiple formal and in- Engineering, and Medicine

formal events, timespans, and contexts,


contributing to an ever-evolving trajectory
of reconfigured and connected experienc-
es, through the lifespan, across multiple
contexts, and intersecting with varying
developmental dimensions (such as psy-
chomotor, social, emotional, and cognitive
learning). An ongoing challenge for learn-
ing professionals, then, will be to help
learners integrate these myriad experienc-
es in thoughtful ways.
230 | Modernizing Learning

2. Connect to, and enable outside


connections from, learning opportunities
beyond the planned instruction

The preceding example described the integration of learning resources around


a central unifying core (a single course). This is good, but we need to think
even broader. In addition to the planned activities designed in or around a
particular formal learning event, learning professionals need to consider the
impact of learning activities that take place outside of their direct control or
even full awareness, such as independent self-directed learning, informal ex-
periences, and other external formal activities (such as courses taught by other
teachers on different subjects). Too often, teachers and trainers focus solely
on the activities taking place within their purview, that is, within their formal
learning episode. This may cause those learning professionals to inadvertent-
ly overlook individuals’ prior experiences, concurrent learning activities, or
the future learning events they might encounter. Linking to prior or external
learning isn’t new guidance, but the growing availability of well-designed
informal learning resources combined with interconnected technologies and
interoperable data make these linkages more achievable and more necessary.
For the future, it’s important to consider instructional strategies that tie-in to
these other learning activities and also to create “hooks” in the formal learn-
ing materials we create, so that learners or other learning professionals can
better link our work into their own learning environments.

3. Connect learning across levels of abstraction

When a child learns to read, we first start by teaching sounds and letters; once
these are learned, we teach words, sentences, punctuation, grammar rules,
comprehension, and eventually one day maybe professional investigative
journalism or creative screenwriting. The point is that different capabilities
emerge from the integration of competencies at a given level of analysis. The
Instructional Strategies for the Future | 231

1. Link learning experiences to each other


2. Link to other outside formal and informal learning

“levels of analysis” concept describes the level of abstraction at which some-


thing is affected or evaluated, with the implication that the elements at each
level relate to one another. Computational neuroscience David Marr has gone
so far as to say:

Almost never can a complex system of any kind be understood as a sim-


ple extrapolation from the properties of its elementary components…If
one hopes to achieve a full understanding of a system…then one must be
prepared to contemplate different levels of description that are linked,
at least in principle, into a cohesive whole, even if linking the levels in
complete detail is impractical.7

In the learning domain, considering learning at different abstraction levels


micro-level interventions),
helps us plan the immediate activities (micro-level interventions broader but
macro-level interventions),
still bounded experiences (macro-level interventions and expansive lifelong
meta-level interventions).
learning arcs (meta-level interventions As indicated in the earlier “Strate-
gies and Tactics; Instruction and Learning” section, precisely distinguishing
where one level ends and another begins is less important than the general
concept. That concept is that we need to consider is how to better combine
the micro- and macro-level approaches to designing instruction (the typical
instructional tactics and strategies experienced designers already use) along
with new macro-level strategies to create a multidimensional, multilayered
model that helps learners aggregate and make sense of learning experiences
across devices, modalities, episodes, and learning dimensions. The idea is to
support learners beyond the context of a given course or training event, to
help them integrate these into a more holistic course of study. For instance, a
university mentor might help a graduate student understand how the differ-
232 | Modernizing Learning

3. Connect across
different levels
of abstraction

MICRO MACRO META

ent courses, job-study projects, and internships coalesce—creating integrated


meaning beyond their individual parts. How do we provide similar support,
but more broadly and outside of a narrow academic context? How do we help
people extrapolate meaning across otherwise unconnected activities and inte-
grate experiences in ways that expand those activities’ individual values? And
how do we do this across longitudinal periods—not only during a semester or
academic program, but at a lifelong learning scale?

4. Consider the “in between” learning spaces

This multilayered model of learning might appear to simply connect pinpoints


of learning across time, space, and modality—like a pointillist painting that
reveals an image from separate daubs of paint. But the concept goes beyond
that. Unlike paint blotches, which are individually contained and otherwise
inert, each learning experience is dynamic and complex. Further, the “space”
between learning experiences—that is, the new value derived from merg-
ing or reconceptualizing learning “frames” in response to their integration or
comparison—differs from the largely additive emergent qualities of a Georg-
es Seurat masterpiece. In other words, the challenge for learning professionals
is this: How do we capitalize on the abundance and diversity of learning ex-
periences in creative and deeply meaningful ways? Can we do more, for in-
stance, than simply reminding students of prior knowledge or asking working
Instructional Strategies for the Future | 233

professionals to consider how new concepts fit into their jobs? Can we build
something more than the sum of the learning parts?

Some “levels of analysis” hierarchies include a middle or meso level to refer to


the connections between the other levels. We’re modifying this concept slight-
ly and using the term meso-level to refer specifically to those interventions
aimed not merely at linking across experiences but also producing unique
added value from the correlations. This involves more than just linking across
time horizons or subject matters, although those are both relevant. It also in-
volves aggregating concepts at a given level so that new and integrated capa-
bilities emerge.

5. Help learners filter overload

As discussed in Chapter 4,
4 cognitive overload poses a serious problem for
individuals, who can readily become overwhelmed by the sheer amount and
velocity of information. Learners need new supports that help them filter out
“noise” and meaningfully integrate the relevant “signals.” If not addressed,
we run the risk of increasing information acquisition to the detriment of deep
comprehension and robust knowledge construction. The multilayer, intercon-
nected model we’ve discussed in this section emphasizes this complexity. The
challenge for learning professionals is to help learners navigate through infor-
mation overload and to develop the internal cognitive, social, and emotion-
al capabilities needed to self-regulate against it. Some strategies to support
this have been discussed in prior chapters, including social and emotional

MESO

4. Use the interplay of experiences to create something new


234 | Modernizing Learning

competencies (Chapter
Chapter 4),
4 self-regulated learning skills (Chapter
Chapter 15),
15 and so-
cial learning supports (Chapter
Chapter 14).
14 Mentoring learners in these areas can
help, as can specifically teaching techniques for managing overload including
connectivist skills, curation, and metacognition.

6. Help learners use connectivist learning strategies

Connectivism emphasizes the importance of distributed knowledge and capa-


bility. For example, rather than knowing how to bake banana bread, one sim-
ply needs to know where to find recipes online, how to select the best video
tutorials, and which friend to phone when a little extra assistance is needed.
Navigating through these technical and social networks is a primary skill—a
critical learning strategy—associated with connectivism. Although the multi-
layered, interconnected model discussed so far has emphasized instructional
strategies (i.e., those things learning professionals do to help support learn-
ing), it’s also important to consider learning strategies. By definition, these
must come from the learners, themselves; however, learning professionals can
enhance and support learners’ abilities. Instructors and good instructional de-
sign can help learners develop their connectivist learning skills and associated
self-regulation strategies to help them navigate complex social, cultural, and
informational networks.

7. Help learners curate resources and knowledge

Information and communication technologies offer new ways of discovering,


organizing, and later retrieving information. Often learning instances and
other information can be digitally captured, processed, aggregated, and stored
for retrieval across time, contexts, and devices. This notion relates to connec-
tivism, and it highlights the importance of developing related learning strat-
egies (e.g., how to organize and retrieve curated information). Over the last
decade, personal learning environments have become popular; these online
Instructional Strategies for the Future | 235

systems help learners and their teachers manage learning resources. Looking
ahead, learning professionals will need additional tools and mentorship strate-
gies to continue to support such curation activities across increasingly “noisy”
and diverse settings.

8. Blend instructor- and learner-controlled strategies

This section has outlined guidance for instructional strategies as well as pos-
sible interventions to help develop and activate learners’ own internal learn-
ing strategies. This final item highlights that both internal expert-directed
learning controls as well as learner-directed self-regulatory interventions are
critical. Over time, individuals should develop the desire and ability to exert
more independent control. However, many learners need help cultivating their
self-directed learning abilities, hence a negotiated mix of instructor-controlled
and learning-controlled approaches is needed. The role of the instructor in
these new multidimensional contexts, therefore, needs to expand and grow
in flexibility, shifting to encompass the roles of activator, facilitator, coach,
mentor, and advisor.8

STRATEGIES FOR
MEANINGFUL
FUTURE LEARNING
The prior section outlined eight principles for the application of instruction-
al strategies in the future learning ecosystem context; however, it didn’t de-
scribe the strategies, themselves. Hundreds of instructional strategies and,
likely, thousands of corresponding tactics have been tried and tested. Rather
than provide a litany of these, we’ve identified five generalizable principles
of meaningful learning well-suited for instructional strategies in this context.
236 | Modernizing Learning

These methods will help create active


active, constructive
constructive, cooperative
cooperative, authentic
authentic,
and intentional learning interventions.

Meaningful learning is grounded in and driven by epistemological orienta-


tions and theoretical foundations that are primarily constructivist, social con-
structivist, and connectivist in nature. In constructivism, learning is char-
acterized as “constructing” or creating meaning from experience such that
knowledge comes from our interpretations of our experiences in an environ-
ment and emerges in contexts where it’s relevant.9 In other words, the mind
filters inputs from an environment or experience to produce its own unique
reality or understanding. Therein lies the intentional (goal-directed, regula-
tory), active (manipulative, observant), constructive (articulative, reflective),
and authentic (complex, contextualized) principles of meaningful learning. In
social constructivism and connectivism, learning becomes a process of col-
lection, reflection, connection, and publication.10 Therein lies the cooperative
(collaborative, conversational) principles of meaningful learning.

Strategies in Application: An EMT Example

Consider an example of a young woman who, upon high school graduation,


enrolls in an Emergency Medical Technician (EMT) training program. The
program incorporates multiple courses delivered via didactic instruction and
labs, followed by integrative in-the-field clinical experiences. Throughout
the program, her learning is supplemented by various digital tools including
e-books, practice simulations, and a micro-learning study app.

At a micro-level, the instructional strategy of scaffolding can be used to cre-


ate a supportive and responsive environment to help the novice EMT progress
towards becoming a paramedic. Scaffolding involves assessing what learners
can do, helping them reflect on what they know, identifying needs and goals,
providing individualized assistance towards these goals, and offering oppor-
tunities for learners to internalize and generalize their learning. In this ex-
Instructional Strategies for the Future | 237

The characteristics of meaningful


ACTIVE learning, adapted from Howland,
Jonassen, and Marra (2012)
Manipulative and Observant

INTENTIONAL CONSTRUCTIVE
Goal-directed Articulative
and Regulatory and Reflective

AUTHENTIC COOPERATIVE
Complex and Contextualized Collaborative and Conversational

ample, the instructors might engage the EMT trainee in intentional, goal-di-
rected, and regulatory behaviors to prompt a connection between what she
learned in the EMT training course and how she can extend the physical and
cognitive dimensions of EMT training into future paramedic training.

The instructional strategies of modeling and explaining can also be used


to help transition learners in their learning trajectories. In modeling and ex-
plaining, instructors demonstrate a process while also sharing insights be-
yond the obvious, such as telling learners about why a task is performed in a
certain way. In the case of the EMT trainee, her instructors—whether human
or AI coaches—can model and explain what, how, and why paramedics per-
form certain procedures while also demonstrating the social and emotional
aspects involved in these tasks. Modeling and explaining can take place in
authentic contexts, which helps present the concepts at the appropriate level
of complexity and portray the interplay of dimensions associated with them.
For instance, for the EMT example, this could be done in a simulated or real
ambulatory run. The EMT trainee, in this case, might be asked to articulate,
238 | Modernizing Learning

reflect, and engage in constructive thinking through observation of expert


performance. She might also be challenged to extend her knowledge beyond
her comfort zone, such as to consider the next phase of her professional and
personal development as a future paramedic.

In addressing more macro-level instructional interventions, we can expand


traditional strategies to incorporate organizational, elaborative, exploratory,
metacognitive, collaborative, and problem-solving elements across the vari-
ous dimensions of learning. These macro-level strategies can be connected
or “threaded” to incorporate higher-level objectives, such as encompassing
a defined career path or advancing a current professional situation. Each in-
dividual’s journey through a lifetime of formal and informal experiences is
somewhat unique and may incorporate multiple contexts and educational
events. Hence mapping and organizing a learner’s cohesive transition, with
the important consideration of “the spaces in-between” (the meso-level of de-
sign), as well as the integration of instructional experiences and major life
events, become important areas of focus for future learning design.

Upon completion of paramedic training, coaching and mentoring can be


used as crossover instructional strategies to further scaffold learners towards
the next phase or experience in their lifelong learning trajectory. Coaching
and mentoring are related. They involve observing learner performance and
offering assistance to bring it closer to expert performance (coaching), as
well as acting as role model, advising, and supporting learners in attaining
goals and in overcoming barriers and challenges (mentoring). As learners set
goals for real-life situations, coaches and mentors provide support through
dialogue, with social negotiation, and by engaging learners in actively seeking
information, researching the issues, and finding solutions to meaningful and
authentic problems.11

In the EMT example, this means engaging the EMT trainee, who (let’s say)
is now a paramedic, in authentic (complex, contextualized) and cooperative
(collaborative, conversational) activities to help her think about how to extend
STRATEGIES FOR MEANINGFUL LEARNING

Instructional strategies such as scaffolding, modeling and explaining, and coaching and
mentoring can support meaningful learning within and across different levels: 12

COOPERATIVE (collaborative, conversational)


• Enable collaborative and conversational interactions between learners and instructors,
mentors, tutors, or instructional systems
• Encourage learners to engage in collaborative and conversational activities through
sharing ideas, listening to each other’s perspectives, and co-constructing knowledge
• Help learners work together in communities to accomplish the task at hand

AUTHENTIC (complex, contextualized)


• Use authentic processes and contextualized examples to present concepts and domain
knowledge at appropriate levels of complexity
• Engage learners in authentic activities that are complex and contextualized
• Encourage learners to actively seek information, research issues, and find solutions to
meaningful and authentic problems

CONSTRUCTIVE (articulative, reflective)


• Enable active and constructive learning by challenging learners to perform beyond their
comfort zones
• Engage learners in active and constructive thinking, for instance, by representing their
understanding in different ways, using different thought processes, and challenging them
to develop and defend their own mental models
• Create opportunities for learners to think constructively while considering experts’
performance, articulation, and reflective practice

INTENTIONAL (goal-directed, regulatory)


• Encourage goal-directed and regulatory behavior by keeping learners’ intentions at the
forefront of the learning task
• Engage learners in reflective and intentional behavior, encouraging them to analyze their
actions, compare them to others, and, ultimately, to form expert knowledge and skills
• Help learners set achievable goals and manage the pursuit of these goals through a
process of exploration and inquiry

ACTIVE (manipulative, observant)


• Engage learners in active learning through observing the consequences and results of
their actions and by assessing and evaluating their knowledge
• Enable learners to consciously think about their observations and actions thereby
constructing new knowledge and restructuring their understandings accordingly
240 | Modernizing Learning

her physical, cognitive, emotional, and social knowledge of being a paramedic


further, maybe encouraging her to consider the perspectives of a physician’s
assistant. This might involve shadowing a physician’s assistant at a hospital,
observing what they do, and actively considering how her current and emerg-
ing medical knowledge and skills as well as her social and emotional compe-
tencies (such as bedside manner) might apply. This type of experience allows
learners to work in authentic settings, and it engages them in collaborative and
conversational interactions with their coach or mentor as well as with their
peers. All this enables them to share ideas, listen to each other’s perspectives,
and co-construct knowledge. As illustrated in this example, the instructional
strategies of scaffolding, modeling and explaining, and coaching and mentor-
ing can be used as crossover instructional strategies to create meaningful con-
nections that help learners transition across experiences, set lifelong learning
goals, and achieve those goals across the lifespan.

Macro-level instructional strategies can inform larger and larger units of in-
structional and professional development, and adding meta-level structures
also helps support a lifetime of growth across multiple careers, experienc-
es, and interests. This supports continual expansion of knowledge, multiple
learning itineraries based on learners’ competencies and interests, and multi-
ple tools for manipulating resources. This includes not only formal learning
experiences but also informal and life experiences, all intimately connected.

Viewing learning across the lifespan as a networked and connected ecosys-


tem of experiences opens new opportunities for instructional strategies. Each
individual may have a different learning trajectory and mosaic of experiences
threaded together across education and training, major career events, multiple
careers, and other lifetime activities. Like a puzzle that’s never quite finished,
learners progressively add to their learning landscapes while also benefiting
from the integration of the elements within them. The technological advances
described throughout this volume have created the capacity to provide learn-
ers with connected and cohesive learning across their lifespans.
Instructional Strategies for the Future | 241

SUMMARY
Instructional strategies can incorporate interventions, such as scaffolding,
modeling and explaining, and coaching and mentoring, to provide the glue
that meaningfully supports connected and cohesive experiences across a
learner’s lifetime. Thinking about the continuum of future learning, we need
to consider these strategies at multiple levels—not only within a particular
instructional event or course of study, but across learners’ longitudinal trajec-
tories. Accordingly, a significant challenge for the future is the differentiated
application of instructional interventions across conceptual areas, learners’
developmental phases, content modalities, and levels of abstraction—while
also considering the impact of composite learning experiences.

Such learning experiences can be implemented using experiential, collabo-


rative, and personalized instructional models that target cognitive, psycho-
motor, emotional, and social skills across distributed contexts including in-
dividual and collaborative activities; these, of course, will also be facilitated
by a variety of delivery formats, modalities, and technologies. Thus, we must
consider a new model for how to organize and recommend instructional strat-
egies within this non-linear, lifelong, personalized learning continuum. How
do we ensure such strategies are coherent to learners and that they improve
upon (rather than add noise to) the potentially overloaded learning environ-
ment? How do we help teachers, trainers, mentors, and automated systems, as
well as learners themselves, use appropriate strategies in this crowded future
learning environment? Many other learning science questions persist. How-
ever, it’s clear that to realize the full promise of the future learning ecosystem,
we need to apply considered strategies across it—strategies that combine mi-
cro- and macro-level instructional activities with macro-level considerations,
that identify and support “the spaces in-between” learning episodes at the
meso-level, and that help learners develop and apply their own learning strat-
egies to navigate the complexity of the world around us.
We need a better system to federate and integrate multiple
learning experiences throughout a career, across organizational
units. Transcripts have been used for years, for child to young-adult
education…but there isn’t a good portable transcript system for
professionals to securely identify what learning experiences they’ve
completed and their interests to learn related content areas through
personalization. As workers move through their organizations and
careers, the learning record really should follow them more closely
and accurately.

John Landwehr
Vice President and Public Sector Chief Technical Officer, Adobe
Competency-Based Learning | 243

CHAPTER 13

COMPETENCY-BASED
LEARNING
Matthew Stafford, Ph.D.

Competency-based learning isn’t new. It evolved from the following four in-
novations: The parsing of learning into specific chunks of skills and knowl-
edge; the creation of learning outcomes to clearly establish levels of mastery;
assessments that allow learners to demonstrate their mastery; and most re-
cently, a focus on the learner and the learning (outputs) versus a focus on the
teacher, the curriculum, and the time invested (inputs).

The first of these advances traces back centuries to the age of guilds and
apprenticeships. Master craftsmen parsed their specialties into a variety of
discrete tasks and then trained their apprentices to perform those activities to
appropriate levels of mastery.
mastery Another remnant of the age of guilds is the con-
cept of varying levels of mastery. Aspiring craftsmen started as apprentices
and advanced through the assorted levels. Only after demonstrating mastery
of every aspect of the craft, would the tradesman graduate the apprenticeship
at the full craftsman status.

This parsed-learning approach still exists across widespread training pro-


grams today. The military employs this approach with its enlisted personnel,
training and certifying members on specific tasks. One can also see it in in-

dustry and, not surprisingly, in the wide variety of vocational-education pro-
grams that prepare students for jobs in industry. These modern settings also
borrow the performance levels from classic trades-training to indicate prog-
ress from novice to master. Ironically, the Air Force—the youngest of the U.S.
Historically, a cooper (“barrel-maker”) would train an apprentice on selecting
trees and forming the individual staves. Equipped with these skills, the apprentice
would progress to assembling the staves into the barrel form, installing the
retaining hoops (forged by a fellow craftsman, a blacksmith) and “rounding”
the barrel’s interior. Next, the apprentice would master the art of finishing the
barrel, so it would seal. Then there was the complex task of cutting the croze—
the groove into which the head and foot rest,
installing the head… It was a complex series of
tasks requiring a variety of specialized tools!
Even after mastering barrel-making, however,
an apprentice had more to learn. In addition
to barrels, coopers also made casts, vats,
buckets, tubs…a sundry of wooden vessels
made from individual wooden staves. Only
after mastering all of the knowledge, tools,
processes, and specific tasks associated with
all of the vessels, would the master craftsman
honor an apprentice with the title “cooper.”

military branches—even employs old “guild


language” to label its Airmen’s skill levels: 1
…the unlikely forerunner for helper, 3 for apprentice, 5 for journeyman, and 7
to competency-based
learning for craftsman.1

Although born in training, the application of this “levels


of mastery” approach eventually found its way into education, largely due to
research into learning theory. In 1956, for instance, Benjamin Bloom posited
specific levels of mastery within the cognitive domain of learning.
learning 2 Equipped
with these descriptions, teachers and instructional designers had consistent
levels of capability they could target. Well-defined cognitive outcomes mark
the second of the four innovations that led to competency-based learning.
What educators needed next were authentic assessments to validate that
learners had reached the desired levels of mastery. Authentic assessments are
Competency-Based Learning | 245

those in which students have to demonstrate meaningful applications of their


knowledge and skills. A classroom assessment that matches real-world work-
place activities, for instance, would be “authentic.”

Authentically assessing performance in the cognitive domain, however, is dif-


ficult. The mastery of these less tangible concepts—the ability to formulate an
effective argument, for instance—is complicated. Demonstrating conceptual
mastery is even more so. Educators are forced to “sample” desired behaviors
and then, equipped with these samples, make informed judgments on the lev-
els of mastery students have achieved. Over time, educators have progressed
in this art, creating performance-based assessments that actually measure lev-
els of mastery, even in “soft skills.” Effective, authentic assessments were the
third innovation contributing to competency-based learning; however, assess-
ments play a far more important role than simply measuring mastery—they
actually drive learning.

Contemporary learning theory, based on evidence-informed research and


neuroscience principles, makes it clear that the best results occur when indi-
viduals take responsibility for their learning. Terry Doyle, an accomplished
learning science author and professor emeritus, is fond of reminding his
The one who does the work does the learning
readers, “The learning.” 3 Assessments can
empower learning by making learners do the work. For instance, instead of
devising detailed courses of study, teachers can instead focus on designing
effective assessments, describing them to students, and then helping learners
find their own paths to success.

It may sound shocking to some; however, this is how most informal learning
occurs. Someone buys a lawnmower and turns to YouTube to figure out its
assembly and how to get it running. Someone else goes online to figure out
how to change the oil filter in an antique automobile. Gamers have special
websites to share tips on how to win in their favorite video games. Even those
who sit and practice the lost art of “reading the manual” are benefiting from
informal, self-directed learning. In each case, there are no formal classes.
246 | Modernizing Learning

Learners can spend as much or as little


time, as necessary, to achieve their learning
goals. The focus is on reaching the desired
The classical model of
level of mastery. This learner-centric ap-
education posited learning as
a somewhat passive pursuit. proach, where the source of learning is less
Learners sat and listened to important than the mastery of it, catalyzed
lectures or read from books in the final innovation contributing to compe-
order to memorize facts. tency-based learning.
learning

This innovation is, arguably, the most rev-


olutionary for contemporary learning pro-
fessionals: In competency-based learning,
performance becomes the constant and
time becomes a variable. This is in direct
contrast to the traditional approach to train-
ing and education, where time is constant.
In this classical model, learners attend
classes that run so many days, in programs
that span so many months… The Carnegie credit-hour system, underpinning
many U.S. educational programs, exemplifies this time-based approach. Sim-
ilarly, traditional learning professionals talk of “seat time” or “contact hours.”
In all cases, time is the constant and performance varies. Some learners sit
through an entire course of instruction and master all of the objectives, earn-
ing an ‘A.’ Others, sitting alongside these top performers the entire time, don’t
do as well. Performance varies.

In competency-based learning, however, all of the learners work to achieve


the desired level of mastery.
mastery Some will do it the first day. Others will take
longer. Further, in these outcomes-focused settings, some learners may show
proficiency even before exposure to the prescribed curriculum. Perhaps they
already mastered the skills and knowledge in previous experiences. Regard-
less of the source, if they demonstrate mastery, they earn the credential and
Competency-Based Learning | 247

advance in their learning. Others will require a complete program of instruc-


tion. Again, performance is the constant and time is the variable.
variable

Another aspect of competency-based learning that causes confusion is the


concept of competencies
competencies. There are many different interpretations of this term.
For some, it refers specifically to a performance and encompasses knowledge,
skills, abilities, aptitude, and self-concept. Others define competencies far
more narrowly, describing them in terms of specific skills or specific areas
of knowledge. Looking at the definitions below, it is easy to see why there’s
confusion over the term.

A few of the competing definitions of a competency include:

• “…a clearly defined and measurable statement of the knowledge,


skill, and ability a student has acquired in a designated program,” per
the Southern Association of Colleges and Schools Commission on
Colleges.4

• “…a measurable pattern of knowledge, skills, abilities, behaviors, and


other characteristics that an individual needs to perform work roles
or occupational functions successfully. Competencies specify the

In competency-based
learning, performance
is key; performance
standards are held constant
while time may vary.
Suppose a coach goes into the team assembly room and explains:
Next Friday, I’m going to put this 48” stick into the ground
vertically, like this. I’ll expect each of you to jump over it without
touching it. Those who do so will accompany me to the track-
and-field competition the next day.

What would happen? The traditional approach would be to


build a course that teaches athletes how to jump higher. In this
instance, however, the coach has turned the learning task over to
the learners: Those athletes who want to attend the competition
are going to put a 48” stick into the ground and start practicing
ways of jumping over it. Some will try a standing broad-jump
approach (a vertical leap from a stationary position); others will
try a running jump. Still others might try the famous “Fosbury
Flop,” the popular high-jumping technique where athletes pass
over obstacles and land on their backs. Each athlete will approach
the task in their own way, leveraging their individual strengths so
they can demonstrate mastery of the assigned task.

Harry S. Truman once noted, “It is amazing what you can


accomplish if you do not care who gets the credit.” In essence,
competency-based learning applies a similar level of humility
to learning. It’s amazing what learners can
master if we cease caring how or where
they learned it and instead focus just on
the mastery.
Competency-Based Learning | 249

‘how’ of performing job tasks, or what the person needs to do the job
successfully,” per the U.S. Office of Personnel Management.5

• “…observable, measurable pattern of knowledge, skills, abilities,


behaviors, and other characteristics needed to perform institutional or
occupational functions successfully,” per the U.S. Air Force.6

• “…a student’s ability to transfer content and skill in and/or across


content areas,” as defined in the book, Off the Clock, which outlines a
roadmap to competency-based education.7

Some commonalities exist among these definitions. Like most competency


definitions, these focus on capabilities that are transferable across a range of
performance requirements,
requirements inherent in which are the notions of functional
utility and portability.
portability These definitions also highlight knowledge and skills;
the more holistic definitions, however, look beyond these two facets to also in-
clude other capabilities that may impact competence. In their 1993 touchstone
work on competencies, Competence at Work, Lyle and Signe Spencer listed
five components of competencies: 8
• Motives – Motives drive, direct, and select behavior towards certain
actions or goals and away from others
• Traits – A person’s habitual or enduring characteristics
• Self-Concept – A person’s attitudes, values, or self-image
• Knowledge – Information a person has in specific content areas

• Skill – The ability to perform a certain physical or mental task

In their 1999 work, The Art and Science of Competency Models, Anntoinette
Lucia and Richard Lepsinger offered a slightly different conceptualization.9
Readers can see in the above figure how Lucia and Lepsinger’s approach cor-
relates with Spencer and Spencer’s; however, the pyramid provides better in-
sight into the ways in which some characteristics support others and how,
when combined, they all manifest in behaviors—i.e., in performance.
250 | Modernizing Learning

SKILL: Ability to KNOWLEDGE: Information


perform a certain a person has in specific
physical or mental task content areas
Behaviors

TRAITS: Person’s SELF-CONCEPT:


habitual or enduring Person’s attitudes,
characteristics values or self-image
Skills Knowledge

MOTIVES drive,
direct, and select
behavior towards
Personal
Aptitude certain actions / goals
Characteristics

The Competency Pyramid per Lucia & Lepsinger, with definitions from Spencer & Spencer

Lucia and Lepsinger argued that aptitude and personal characteristics are
foundational, and while such characteristics may be innate, they can be influ-
enced. Skills and knowledge, of course, are more easily affected; they can be
imparted through training and education—through development. At the top of
the pyramid, all of the characteristics manifest in behaviors—in performance.

There are two categories of competencies within most institutional models,


core and occupational. Core, or “institutional,” competencies are applicable
to everyone in the organization. Occupational, or “specialty,” competencies
are applicable only to certain vocational specialties, positions, or jobs. For
instance, every employee of a city would need at least some level of proficien-
cy in “teamwork and cooperation” or “initiative,” but only firefighters would
need to master a firefighting competency.

The applicability of competencies to skills development (like firefighting) is,


for most, more easily understood than the relationship between competencies
and cognitive development. This partially explains why competency-based
learning has been adopted more slowly in education than in training. Within
Competency-Based Learning | 251

the scholarly literature, however, there are many examples of purely cognitive
competencies, such as analytical thinking, critical thinking, conceptual think-
ing, diagnostic skill, and commitment to learning, to name a few. Like their
vocational counterparts, these cognitive competencies are transferable—ap-
plicable to a wide variety of educational pursuits.

Using Competencies to Guide Learning

Competencies serve as broad targets for learning. Readily available to both


learners and teachers, they serve as a “contract” for learning and describe
the “finish line” for the accompanying learning experience. When learners
achieve desired levels of mastery in all assigned competencies, they progress
to subsequent learning events or complete their programs.

A well-crafted competency model will typically list competencies, provide


definitions, and be accompanied by descriptions of proficiency-levels. As po-
sitions are created, as workers are hired, or as students move through edu-
cational programs, competencies and desired proficiency levels are selected.
Supervisors, trainers, and faculty members then devise learning experiences
and assessments to ensure their people can reach and demonstrate the de-
sired levels of learning. Once the desired competency is demonstrated to the
credentialed—captured in a
required level of mastery, the performance is credentialed
certificate, badge, or other record, so there’s a lasting record of this capability.

Tracking competency development facilitates learning portability


portability. For in-
stance, by credentialing competency completions, learners can prove they
possess given capabilities, which is useful for meeting the entry criteria of
future learning experiences or for verifying personal qualification should they
move to other jobs. Similarly, tracking competencies gives parent organiza-
tions more opportunity to effectively employ workers’ skills and knowledge,
that is, organizations can move workers to those areas where their competence
is most needed.
252 | Modernizing Learning

Because competency-based learning facilitates precision in tracking and em-


ploying developmental investments, it’s popular within industry. It’s particu-
larly valuable to employers hiring new workers. Prior to competency-based
learning, employers had to assume that prospective employees possessed the
required attributes, attitudes, skills, and knowledge simply based on their for-
mal learning credentials and the limited time spent in interviews. It’s an un-
reliable approach. Just because prospective employees have high school diplo-
mas, for instance, offers no guarantee they can perform the arithmetic needed
to make change at a cash register or even to read its operating instructions! In
contrast, since competencies aren’t awarded until mastery performance has
been demonstrated, employers see exactly what their prospective employees
know and can do. They’ve demonstrated and received credentials for these
capabilities prior to applying for the job.

Competency-based learning is not yet universally accepted within education,


but acceptance is growing. One of the more interesting experiments, in this
vein, is described in Fred Bramante and Rose Colby’s book, Off the Clock:
Moving Education from Time to Competency.10 Bramante served as the Chair-
man of the New Hampshire Board of Education where he faced a high-school
dropout rate of 20%. To address this, he led the school system to embrace
competency-based learning, implementing his approach in 2009. By 2011, the
cumulative dropout rate was 4.68% and still falling. Students were mastering
the competencies necessary to earn their high school diplomas but doing so in
nontraditional ways. The key was focusing on the learners and the learning—
the outcomes. This is at the heart of competency-based learning.

Post-secondary institutions are also gradually embracing competency-based


learning. Educators have found students enjoy the flexibility and the fact that
they can progress as quickly through the programs as their efforts and capa-
bilities allow. Western Governors University was an early adopter of compe-
tency-based learning; however, the benefits of the approach quickly attracted
others. The University of Michigan, the University of Wisconsin system, Pur-
Competency-Based Learning | 253

CHUNK AUTHENTIC LEVELS OF LEARNER-


LEARNING ASSESSMENTS MASTERY FOCUSED
Parse learning Have learners Ensure learning Focus on the
into specific, demonstrate their outcomes clearly outcomes (learning)
output-focused mastery in realistic define associated versus inputs
chunks applications levels of mastery (teacher, time)

Considerations for Competency-Based Learning

due University, Northern Arizona University, and Southern New Hampshire


University, among many others, are offering competency-based programs.

CONCERNS
Minimizing Learning

Perhaps foremost among the competency-based learning detractor arguments


is the concern that in the rush to impart marketable skills for students, the
competency-based learning institutions are pushing students into “knowl-
edge-less” versions of the traditional liberal learning. In other words, those
seeking to discredit competency-based learning claim it’s too utilitarian and
specific, at the expense of broad-based learning and critical thinking. While
such programs lead to a skilled and potentially employable workforce, critics
argue the upward mobility of those workers is limited in terms of perspective
and their potential to step outside the initial knowledge specializations. This
argument also implies (or sometimes openly alleges) that the true aims of
competency-based programs are to expedite program completion and ensure
high graduate-to-employment statistics, which help sell these programs to fu-
ture students. Detractors argue competency-based learning institutions are
K–12 SCIENCE STANDARDS: The development of the Next Generation Science
Standards is an innovative example of bringing research-based learning to scale. The
National Academy of Science, Engineering, and Medicine developed the Framework
for K-12 Science Education informed by research on learning that is developmental
and interweaves science and engineering practices with core ideas and crosscutting
concepts. Moving from the Framework to standards with clear performance
expectations came with a hand-off to Achieve, an education nonprofit established in
1996 by governors and business leaders that works with states to prepare students
for college and career readiness.

Achieve reached out to states inviting them to be lead state partners in developing
the standards to an overwhelmingly positive response, resulting in 26 state partners.
This was the start of the tag line, “For States, By States.” The collaborative approach
continues to today with the launch of Achieve’s Science Peer Review Panel to
enhance the implementation and spread of high-quality lessons aligned with the Next
Generation Science Standards. Creating a sense of ownership and the providing
tools to implement. To date, 19 states and the District of Columbia have adopted
the Standards, and 21 additional states have developed their own standards based
on the Framework.

Susan Singer, Ph.D.


Vice President for Academic Affairs and Provost, Rollins College
www.nextgenscience.org/framework-k-12-science-education
Competency-Based Learning | 255

creating a new hierarchy within the educated populace: A distinction between


those who receive a “cheap, fast food-style or ‘good enough’ education from
those who receive a quality one.” 11 Said another way, the concern is that com-
petency-based learning graduates receive a lower quality of education more
pointedly focused on vocational development than on habits of the mind, and
that habits of mind (supposedly in contrast to the attained competencies) are
more transferable and, ultimately, more valuable beyond entry-level positions.
It’s a position worth noting.

Quality

Competency-based learning certainly has the potential to be of lower quality.


The concern is not so much about competency-based learning, in general, but
in how competency-based learning is operationalized within individual insti-
tutions. A vocationally focused program that grants credit for demonstrating
acceptable levels of supporting, transferable skills, such as speaking, writing,
critical thinking, and active listening, might indeed produce graduates who
aren’t on par with their peers from traditional higher-education institutions,
who’ve had to delve more deeply into these areas as part of their academic
experiences. Again, however, it depends. It depends on the assessments used
within the vocationally focused programs and the degree to which the trans-
ferable skills were tapped and reinforced during the program. If an institution
sets its requirements for performance very high, it can force all but those who
have that level of mastery into its more traditional learning opportunities.

Employing Competencies Effectively

Perhaps the most important concern raised over competency-based learning


isn’t actually a rejection of the concept but, rather, concern over how it and
the resulting competencies are employed. In 2003, George Hollenbeck and
Morgan McCall questioned why the competency-based approach to executive
256 | Modernizing Learning

development hadn’t produced better executives. They wrote:

As we begin the 21st century, evidence abounds that executive and lead-
ership development has failed to meet expectations. Unless we change
our assumptions and think differently about executives and the develop-
ment process, we will continue to find too few executives to carry out
corporate strategies, and the competence of those executives available
will be too often open to question. The “competency model” of the ex-
ecutive, proposing as it does a single set of competencies that account
for success, must be supplemented with a development model based on
leadership challenges rather than executive traits and competencies. Ex-
ecutive performance must focus on “what gets done” rather than on one
way of doing it or on what competencies executives have.12

Hollenbeck and McCall weren’t calling for the rejection of competency-based


learning but were simply arguing that it’s not sufficient to develop or possess
individual competencies; instead, it’s how they’re collectively employed that’s
truly important in terms of occupational success. By way of a metaphor, one
can produce the perfect brick (the competency), and with a stack of these
bricks, one can build a cathedral that soars into the sky or a brick outhouse.
It’s how one employs competencies that matters. This is a valid concern. Ap-
plication is all important.

U.S. AIR FORCE EXAMPLE


The U.S. Air Force is attempting to integrate competency assessment and
the credentialing of mastery into its workplace. The Air Force can do this
because, unlike most learning institutions, it has a continuing relationship with
the graduates of its education and training programs, which affords unique
opportunities to ascertain the impact of learning within the work environment.
The effort is already attracting interest even though it’s yet to be executed.
Air Force administrators predict the assessment and tracking mechanisms
will be online by 2022. This use-in-the-workplace example segues to the final
competency-based learning concern, the attachment to talent management.
Competency-Based Learning | 257

VISION
A national competency-based system will enable a great deal of flexibility.

Learners will learn at their own pace.


pace A common characteristic of competen-
cy-based learning is that it enables learners to advance as they reach mastery
because the focus in on outcomes (i.e., mastery of the given competencies)
and not on the amount of time spent completing a set curriculum. Said another
way, if a learner can prove mastery of a “communication – speaking” compe-
tency, developed earlier in life, she won’t have to sit through a class rehashing
the material. More than that, articulating competency models helps clarify the
instructional domains and give structure to learner models—both of which
aid personalization and automated adaptation of learning. This, in turn, al-
lows learning to be tailored to individuals in multiple ways, not only targeting
their individual strengths and weaknesses, but also helping to optimize the
availability of instructional opportunities, plan personal schedules, and so on.

Competency-based learning will also increase resource efficiency.


efficiency Allow-
ing learners to bypass education and training requirements for competencies
they’ve already mastered can accelerate individuals through programs. Per-
haps they can use this time to pursue other competencies or, instead, they
might need to employ the competencies they’ve mastered on the job. Either
way, learners and host institutions only expend resources on competence eval-
uation and on aiding those learners working towards mastery.

Competency-based learning can help individuals better tailor their planning


and learning priorities.
priorities If, for instance, a learner is working while attempting
to master a list of specific competencies, he might choose to “front-load” those
competencies most vital to short-term success on the job. Learners might also
leverage insight into the competency requirements to choose a learning meth-
odology that they find more effective for themselves or to help inform fu-
ture career planning. Consider this example: The figure on page 260 shows
258 | Modernizing Learning

So much of our education system is based on where you live and how
much money you have. We’re lacking national equity. But if you learned
it, it should count. I don’t care where you learned it. Lots of people
aren’t being served by the current system, but they should be. By 2025,
60% of Americans will need a postsecondary credential. We currently
don’t have a system that can produce those results unless we leverage
every postsecondary learning opportunity and everyone together.

Amber Garrison Duncan, Ph.D.


Strategy Director, Lumina Foundation

an excerpt from the Department of Energy’s 268-page catalog, Leadership


Development Seminars July 2013–2014 Edition. It links learning opportu-
nities both within and beyond the government to aid employees seeking to
master the Executive Core Qualifications (i.e., the competencies specific to
executive-level leadership for the Federal Senior Executive Service).13 The
Department of Energy’s catalog includes government-offered courses, cours-
es offered by various universities and private industry organizations, and even
informal learning opportunities—all mapped to the same set of Executive
Core Qualifications. Such correlations provide an invaluable tool for motivat-
ed learners to build competence in areas specific to their employers’ needs.

Equity and Diversity

A less obvious benefit of competency-based learning is the manner in


which it may help address inequities within the U.S. populace.
populace The Lumina
Foundation has researched this, noting that competency-based learning offers
Competency-Based Learning | 259

a mechanism to get education into the hands—and minds—of disadvantaged


Americans.14 This includes under- or unemployed adults, adults with some
college exposure but with no credential, and historically underserved commu-
nities. Education has long been credited as a bridge from poverty to prosperi-
ty. Competency-based learning expands access to that bridge.

Translation

Competency-based learning is growing as a “currency” for learning. The


transition from the Carnegie, credit-hour based approach to transcripting ed-
ucation is underway. The Lumina Foundation, an independent, private foun-
dation in Indianapolis, has set for itself Goal 2025, a goal to have 60% of U.S.
working-age adults possess meaningful and marketable learning credentials
beyond a high school diploma by 2025.15 To achieve this, the Foundation is
pressing for, “A new, national system of transparent quality credentials” and
“a national expansion of competency-based learning…that recognizes mea-
suring academic progress based on demonstrations of what students know
and can do.” A leader in competency-based learning, Lumina is working
with learning institutions and governmental organizations across the U.S.—
and they’re not alone. The U.S. Department of Labor, U.S. Department of
Education, U.S. Office of Personnel Management, and elements of the U.S.
Department of Defense are also pursuing competencies.

There is talk of a “Rosetta Stone” to translate competencies, so associated


credentials can move more easily across organizational lines, expediting in-
dividuals’ progress towards their learning goals. Given the rate at which new
competency models are entering the marketplace, however, this might not
be the best approach. Leveraging the “currency” metaphor—which is enor-
mously popular among competency-based learning proponents—is helpful. A
“Rosetta Stone” would serve as a sort of “currency calculator” to compute ex-
change rates among credentials. That would be a complicated process. Part of
the challenge, however, is that one may not be able to track the exchange rates
260 | Modernizing Learning

Department of Energy Learning Opportunities Correlated to Executive Core Qualifications

The U.S. Department of Energy’s Leadership Development Seminars July 2013–2014


Edition links learning opportunities both within and beyond the government to aid
employees seeking to master the Executive Core Qualifications. This catalog lists
over 550 courses offered by the U.S. Office of Personnel Management as well as 75
universities, colleges, and private industry organizations throughout the continental
U.S. and more than 700 leadership readings, mapped to various Executive Core
Qualifications. Each listing is cross-referenced and includes a brief description of the
course as well as its date, location, cost, and contact information.

Note: The listing of these courses does not constitute endorsement of their content by
the U.S. Department of Energy or any agency of the U.S. Federal Government.
Competency-Based Learning | 261

for every currency interaction (pesos to dollars, dollars to rubles, and fenings
to pesos, for instance), particularly as their values fluctuate. There would be
so many different currencies to track! Why not leverage the same approach
used for different currencies? Evaluate the relative value of the currency in
relation to commodities. If one knows how many of a given currency it takes
to purchase a commodity, such as a loaf of bread or barrel of oil, currency
conversion is easy.

Within a competency-based system, the commodity is performance—what


individuals know and can do. Hence, to exchange competency information
among institutions, these organizations needn’t learn one another’s competen-
cy models; they need only focus on what those competencies can “buy,” that
is, the credentialed performance.

However, credentials are only as effective as the reliability of their measure-


ments, and some challenges still remain in this area. For instance, the same
competency may manifest differently in different contexts. For example,
seemingly universal competencies, such as leadership, may vary widely be-
tween professions. As leaders, surgical doctors need more procedural knowl-
edge than business leaders who, in turn, may need more skill in motivating
their staff to increase sales. Therefore, while some competencies have similar
“bundles” of required knowledge, skills, and other attributes, others require
different components sets to determine applied competency. Credentialing
bodies must take these issues into consideration when determining how to
assess and manage credentials for competencies.

Another challenge involves determining the evaluation criteria for perfor-


mance standards; this can be particularly difficult. Some questions to ask are:
What methods will be used to assess performance (tests, portfolios, writing)?
Who’s responsible for the assessment? How will these assessments be used?16
Clearly, there are numerous questions to address before an integrated, over-
arching competency-based system can be realized. However, the subsection
below outlines some recommended ways forward.
262 | Modernizing Learning

The currency of the future labor market will be skills or


competencies, which will demand competency-based
education in both early-life and lifelong learning.

Martin Kurzweil, J.D., Director, Educational


Transformation Program, Ithaka S+R

IMPLEMENTATION
RECOMMENDATIONS
1. Decide if competency-based learning
is right for the organization

The first step in embracing competency-based learning is to conduct a com-


parative analysis of the change versus the status quo. What’s the demand for
competency-based learning? Can it enhance learning effectiveness and effi-
ciency within the institution? Are the leaders supportive? Are the faculty and
staff supportive? Is there sufficient talent to create the competencies, the re-
sulting competency model, the levels of mastery, and the assessments so vital
to competency-based learning’s success? Albert Einstein is reported to have
said, “If I had an hour to solve a problem, I’d spend 55 minutes thinking about
the problem and five minutes thinking about solutions.” Before embarking
on a change to competency-based learning, consider carefully the potential
benefits and challenges. Make sure that the investment will pay sufficient div-
idends. Lastly, make sure that the organization is willing to make the jour-
ney, too. To understand the relative value of the competency-based learning
approach, consider other organizations that have already embraced it. Find
Competency-Based Learning | 263

organizations similar to your own with similar missions and challenges. Look
at what they did to embrace competency-based learning and how they’ve em-
ployed it. …and as much as possible, learn from others’ mistakes!

2. Build a competency model

Next, construct and validate a competency model. There are several approach-
es one can take. Many institutions simply select from existing models where
the competencies, performance levels, and other accoutrements seem to fit
their needs, modifying their model as needed during validation. A second
method is job analysis.
analysis With this approach, researchers dissect the various
jobs performed within an organization figuring what core and/or occupation-
al competencies are required and at what proficiency levels. Typically, the
researchers will interact with workers to ensure the analysis is thorough and
that all competencies have been properly identified. Another method involves
leveraging panels of experts,
experts surveys, and interviews to create a competency
model. This is a fairly common approach and benefits from the fact that most
organizations fail to capture the full breadth of tasks and knowledge within
the human capital management documentation. A final method, and one rated
most effective by experts, is a criterion sampling method.
method With this approach,
researchers work with organizational members to establish criteria to iden-
tify the most outstanding performers. Applying this criteria, the researchers
then interview these performers to determine “what makes them tick” and
what competencies make them so successful in their jobs. The resulting mod-
el helps drive workforce development by focusing on the competencies most
closely aligned with success—outstanding performance—thus benefiting
both the employer and employees.

Validation can occur simultaneously as the model is being created. In essence,


validation is a means to ensure the predictability of the competency model.
model
If an employee who reaches the prescribed level of mastery in each of the
listed competencies is judged to be an outstanding employee, then the model
264 | Modernizing Learning

has a high degree of predictability and validity. If, however, those employees
who reach all of the desired levels of mastery are still found wanting, then the
model probably needs more work. With predictability—the “gold standard”
for competency models—it’s
models easy to see why criterion sampling is a preferred
method for creation and validation. Perhaps not surprisingly, starting with top
personnel offers a shortcut to creating a model capable of predicting outstand-
ing performers!

3. Develop authentic assessments for competencies

Once a model has been successfully created and validated, the next step is to
develop authentic assessments through which learners can demonstrate levels
of mastery. For industrial and vocational organizations, assessments can be
based on actual job performance. For most technical skills, workers need only
demonstrate their ability to perform their work-specific tasks correctly to earn
certification for a given level of mastery. For “soft skills” and cognitive com-
petencies, the assessments are usually more difficult. As noted previously, ed-
ucational programs usually rely on samples of behavior and faculty judgment
to assess competency mastery. A student required to demonstrate mastery in
multiplication, for instance, with levels of mastery determined by the number
of digits in the numbers being multiplied, would never be asked to multiply
every possible combination of appropriate-length numbers. That would be
ridiculous. Similarly, a student required to construct and deliver persuasive
arguments would only have to perform this task a limited number of times be-
fore a faculty member felt confident in certifying a level of mastery in the task.

There are standardized tests for soft skills, for example the California
Critical Thinking Skills test and a number of leadership and communication
assessments. The key to building or selecting assessments is to ensure they’re
valid (i.e., assess what they are supposed to assess), reliable (i.e., consistently
produce similar results), and authentic (i.e., match similar challenges learners
will encounter outside of the classroom—in the workplace, for instance).
Competency-Based Learning | 265

www.onetonline.org

O*NET – OCCUPATIONAL INFORMATION NETWORK


Sponsored by the U.S. Department of Labor, O*NET provides a database of
general occupational descriptions, including typical job and employee attri-
butes, necessary skills and knowledge, and workplace characteristics. These
are provided as free, open-access resources for broad use across businesses,
educators, job seekers, and HR professionals. To date, O*NET contains stan-
dardized descriptors for nearly 1000 occupations across the U.S. economy;
these form a common foundation for codifying occupational competencies.
Looking ahead, O*NET developers are exploring ways to create an overar-
ching architecture across competency frameworks, and they’re starting to use
GUIDs (Global Unique Identifiers) to connect credentials to O*NET compe-
tencies. Ultimately, O*NET developers imagine this work will remake the re-
sumé, perhaps turning it into a clickable or drill-down document that contains
someone’s entire “competency portfolio” but at levels of detail usable by em-
ployers. Further, the capability to relate bundles of competencies to specific
education and training modules, classes, or sequences of courses could enable
help individuals determine what competencies they need to achieve their ca-
reer goals, and how to, or where to go, to acquire those capabilities.

4. Develop learning paths to reach desired mastery

This is where creativity and ingenuity can pay big dividends. If program
leaders pursued a criterion-sampling approach to creating and validating a
competency model, they may be able to ask those outstanding performers,
“How did you learn that?” The same is true of others able to demonstrate mas-
tery of competencies without taking any institutional classes or courses. The
answers can be fascinating. It may turn out, for instance, that an employee
266 | Modernizing Learning

who demonstrates mastery of “leader-


There’s no such thing ship,” gained the associated capabilities
as “nontraditional” through the process of earning a Gold
education anymore. Award (Girl Scouts) or Eagle Scout (Boy
Scout) designation as a child.
Fred Drummond
U.S. Deputy Assistant Secretary of Of course, many students and workers
Defense for Force Education and Training will need help in mastering core and
occupational competencies. It’s tempt-
ing to offer a single course that covers
a wide variety of topics, competencies, and proficiency levels; however, fo-
cused learning, addressing the specific desired competencies and proficiency
levels, coupled with sufficient time for reflection and practice, is key. Further,
it’s more efficient: Institutions invest only what’s needed to achieve success,
and learners don’t waste time or effort picking up unnecessary or duplica-
tive skills and knowledge. Obviously, this applies more specifically to work-
place-learning than to educational applications. Development of the cognitive
competencies so foundational to education requires a depth and breadth of
learning far broader than a specific vocational focus.

As noted earlier in the “athlete example,” one needn’t create a program or


course for every learning need. Immersive learning experiences, such as
special work assignments, often allow learners to reach their goals more
effectively and efficiently than formal classes. Another option to consider is
the guild approach, as addressed at the start of the chapter. Maybe assigning
an “apprentice” to a “craftsman-mentor” is the key. Also, one shouldn’t
exclude off-duty, nontraditional learning. For instance, an employee or student
struggling with public speaking may not need a speech class; perhaps joining
a local Toastmasters club will foster her skills. Hence, it’s useful to document
the various ways other people have developed their own capabilities; these
can serve as models and potential pathways for those seeking to earn their
own credentials. Consider accumulating this information into a catalog,
Competency-Based Learning | 267

where learning experiences are cross-referenced to specific competencies and


proficiency levels.

5. Lastly, organizational leaders need to ensure


there’s a mechanism for tracking and reporting
competency mastery

This isn’t a simple task. Those responsible for this will have to consider the
broad array of users who need access to the information. Certainly, learners
need to know how they’re progressing—where they’re strong, where they’re
weak, and what they need to do to achieve their learning goals. For education-
al institutions, faculty and staff will need access to the information. There’s
also a need for transcripting learning progress for sharing with learners and
other institutions. Industrial entities will have a variety of data-users as well.
Like students, workers will want to know where they stand. Supervisors will

TECH TOOLS EXAMPLE: JDX


The Job Data Exchange (JDX) is a new set of open data resources, algorithms,
and reference applications for employers and their HR technology partners to
use in improving how employers communicate competency and credentialing
requirements for in-demand jobs. Today, 50% of open, available positions in
the U.S. country go unfilled because employers can’t find the right talent for
their critical positions. At the same time, education, training, and credentialing
providers are in need of better, faster, clearer signaling from employers on what
skills are most in demand in a changing economy. The JDX isn’t a “job board,”
rather, it will be a resource for employers and their HR technology partners to
more clearly define competency and credential requirements for jobs distributed
to talent sourcing partners such as job boards and preferred education, training,
and credentialing partners. The U.S. Chamber of Commerce Foundation and
their parters are pilot testing the JDX throughout 2019 across six states and the
District of Columbia.

See: www.uschamberfoundation.org/workforce-development/JDX
268 | Modernizing Learning

want to know how their individual workers are progressing in their develop-
ment, and also where their teams are strong or weak in terms of needed com-
petencies. Similarly, progressive levels of supervision will want insight into
this aspect of workforce development.

Within the military, the term force readiness describes how ready a military
force is to execute its warfighting mission. Competency-based learning pro-
vides a granular look into force readiness, providing senior leaders insight
into where they need to invest their developmental resources. Prior to World
War II, the U.S. Marine Corps correctly anticipated the nation would face a
war in the Pacific. The Corps purchased equipment to effect beach landings;
however, there was also a corresponding need to teach Marines to fight in this
extraordinarily challenging, sea-to-shore environment. In essence, the Corps
determined a new competency was required, assessed the developmental need
this new competency created (gap analysis), then began training Marines to
execute the new mission.

 A holistic look at workforce, student, or 


 military-unit competencies can help leaders 
 make learning investments more wisely 

Summary
As noted at the beginning of this chapter, competency-based learning isn’t
new. It is, however, an exciting way to approach learning. The power it gives
to learners—the control they have over their own learning journeys—creates
an excitement both for the learners and those guiding them to their eventu-
al goals. Competency-based learning also fosters creativity as both learners
and leaders seek new ways to attain and demonstrate mastery. Lastly, com-
petency-based learning offers that “common currency” that permits learners,
workers, and their institutions to both understand developmental needs and to
share achievements across institutional barriers.
Social Learning | 269

CHAPTER 14

SOCIAL LEARNING
Julian Stodd and Emilie Reitz

Formal learning is a story written by an organization and addressed to its


people. Social learning,
learning in contrast, is a story largely written by the learners,
themselves. It’s about tacit, tribal, and lived wisdom that exists within distrib-
uted communities. It’s often untidy, diverse, and deeply personal, as people
bring their own perspectives and experiences into the learning space. Modern
organizations are increasingly interested in how to unlock the power of social
learning. This chapter explores that question; it describes what social learning
is and elucidates a design methodology of Scaffolded Social Learning.1 This
is considered against the backdrop of the Social Age,
Age the evolved reality with-
in which we live, and an understanding of the impacts this has on learning
through its forms of power, knowledge, and control.

Living and Learning in the Social Age

Technology is the most visible manifestation of change we see around us: the
rise of social collaborative technologies, leading to the proliferation of con-
nectivity, and the democratization of organization at scale. Put simply, we’re
now connected in many different ways, almost all of which are outside of the
oversight or control of any formal organization or entity.2 In network terms,
there’s high resilience and great redundancy in our connections—which
connections is
significant. Historically, mechanisms of connection were local and tribal, or
large-scale and formal. We connected within formal hierarchies and formal
organizations, and within those spaces, we were expected to conform, to wear
the “uniform,” use the appropriate “language,” and accept the imposition of
270 | Modernizing Learning

“control.” Today, our global connections—our connections at scale—are


broadly social, distributed, and with the imminent proliferation of synchro-
nous machine translation, often culturally diverse. We’re substantially liber-
ated from language, time, and place. And with these changes comes a shift in
individual expectations, feelings of entitlement, and perceptions of fairness.

In turn, this leads to a shift in power across individual and collective and for-
mal and informal dynamics. There’s a broad rebalancing taking place around
the world, slowly draining power away from formal systems (hierarchy) and
into social ones (community). An important part of shifting power dynamics
is the fracturing of the social contract between individuals and organizations.
organizations
The notion of “career” is evolving; it no longer emphasizes lifelong loyalty
between an employee and a company. Instead, our public reputations, our
personal networks, and the broader communities that surround us become our
“job security.” 3 This has broad implications for learning and development.

 In the Social Age, learning is increasingly


 dynamic, co-created, and adaptive, and we
 must invest in that co-creation 
As our commitment to formal organizations becomes increasingly transient
and transactional, we’re seeing new entities emerge, or adapt, to fill gaps in
adult education, vocational training, credentialing, and other talent manage-
ment functions. Many of these entities are socially moderated and utilize so-
cial learning approaches.
approaches We already see early stages of this: Into this void
step the MOOCs (democratized teaching), the tech entities such as LinkedIn
and Udemy (democratized, beyond formal control), and portable credentials
such as the Open Badges initiative. Looking ahead, we’re also seeing new
“guilds” emerging.4 These guilds hold emergent political powers across in-
stitutions, and rather than being constrained by traditional structural organi-
zational boundaries, they’re instead defined by the bounds of knowledge and
capability, such as cybersecurity or anesthesiology.5
Social learning is a type of informal learning;
it’s frequently experiential and often facilitated
by distributed communities. It’s generally untidy,
diverse, and deeply personal, as people bring their
own perspectives and experiences to the learning.

The type of learning these new entities offer is different. No longer hindered
by decades of organizational stagnation and “known knowledge,” it’s typi-
cally more dynamic, co-created, contextual, adaptive, and free. This speaks
to the challenge of how organizations need to adapt to the new ecosystem:
Clinging to old models of organizational design (nested power structures),
formal learning (learning as a form of control), formal hierarchies of power
(systems of consequence), and known knowledge (unchallenged, static orga-
nizational dogma), is a sure fire way to be disrupted, from the level of organi-
zations up to the scale of nations, themselves.6 And hence, the old structures
of formal power are ceding some of their relevance—unless they can adapt.7

We’re used to seeing training and education as discrete parts of a stable sys-
tem, but today, in the context of the Social Age, learning and development
are dynamic parts of a dynamic system—and we must adapt them to fit the
changing times, not just the new modes of delivery available.
available In other words,
our adaptations must fundamentally readdress the design, facilitation, assess-
ment, and support of learning. We must develop new methodologies for learn-
ing, and invest heavily in the communities and social leaders who will deliver
these new capabilities so that we don’t simply survive—but thrive, and avoid
disruption and failure, in the Social Age.

The New Nature of Knowledge

Delving into semantics may kill us, but let’s briefly consider the nature of
knowledge, not at the deepest philosophical level but at the rather mundane
272 | Modernizing Learning

and practical one: Our ways of knowing are changing. We’ve moved from
“concentration” to “distribution.” Where once we memorized and codified
knowledge, and held it in libraries, books, vaults, and experts (in concentrat-
ed “centers of learning”), today it’s dispersed, distributed, and free—yet, not
without its problems (validity, bias).

Clearly, we still need “formal” knowledge with its mechanisms of valida-


tion, replicability, and rigor. But in many cases, we seek just enough and just
“good-enough” knowledge to get us to the next step of the journey, like the in-
formation we access from our smartphones while racing through the airport,
let’s say, trying to make a swift decision about our connecting flight. Another
key difference between formal learning and social learning is that “formal” is
often abstract and frequently decontextualized while “social” is inherently ap-
plied, because it’s done in the everyday reality. Where formal learning often
takes place in special spaces (classroom, laboratories), social learning more
often occurs in performance settings (around the water cooler) or at the point
of need (a YouTube “how to” video or Reddit answer).

Is this type of distributed, community-moderated knowledge always correct?


Absolutely not, but to be fair, neither was all of our “old” knowledge. And cru-
cially we’re still creating the mechanisms of validation for social knowledge
that may make it ever better. This is a feature of the Social Age that’s often
misunderstood: What we see around us today isn’t the end state. It’s often the
first early prototype. In contrast, the old system is relatively evolved and stat-
ic. The new one is still in constant motion; it’s always improving.

If we worry about validity to the point where we take no action, then we can’t
benefit from social learning. Conversely, if we liberate social learning with no
account of the risks, we’ll be overtaken by it. We must learn to balance both,
in a persistent dynamic tension.
HIERARCHICAL HYBRID SOCIAL

1
Learning is changing
Against the backdrop of the Social Age, the type of knowledge we engage with everyday has
changed, often co-created, geolocated, adaptive, and hidden within our social communities.

2 Scaffolded social learning can support social learning


Scaffolded social learning is a design methodology, and modality of learning, which creates a
loose structure, a scaffolding, within which learning communities carry out “sense making”
activities, all the while engaging with both formal and informal social knowledge.

3 Learning isn’t confined to formal or controlled structures


A significant amount of learning takes place outside of formal structures and within
communities that are trust-bonded, complex, and powerful. Our challenge is to create the
conditions for these communities to thrive.

4 Stories fuel social learning—and can benefit those willing to listen


Within these communities, learners create stories, narratives produced both individually
and collectively; these stories can inform the wider organization, if it has the humility and
willingness, to learn from them.

5 Social learning is just one part of a larger, Social Age strategy


Adopting social learning is just one part of a wider cultural transformation, and that
transformation could break every other part of an organization.
274 | Modernizing Learning

In a recent research project from a healthcare setting, we (this


chapter’s authors) asked learners which technologies they use to
collaborate. They identified 17 different platforms, only one of which
was sanctioned for official use by their organization. Knowledge has
already flown the coop; denying the change won’t prevent it. Instead,
we must engage to help better the rapidly evolving social system.

Formal and Social Systems: Dynamic Tension

The formal system is everything an organization can see, own, and control.
Formal systems are where we create formal learning, and they’re extreme-
ly good at certain things: collectivism, consistency, and achieving effects
at scale. Flowing around and through the formal system are social systems.
These aren’t held in contractual relationships but in trust-based ones. The
social system is multilayered, contextual, often internally conflicted, and ever
changing. Social systems are also good at certain things that formal ones ar-
en’t: They’re good at creative dissent, gentle subversion of outdated process-
es, questioning of systems, radical creativity, social amplification, movement,
momentum, curiosity, and innovation.

Healthy, modern organizations exist in a “dynamic tension” between the two,


and social learning takes place at this intersection, incorporating parts of the
formal and parts of the social.
social Our challenge is to maintain, not deny or destroy,
this tension.8 If the formal system triumphs, we get greater consistency and
hear the story that the formal organization agrees with, but we may not achieve
true learning. If the social system wins, and subverts formal structures entire-
ly, we lose our ability to validate quality, have consistency, and achieve effec-
tiveness at scale. But if we can master both, we can thrive: formal structure
and social creativity held in a dynamic tension. To do so requires a scaffolding,
an evolution of mindset, and a willingness, on both sides, to listen and learn.
Social Learning | 275

FACILITATING A SOCIAL
LEARNING CULTURE
1. Create the conditions for effective social learning

Authority within formal systems is represented by rank, title, and formal


qualification. In social systems, authority is granted by the collective based
upon reputation, trust, fairness, and the investments made over time. It’s this
social authority that we draw upon within social learning communities; it’s
reputation that counts.
counts In the context of social learning, our ability to learn
and collaborate socially depends partly on our social authority as well as our
levels of social capital.
capital Much like we need political skills to thrive in formal
spaces, so too do we need social skills to thrive in informal spaces. Self-reg-
ulated learning abilities, as described in Chapter 15,
15 are also critical. Hence,
as we think about ways to enable social learning, it’s important to consider
how to foster productive communities as well as how to support the social and
learning processes of their various members.

2. Scaffold formal, social, and individual learning

Consider an approach for social learning called Scaffolded Social Learning.9


It’s a methodology for the design, delivery, facilitation, and support of this
type of co-creative learning. It defines principles related to co-creative spaces,
formal learning assets, and learning community support structures that help
formal organizations integrate social learning into their contexts.

First, consider that in social learning, individuals will engage with formal
assets (stories written by the organization, codified and accepted knowledge),
social assets (tribal, tacit knowledge, held within the community), and individ-
ual knowledge (worldview, preconceptions, biases, and existing knowledge).
276 | Modernizing Learning

Organizations capture Communities take the formal


their codified strength story and add local and
in formal stories individual context

They share these stories They carry out


through formal learning sensemaking activities

They use technology for We can create spaces and


distribution, assessment, provide support for this to
and compliance happen, using scaffolded social
learning approaches

GREAT FOR DRIVING CONSISTENCY, GREAT FOR BUILDING DIVERSIFIED


VALIDITY, AND STANDARDIZED STRENGTH, RADICAL CREATIVITY,
STRENGTH AT SCALE AND INDIVIDUAL CAPACITY
Social Learning | 277

From a design perspective, one can, for example, vary the amounts of for-
mal knowledge provided, create conditions for sharing tribal knowledge, and
schedule reflective opportunities for individuals to explore their own experi-
ences. The “scaffold” in Scaffolded Social Learning represents these struc-
tures. In other words, this scaffolding supports specific activities designed
to facilitate and integrate formal, social, and individual learning, and to help
people “make sense” of it all, both individually and collectively as a group.

Second, at a technical level, consider the implementation of Scaffolded Social


Learning. It involves choreographing experiences across these formal, social,
and individual constructs. Like a good play, learning can be sequenced into
a “running order,” so that formal learning assets are released at certain times
that coincide with community activities, such as group storytelling. To extend
the theater metaphor, scaffolded learning also involves a range of supporting
roles, both front of stage and back of house, such as community managers,
storytellers, coaches, and social leaders. These learning facilitators help de-
fine the learning spaces, encourage activities that provoke and support the
manipulation (the processing) of new knowledge, and create opportunities for
people to bring in and demonstrate their own specific expertise. These actions
help manage the learning tempo, maintain its momentum, and drive up en-
gagement.

3. Use gentle learning interventions to


nurture social learning communities

Specific co-creative behaviors can enrich the activities of a social learning


community. For instance, putting loose structure into conversations and cre-
ating common patterns of activity can help to draw out coherent narrative
threads across concepts. As an example, consider the social learning tactic
curation. In Scaffolded Social Learning, the learning facilitator might not
of curation
bring a formal example of, let’s say, good teamwork or effective problem-solv-
ing, but rather would encourage learners to bring their own. Now, one person
278 | Modernizing Learning

may bring an example that seems terrible to others, and another person might
offer one that seems off-track. Hence, another step is to encourage the co-cre-
interpretation. This is where someone writes a narrative,
ative behavior of interpretation
shares a story of precisely why he sees the case study as relevant or how it
relates to her personal journey. In other words, this involves interpreting the
thing they curated and exchanging stories across the community.

Will we agree? Well, that doesn’t matter: Social learning isn’t about conformi-
ty and agreement; it’s about broadened understanding, context, and perspec-
tive. We don’t get to deny the validity of others’ examples, but we’re absolutely
allowed to challenge and engage in debate about them. Indeed, challenge can
be another co-creative behavior: I tell a story, you respond, I try to paraphrase
your story, you respond, we both collaborate and respond to a third story, and
we come together to co-create an overall narrative.

4. Assessment is feasible, but don’t apply it blindly

Our effectiveness as social learning designers is largely tied to our ability


to define and master the usage, combination, and creativity of co-creative
learning approaches, and to use them to craft engaging and effective learning
spaces together. However, it’s worth saying that organizations can measure
the effectiveness of social learning equally as well as they can measure the ef-
fectiveness of their formal training and education programs (although with the
caveat that that’s not saying much!). Like with formal learning, it’s generally
worthwhile to triangulate assessment approaches:10 Do learners feel they’ve
learned? Does the community believe they’re learning? Do learners score
more highly on formal knowledge tests or in simulation-based exercises? Are
there any noticeable changes to the processes or products developed outside
of the learning context?

While you can technically measure anything, the pertinent question may be,
How will that information be used? The collaborative technologies often used
Social Learning | 279

to support social learning have many convenient built-in measures; various


systems can report metrics about “engagement” (used as a byword for “click-
ing”) or “interaction.” They can also produce social network graphs or output
all manner of frequency statistics (e.g., log-on averages, average number of
posts). Technology certainly allows us to measure, but hard thinking should
be done on what to measure, how to best measure it, and what to do as a result.
Unless we can answer these three questions clearly, it’s best not to measure
at all. Measurement is enticing and important, but when misapplied, it can
lack value, waste resources, and even impede learning. The best advice is to
consider measurement carefully. Focus on outcomes, and where applicable,
triangulate among (1) self-assessed, (2) observational, and (3) formally mod-
erated measures.

5. Build social learning spaces


and foster communities

At the heart of social learning are the learning spaces—the places people
come together to carry out collective sensemaking activities. To be very clear,
space means something very different than community. Consider the analogy
of building a new town: You can build houses, landscape gardens, construct
a mall, and pave a town square. You can even move people into those houses.
But none of this creates the community. It’s only begins to emerge when two
of those people come together, on a street corner, let’s say, and have a conver-
sation about what a terrible job you’ve done on the brickwork. The buildings
form the space; the conversation forms the foundations of the community.
Spaces for social learning might be a classroom, a chatroom, or some kind of
learning management system—however, none of those are the community.

In social learning, as in our allegorical town, individuals interact across mul-


tiple spaces, on the street corner, at the marketplace, or in someone’s home.
In a learning context, multiple spaces—multiple technologies—may support
a community, and their conversations may span across them, starting in one
280 | Modernizing Learning

and graduating to another. It’s useful for the


A well-designed Scaffolded design of social learning spaces to takes this
Social Learning experience into account and to explicitly design for differ-
will contain differentiated ent types of social interactions, such as con-
learning, rehearsal, and versational spaces, collaborative spaces, in-
performance spaces frastructure spaces (for formal components),
subversive spaces (to complain about the
“brickwork”), assessment spaces, and so forth.

Each learning space is differentiated by


notions, such as its permanence and con-
sequence. For example, a conversational
sequence
space needs high impermanence, while a
formal assessment one may carry great per-
manence. Collaboration spaces should be
low consequence, and performance spac-
es may carry high consequence. Social learning takes place across these
diverse constructs and associated technologies—it’s not bounded by a sin-
gle system or conceptual frame. Hence, the ability to construct such spac-
es as a coherent ecosystem is a core skill for socially dynamic organiza-
tions, i.e., organizations adapted to benefit from social learning approaches.
tions

To encourage social learning communities, we need to create the conditions


for them to emerge. Start by dedicating time to growing the community prior
to moving into any formal learning activities. Before you can be purposeful,
you need to be coherent;
coherent that is, before meaningful learning can begin, you
first need to establish a high functioning community.

SENSEMAKING ENTITIES

Coherent communities are sensemaking entities; they help figure out infor-
mation, identify misinformation, determine value, and recommend responses.
Our social communities help us to filter the signal from the noise, and then to
Social Learning | 281

understand those signals. In the context of social learning, where much of the
sensemaking is done in the community, this helps provide a diversified view,
and the more diverse in worldview, experience, cultural profile, and capability
the community is, the more effective its sensemaking can become.11

MECHANISMS OF ENGAGEMENT

Within formal systems, we’re assigned roles by the organization, but in social
systems, our roles are more nebulous and change more often. Sometimes we
bring specific expertise, resources, or capability; sometimes we bring chal-
lenge, sometimes support, and other times we’re cross-connectors, linking
different communities. Sometimes we simply come to learn. When consider-
ing social learning communities, it’s worth remembering that we don’t need
everyone to engage in a certain way; we just need broad engagement. It’s fine
for people to take diversified roles.

RITUALS AND CHOREOGRAPHY

There’s a role for ritual; in our own research, people described the “rituals of
welcome and engagement” as the single most important factor for their future
success within a community. Such rituals are something within our control;
when designing the scaffolding for social learning, we can actively design rit-
uals or consciously adopt existing ones. We can work with community mem-
bers on their rituals of engagement for new members, for example, and can
work with their formal managers on the rituals they’ll use to share stories of
their learning back to the rest of their teams.12 It’s all part of the choreography
of learning. This means we pay equal attention to every part of the learning
experience, from the email that invites someone to join to the registration
instructions they receive, the way we thank them for sharing stories, and the
ways we graduate them at the end. It’s important to script and craft each part
as an element in the overall running order. Pay attention to them all. Together,
rituals and choreography form a powerful tool of community-building and,
ultimately, of learner engagement.
282 | Modernizing Learning

HIDDEN COMMUNITIES

We’ll never find all the communities with-


in an organization. Some (like our learning
communities) are visible and formally sanc-
tioned, others exist outside our networks and
experience. Some even exist in active oppo-
sition, deliberately hidden from us. When we
We belong to many different ask people what their most valuable com-
communities. Some communities
are visible to both us and the munities are, for learning, they often speak
organizations we work for, while of these hidden communities, formed on
others are hidden deep in our social
WhatsApp or as Facebook groups—places
networks, out of sight from formal
institutional authorities but still beyond formal oversight and consequence.
very relevant and connected to us It’s worth remembering that hidden commu-
individually in our day-to-day.
nities aren’t new; we’ve always existed with-
in a web of communities, but in the context
of the Social Age, the boundaries between
formal and social communities have blurred. Although formal communities
haven’t substantially encroached beyond their organizations, social ones have
invaded that previously sacrosanct space. The difference today is that these
hidden communities can form and operate, at scale, and do so right under our
noses. This is the consequence of the democratization of communication and
connectivity.

SANCTIONED SUBVERSION

Moving ourselves beyond a binary understanding of which answers are “right”


or “wrong” is valuable. Sometimes the answer lies in breaking the question.
Subversion itself can be of great benefit to formal systems, if they’re willing to
listen, because established organizations are typically very bad at subverting
(or evolving) themselves. Consider this: How many organizations put as much
time and effort into deconstructing redundant process and un-writing outdat-
ed rules, as they do into forming new ones? Very few! What happens around
Social Learning | 283

this organizational detritus? Typically, it’s subverted; people work around re-
dundant systems and suboptimal process. And they do so not only individu-
ally, but collectively too; indeed, when people join a new organization, much
of what they learn, at the local or tribal level in the early days of a new job,
comes exactly from this type of crowdsourced subversion, usually under the
generic banner of “this is how we get things done around here.”

Conclusion

Stories, communities, learning—these are all expressions of power, and in the


context of the Social Age, power itself is evolving. As we engage more broad-
ly and more intentionally in social learning, we’ll discover that our formal
power doesn’t carry through into social spaces: within these learning com-
munities, you can shout all you like, but it’s social authority, reputation-based
influence, and social capital that count the most. In the course of adopting
social learning, we inadvertently (but necessarily) erode the power of the for-
mal organization.

As we cultivate the social community, this newly empowered collective


will demand ever greater freedom and power. If our aim is learning
transformation, then this power is what will drive the change.

It’s a champagne bottle to uncork with care. The balance between formal sys-
tems of control and socially moderated ones creates an important dynam-
ic tension. When managed effectively, a socially dynamic organization can
emerge, one that integrates the very best of the formal (system, process, hi-
erarchy, and control) with the very best of the social (creativity, subversion,
innovation, amplification). That’s our challenge: to craft more collaborative
models of learning, and to learn how to build an organizational culture in
which learning can thrive both for today and through our emerging future
learning ecosystem.
Americans should have self-sovereign management of their
lives. Right now, medical records are yours but not so much
your educational records; you don’t really control any of that info right
now. We’re working on envisioning what the future looks like following
these guiding principles: to give each person their own destiny,
balance on the supply and demand side…and put it into the hands of
the ones who want to earn the competencies and credentials. It gives
them the power to drive the marketplace. Currently, the providers
squarely have the advantage, but we need to make it a new space
where learners are empowered.

Jeanne Kitchens
Chair of the Technical Advisory Committee for Credential Engine; Associate
Director of the Center for Workforce Development, Southern Illinois University
Self-Regulated Learning | 285

CHAPTER 15

SELF-REGULATED
LEARNING
Louise Yarnall, Ph.D., Michael Freed, Ph.D.,
and Naomi Malone, Ph.D.

There’s a growing need for continuous


modes of lifelong learning to cope with Self-regulated learning refers
the acceleration of knowledge produc- to the thoughts, feelings, and
tion and flow aided by new technologies. actions some learners use
In response, both schools and work- to independently attain their
places are progressing towards more learning goals. Self-regulated
learners are metacognitively,
independent, learner-centered forms
motivationally, and behaviorally
of education and development. Poten- active in their own learning.
tial support for lifelong learning comes
from improvements in AI technologies
that permit more personalized learning,
and greater access to mobile and search
technologies that provide ubiquitous access to information. In the workplace,
trainers are increasingly using cloud-based software, augmented reality, and
virtual reality to prepare workers, support their lifelong learning needs, and
enable diverse collaboration methods.1 In higher education, institutions are
increasingly offering online education options and providing students with
information resources and communication tools to aid their independent re-
search and collaboration. However, despite these trends, both educators and
employers report challenges with this shift towards greater learner-control.
For instance, some learners have difficulty taking responsibility for their own
286 | Modernizing Learning

PERFORMANCE
SELF-CONTROL
• Imagery
• Self-instruction
FORETHOUGHT • Attention focusing
SELF- • Task strategies
TASK ANALYSIS
Goal setting •
REGULATED SELF-OBSERVATION
Strategic planning • LEARNING • Self-recording
• Self-experimentation
SELF MOTIVATION BELIEFS
Self-efficacy •
Outcome expectations •
SELF-REFLECTION
Intrinsic interest/value •
Goal orientation • SELF-JUDGMENT
• Self-evaluation
• Causal attribution

Figure 15-1: The three phases and SELF-OBSERVATION


subprocesses of self-regulated learning, • Self-satisfaction/affect
• Adaptive/defensive
derived from Barry Zimmerman’s work

learning,2 and others may struggle to assimilate their diverse experiences—


leading to a situation where they have increased exposure to information but
reduced overall comprehension.

Learners need to become skillful at regulating their learning over time and
across different settings, especially to acquire thinking, writing, and analysis
skills.3 However, individuals often struggle to manage their learning without
effective and perceptive external support, such as what a teacher, mentor, or
well-structured piece of courseware might provide.4 Consequently, developing
effective self-regulated learning skills requires educators and trainers to help
learners notice knowledge gaps, try new strategies, and adopt more proactive
mindsets. Incorporating support for this approach into new technologies can
also help learners acquire the meta-level skills needed to manage their own
learning across their lifetimes.

Empirical research is beginning to identify effective tools and strategies for


aiding self-regulated learning; however, the paradigm originally emerged
during the 1980s when education researchers studied why some K–12 students
succeeded in traditional classrooms better than others. They found the most
Self-Regulated Learning | 287

effective students demonstrated a set of learning strategies and mindsets in-


cluding metacognitive strategies (e.g., goal-setting, self-monitoring, self-eval-
uation), cognitive strategies (e.g., rehearsal, organization, elaboration), en-
vironmental management strategies (e.g., time management, study area
management), and self-beliefs (e.g., self-efficacy, intrinsic and extrinsic goal
orientation, effort regulation).5 Since these behaviors stemmed from learners’
personal choices, researchers categorized them as “self-regulated” learning.

By the 1990s, researchers agreed that learners self-regulate during three iter-
ative phases: the forethought phase, where a learner plans and initiates action;
the performance phase, during which learning actions occur; and the self-re-
flection phase, in which a learner reflects on and evaluates performance, ad-
justing as necessary. Barry Zimmerman, one of the preeminent scholars in
the self-regulated learning field, developed a model of these three phases,
grounded in social cognitive theory (see Figure 15-1).6

More recent evidence has demonstrated that some self-regulation strategies—


time management, effort regulation, and critical thinking—have positive im-
pacts on academic outcomes, but that other strategies—rehearsal, elaboration,
and organization—have less empirically convincing effects. Further, in both
school and workplace settings, a small number of these strategies have the
largest impacts, accounting for 17% of the overall variation in learning out-
comes 7 These include:
comes.

1. CONFIDENCE, SELF-EFFICACY, INTERNAL LOCUS-OF-CONTROL –


Effective learners believe they can learn because they’re in control and tend to
take a more “active” approach to learning. By contrast, less effective learners
doubt they can learn (because they think they’re not smart enough or not in
control) and, consequently, take a more “passive” approach to learning.8

2. GOAL SETTING AND PLANNING – Effective learners set appropriate


learning goals, anticipate the resources required, and set benchmarks for their
progress. By contrast, less effective learners may not set goals or may simply
288 | Modernizing Learning

plunge in, then run out of time or lack access


to appropriate learning resources.9

3. PRIOR KNOWLEDGE AND STRATEGY


USE – With stronger prior knowledge, effec-
tive learners engage in greater instances of
planning and monitoring, both independently
and in collaboration. With lower prior knowl-
…it’s not going to replace edge, less effective learners use just a few
teachers, it shifts the role strategies.10
and nature of a teacher
4. METACOGNITIVE MONITORING – Ef-
to a master facilitator.
fective learners note and address gaps and
Thomas Deale misunderstandings while they learn. Less ef-
Major General, U.S. Air Force (Ret.)
Former Vice Director for Joint Force
fective learners fail to notice or address such
Development on the Joint Staff difficulties in their learning.11

5. POST-LEARNING REFLECTION – Effec-


tive learners consider what they’ve learned,
taking stock of what remains to be learned.
Less effective learners fail to reflect sufficiently after learning and may rush
to the next task.12

RECOMMENDATIONS
Helping learners develop better self-regulated learning skills will require new
supports, added into the many contexts where people engage in learning. To
cultivate awareness of Zimmerman’s three phases of self-regulated learning
and to develop effective habits at the cognitive, metacognitive, emotional, and
behavioral levels, we propose three conceptual levels of self-regulated learn-
ing support: micro-, macro-, and meta-interventions. The micro-level focuses
Self-Regulated Learning | 289

on individuals and the tools they use to better navigate a personalized trajec-
tory. The macro-level focuses on how to navigate the selection and progres-
sion across learning experiences.
experiences At the meta-level
meta-level,, there’s a recognition that
building appropriate learning habits requires focused practice in the cogni-
tive, social, emotional, and physical capabilities that contribute to resilience,
effective decision-making, and lifelong personal growth. We describe appli-
cations of these three levels in the suggested interventions below.

1. Use formative assessments to personalize


support for self-regulation skills and mindsets

Although research shows the benefits of supporting learners’ self-regulation,


these interventions often rely upon the discretion and knowledge of their ed-
ucators. Hence, better supporting self-regulated learning depends, in part, on
enhancing the skills of teachers, workforce trainers, and managers, in addi-
tion to learners, themselves. To start, it’s useful to help stakeholders identify
the specific self-regulation skills and/or mindsets needed in a given learning
situation; a first step towards that is to translate self-regulated learning as-
sessment methods from research into practice. For instance, several diagnos-
tic tools can help identify the signs and symptoms of a learner with weak
self-regulation mindsets or strategies. These diagnostic tools could be embed-
ded into online courseware or used by teachers, trainers, and learners in both
classroom and workplace settings.

Drawing on the three-level support approach to self-regulated learning: Tools


can be devised to support individual educators in learning specific diagnostic
techniques (micro-level), to help them anticipate where self-regulated learn-
ing challenges may occur before any extended learning activity (macro-level),
and to serve as a regular formative assessment to encourage the maintenance
of effective mindsets and habits of self-regulated learning (meta-level). Below
are some self-regulated learning assessments that could be put to use:
290 | Modernizing Learning

SELF-REPORT INSTRUMENTS

Technology can deliver self-report, self-regulated learning assessments; the


results from these may be shared with teachers and trainers or fed into adap-
tive learning algorithms to provide more personalized support to learners.
Such assessments may target key elements known to support self-regulated
learning, including: level of motivation (e.g., The Motivated Strategies for
Learning Questionnaire 13 ) and the skills of goal-setting, time-manage-
ment, help-seeking, preparing the study environment for focused work, and
self-evaluation (e.g., The Online Self-Regulated Learning Questionnaire 14 ).

ASSESSMENTS IN ACTION: GOVERNMENT WORKFORCE EXAMPLE

In Marcus Buckingham’s work, StandOut, he designed an assessment…


One of the things that he applied there—that’s extremely successful—is a weekly
check-in with a supervisor. Once a week, through technology, it sends a request:
These were your goals last week. Were you able to reach these goals? What are your
new goals? Did you use your strengths? What did you like? What did you detest?
Responses help the supervisor know things like, John keeps disliking this, and I need
to get this off his plate and make it less painful for him. This is what he’s liking, where
he’s using his strengths. I need him to do more of this. It allows for side questions,
too, like, how motivated are you in what you do? How are you as an employee in
working with this environment?
Then there were 5 critical questions provided quarterly asking if the team is
growing and learning. It’s a huge help for a leader, and it also prompts me to go in
and say, “This is what John is working on. This one’s important, so can you put it
at the top of your list? Thanks for the great idea; I’m glad you’re working on it.”
These learning interventions cause us to have a conversation in a less threatening
format and talk back-and-forth. There are lots of benefits and these are the kinds
of interventions we intend to apply during the Leadership for a Democratic Society.

Suzanne Logan, Ed.D., SES


Director of the Center for Leadership Development and Federal
Executive Institute, U.S. Office of Personnel Management
Self-Regulated Learning | 291

STRUCTURED INTERVIEW PROTOCOLS

Drawing from questions in existing research interview protocols, technolo-


gy can be adapted to deliver helpful queries to teachers and trainers. These
may, for example, help them consider and investigate the potential factors
contributing to weak outcomes, either observed among students in school or
personnel in a workplace setting. Factors useful for reflection include assess-
ing learners’ skills for organizing and transforming information, setting goals
and planning to learn, seeking information, keeping records and monitoring
learning progress, preparing their study environment for learning activities,
engaging in self-evaluation, meting out self-consequences, reviewing texts
and notes, help-seeking, and rehearsing and memorizing. (See, for example,
the Self-Regulated Learning Interview Schedule.15)

MEASURING SELF-REGULATION PROCESSES AS EVENTS

Education technology researchers working primarily in learning management


systems are already moving towards designing more complex, process-ori-
ented measures that can determine individuals’ deployment of self-regulat-
ed learning strategies over time. Measurement methods include think-aloud
protocols and technologies that detect errors in tasks or employ online trace
methodologies (e.g., of mood and task steps) that measure individuals as they
go about their learning activities.16 To better support self-regulated learning,
researchers will need to study how to adjust these types of detection methods
for delivery and use across different learning technology platforms, such as
mobile, augmented reality, and virtual reality.

2. Build confidence, self-efficacy, and internal


“locus of control” about learning

To realize a vision of self-regulated learning across a lifetime, more needs


to be understood about the preconditions for developing habits of lifelong
292 | Modernizing Learning

learning. International studies indicate wide variation in how well both early
childhood education and family upbringing sets the stage for lifelong learn-
ing;17 however, it generally begins with establishing confidence and indepen-
dence as learners. Over the past 35 years, K–12 education researchers have
found evidence that open-ended instructional practices, such as guided inqui-
ry activities, foster confidence and independence in learning more than other
practices, such as traditional close-ended question-and-answer routines.18 In-
troducing open-ended practices in childhood can help set the conditions for
lifelong learning, but continued support for self-regulation is needed even in
adulthood. For example, some research indicates that those countries with the
highest levels of lifelong learning among adults have robust adult education
systems.19

Based on the three-level support approach to self-regulated learning, de-


scribed at the beginning of this section, individual educators can be tutored
in confidence-building techniques (micro-level); in methods for identifying
likely areas of low confidence in an upcoming lesson (macro-level); and in
noting, reflecting on, and accepting their own challenges with maintaining
confidence during learning (meta-level).

The one point I hope every single person can internalize—as


the neuroscience evidence shows us—the brain is learning
every single second of every single day. So, the way every
individual learns is the same, but what they’re learning differs and
that depends on context—internal and external. Our job is to align
our learning goals to what the brain is actually learning. That’s a big
paradigm shift for leadership.

Melina Uncapher, Ph.D.


Director of Education Program, Neuroscape; Assistant Professor of
Neurology, Weill Institute for Neurosciences and Kavli Institute for
Fundamental Neuroscience, University of California San Francisco
Self-Regulated Learning | 293

3. Develop goal-setting and planning skills

To improve self-regulated learning, goal-setting and planning, strategies


should be translated into user-friendly tips to guide individuals while they
learn. Such self-regulated learning support should be made available across
a range of learning contexts, from face-to-face to online environments. The
three-level support approach to self-regulated learning is useful here, too. In-
dividual learners and learning facilitators can be linked to templates and tools
to support goal setting and planning (micro-level). They can be encouraged
to reflect on the pacing and time management required in multiple stages and
phases of upcoming lessons and projects (macro-level), and they can be en-
couraged to confront resistance to goal setting and planning by seeing the
success stories of those who employ these techniques regularly (meta-level).

4. Activate prior knowledge to enrich self-


regulated learning strategy use

Past education and experience represent both a potentially rich learning re-
source and a possible threat, since old habits and misunderstandings can block
the grasp of new ideas and procedures. For this reason, educators, trainers,
and instructional designers should incorporate activities and tools to elicit
learners’ prior knowledge and help them reflect on which elements of it are
potential building blocks and which are possible barriers.

Based on the three-level support approach to self-regulated learning, ways


for activating prior knowledge might include: Linking individual learners
and learning facilitators to lessons about how to elicit and document prior
knowledge relevant to a particular lesson (micro-level). Identifying the useful
prerequisite knowledge as well as the naïve concepts that might pose learning
hurdles in upcoming lessons or projects (macro-level), and supporting indi-
viduals’ capacity to activate useful prior knowledge and to counter or encap-
sulate less useful prior knowledge (meta-level).
294 | Modernizing Learning

…it’s not just what you learned but


rather how much it changed you.
Betty Lou Leaver, Ph.D.
Director, The Literacy Center; Manager, MSI Press; Former Provost,
Defense Language Institute Foreign Language Center

More research is needed in this area, however, to uncover new methods for
estimating learners’ prior content knowledge and self-assessed self-regula-
tion skill levels. Since traditional testing can negatively impact learners’ mo-
tivation, finding new assessment methods is a critical step to enhancing per-
sonalization models beyond their current level. Currently, traditional testing
approaches and curriculum sequences favor comprehensiveness and certifi-
cation. Work is needed to understand how adjusting the frequency and forms
of assessment can inspire rather than hinder self-regulated learning. Methods
worth exploring include integration of self-reflective assessments of content
knowledge and self-regulated learning skills with validated measures of tra-
ditional content knowledge and skills.

5. Support metacognitive monitoring

As learning platforms and media proliferate, the community will need a wider
range of ways to gather trace data on how and under what conditions learners
use self-regulated learning supports. This line of research is likely to inno-
vate around new approaches to using xAPI to collect student data, usefully
aggregate datasets across experiences, and apply learning analytic models to
analyze them. Such work need not focus only on individual learners’ patterns,
but should also consider patterns within content pathways from multiple us-
ers. Such data traces can support more personalized and optimal recommen-
dations of what content to review next and can strengthen systems to covert-
Self-Regulated Learning | 295

Discover

EXPLORE

Dabble Bridge

Familiarize

STUDY Practice Get Help

Assess
Next

Refresh Use
SHARPEN

Extend

U.S. DEFENSE TECHNOLOGY EXAMPLE


By providing fast access to short-form learning materials (“micro-
content”), mobile applications can make it easy to use brief windows of
available time for learning. Such applications can use AI to identify high-
interest topics, select learning activities most likely to benefit the learner,
and then recommend micro-content on selected topics and activities. For
example, PERLS, a mobile app developed with DoD support, presents
recommendations in the form of electronic cards that users flip through
to find preferred content, and underlying these recommendation is a
dynamic model of self-regulated learning. The app has been evaluated
with several DoD organizations, including U.S. Northern Command and
Joint Knowledge Online to augment training in areas such as Defense
Support of Civil Authorities. Early results show that learners using PERLS
reported heightened enjoyment and motivation to learn, and they
performed as well as others required to take a full, formal courses.20
296 | Modernizing Learning

ly strengthen or fade self-regulated learning


If we believe that the
support in a continuous fashion.
exploration of knowledge
must continue, then One aspect of self-regulated learning sup-
we can’t only teach the port that has not been adequately studied
knowledge we currently concerns understanding both the optimal fre-
have. Truth and facts are quency of self-regulated learning support and
constantly unfolding. If we the optimal tools for providing this support.
just decide that by 2018 These factors are likely to vary by the content
we have all the knowledge to be learned as well as the learning platform
we’ll ever need then we’re (e.g., LMS, mobile smartphone). R&D devel-
making a serious mistake. opers should be prepared to make the case
Christopher Guymon, Ph.D.
for which self-regulated learning skills they
Interim Dean of the Graham plan to target, highlighting those skills most
School, University of Chicago, important for learners of their content and
Office of the President
most amenable to support with their particu-
lar learning experience. Such design specifi-
cations can improve the field’s understanding
of how different technologies can support
specific self-regulated learning skills.

Supporting metacognitive monitoring across


the three levels of abstraction might include:
Connecting individual learners and learning
facilitators to tips and guidelines for notic-
ing and remedying points of confusion, poor
procedure or technique, and weak understanding (micro-level); identifying
points for checking on understanding and procedures in upcoming lessons
and projects (macro-level). Additionally, new methods may be able to track
progress over time, measuring the effectiveness of techniques in reducing
misunderstandings and, in turn, providing systematic feedback that sharpens
procedures over time (meta-level).
Self-Regulated Learning | 297

6. Foster habits of post-learning reflection

Educators, trainers, and instructional designers need to provide extended


post-training self-regulated learning support for learners, helping them to re-
flect, learn how to reinforce, and know when to refresh past learning. Such
post-training support could be delivered by mentors and coaches, aided by the
parent organization, or take the form of persistent technology-based tools for
self-coaching and reference.

To return once again to the three levels of abstraction, ways to foster post-learn-
ing reflection might include: Providing lessons to individual learners and
learning facilitators about the kinds of useful questions to pose (micro-level);
scheduling and building-on reflection activities across an extended lesson or
project (macro-level), and rewarding learners for engaging in reflection ac-
tivities, such as offering them the chance to unlock a range of new learning
opportunities based on their reflective participation (meta-level).

Summary

Successful self-learners do more than just study and memorize. They stay alert
and are curious to discover new, valuable learning. They skim a lot of content
to find the important points. They search informally to nurture motivation for
intensive study and periodically review afterwards to fight forgetfulness. And
they find the time to do it all.

Though more than 70% of work-related learning is self-learning, few


technologies help self-learners deal with these challenges. Ideally, technology
will reduce the difficulty and friction of all self-learning activities, while
making it easier to learn in small slots of available time, whenever and
wherever these occur. Targeting and supporting self-regulation skills
throughout personalized learning trajectories will aid learners of all ages and
promote enhanced learning efficiency across lifetimes.
Organization
In an era where gaining access to information is no longer
difficult, a continuing culture of high-stakes testing (focused on
testing knowledge recall) runs counter to what we need. We should
instead value the ability to sift through information and to connect,
assimilate, aggregate, interpret, and apply data. If our teachers could
help students view information from societal, cultural, economic, and
other perspectives, and if they could help students practice writing
and data-validation skills and creativity, we could accomplish so much
more than just teaching them how to answer multiple-choices tests.

Anne Little, Ph.D.


Vice President, Training Solutions Development, SAIC
Instructional Designers and Learning Engineers | 301

CHAPTER 16

INSTRUCTIONAL DESIGNERS
AND LEARNING ENGINEERS
Dina Kurzweil, Ph.D. and Karen Marcellas, Ph.D.

For over 60 years, instructional designers have supported teaching and learn-
ing, primarily by identifying effective ways to present material in formal edu-
cational and training environments. Given advances in technology, increasing
access to data, and the explosion of formats and venues for learning, designers
in the future will have to gain more knowledge and expertise than ever be-
fore as they develop their professional craft. Consequently, a new concept is
entering this complex field: the learning engineer.
engineer Who are these individuals?
What are their areas of expertise? How do their knowledge and skills relate
to, expand upon, or differ from those of instructional designers? This chapter
describes the history of instructional design and explores how the field of
learning engineering will need to develop and expand upon instructional de-
sign methodologies to support teaching and learning in the future.

Background: Design of Instruction

Traditionally, a number of specialists have collaborated in developing learning


experiences and tools. Their titles and roles may differ somewhat depending
on the project or the available personnel, but one commonly used team struc-
ture includes a technologist, a learning science expert, and an instructional
designer. Technologists generally have technology backgrounds and use ei-
ther personal experience with education or some learning science knowledge
to help develop instructional technology tools. Some certainly have robust
302 | Modernizing Learning

educational knowledge, but usually this isn’t the norm. In contrast, learning
scientists are educational researchers who are deeply knowledgeable about
how humans develop and learn, particularly from a cognitive perspective.
Both of these roles can act in support of instructional designers, who apply a
systematic methodology based on theory, research, and/or data to plan ways
to teach content effectively. Instructional designers work in both education-
al and training environments. They’re problem solvers who use different in-
structional models to promote learning. In other words, they’re responsible
for “the theory and practice of design, development, utilization, management,
and evaluation of processes and resources for learning.” 1

A BRIEF HISTORY OF INSTRUCTIONAL DESIGN

The field of instructional design is historically and traditionally rooted in cog-


nitive and behavioral psychology. It first emerged during a period when the
behaviorist paradigm dominated American psychology. Its practice can be
traced back to the late 1950s and early 1960s, but in those early days, one
wasn’t referred to an “instructional designer.” Rather, those who worked in
this field were typically called educational psychologists, media specialists, or
training specialists.2

Through the 1960s and 1970s, the growth of digital computers influenced
learning theories, and many new instructional models adopted an “informa-
tion-processing” approach. The 1970s also heralded the systems approach to
instructional design, including one of its best-known models, the Systems Ap-
proach Model, published by Walter Dick and Lou Carey.3 The Dick and Carey
approach offered a practical methodology for instructional designers, and it
emphasized how each component of the model works together. Dick and Car-
ey also highlighted how technology, media, and research were all impacting
the field at that time and, consequently, how “modern” instructional designers
differed greatly from their counterparts in the 1960s in terms of academic
background, training, research, and tools.4
Instructional Designers and Learning Engineers | 303

EXAMPLE: The proliferation of video cameras makes it possible for any


instructor to record videos for use in courses; the role of the instructional
designer is not simply to facilitate the incorporation of video but rather
to examine instructional goals and identify areas where it can be used
most effectively to support student learning, while also possibly identifying
appropriate use of lower-technology and lower-bandwidth solutions in
other areas. They also work with faculty to define content that would
best be suited for video. Continuing with the video example, instructional
designers also look at the video’s effect on learning and develop ways to
improve the both the product and the learning outcomes.

Throughout the 1970s and 1980s, the instructional design field continued to
evolve; a later survey of instructional design models found they had differen-
tiated into having a classroom orientation (focused on development of instruc-
tional materials for a single lesson or set of lessons by teachers), a product
orientation (focused on development of specific products by teams), or a sys-
tem orientation (focused on development of curricula by teams).5 Present-day
instructional design continues to have different application specialties, and it
continues to be influenced by technology. However, rather than model instruc-
tional design theories on technology, as in the 1960s and 1970s, contemporary
instructional designers explore ways to incorporate technology into their work.

Experienced instructional designers recognize that technology has numer-


304 | Modernizing Learning

ous uses for learning—but it’s still just a tool. While technology can provide
many benefits, its effective use in training and education requires carefully
defining its role and ensuring it remains subordinate to the learning goals. In
recent history, we’ve seen a push for instructional designers to focus more on
technology, shifting emphasis away from instructional theory. However, the
systematic design, development, implementation, and assessment of teaching
and learning requires that instructional designers keep instructional methods
central to their work and examine all technology with an eye towards promot-
ing more effective learning.

“Technology is not an end in itself; any successful use of training technology


must begin with clearly defined educational objectives.” 6

INSTRUCTIONAL DESIGN ACTIVITIES

Instructional designers’ primary role is to support good instructional practice.


As many professionals in the teaching and learning fields have known for de-
cades:7 Teaching is a complex activity that, when done effectively, is closely
tied to the success of learners.8

Many times, instructional designers work with subject-matter experts, such


as training facilitators, teachers, and/or other faculty members, to help them
translate their content knowledge into effective learning experiences, usually
for formal learning contexts. Often, these content experts have less familiarity
with effective instructional practice; hence, instructional designers introduce
them to key principles and help them incorporate more effective methods.
Instructional designers help their clients think more critically about a range of
issues related to instruction, including the needs of learners, curricula, learn-
ing environments, and associated policies.9

Instructional designers generally use systematic models and methods, such


as the systems approach, backwards design, successive approximation model,
Instructional Designers and Learning Engineers | 305

and the Kemp instructional design process. Their approach usually involves
identifying desired outcomes and determining the skills, knowledge, and atti-
tude gaps of a targeted audience. They apply theory and best practices to plan,
create, assess, evaluate, select, and suggest learning experiences to close those
gaps.10 Instructional designers may be involved with the entire instructional
process or with portions of it. For example, early in a project, they’re often in-
volved with the systematic review and critical appraisal of existing materials.
Using research and theory, instructional designers may also conduct analyses
before the actual instructional design and development occur. Later in the
process, instructional designers may emphasize the importance of assessment
and evaluation, to ensure learning experiences have met their intended goals.
A common theoretical and practical understanding of innovation also con-
tributes to instructional designers’ work, and the best instructional designers
ensure their clients, fellow educators and trainers, and leadership recognize
how the different tools, processes, materials, and innovations that make up
learning systems can enhance their learning offerings. Hence, instruction-
al designers need to additionally have a creative spirit of design,11 including
an imaginative, creation-oriented, and interdisciplinary character as well as
the creative spirit to remain flexible and perceptive in their practice. That is,
despite the proliferation of formal processes, such as instructional systems
design, instructional design remains an art—albeit one firmly grounded in
science and theory.

A learning engineer is someone who draws from evidence-based


information about human development—including learning—and seeks to
apply these results at scale, within contexts, to create affordable, reliable,
data-rich learning environments.

Bror Saxberg, Ph.D., M.D.


Vice President of Learning Science, Chan Zuckerberg Initiative
306 | Modernizing Learning

“Designing is a process of pattern synthesis, rather than pattern recognition.


The solution is not simply lying there among the data... it has to be actively
constructed by the designer’s own efforts.” 12

CHANGING CHARACTER
OF LEARNING
The growth of technology and access to learner data has led to advances in
learning science and made the learning environment more complex. This, in
turn, affects the roles of instructional designers, who must now interact with
a variety of formal and informal modes of learning, social and experiential
learning theories, as well as new tools, processes, and people. This complex
infrastructure has been called the “learning ecosystem.” It encompasses the
physical and mechanical elements of educational and training environments;
the theories, processes, and procedures that drive their use; and learners’
(complex) relations to and interactions within that environment. This includes
all elements that make up learning, from the formal classroom and those tra-
ditional instructional activities, to the technologies used to support informal
learning. The complexity of the future learning ecosystem is turning instruc-
tional design into an even more dynamic activity, where designers must be
aware of how all these elements come together, how each works, and how to
best orchestrate learning across time, space, and media.

These advancements have similarly transformed the expectations of leaders,


educators, trainers, and learners, and at the same time, they’ve created an
abundance of choice for anytime/anywhere learning. The strategic challenge
is that, unlike when learning occurred primarily in a classroom with limit-
ed technology options, today there are many resources available in personal
learning ecosystems, classrooms, training programs, and beyond. Given that
Instructional Designers and Learning Engineers | 307

most of these new resources rely on technology, the challenge is no longer


about mastering a few platforms in a constrained environment—it’s about un-
derstanding the benefits of multiple resources, maintaining awareness of the
wide variety of capabilities, choosing the best ones for learning, and balanc-
ing the entire ecosystem of multiple resources in a way that provides greater
support overall. Such rapid advancements have made it ever more challenging
for conventional training, education, and instructional practitioners to build
effective strategies, tools, policies, and designs; hence, there’s need for a new
player: the learning engineer.

Learning Engineers

In December 2017, the Institute of Electrical and Electronics Engineers


(IEEE) Standards Association Standards Board recommended creation of a
new 24-month working group, called the Industry Connections Industry Con-
sortium on Learning Engineering or ICICLE, to provide definition to and sup-
port for the burgeoning field of learning engineering. Creation of this group
marks a groundswell of attention on the learning engineering field, although
its original concept dates back to the 1960s, from Nobel Laureate Herbert A.
Simon, who wrote at the time:

The learning engineers would have several responsibilities. The most


important is that, working in collaboration with members of the fac-
ulty whose interest they can excite, they design and redesign learning
experiences in particular disciplines. […] In particular, concrete demon-
strations of increased learning effectiveness, on however small a scale
initially, will be the most powerful means of persuading a faculty that a
professional approach to their students’ learning can be an exciting and
challenging part of their lives.13

Learning engineering, as conceived today, is an interdisciplinary approach


based on an in-depth foundation and education in proven theoretical models
and methods, educational paradigms and instructional approaches, and sci-
308 | Modernizing Learning

AI: In many ways it’s solving similar problems as before but doing it more
effectively with data. For example, we can search and find content with a
much deeper understanding of its meaning. We can get better at questions
such as: “What’s the student really trying to learn? Can we find the part of
a video that would be most helpful? How else can we make this experience
easier for students?”
Shantanu Sinha
Director, Product Management, Google
Former Founding President and Chief Operations Officer, Khan Academy

entific and analytical methods. Learning engineers use data and knowledge
of enterprise structures to help promote good decision-making in the use of
learning ecosystem components. With its focus on data, and in using validat-
ed methods that put learning data to work in the service of improved learn-
ing outcomes and institutional effectiveness, this emerging field takes a step
beyond traditional instructional design. Learning engineers do this, in part,
by combining big data with design-based research to improve the design of
learning experiences.14 Additionally, learning engineers use theoretical and
practical understanding to scale innovations across the learning ecosystem.

Learning engineers can help with the complexities of integrating various tech-
nologies, workflows, interactions, and data-driven processes to enable learn-
ing. They may engage with widely varying technologies, including learning
management and learning content management systems, mobile learning
Instructional Designers and Learning Engineers | 309

applications, course authoring tools, MOOCs, digital simulations and game


environments, virtual/augmented reality, micro-credentials, learning appli-
cations and tool developments, learning records and analytics dashboards,
video and other streaming content, and new applications involving wearable
and IoT technologies. Though learning engineers may not necessarily write
software code or serve as systems administrators, they can influence the de-
sign, development, integration, implementation, and use of a wide variety of
technologies. They might, for instance, recommend AI algorithms, such as
deep learning, to analyze data gathered in rich learning experiences to create
a clearer picture of learners. This information can be used to inform how
learning is supported, for instance, by deepening student engagement in their
courses, improving the efficiency of teachers’ instructional methods, or pro-
viding learning tailored to individual needs.15

“Bringing together teams of collaborators with different kinds of expertise—


teaching, subject matter knowledge, instructional design, and data analysis—
is a prerequisite for realizing the full potential of learning system data.” 16

The growing and dynamic learning ecosystem means learning engineers are
likely to play much larger roles in the planning, design, development, and
analysis of diverse and complex instruction. Learning engineers, like instruc-
tional designers, will be expected to anticipate changes or new developments
in applicable technologies or in the instructional fields affecting their spe-
cialty areas and programs. They’ll also need to continually improve their
instructional strategies to reliably identify best practices and opportunities
for change. Accordingly, learning engineers need to possess a wide scope of
competencies, including a foundation in learning science as well as the use of
data to improve learning practice. They need to know good learning design
principles, be conversant in learning analytics and enterprise learning tech-
nologies, and have some unique areas of relevant expertise, such as cognitive
science, computer science, or human-computer interaction.
310 | Modernizing Learning

I don’t think that from a military perspective that we’ve completely taken
advantage of large data management. Here’s a great analogy: We have
hundreds, if not thousands, of hours of full motion video, but how much do
we actually analyze based on the current tools…? Eighty-plus percent isn’t
reviewed in detail. Until recently, we were working on automating that and
that’s one element I look at data management for—turning those mountains
of data into decision-quality information.
Thomas Deale
Major General, U.S. Air Force (Ret.)
Former Vice Director for Joint Force Development on the Joint Staff

In general, learning engineers tend to focus more on technology and data-driv-


en decision-making than do instructional designers. At the highest levels of
expertise, learning engineers typically act as partners to provide leadership,
advice, and guidance throughout an organization and to serve in key staff
positions, such as a specialist at agency or major military command head-
quarters, or in a generalist capacity as an educational specialist at a school or
university. Learning engineers’ focus on data could give them an inroad for
working with learning professionals who need grounding in assessment or
in how learning works, such as training facilitators, teachers, and faculty at
educational institutions. Those drawn to evidence-based practices might be
especially interested in working with a learning engineer. In a higher educa-
tion environment, for example, learning engineers could provide a valuable
Instructional Designers and Learning Engineers | 311

service by helping to link research and teaching, both promoting current re-
search into effective teaching and encouraging faculty members to conduct
such research. Learning engineers can also work in many different industries,
perform many different tasks at various organizational levels, and, indeed,
work side-by-side with instructional designers and other learning profession-
als—but with a different focus.

Instructional designers and learning engineers should collaborate and partner


to assess learning needs, develop strategies, and implement plans based on all
the component parts and connections within the ecosystem in which learn-
ing occurs. Both instructional designers and learning engineers have valuable
knowledge and competencies that can help make the most effective use of
learning resources, and together, they can contribute to transforming how we
think about teaching, learning, education, and training.

IMPLEMENTATION
Define the Roles

While they can work together and have some overlapping skill sets, there are
important distinctions between learning engineers and instructional design-
ers. Notably, while learning engineers’ skills are grounded in applied learn-
ing sciences, they additionally emphasize data science, analytics, user expe-
rience, and applied research. Learning engineers also have a greater depth
and range of experience, including some expertise in the implementation and
improvement of learning ecosystems—that is, in working with diverse, tech-
nology-enabled, data-driven learning systems.

Before becoming a learning engineer, someone must acquire the highest lev-
els of knowledge in learning theories, models of learning, data about learning,
research into learning, and the management of learning. They’re also likely
to need higher levels of technical experience than instructional designers. As
312 | Modernizing Learning

such, unlike an instructional designer who can start at the entry level and
develop skills over time, learning engineers must have more extensive edu-
cational backgrounds and prior experience. The mix of knowledge and ex-
perience, or, more specifically, the ability to filter expert knowledge through
the lens of practical experience, helps characterize the learning engineering
approach to instructional solutions.

To be clear, education alone won’t give learning engineers the practical


knowledge nor integrated experience they need to be successful. A typical
learning engineer wouldn’t come out of an undergraduate program; rather,
we’d expect a learning engineering protégé to build upon undergraduate work
in education or a relevant technical field with applied experiences and sub-
sequent advanced preparation. For instance, someone might first train and
work as an instructional designer and then later seek additional education in
the research, learning sciences, and data-based problem-solving elements of
learning engineering.

Education and Professional Growth

What would the education of a learning engineer look like? As discussed


previously, they must have a solid grounding in learning science as well as
experience with instructional design, curriculum development, evaluation,
and other educational areas. They should understand statistical modeling ap-
proaches for education and training, analysis of large datasets, and the use of
evidence to improve learning. Befitting the word “engineer,” they also need
some background in math or science, to help them identify and solve complex,
sociotechnical problems in logical ways.

We must be cautious in thinking about learning engineering as simply a uni-


versity degree. Learning engineering should be a cross-disciplinary program,
likely at the master’s or doctoral level. These programs should also be com-
petitive. Universities should evaluate applicants for sufficient prior knowledge
Instructional Designers and Learning Engineers | 313

and experience. Entrants into a program could have various areas of relevant
expertise, and the purpose of the program would be to engage them in devel-
oping a common vocabulary, breadth of awareness, and solid ability to exam-
ine data to identify learning evidence.

A learning engineering graduate program could have various areas of focus to


complement the vocabulary and data elements. For instance, a technology-fo-
cused concentration in a learning engineering program could incorporate ar-
tificial intelligence, simulation, augmented/virtual reality, intelligent tutoring
systems, or UI/UX for learning. But at the heart of any program must be
learning science and design. Using science and theories as guardrails is valu-
able for all types of learning professionals in creating engagement, establish-
ing context, and promoting application. Though technology may be helpful in
many cases, implementing technology is not the goal—good instruction and
learning are the focus.

In the end, the graduate from such a program should be able to design and im-
plement innovative and effective learning solutions in complex systems, po-
tentially at scale, and aided by advanced technologies when appropriate. They
should be able to use data and a solid, theory-based evaluation framework to
improve learning and assessment in practice. Whether applied to industry,
government, military, or academic settings, these graduates should bring val-
ue above-and-beyond that provided by traditional instructional designers.

Job Series, Titles, and Competencies

The path to the job of the instructional designer or learning engineer may begin
with teaching in K-12 or higher education; working in technology in corporate,
government, or military environments; holding an academic research position;
or filling some other responsibility related to educating or training people.

Because the U.S. Federal Government has a strict classification system for
employment, and because it employs so many education professionals, it
314 | Modernizing Learning

serves as a useful lens through which to view the learning engineer role. The
Office of Personnel Management classifies jobs in the Federal Government,
and its General Schedule outlines the occupational groups, series codes, and
classifications of positions including their duties and responsibilities, descrip-
tions, and standards.17 Each occupational group (such as the 1700 “Education
Group”) is indicated by the first two numbers of a four-digit sequence, and the
subspecialties in that group fall within the specified range, for instance be-
tween 0000 to 0099. The 1700–1799 occupational series covers education and
training–related professions, such as “training instruction” (1712) and “public
health educator” (1725). The requirements and description for learning engi-
neering should be included within this general series.

Currently, instructional design falls within the 1750 sub-series (i.e., the “in-
structional systems series”). It seems like a clear solution to expand this sub-se-
ries to incorporate the competencies necessary for learning engineers and
related future learning professionals. For instance, the title could change from
“instructional systems series” to “teaching/learning support and instructional
systems series.” This would follow a trend in the industry acknowledging the
importance of supporting teaching and learning, broadly. Also, more detailed
language about the work performed by learning engineers, their education
qualifications, and experience requirements could be added to the description.
Correspondingly, the upper end of this job series should be reviewed to ensure
that pay and benefits are appropriately aligned with the necessary experience
and education. If we don’t reframe this series (or take similar actions), it’s
more likely key learning engineering components will become lost within an
organization or devalued in career planning or performance appraisals; we
also risk learning engineers being conflated with instructional designers.

The success of the instructional designer or learning engineer of the future will
ultimately rest on how institutions and their leaders connect, communicate,
support, and value those specialties. Learning engineers shouldn’t be seen
as “one-time stops” or clearinghouse consultants for educational products.
60YC: THE 60 YEAR CURRICULUM
The dean of DCE [Harvard’s Division of Continuing Education], Hunt Lambert, is leading this
effort to transform lifelong learning, which is now a necessity in our dynamic, chaotic world.
The 60YC initiative is focused on developing new educational models that enable each person
to re-skill as their occupational and personal context shifts. The average lifespan of the next
generation is projected to be 80-90 years, and most people will need to work past age 65 to
have enough savings for retirement. Teenagers need to prepare for a future of multiple careers
spanning six decades, plus retirement. Educators are faced with the challenge of preparing
young people for unceasing reinvention to take on many roles in the workplace, as well as for
careers that do not yet exist.
On-the-job learning is familiar to most adults; many of us take on tasks that fall outside of our
academic training.…but our children and students face a future of multiple careers, not just
evolving jobs. I tell my students to prepare for their first two careers, thinking about which is a
better foundation as an initial job—but also building skills for adopting future roles neither they
nor I can imagine now…Given this rate of change, education’s role must be long-term capacity
building—enhancing students’ interpersonal and intrapersonal skills for a lifetime of flexible
adaptation and creative innovation—as well as short-term preparation so that they are college-
or career-ready. Education must also advance two other goals beyond preparation for work:
to prepare students to think deeply in an informed way and to prepare them to be thoughtful
citizens and decent human beings…
The 60YC initiative centers on the least understood aspect of this challenge: What are the
organizational and societal mechanisms by which people can re-skill later in their lives, when
they do not have the time or resources for a full-time academic experience that results in a
degree or certificate? Thus far, attempts to address this issue have centered on what individual
institutions might do. For example, in 2015 Stanford developed an aspirational vision called
Open Loop University. Georgia Tech followed in 2018 with its model for Lifetime Education.
The hallmarks of these and similar models center on providing a lifelong commitment to alumni
that includes periodic opportunities to re-skill through services offered by the institution;
microcredentials, minimester classes, and credit for accomplishments in life; personalized
advising and coaching as new challenges and opportunities emerge; and blended learning
experiences with distributed worldwide availability. I believe a possible third approach is to
reinvent unemployment insurance as “employability insurance,” funding and delivering this
through mechanisms parallel to health insurance…
Much remains to be understood about how 60YC might become the future of higher education.
In my opinion, the biggest barrier we face in this process of reinventing our models for higher
education is unlearning. We have to let go of deeply held, emotionally valued identities in
service of transformational change to a different, more effective set of behaviors. I hope higher
education will increase its focus on the aspirational vision of 60YC as an important step towards
providing a pathway to a secure and satisfying future for our students.

Excerpt from the The EvoLLLution online newspaper, 19 October 2018


by Christopher Dede, Ed.D., Wirth Professor in Learning Technologies, Harvard University 18
316 | Modernizing Learning

Instead, they should be leading the way to optimize experiences and systems
of learning (which may or may not involve technology), and helping organi-
zations meet their missions through the growth and evolution of their training
and education programs. This will require learning engineers to work both
together with other experts and on their own to navigate client expectations,
integrate emerging capabilities, choreograph complex interactions, and help
learners achieve more efficient and effective results.

Conclusions and Recommendations

As the learning ecosystem becomes more complex, those who teach others,
whether they are facilitators, faculty members, or other professionals may well
find it difficult to keep up with the changes. Instructional designers and learn-
ing engineers are specialists in education and training; they can help teachers,
trainers, and organizations transform teaching and learning environments for
the modern age, and they can also help fellow learning professionals expand
their own knowledge and skills in the use of best practices for education and
training.

Instructional designers and learning engineers have complementary skills and


knowledge. They both have a thorough grounding in the learning sciences and
an ability to identify appropriate instructional interventions, but learning en-
gineers will provide more data-driven solutions and focus more on advanced
technologies and enterprise-wide elements.

As these positions evolve, we need to ensure instructional designers and learn-


ing engineers have defined responsibilities and roles, so that both they and
their organizations know whom to approach for different needs, understand
how they work together, and can uniquely value each skill set. Overall, we
need to recognize the benefits that both instructional designers and learning
engineers bring and, thus, ensure they continue to play an active, valued role
in project teams, organizations, and the larger learning community.
Governance for Learning Ecosystems | 317

CHAPTER 17

GOVERNANCE FOR
LEARNING ECOSYSTEMS
Thomas Giattino and Matthew Stafford, Ph.D.

The transition from independent systems to a learner-centered ecosystem is


attractive to learning professionals who have previously undergone a similar
evolutionary process in their field. Readers may recall the first forays into
online learning, which largely emerged within individual programs, depart-
ments, or colleges. As enrollment and interest grew, other organizations went
online as well, resulting in duplication and increased costs. In most instances,
an overarching entity—an agency, industry, school district, university, or uni-
versity system—stepped in to harmonize the e-learning systems, standardize
their technology and approaches, and ensure e-learning results were captured
and reported similarly. Today, as the learning ecosystem reaches maturity, the
question emerges: who will run it and how?

In dealing with competing and constantly changing demands, limited resources,


a vast array of products and capabilities, and a need for integration across
their systems, learning professionals recognized the need for an overarching
governance structure.

Heraclitus of Ephesus noted, “Life is flux; the only thing that is constant is
change.” Learning professionals will certainly agree; their field has changed—
and continues to change so rapidly that it’s difficult to keep abreast of develop-
ments. The proliferation of content, the myriad delivery modalities, and even
the collective understanding of how the human mind actually learns have
318 | Modernizing Learning

driven these professionals into near-constant reconsiderations of their field


and all it encompasses. Learning professionals have reacted to this flood of
capabilities by stitching together patchwork systems of systems. As teachers
approach them with new requirements, usually coupled with a request for
a new technological capability, learning professionals have expanded their
patchworks accordingly. The result is a workable set of individual tools, but
only just. Often, learners and teachers have to switch between capabilities—a
tool for audio/video, another tool for asynchronous chat, still another for syn-
chronous collaboration. It is a “time of plenty,” but it is also plenty confusing!

Learning professionals are starting to describe these product-and-service


composites as “ecosystems,” with the term “ecosystem” adopted from biolo-
gy. Scientists describe ecosystems as groups of living organisms interacting
with one another and with their environment, with a high level of interdepen-
dence. Some ecosystems, such as an ecological biome, are ungoverned, but
others have some centralized mechanisms. A good example of this scientific
understanding is the human body. The various organs each perform specific
functions, but they work together, within an environment that provides ox-
ygen and nutrients, to ensure the overarching system (the body) functions
successfully. It’s a complex system of systems that’s also managed centrally,
as all of these functions are governed by the human brain.

 Without centralized governance, the various


 components of an ecosystem cannot maximize
 effectiveness and efficiency. 
For our learning ecosystem to function optimally, it needs centralized co-
ordination, but where should it come from? An initial, obvious answer is to
look towards technology vendors. For instance, Apple was an early leader in
the system-of-systems technology movement. Apple realized it could increase
its market share by making all of its devices work with one another and by
simultaneously allowing users to personalize their networks, build content,
Governance for Learning Ecosystems | 319

Most of our challenges have been cultural and political, not


technical nor operational. If people can see the big picture, and
they can see where they are and why it makes sense, it can be very
beneficial. If I can get them to see it, then they can understand it,
and more importantly, they can carry the message to the next office
because it makes sense.

Reese Madsen
Senior Advisor for Talent Development, U.S. Office of Personnel Management;
Chief Learning Officer, Office of the Secretary of Defense (Intelligence and Security)

and control their cross-platform experiences. Microsoft and Google followed


suit. In each case, the connection between customers, their hardware, online
capabilities, and content increased the effectiveness of each component and,
in turn, its value to customers.

Looking to large-scale technology or media companies to orchestrate the


interoperable systems, implementation and operation processes, ethics and
norms, and organizational policies for a learning ecosystem, however, is a
risky prospect. The learning ecosystem concept necessarily involves many
diverse components, likely derived from different vendors, across organiza-
tional boundaries, and for different phases and aspects of learning. Seeking
centralized oversight from a single corporation risks “vendor lock” or confine-
ment to potentially expensive and proprietary solutions. Further, many key as-
pects of governance extend beyond technology, media, data, or delivery. Each
organization will want to answer these sorts of questions for itself, away from
the commercial interests of even the best-intentioned industry organizations.
For instance, how an organization chooses to use learners’ data, how tightly
coupled talent development systems are with human resources functions, and
how best to negotiate between stakeholders’ competing requirements are all
key governance considerations.
We’re such a small state that we can’t build our own systems. This
means we need to be the best “masher-uppers.” We’ve worked with
other new England states, but now we’re focused more broadly.
It’s not so much urban versus rural, it’s that we’re an outlier, a
progressive state that’s always focused on the individual learner.
We’re in a slightly different place than other states because we’re
not a top-down, centralized education system. Rather, we put a lot of
emphasis on local control.

Daniel French
Secretary of Education, Vermont Agency of Education

For the most part, too, education and training vendors have been less con-
cerned about governance and more concerned about sales. Governance is a
customer concern. So then, the question for customers—for those organiza-
tions who design and deliver learning—is: How do we create a governance
structure that both centralizes general oversight of the ecosystem while simul-
taneously maintaining necessary flexibility that allows for content ownership
by communities, data ownership by users, and tool creation by developers?
Governance for Learning Ecosystems | 321

E PLURIBUS UNUM
(OUT OF MANY, ONE)

A look back through American history provides an instructive example of


how one might develop a governing structure for an ecosystem. Like the in-
dependent systems of the first educational forays into online learning, early
American settlements existed in relative isolation from one another. The set-
tlements were responsive to their inhabitants’ needs but looking holistically,
there was a great deal of overlap and duplication in governmental functions.
Each settlement handled its security, infrastructure, communications, and
transportation needs often without even considering other settlements. As
these settlements grew, interdependencies developed to create colonies. Each
colony had its individual identity, its own governance structure and, as with
settlements, only limited concern for the wants and needs of neighboring col-
onies. This changed, however, with the arrival of a common threat.

The move toward independence from England, which precipitated the arrival
of what was then the world’s most capable military force, drove a loose alli-
ance among colonies. At first, the colonies attempted to keep their indepen-
dent identities, with primarily decentralized control; however, this first gov-
ernance structure, the 1777 Articles of Confederation, proved a failure. The
Articles failed to create a sufficiently strong, centralized government capable
of guiding the fledgling nation. This resulted in infighting and made the cen-
tral government unable to overcome challenges or capitalize on opportunities
collectively.

As the weaknesses of this confederated approach became obvious, represen-


tatives from across the colonies—the men who became the Framers of the
Constitution—gathered to reconsider their centralized form of governance.
Some argued emphatically for simply modifying the Articles, retaining the
balance of power at the colony level. Others took an enterprise approach, ar-
322 | Modernizing Learning

guing that only a strong, centralized government would be able to quell the
bickering that had made the Articles-based government so ineffective.

In 1788, the Framers’ U.S. Constitution was ratified, implementing a unique


“ federalized approach”—a state within a state in which the former colonies
(now “states”) were provided the authority for tactical issues, while the
centralized government retained supreme power and oversight to deal with
those issues affecting the enterprise (the entire nation). Such a “ federalized
approach” to governance is an ideal structure for learning ecosystems!

The evolution from a loose affiliation of learning-focused entities, each with


its own needs, systems, and rule sets, to an overarching centralized gover-
nance solution parallels the Air Force’s experience in designing and deploying
its “Learning Services Ecosystem.” From the authors’ interactions with other
agencies, the evolutionary track is remarkably similar for a wide variety of or-
ganizations, whether from the industry, academic, or government sectors. In
each instance, success was predicated on the organization’s understanding of,
and commitment to, an enterprise solution coupled with the ability to receive,
evaluate, and act on the varying needs of all organizational constituencies
within the ecosystem. In other words, where governance has proven most suc-
cessful, there’s been an intentional balance between individual constituents’
needs and the centralized needs of the community.

Since humankind first saw the need to join together to satisfy common needs,
there has been some form of governance. Learning ecosystem governance is
no different. An effective governance structure is born out of a small group of
professionals who decide to combine their individual needs, capabilities, and
resources to provide better support for, and service to, their organizational
constituencies. These professionals come together to discover the breadth of
the organization’s stakeholders and the key issues to be addressed. They then
work across the organization to select representatives—the framers—who
Governance for Learning Ecosystems | 323

discuss the issues, create an ecosystem charter, and manage its governance
over time. It’s a labor-intensive and emotional process, but when successful,
it’s an extraordinarily fulfilling undertaking.

IMPLEMENTATION
The process through which ecosystem administrators can design and imple-
ment a governance structure, necessarily includes the following steps:

Step 1: Identify Stakeholders and Select Framers

The first step in establishing governance necessarily involves identifying the


breadth of inclusion: Which entities (colonies) will be included and which
will be left to fend for themselves? Next, there has to be an opportunity for
the entities to come together to share their wants, needs, expectations, and
resources. These stakeholders will become the initial architects of the ecosys-
tem governance structure. To make this opportunity successful, organizers
must ensure the appropriate representatives are selected to participate. These
representatives will become the “framers” of the new ecosystem charter. Or-
ganizers can consult with stakeholders for nominations but may also ask to
have certain personnel appointed for their special skills or knowledge.

Because of the technological focus of ecosystems, organizations are likely


to send representatives from their most technologically advanced programs:
technology experts who understand systems, data, and the capabilities avail-
able in the marketplace. This is expected and desired; however, representa-
tives from all stakeholder groups need to be included as well. Collectively,
the framers will need to understand the entire organization’s needs, products,
processes, and capabilities. Without a holistic understanding of the organiza-
tion, the framers are likely to ignore key constituencies or issues.
324 | Modernizing Learning

The framers should also include members who can think locally, addressing
individual requirements and concerns, while also thinking globally to under-
stand an enterprise perspective. It’s not always possible to find people who
can do both; so, organizers should try to find a balance among the selected
members to ensure all the constituencies are heard. The result shouldn’t be
a patchwork of individual interests, rather the collective perspectives should
inform an overarching strategy for addressing the broadest array of require-
ments and desires.

Step 2: Select Issues

Once the constituencies are determined and framers selected, organizers will
need to consider the breadth of topics to discuss. The selected framers will
undoubtedly expand the discussion when they meet, but it’s necessary to have
“an entering argument”—a list of key questions to answer. These will vary
with each organization’s unique situation; however, the following brief list
might prove helpful in building a governance conference, as they are some-
what common to most organizations:

MEMBERS

1. Who determines who “ joins” the ecosystem? One centralized administra-


tive function involves determining who may “join” both in terms of people and
organizations who want to belong, and also in terms of systems and capabil-
ities that constituencies might want to integrate into the ecosystem. The gov-
ernance structure must provide avenues for entry and, simultaneously, ensure
new people and new capabilities aren’t injurious to others within the enterprise.

2. How will constituencies be replaced? Representation is foundational to the


success of a governance structure, as it ensures constituencies have a voice
in the design, development, and direction of the ecosystem across its life.
There’s a risk in representation, however. Constituencies need to be heard,
Governance for Learning Ecosystems | 325

Members
Who determines who joins the ecosystem?
How will constituencies be represented?
How will governance structure be organized?

Policy
Who is responsible for establishing centralized policy?
Who will enforce policy?
How can enterprise-level functions be supported?

Resources
Who will provide the resources and how?
Who will provide support and how?

Processes
How can the ecosystem address change?
How can the ecosystem remain relevant and responsive?
How does the ecosystem interact with partner/other organizations?
How will users experiment and adapt?

but the governance structure must ensure no single constituency takes control
of the ecosystem to the detriment of others. In addition to rules for expected
behavior, mechanisms are needed to censure misbehaving representatives or
shed inactive ones.

3. How will the governance structure be organized? There are multiple ap-
proaches; however, an approach needs to be selected, coordinated, approved,
and promulgated so all constituents understand where their representation
lies, where decision-making authority lies, and where they can go to request
reconsideration of their proposals should they be denied. The model adopted
by the Framers of the U.S. Constitution (the federalized approach) is worthy
of consideration: The centralized (“national”) government oversees enter-
prise-level concerns while subordinate organizations (“states”) have the capa-
bility to make certain changes to keep their operations moving.
326 | Modernizing Learning

POLICY

1. Who’s responsible for establishing centralized policy? Like the federalized


approach to U.S. governance, some functions and decisions will affect all
constituents, while others are best handled locally. It’s necessary to determine
the functions that affect multiple constituencies, as well as the constituencies’
needs and processes for managing these centralized functions. How will ag-
gregate requirements be identified, needs agreed upon, decisions made, and
results promulgated across the ecosystem?

2. How will the ecosystem address change? Change is difficult. Framers will
need to consider a variety of potential scenarios to devise a system responsive
to change. The following scenarios present examples framers might consider:

U.S. AIR FORCE LEARNING ECOSYSTEM GOVERNANCE


The U.S. Air Force is deploying their Air Force Learning Services
Ecosystem. Air Education and Training Command built the ecosystem
and also established its charter, a managing body that oversees its
operation, and its policy and support structures.
The ecosystem’s governance structures were adopting from the model
prescribed in the IT Infrastructure Library, the British Government’s guide
to IT service management. It’s a hierarchical model, much in line with the
approach prescribed in the U.S. Constitution. For the U.S. Air Force, at
the enterprise level, there’s Force Development Governance, overseeing
how the Service will develop Airmen, how many will be developed, and
in which areas. Below that, there’s an operational level of execution—Air
Education and Training Command—overseeing the specific programs
supporting Force Development and IT/Educational Technology.
Air Education and Training Command manages ecosystem operations
with a level of decentralized execution, so stakeholders can address their
own concerns, but where users’ concerns have the potential to affect
the entire ecosystem, they’re elevated for an enterprise-level solution.
Governance for Learning Ecosystems | 327

• A new training program is created per a senior leader’s directive. Its


administrators wish to claim to high levels of synchronous bandwidth.
Ecosystem administrators need to know how this will this be funded.
• Multiple games and simulations run within local systems. Ecosystem
administrators will have to determine which will migrate to the
ecosystem and what opportunities might exist for sharing technological
advances inherent in the best of these with other ecosystem users.
• Senior leaders have opted to increase the workforce. Ecosystem
administrators will have to determine how the enterprise will support
this increase in throughput. If external education/training is required,
they will also have to ascertain how the ecosystem will track learning
occurring outside of the organization.

3. Who will enforce policy? This is an important consideration, as constit-


uencies will often bring special talent to bear to change or incorporate new
capabilities, software, or hardware into the ecosystem. How will unauthorized
variations be detected and how will they be handled?

RESOURCES

1. Who will provide support and how? Support is a complex topic and one
often overlooked in the rush to bring aboard new capabilities. Systems tend
to come with “a maintenance tail” to keep them functioning effectively and
current to industry and security standards. More importantly, users—be they
teachers, learners, data analysts, or records keepers—need support too. The
governance framers, in their desire to balance enterprise-level and individu-
al constituency-level concerns, may opt for the federalized approach, where
some level of support is provided locally and other support nodes are cen-
tralized for the entire ecosystem. Support is often a major hurdle for fram-
ers as new ecosystems come on line: Users will want to retain their existing
support capabilities while ecosystem administrators tend to favor centralized
approaches. This is a critical resource consideration.
328 | Modernizing Learning

2. Who will provide resources and how? This question should drive framers
to discuss the sources, types, and quantities of resources required, and who
can provide them. It’s a broad category, encompassing money, manpower,
machines, infrastructure (facilities, electricity, internet capability, etc.), and
much more. Some resource considerations follow:

► Funding – Centralized funding is attractive for constituencies but, with-


out their investments, they may find it easier to strike out on their own
when decisions don’t go their way. Framers shouldn’t underestimate the
power of constituencies “having skin in the game!” For government en-
tities (and some non-government organizations as well), we often find
financial resourcing extraordinarily complex, as funds are split across
organizations and labeled for very specific expenditures. “Pooling re-
sources” becomes surprisingly difficult, which creates a risk of promot-
ing individual actions and encouraging redundancies. Framers should
ensure their resourcing strategy doesn’t create “insurgencies” within
their organization.

► Manpower – The pooling of manpower is often recommended as an ap-


proach to enhance efficiency; however, it’s often predicated on a notion
that dispersed manpower possesses some level of excess capacity that
will be employed most effectively if amassed. That’s not always the case.
If five people, working in five organizations, are overwhelmed with their
existing workloads, having them bring their workloads to a common
location will simply increase the difficulty they experience serving their
former constituents–making them even more overwhelmed. So, while
there’s often value in centralizing some functions, care must be taken
to be realistic in the level of effort required and to find the best balance
between local and centralized labor resources.

► System integration – For good reasons, constituencies will often argue


to retain their systems. Training and conversion costs, and the trauma
of switching systems, are very real concerns. Yet the governance struc-
Governance for Learning Ecosystems | 329

ture will have to find efficiencies and ensure that systems work together.
Recognizing potential duplications and overlaps, and dealing with these
fairly, is an important part of technological governance. Those constitu-
encies forced to adapt must receive sufficient assistance to ensure their
operations are not adversely affected.

PROCESSES

1. How will users experiment and adapt? The educational technology market
is changing constantly. Users will want to explore new capabilities to meet
their organizational needs. Restraining creativity will frustrate users and
drive them out of a centralized-governance approach. The best way to counter
this is to provide space for experimentation—an “innovation sandbox.” This
approach supports the insatiable appetite of some users for tinkering; howev-
er, it also drives these users to follow system protocols that govern the entire
ecosystem. This approach benefits all: The ecosystem is not corrupted by ex-
perimentation, and those experiments that prove worthy of pursuing have al-
ready demonstrated an ability to function within the ecosystem successfully.
An additional benefit is the way this approach aids in “policing.” The innova-
tors who leverage the sandbox are much less likely to try to sneak capabilities

OVERALL, TOP DOWN CONTROL OF SCHOOLS IS PROBLEMATIC. We


need more of a resource model that focuses on the question, “How
can the top help you do your job?” to encourage more autonomy,
diversification, and innovation. When you have a really bloated
bureaucracy, it doesn’t help people.

Benjamin Nye, Ph.D.


Director of Learning, Institute for Creative Technologies, USC
330 | Modernizing Learning

onto the ecosystem (with potentially dire consequences for the enterprise) if
they have an approved place and method for experimentation as well as a way
to advance their successful innovations to the central governance structure for
adoption into the ecosystem.

2. How can enterprise-level functions be supported? There are organization-


al-level decisions that must be addressed within the ecosystem, such as the
workforce end-strength, qualifications the workforce must meet (learning
needs), and the transcripting or certification of learning. All of these ques-
tions, and many more, must be considered by the framers establishing ecosys-
tem governance. For the Air Force, negotiating the enterprise-level functions
required participation from several overarching working groups, including the
Service’s Force Development Council, which addresses strategic-level consid-
erations affecting training, education, and experiential learning; an Air Force
Learning Council, which determines content requirements for specific pro-
grams; and an Air Force Educational Requirements Board, which determines
advanced academic-degree and professional military education requirements
for Airmen. Each of these strategic-level bodies has data requirements, and
each produces decisions that drive ecosystem functions.

3. How does the ecosystem interact with external or partner organizational


systems? If the learning ecosystem is set up to provide certificates, badges,
or some other credential, are talent-management systems capable of leverag-
ing those credentials for decision-making? Will supervisors have the means
to verify employees are properly trained to perform specific tasks? The in-
tegration of the learning ecosystem into the overarching organizational IT
structure is foundational to its value to the organization. This is complicated,
requiring a strategic mindset to establish and maintain integration. Two ex-
amples will help explain an administrator’s concerns:
• A partnering organization wants a reciprocal arrangement through
which their employees can learn within your ecosystem and receive
credit electronically, delivered to their personnel records system. Your
Governance for Learning Ecosystems | 331

leaders want the same for employees who train in their programs.
Ecosystem administrators will have to craft these reciprocal
agreements and develop the data-transfer capabilities to make these
arrangements successful.
• A local community college would like to partner with the
organizational training unit to offer associate degrees. The college
wants access to the employees’ training records as well as the ability
to report courseware completions back to the ecosystem. Senior
leadership agrees; they want this too.

4. How can the ecosystem remain relevant and responsive? There must be
mechanisms in place to ensure stakeholders have awareness of what’s trans-
piring within the ecosystem and have a voice in its evolution. There also has
to be some level of senior-leader oversight to adjudicate disagreements that
arise between stakeholders and ecosystem administrators. Lastly, like the
U.S. Constitution, there must be a mechanism for updating the governance
structures and policies. How will the organization drive change in the ecosys-
tem to ensure it continues to meet the needs of the future?

Step 3: Build a Charter

Once all issues have been debated and preliminary decisions made, the fram-
ers should produce a charter. This, in effect, is the ecosystem’s constitution,
describing the manner in which it will function and prescribing the processes
by which it will remain responsive to organizational and user needs. A pub-
lished charter ensures common understanding of authorities, decision-making,
and resource-allocation processes, and it outlines steps stakeholders can take
to resolve disagreements or seek change. The charter should be coordinated
through stakeholders’ organizations and concerns adjudicated by the framers
before final, senior-level approval and implementation. Once approved, eco-
system administrators must adhere to the charter precisely. Doing so ensures
332 | Modernizing Learning

transparency in ecosystem administration but also serves to reduce the num-


ber of complaints or offer a credible defense should complaints surface.

Returning to the Heraclitus quote that began this chapter, “Life is flux; the
only thing that is constant is change.” Ecosystem administrators will face
change. Charters are created for specific needs at specific moments in time.
Those needs can change. The U.S. Constitution, for instance, was ratified
in 1788. In the course of its existence, 33 amendments have been proposed
by Congress and sent to the states for ratification. Of these, only 27 have
been ratified and have become part of the Constitution. Arguably, each of
these proposed amendments represented a disagreement between contempo-
rary Americans and the Framers; disagreements that must be addressed and
resolved. Through the ratification process, the nation keeps its governance
aligned with the nation’s evolving needs. Ecosystem charters need to be simi-
larly responsive. Change should be possible, but the change process should be
sufficiently difficult, so the charter isn’t in constant flux. Should that happen,
the charter will lose its power and meaning. All stakeholders should have a
voice in changes to the charter, so they can weigh the advantages and disad-
vantage and respond appropriately.

Step 4: Coordinate (Ratify!) the Charter

There is a tendency, within modern organizations, to employ a hierarchical


“coordination process” for the approval of organizational positions or initia-
tives. This seems logical; however, in returning to the example of the ratifica-
tion of the U.S. Constitution, one can discover even more wisdom in the Fram-
ers’ approach: Although the Constitution established a representative form of
government, where elected and appointed officials would bring the needs of
their people forward for debate, it’s interesting to note this isn’t the system
the Framers established for ratification of their governance structure, their
Constitution. Instead of handing this task to legislative bodies— the estab-
lished hierarchy of governance—the Framers authorized “conventions.” They
Governance for Learning Ecosystems | 333

understood the existing colonies’ gover- We need a common


nance structures might not be inclusive space where key actors
enough, so they authorized this approach. in postsecondary learning
Conventions were held across the nation. can coordinate without
Most had lax participation requirements, hampering innovation. It’s an
much more permissive than the require- important piece of bringing
ments for a governmental position. As a the systems together. We
result, a vast array of constituencies could need touchpoints without
step forward, air concerns, and identify over-programming.
strengths and weaknesses in the proposed
constitutional-governance structure. Amber Garrison Duncan, Ph.D.
Strategy Director, Lumina Foundation
Ecosystem administrators should be
equally inclusive in establishing their
“governance conventions,” to maximize
inclusion. They should provide the char-
ter to the various stakeholder groups to
let them discuss it and provide input. Cer-
tainly technology experts must be con-
sulted—but so must personnelists, organi-
zational planners, and, of course, trainers
and educators. They will guide organiz-
ers in building an ecosystem that delivers
learning effectively. Since the ecosystem will produce data, it’s also important
to include those people who will have to access and employ ecosystem data.
Consider providing the charter to the registrar or human-capital record-keep-
ing departments. Lastly, don’t forget the learners! To maximize effectiveness,
the learning ecosystem will have to be designed, developed, and deployed
with learners in mind. How will the system meet learners’ wants and needs
if they’re excluded from the governance discussion? Consider “conventions”
carefully; those excluded from consideration are likely to become the most
resistant to the resulting governance structure.
334 | Modernizing Learning

Step 5: Build Responsiveness into Administration

Much in the same way that organizers cast a wide net in establishing their
conventions, ecosystem administrators should ensure all stakeholder com-
munities remain aware of and involved in the evolution of the system. This
requires managers to identify and continually refine the requirements of the
supported population. “Responsiveness” is the watchword, requiring manag-
ers receive and respond to needs quickly and accurately.

In addition to responding, ecosystem administrators must be proactive in pro-


viding feedback to stakeholders on system operations. Metrics, for instance
addressing support, system functionality and availability, and costs, are in-
valuable for ensuring stakeholders are attuned to the enterprise-level require-
ments of the ecosystem. These must be provided to stakeholders regularly and
can also guide senior leaders’ resourcing decisions for ecosystem investments.

Although the governance structure has already been addressed, with the need
for each stakeholder to have a voice in system administration, ecosystem ad-
ministrators must ensure there’s transparency in this process. “Frequently

School districts are a traditional, sole-service delivery model, and districts have
exclusive rights over the learning needs of the students assigned based on residence.
They have to be all things, to all kids, all of the time. This is impossible, and it’s arbitrary
because it’s based on just where you live. If a kid wants something but the school
doesn’t have it, we assume the kid is wrong but the system is right. For example, if a
kid in a rural community loves art but the district doesn’t offer much art, we ask the kid
to put the passion on hold and instead get excited about history or some other offering
the district is good at. We say that the district is right and the student is wrong—in a
deep profound way. But the kid and family are right and the system needs to adjust
and adapt to provide those pathway options. Of course, districts can’t do it alone; they
have to form partnerships.

Ken Wagner, Ph.D.


Education Commissioner, Rhode Island Department of Education
Governance for Learning Ecosystems | 335

asked questions,” chat rooms for stakeholder feedback, meeting minutes for
governance meetings, and regular, open communications between adminis-
trators and stakeholders are critical to building trust across the organization.
Administrators should also alert users to current and upcoming problems,
maintenance schedules, and actions taken to resolve problems. Too much in-
formation is better than too little in building trust. Resources are limited, and
administrators invariably have to deny stakeholders’ requests. Trust and trans-
parency aid considerably in how a negative response is received and perceived.

Administrators must also engage stakeholders in bringing aboard new com-


ponents. Cooperative efforts that maximize participation can expand inter-
est in and support for new capabilities. In addition, administrators may find
stakeholders who can benefit from such initiatives willing to share resources
for their implementation.

Step 6: Address Grievances

Although cooperation is the aim, there will be disagreements. There must be


an organizational “court of appeals” for those situations where administrators
and stakeholders disagree. There must also be a level of strategic oversight to
ensure the ecosystem and all its stakeholders are moving together to satisfy
organizational needs. Most organizations have some level of “corporate struc-
ture” that facilitates strategy-making, execution, and decision-making. Eco-
system administrators must ensure their operations are included within that
corporate structure. To ensure their senior leaders comprehend the value of
ecosystem operations and the challenges faced, administrators should report
regularly to senior leaders. Within the Air Force, for example, ecosystem ad-
ministrators periodically send written reports to Major Command command-
ers and Air Force senior leaders. In addition, there’s an oral report—the State
of the Command—that specifically addresses the ecosystem and is delivered
by the Force Development Commander to the assembled body of Air Force
senior leaders at an annual conference.
336 | Modernizing Learning

We need a new approach that enables our people to


innovate effectively across the DoD training and education domain.
There’s good strategic guidance from our government leadership, but
it’s not carried forward in a meaningful way because it’s negated by
excessive and often misinterpreted policies. The problem is further
compounded by competing interests, stovepipes, and lack of resources.

A new approach should actively pursue and remove irrelevant


administration, process, and governance that kill modernization and
administration
rapid-development initiatives. A new approach should get people from
out behind their desks and away from “business by email.” Finally, a
new approach should encourage face-to-face exchange of ideas, better
cross-organization coordination,
coordination and dedicated investment in discovery
activities outside of the traditional R&D mechanisms.
mechanisms Only by doing
these things will we truly achieve the transformational goals expected
of us working in the training and education space.

Dennis Mills
Program Analyst, Naval Education and Training Command, U.S. Navy

SUMMARY
This chapter described the evolution large organizations must make as they
move from functionally isolated information-technology schemes toward
enterprise solutions. The example of the American colonies’ transition from
relatively independent polities, to loosely affiliated states, and later to inter-
dependent states governed by a constitutionally ordained centralized gov-
ernment, provided a foundational metaphor to help readers orient to this
evolutionary process. In our case, today’s functional, formerly independent
learning institutions will need to come together. Although, they’ll still require
some level of autonomy to address local events and requirements, events with
Governance for Learning Ecosystems | 337

a broader impact must be handled at the enterprise level, across the learning
ecosystem, to capitalize on opportunities, reduce costs, and avoid unintended
consequences that can occur within this complex system of systems.

Organizations can take steps to ensure their governance structures remain


attuned to stakeholders’ needs, are stable to ensure dependability, and are
simultaneously flexible to support growth and innovation. Establishing these
internal governance structures is a critical first step in deploying an effective
learning ecosystem supports, and remain concurrent with, evolving learning
needs and opportunities.

As with nations, initial governance starts at home, by establishing processes,


policies, rules, and norms for managing an ecosystem within a given organi-
zation. Over time, different organizations will encounter more opportunities
for interdependency, and new external governance structures, like the United
Nations or World Trade Organization in our metaphor, will be needed. As we
ponder the enormity of governance required for lifelong learning systems,
however, it’s useful to be reminded of Gall’s law, written by systems theory
critic John Gall:

A complex system that works is invariably found to have evolved from


a simple system that worked. A complex system designed from scratch
never works and cannot be patched up to make it work. You have to start
over with a working simple system.1

The most advanced learning ecosystem efforts will ensure all system com-
ponents work together, that learning is captured and reported across organi-
zational and temporal boundaries, and that the entire construct is learner-fo-
cused, giving users control over their learning and, to the extent possible, their
learning environment. Yet, for such an overarching system to be successful,
it must start locally, with well-developed processes and mature governance
methods within individual enterprises. Over time, then, we can extend those
approaches out, building the complex, lifelong learning ecosystem across our
societies—albeit, one step at a time.
Everything people want is obtainable, but there’s a great deal
of myopic thinking, especially within the government. We hear
too often, “We’ve never done it that way, so why should we change
now?” The focus is too often less about the mission and more about
the change. Having a good change agent is key. We need the executive
branch pushing from the back and Congress pulling from the front. It
needs to be a comprehensive system to be effective.

Reese Madsen
Senior Advisor for Talent Development, U.S. Office of Personnel Management;
Chief Learning Officer, Office of the Secretary of Defense (Intelligence and Security)

Culture Change | 339

CHAPTER 18

CULTURE CHANGE
Scott Erb and Rizwan Shah

Our current learning systems were developed in response to the industrial


revolution and the accompanying shift from an agricultural civilization to an
urban, manufacturing society. The focus of much of education and training
was to produce people ready to enter the workforce with predictable, well-
known, replicable skills that matched the needs of the industrial economy. In
order to produce these workers, the system required teachers who also had
predictable, well-known, replicable skills in teaching. Thus, a system of “nor-
mal schools” was built to train teachers.1 But as we shift to an information
economy, a new set of skills not easily taught within the existing education
and training framework will be required, which will drive a shift in the way
we imagine, approach, and develop learning experiences.

Rethinking learning from the industrial model to the information model


will necessarily be disruptive to existing organizations. The adoption of new
learning science methods and technologies will require change to their cul-
tures, shifting away from incremental compliance cultures with established
delivery and assessment methods to more fluid multiplatform and multimodal
methods, coupled with pervasive data capture and advanced analytics. Orga-
nizations able to successfully navigate this cultural change will thrive; those
unable to do so will be left behind.

This chapter discusses some considerations for the change in culture that will
be needed to modernize learning, remove barriers, and restructure incentives
to inspire the organizational shifts needed in order to achieve the future learn-
ing ecosystem.
340 | Modernizing Learning

The culture shift—that’s probably the


most difficult thing to navigate.
Kurt VanLehn, Ph.D.
Professor, Computing, Informatics, and Decision Systems
Engineering, Arizona State University

FEAR OF CHANGE
Undertaking significant organizational changes can create feelings of uncer-
tainty, anxiety, and being threatened.2 When the area of change is something
as fundamental as learning, such fears can multiply.3 If not adequately ad-
dressed, these feelings can manifest as either passive or active opposition,
resulting in immediate failures and resistance to future attempts.4

There are various human factors that make change difficult,5 some of which
are particularly relevant for the future learning ecosystem. For instance, con-
sider the fear of automation. The potential for AI to replace workers in the
economy, including teachers, doctors, and lawyers, has been widely publi-
cized in the popular press over the last few years.6 This has had the effect of
amplifying the natural fear of one’s skills becoming obsolete in a changing
economy.

Another, related example, involves the fear of losing of control. The impo-
sition of change may make individuals feel their self-determination is being
undermined, particularly when that change involves increased automation,
complexity, and difficult-to-understand data analytics. Individuals might feel
uncertain about their roles, the direction of the organization, or their abilities
to contribute and maintain relevance.7 Team members who were instrumental
in creating the current way of doing business may worry about the perception

Culture Change | 341

that the need for change means their way had failed. Similarly, those who help
administer the current system (such as the present-day teachers, trainers, and
instructional designers) may wonder whether they’ll be able to translate their
current skills into the new environment—will they still be competent and be
viewed as competent by others?

“Nothing so undermines organizational change as the failure to think


through the losses people face.” – William Bridges

Add to these underlying insecurities the fears of increased scrutiny. Data ana-
lytics, increasingly important in all aspects of learning and absolutely critical
to measuring the effectiveness of changes in a learning environment, may
cause concerns that instructors or program managers will be held adversely
accountable if the data don’t demonstrate high levels of perfection. Learners
may feel exposed and uncomfortable, as well, as the data we can collect and
analyze gets richer and more actively informs a growing range of actions—
not just within a given learning episode but potentially affecting jobs, careers,
and lives.

Another reason some resist change is because it looks like additional work.
In general, production must often continue in the existing system while new
systems are established;8 this is certainly the case for the future learning eco-
system. Add to that the new processes and requirements of the future system,
the looming prospect of ongoing lifelong learning, and the complexity of it all.
It seems like a daunting task.

Combined, these form a landscape that leaders of organizations seeking to


innovate must understand and successfully navigate. Discomfort with change
may manifest in different ways as an organization attempts to implement
it. Within sufficiently established bureaucratic organizations, resistance can
be accomplished by citing page and paragraph of existing policy, or by con-
structing unnecessarily onerous approval processes. People with a long his-
342 | Modernizing Learning

tory within an organization, and who may have seen several generations of
leadership, may become passive resistors—intent on waiting out the latest fad
while continuing to execute the status-quo processes they’re responsible for.

However, for organizations to remain viable they must embrace—appropri-


ate—change. If not, organizations that were once industry leaders can fall be-
hind, or worse, close their doors altogether. In which case, those who resisted
the change, and the leaders who failed to overcome that resistance, will have
helped bring about the downfall that they believed they were preventing. A
tough situation, but there are proven techniques to facilitate culture change
and maximize the probability that the change leads to better outcomes.

CHANGE MODELS
There are several change management models that are useful across a variety
of settings, and which can inform options for creating acceptance to advance
learning (see adjacent figure).9 The models vary in complexity. Some have
only a few steps, but these fail to target all the necessary areas; others have
more detail but risk draining resources and time. As such, no pre-packaged
model is sufficient. Rather, a composite of these models, combined with les-
sons learned from working within government, military, and current educa-
tion structures, must be utilized.

General Principles for Encouraging Change


CREATE AND COMMUNICATE THE UNIFYING VISION

To begin, each organization requires a unifying vision of why it exists and


why it’s changing. Organizational consultant and motivational speaker, Simon
Sinek, has written extensively on ways to develop this vision; he emphasizes
that the first step is understanding the organization’s guiding purpose, and
KÜBLER-ROSS’S ADKAR LEWIN’S MODEL
MODEL Awareness of the need
Unfreeze
5 Stages of Grief  Desire to support it
1. Denial Knowledge of how
Change
2. Anger Ability to do it
3. Bargaining Reinforcement to Refreeze
4. Depression make the change stick
5. Acceptance
a set of
goals to
reach

KOTTER’S THEORY SATIR’S MODEL


1. Increase urgency 1. Late status quo
2. Build guiding team 2. Resistance
3. Develop the vision 3. Chaos
4. Communicate for buy-in 4. Integration
5. Empower action 5. New status quo
6. Create short term wins
7. Don’t let up
8. Make change stick

MCKINSEY’S 7-S NUDGE THEORY BRIDGE’S


3 Hard No set process.
TRANSITION MODEL
Elements
Structure Help people change by New
nudging rather than using Ending Beginning
Strategy Systems
traditional methods.
Shared
Values
Skills Style

Staff
4 Soft
Elements Transition
344 | Modernizing Learning

that it’s most powerful when framed as a statement of belief: 10 We believe…


The leader must own this statement deeply and personally, yet also develop it
collaboratively with her core leadership team. The leadership team must also
ensure that every member of the organization understands the vision—the
WHY of the organization.

Similarly, it’s important to incorporate a vision for the future, one that stimu-
lates unity of effort and inspires individuals to take initiative to move forward.
A compelling vision of what the organization looks like in the future helps
generate the buy-in and initiative needed to implement change. Sinek stresses
the importance of communicating why change is needed and reinforcing the
message frequently. Communicating the vision once and expecting it to take
hold throughout the organization is a recipe for failure.

Individuals exist within organizations, which exist within communities, which


exist within the larger ecosystem. Accordingly, when communicating across
this magnitude of diverse constituencies, shaping the narrative for each re-
quires intentional consideration. While managers and administrators may be
focused on inputs and efficiency, instructors tend to be more focused on out-
puts (e.g., how well are the students performing?). Communicating the WHY
for change must acknowledge each team member’s role and remain grounded
in the organization’s overarching purpose.

Finally, helping the entire organization (not just the leadership!) contribute to
this vision creates ownership, builds a common compelling story, and inspires
initiative. It’s also likely to generate ideas that leadership didn’t consider and
to reveal easy early wins to help build momentum. Open-ended questions
can help drive creativity here, for instance: What does the new normal look
like, feel like, and sound like? How do our students or employees say that
they imagine the future? What feedback might instructors give to leadership,
if the new system is working? What feedback would indicate that an experi-
ment isn’t working? What new problems does success create? Are we ready
to recognize and take on the new challenges? What are the characteristics of

Culture Change | 345

a learning organization? How might we change the way we communicate to


improve organizational learning?

It’s also useful to identify key influencers at this phase in the change manage-
ment process; they can help carry messaging throughout an organization and
across different stakeholder groups. The influencers may not necessarily be
the most senior people (those with the formal authority); rather, they should
be those with the social leadership to influence the rest of the organization.
Once adequate levels of initial awareness and buy-in have been achieved, the
organization can begin experimenting with process or technology changes.

ENABLE EFFECTIVE COLLECTIVE ACTION

Innovative organizations rarely fail from lack of vision. Often, ideas are plen-
tiful, while implementation is lackluster. Innovating, especially within large,
established, and successful bureaucratic organizations depends not only on
having a sound vision but also on the ability to manage the organizational
disruption that change entails. However, it’s not the leader’s responsibility to
design and manage the implementation plan; rather, it’s critical that the entire
organization participate. The leader’s job then becomes to do only a few dif-
ficult things: (1) inspiring the team to pursue the “why” by doing things that
generally move the organization in the right direction, at generally the right
speed, (2) ensuring the team has the resources to make progress, frequently
by removing resistance, and (3) creating safety for the team by putting the
innovation and culture change risk on her own shoulders.

The leader must resist, at all costs, the temptation to answer specific ques-
tions in any form of “just tell us what you want us to do.” Providing detailed
instructions for HOW to achieve the WHY is all but guaranteed to derail the
innovation and accompanying culture change efforts. The leader must give
ownership to each team member (at the appropriate levels) to decide what
to build and HOW to build it. There are a variety of ways for the leader to
communicate this transfer of ownership, perhaps the most simple is to ask the
346 | Modernizing Learning

questioner for his intent, followed by asking whether that intent enhances the
organization’s WHY.

The team should then craft a process for how they will implement the innova-
tion and the change-ideas of the team. While the team should craft the process
themselves to ensure that the right domain expertise is incorporated and to
provide ownership of the outcomes, some general principles should be fol-
lowed to address common sources of resistance and their underlying causes.

ANTICIPATE AND HANDLE RESISTANCE

Leadership’s approach to implementing change, and ultimately to creating a


culture that thrives in rapidly-changing environments, must acknowledge the
fears that change can cause, recognize how those fears manifest in the organi-
zation, reframe them into aspirations with a strong explanation of “why,” cre-
ate safety for those who implement the change, demonstrate (rather than just
state) that failed experiments are just as (if not more) important as those that
succeed, ensure incentives are aligned to the new culture, and be persistent
persistent.

The roll-out for new processes or technology implementations should take


into account the sources and manifestations of resistance to the greatest de-
gree possible. Although attempts at perfection here will undoubtedly result
in unacceptable delays, failing to have a deliberate process that accounts for
resistance will invariably corrupt the results of experiments. If resistance to
a new experience is too high, the data collected will reflect the level of resis-
tance rather than the effectiveness of the new process or technology itself.

Notably, the design of the system itself also matters. Too often, early proto-
types are designed for minimum functionality but lack corresponding reli-
ability, usability, and user experience considerations—which distracts from
the experiment and can turn stakeholders against the entire change process.
For instance, the user interface is important. If the new tool takes more than
cursory training to begin using, the experiment is not yet ready for the audi-
ence. A new technology tool should be easy to understand, easy to use, and

Culture Change | 347

make the end-users feel like they’re more effective with it than without it, all
within a few minutes. A good rule might be “as easy to use as an iPad for
a 10-year-old.” Failure to fully appreciate this will strengthen fears of com-
petence, skill obsolescence, or more work. Human-centered design and user
interface programming are complex and time-consuming, but users are so
accustomed to technology that’s well-designed that failing to do so early on
may have severe consequences.

The author of the book, The Checklist Manifesto, Atul Gawande, notes
that he’s never seen the “Big Bang” approach to change succeed.11 That
is, dictating a change from the top of the leadership structure to happen
at a specific place and time hasn’t been seen to work. Clearly an approach
that respects the intent of leadership while also preserving ownership of
outcomes and processes, and inspiring innovation at the point of contact
between provider and customer (in the old model, between student
and teacher), is needed. This approach should be common enough to be
replicable, but flexible enough to be rapidly tailored to specific cases, and
to grow as an organization’s experience grows. Furthermore, it should
be maintained deliberately to ensure that lessons learned in the change
process are collected, understood, and disseminated. If critics see mistakes
repeated, they’ll become more effective critics! We suggest creating a guide
for introducing new projects within the organization. This guide should be
owned by the innovation leader (who may also be the organizational leader
or a senior member who reports to the leader), and used and updated by the
project managers.

Another unique area of concern involves the use of learning data. Clarify
upfront what data will be collected and how it will be used. Understanding
learning outcomes and modernizing learning will require handling big data
and advanced analytics. In learning environments, it’s tempting to focus most
of our attention on students. Teachers, staff, and program managers will also
want to understand that they and their data are safe.
348 | Modernizing Learning

IMPLEMENT INCENTIVES AND REWARDS FOR EXPERIMENTS

Developing a culture that thrives on change depends on the ability to ex-


periment—to innovate, to rapidly try out new ideas, and to learn from these
attempts. That necessarily means change, innovation, and innovative organi-
zations are dependent on failing early and at low cost. Consequently, leaders
must not only create the time and resources for experiments, but also publicly
reward experimentation—especially when it “fails.” Business, military, and
government leaders are familiar with the value of publicly acknowledging
team members for exceptional performance, but the norms against failure of-
ten make celebrating falsified hypotheses an unfamiliar event.

Astro Teller, director of Alphabet’s moonshot factory, Google [x], has a meth-
od for doing so that may serve as a best practice for innovators in the learning
domain. Teller explains that giving lip service to the idea of “failing fast” isn’t
enough. Employees need to be free of the fear of punishment—and in fact
truly believe they’ll be rewarded—for failing fast, that is, for learning and for
rapidly sifting through possible avenues for innovation and change. As Teller
recently explained in a podcast: 12

When one of our projects that actually has, like, a nontrivial number of
people, at least a few people full time on it, ends their project…we bring
them up on stage, and we say, “This team is ending their project today;
they have done more in ending their project in this quarter than any of
you did to further innovation at [x] in the quarter.” …then I say, “And
we’re giving them bonuses…You know what guys? Take a vacation, and
when you come back the world’s your oyster. You’ll find some new proj-
ect to start or you can pick which project to jump into, depending on
which one’s going best.…The word failure, and trying to get people to
fail is a bit of a misnomer.…Failure when it’s actually just “you got a
negative result for no reason and it’s meaningless” is a bad thing. I’m not
pro-failure; I’m pro-learning.

Culture Change | 349

IMPLEMENTATION PLAN
Connecting the general theories, methods, and models of change manage-
ment, we recommend a hybrid approach that capitalizes on the key points of
each. Six areas of focus are recommended when starting the culture change
process for modernizing learning systems.

Educate

The first step towards preparing an organization to embrace the future learn-
ing ecosystem concept involves communication and foundational (re)educa-
tion. Resetting the WHY of the organization is critical to not having to re-
peat the culture change process with ever-increasing frequency. The future
learning ecosystem idea isn’t a defined end-state but, rather, a commitment
to ever-evolving, learner-focused support via interoperable technology and
other emerging capabilities. Our goal, therefore, is to foster an organizational
culture that embraces change as a way of life rather than an organization that
has successfully navigated from one static state to another.

There’s nearly always a fear of change. The goal is to reduce this fear by
increasing education about the change. Extra time needs to be spent helping
people understand what they need to achieve and, of course, why. It’s not just
about garnering their buy-in, it’s about reducing their fear. Step one, then, is
to ensure everyone is educated on the goals for the future learning ecosystem.
For example, explaining the value of interoperability at the technological level
and envisioning the new methods learners and teachers will use to operate in
a human-computer shared space will be important. However, the next step is
to listen: To carefully consider stakeholders’ fears and give them an oppor-
tunity to work through their concerns, contribute to the larger vision, and
become ambassadors to the idea in their own ways.
350 | Modernizing Learning

You have to unleash people


and empower them using
climate and culture.

Ken Wagner, Ph.D.


Education Commissioner
Rhode Island Department of Education

Support

Everyone needs to know where and how to get support, not just philosophi-
cally, but also from a management perspective. Within the U.S. Government,
the Office of Personnel Management’s USALearning program provides an
immediate go-to for the development of this system, and the ADL Initiative
offers support for research associated with new aspects of it. Within higher
education and K-12, other support systems are being developed; for example,
the Lumina Foundation and U.S. Chamber of Commerce are working together
to support employers and employees making this transition. Further, other
standards organizations and professional societies, such as the IEEE, can also
offer guidance and recommendations to government, academic, and industry
constituents.

Providing resources is another important aspect of support, whether those


involve time, labor, or financial investments. We often see situations where
people are given a new mission (e.g., “we expect you to increase employee
engagement”) but aren’t given any ideas, resources, or support to aid the pro-
cess. If people are expected to make changes, they’ll need resources to do
so—not only to support the change, itself, but also to facilitate the overhead
required by the change process. Commitment to change requires resources,

Culture Change | 351

and more than that, it requires a demonstration of “skin in the game” via the
allocation of resources.

Buy-In

What’s the return on investment? That’s the question at the highest level,
but at personal levels, individuals need motivation and will ask themselves,
what’s in it for me? (or WIIFM, usually pronounced as “wiff-um,” a com-
mon acronym in military). So, both quantitative, logical messages and more
personal, evocative messages need to be crafted. That is, we need to consider
both the ROI and WIIFM for teachers, instructors, managers, leaders, senior
leaders, and learners, as well as for businesses, schools, universities, and gov-
ernment agencies. They need to understand why these changes need to occur
and the pathway through the transition. They need to understand why it will
help them personally and how it will be implemented and/or integrated with
existing systems.

Making transition feel easy is one of the most important challenges to tackle
and among the most important one to get right. This book is meant to serve as
an initial step in that process. It’s intended to help paint a picture of the “art
of the possible” and take the first steps towards clarifying why these changes
will improve the system; however, the specific buy-in rationale will be unique
to each organization and stakeholder group.

Multi-Messaging

It’s one thing to make changes within a small system or even within a depart-
ment where like-minded or similarly oriented individuals reside. However,
once change is nationwide and includes systems of systems as well as multiple
communities, it requires multiple but complementary messages to be culti-
vated and disseminated. In this case, it’s necessary to achieve two primary
New observations
and exploration
Integrate and
implement new
Ask questions learning insights
where relevant at the
individual, group, and
Assess, evaluate, test, or, organization levels ORGANIZATIONAL
experiment to gather data
INNOVATION
Analyze data and USE-CASE
form conclusions Distribute new knowledge,
From Kendy Vierling, Ph.D.,
insights, and practical
applications to stakeholders Director, Future Learning
Communicate results Group, USMC TECOM
and recommendations

Inform organizational The U.S. Marine Corps Training and Education Command
learning methods, (TECOM) Future Learning Group demonstrates how an or-
policies, procedures,
ganization can implement evidence-informed organizational
systems, and processes
learning processes to support innovation. Established in
2017, the TECOM Future Learning Group is a special staff
unit that advises the Commanding General of TECOM. Its mission is to seek and assess
innovative methods and technologies to improve Marine Corps training and education. The
figure above shows their process.

Beginning with “new observations and exploration,” the group contributes to organization-
al learning by identifying current and future Marine Corps learning needs, competencies,
gaps, and goals—and how they relate to the individual, group, training and education units,
and the overall Marine Corps. Next, the group scans the horizon for emerging science and
technology, such as augmented and virtual reality–based training simulations, adaptive mobile
learning applications, and new methodologies for enhancing instructor development. They
ask questions to explore the prototypes, test new methods and technologies, gather
data and analyze it to form conclusions, and ultimately provide recommendations to
TECOM leadership. These results and recommendations go on to inform organizational
learning methods, policies, procedures, systems, and processes.

The TECOM Future Learning Group also shares the knowledge and practical applications
they uncover with stakeholders both within and (as appropriate) beyond their Command.
Findings are also integrated into current and future Marine Corps programs at the individual,
group, and organization levels, and the results feed-back into their organizational learning
process, driving the ongoing improvement cycle to enhance Marine Corps learning. The
TECOM Future Learning Group’s work helps overcome the research–practice gap and
more rapidly integrate new capabilities into Marine Corps programs. It also facilitates orga-
nizational culture change, encouraging more innovation in Marine Corps training and educa-
tion—helping the Service move from an Industrial Age model of learning to an Information
Age paradigm.

Culture Change | 353

goals: (1) ensure that the messages to the individual communities (e.g., K-12,
higher education, employers, military, and government) are in line with their
singular goals and (2) that there’s a meaningful message that transcends and
unites these communities. In particular, we need to be clear that the benefit to
both human development as well as to our national development lies in the co-
ordination across these communities, that is, in collectively optimizing learn-
ing and development. The future learning ecosystem requires that we have a
shared, single goal but with an unlimited set of pathways for attaining it.

Compliance and Policy

Individuals in compliance and policy roles need motivation for accepting the
future learning ecosystem concept. The stated goal for compliance and pol-
icy is often to ensure no problems occur—that is, to mitigate risk. This is
especially true in the context of information technology and associated cyber-
security and data handling. But to evolve and optimize, risk must be taken.
Consequently, we need to work with compliance and policy stakeholders to
find the acceptable amount of risk. Who decides that? Who’s responsible if
a breach occurs? These individuals have experience and knowledge, but are
often engaged later in a change process, which creates obstacles to obtaining
their buy-in or integrating their ideas into the fledgling system. We need them
to give their direct input, be part of the conversations for planning, and help
us move smartly towards this new vision of learning.

Implement

Average projects (those not tied to cultural change) usually involves linear
planning and straightforward management, with efficiency among their
performance goals. However, in an innovation context, where culture change
is a necessary criterion, different metrics need to apply. There’s a temptation
to revert to traditional managerial methods, to emphasize speed, to reward
354 | Modernizing Learning

In terms of transforming the education enterprise, we need strict or


serious policy—but not only policy, we also need resourcing, direction,
and enforcement. The devil is in the details here because if you were
to transform the training and education system into something that is
really capabilities-based then the whole flow diagram is going to change.
It wouldn’t be a block, like the “Class of 2028.” Instead, it would be a
continuous flow, and you’d have a completely different process. Some may
finish slower and present a gap in the pipeline. Others may get completed
early and be ready to move on to the next phase. But if the entire system
isn’t reformed, the next phase won’t be ready for them.
James Robb
Rear Admiral, U.S. Navy (Ret.)
President, National Training and Simulation Association

only successful trials, and to backslide into comfortable processes. That will
spell disaster for the future learning ecosystem—it cannot function without
the genuine buy-in of stakeholders or the radical change of participating
organizations.

So, a slower but more deeply rooted approach is needed. Consensus-building


working groups, community standardization efforts, and extensive commu-
nication will need to support collective implementation planning. This isn’t
likely to be a speedy process. Leaders will need to balance a reasonable sense
of urgency with a considered appreciation of the culture-change process.

Culture Change | 355

Each organization will need its own experiments, incentives, and implemen-
tation plans, and these must be devised through collective participation. Sim-
ilarly, the larger community—possibly at a nation-wide level—needs to co-
ordinate. This may require extensive cross-cutting communities of practice
and will certainly mean negotiation of experiments and incentives across do-
mains. Just how this implementation plan is designed and what it will contain
isn’t yet clear; however, it’s apparent that it must serve multiple levels—for the
individual stakeholders, their local organizations, and the collective multi-or-
ganizational community. And it’s also clear that each organization will need
to devise its own messages, measures of commitment, and ways of contribut-
ing to the larger vision. We are just beginning down this pathway. We have the
opportunity to do so “the right way,” in concert and with thoughtful coordi-
nation; it’s important that we resist the urge to speed ahead with shortsighted
implementation plans that sacrifice longevity for temporary achievements. “If If
you want to go fast, go alone; but if you want to go far, go together.” 
together 13

Summary

It’s easy to avoid change, to play the cynic, wait out new ideas until the orga-
nization returns to the status quo, or find excuses to avoid uncomfortable ac-
tions (e.g., remaining in the “analysis paralysis” process). Individuals and bu-
reaucratic organizations, in particular, are often remarkably clever at finding
ways to avoid change. It’s also tempting to view the future learning ecosystem
as simply another technology—as a thing that can be installed and activated,
and then fueled with educational materials that instructional designers mer-
rily create using more-or-less conventional methods. But this won’t suffice. If
effective, the future learning ecosystem concept will extensively affect how
we each live, work, and learn. It will affect organizational dynamics, societal
systems, and maybe even the overall zeitgeist of our time. Such impacts can’t
be achieved through technology alone. They require coordination, a shared
vision, and commitment to it. They require a culture change.
…you’ve got to be opportunistic in fixing problems so that
you’re not just fixing one but rather fixing multiples. At the
same time, you have to try to build and control the narrative; use it as
a barometer and take some of the danger out of change. You realize
you’ve reached where you need to when people start to give the story
back to you. It helps to know that you’ve got a narrative from the
beginning that does something that will keep people engaged but will
also allow you to implement it later.

Jeffrey Borden, Ed.D.


Executive Director, Inter-Connected Education; Chief Academic Officer,
Ucroo Digital Campus; Former Chief Innovation Officer, St. Leo College
Strategic Planning | 357

CHAPTER 19

STRATEGIC PLANNING
William Peratino, Ph.D., Mitchell Bonnett, Ph.D.,
Dale Carpenter, Yasir Saleem, and Van Brewer, Ph.D.

In this chapter, we explore some of the most immediate steps required to re-
alize the future learning ecosystem across educational, academic, business,
government, and military sectors. We discuss the larger system, including
people, processes, and technologies, and recommend considerations related to
its design, development, and implementation.

Today’s Learning Journey

Currently, in the U.S., most children begin formal learning in the conven-
tional education system. Primary and secondary programs follow a fairly lin-
ear, time-based model that creates a conservative, general trajectory where
children progress through academic milestones more or less as an age-based
cohort. Students are largely taught as groups in classrooms and provided with
similar lessons and homework. Usually, these curricula focus on key areas
of knowledge acquisition to include mathematics, reading and writing, sci-
ence, and history, often with a few additional areas included such as art, mu-
sic, physical education, and health. Frequently, development of self-regulated
learning capacities as well as social, emotional, and physical competencies
aren’t formally included, although some students may encounter outstanding
teachers or participate in extracurricular activities that foster these abilities.

As students approach postsecondary schooling, more differentiation occurs.


They can choose elective classes (although often limited by local availability),
358 | Modernizing Learning

and in some districts, school choice programs offer more diverse options such
as magnet, charter, virtual, home, and private schools. Increasingly, students
can even opt for fully online high schools, including relatively low-cost na-
tional and international programs.1 Enterprising students, as well as their
teachers and mentors, also have access to an increasing wealth of education-
al resources, which they’re exposed to at younger and younger ages, from
sources including the National Academies, Khan Academy, TED, and various
MOOCs, as well as associated resource repositories such as MERLOT, OER
Commons, and Connexions. There’s also an unprecedented amount of infor-
mal (and sometimes questionable) online resources from YouTube, Wikipedia,
and Reddit to countless other blogs, web sites, and apps.

Once students graduate from high school, they can enter the public or pri-
vate-sector workforce, seek additional vocational training, or matriculate to
higher-education institutions. Postsecondary education traditionally involves
two- and four-year degree options as well as trade and certificate programs.
Colleges and universities also frequently offer advanced degrees in the form
of graduate certificates, master’s degrees, and doctorates. While many schools
still follow traditional methods, the higher-education sector is rapidly evolv-
ing with various new choices including competency-based degrees, fully on-
line options, and hybrid programs.

Increasingly, individuals can also acquire credentials outside of a formal high-


er-education institution; for instance, intensive “bootcamps” have become
popular in fields such as software coding, project management, and cyberse-
curity. We expect this trend to continue and, in the future, we’ll see more and
more varied credentials—including experience-based credentials earned out-
side of structured programs. In other words, we anticipate that more programs
will be available to accredit individuals for their capabilities and knowledge,
regardless of whether they acquired those competencies in formal or informal
settings. This will substantially shift the way we view formal learning as well
as many related human resources processes (e.g., recruitment and promotion).
Strategic Planning | 359

It’ll change the resumé, too, putting less emphasis on the jobs someone has
held or the degrees earned, and more on his or her demonstrated capabilities.

After individuals enter the workforce, their learning journeys continue. They
can seek vocational training and additional credentials, attend workshops and
seminars, or pursue any number of informal and self-directed learning oppor-
tunities. Some companies also offer continuing education or professional de-
velopment programs for their employees. In the U.S. alone, businesses spend
roughly $90 billion annually on corporate training (as of 2018).2 These offer-
ings range in formality. On the more formal side, there are programs such as
McDonald’s Hamburger University, the “Harvard of the fast food industry,” 3
which trains more than 7,500 students a year,4 and Starbucks helps its em-
ployees earn first-time bachelor’s degrees online through their partnership
with Arizona State University.5 Less formal programs come in many shapes
and sizes, including corporate coaching and mentorship, developmental sem-
inars, official and informal feedback, corporate e-learning and webinars, and
numerous informal learning approaches. There are abundant resources avail-
able, and individuals and organizations have a whole slew of learning and
development opportunities to choose from.

A complementary phenomenon to consider is the increased workforce “churn”


(the word economists use to refer to people switching jobs). A large, longitu-
dinal study by the Bureau of Labor Statistics found that baby boomers held
an average of 11.9 jobs between the ages 18 to 50,6 and in another report, the
Bureau found the median tenure at a given employer, across all ages of wage
and salary workers, was only 4.2 years as of January 2018.7 Many expect to
see continued workforce churn in future years, and, increasingly, we also an-
ticipate individuals will have more careers across their lifetimes. As the pace
of global and technological change continues, jobs will increasingly morph
or become obsolete, and individuals at all levels of work will need to engage
in additional learning as they progress through their careers. In other words,
as discussed in Chapter 4,
4 we’ll see increasing, and increasingly necessary,
360 | Modernizing Learning

continuous, lifelong learning—including ongoing up-skilling and re-skilling


for workers.

Like the private sector, the public sector and military workforce face similar
opportunities and challenges. In general, the same informal learning opportu-
nities exist for these special populations. Agencies throughout the U.S. Gov-
ernment offer wide-ranging learning and development programs, covering
the full gamut of formality. For example, the Office of Personnel Management
hosts the Federal Executive Institute that provides training in strategic devel-
opment for senior executives. The National Park Service provides access to a
wide range of personal learning opportunities through its internal Common
Learning Portal, and the Department of State uses its Virtual Student Feder-
al Service program to provide on-the-job experiential learning opportunities
to students around the country. But the U.S. Department of Defense is most
notable among these agencies. It’s been considered the “greatest training or-
ganization of all time” 8 and invests more funds in innovating education and
training for its workforce than any organization in history, with the bulk of
these efforts focused on programs for its military personnel.

The DoD conducts formal individual, collective, and staff programs, and it
actively encourages mentorship, peer-to-peer learning, and self-development.
It employs the spectrum of learning modalities including in-resident and com-
puter-aided instruction, simulation-based and embedded training, m-learning,
augmented and virtual reality, and hands-on experiential learning. The DoD
also has strict education and training requirements tied to assignment and
promotion, and particularly for key accession points, it employs several stan-
dardized test, such as the Armed Services Vocational Aptitude Battery and the
Tailored Adaptive Personality Assessment System.

Unlike the private sector, service members generally have fairly constrained
entry- and exit-points into the military workforce, and almost always, individ-
uals separate from active duty military service before they fully retire from all
work. Once service members separate from the military, they may return to
Strategic Planning | 361

the DoD or federal government as a civilian or contractor or seek employment


in another sector. The latter often requires some retraining as well as careful
translation of military-centric capabilities into private-sector roles.9

BUILDING TOMORROW’S
LEARNING JOURNEY
Never before have so many high-quality opportunities for learning existed.
Yet, tomorrow’s learning environment will be even more advanced as infor-
mation and communication technologies, automation, and innovation contin-
ue to change how we interact, behave, and learn. We have great momentum,
but how do we optimize this future system? Towards that end, we’ve inte-
grated a set of 10 near-term strategic recommendations for the wider future
learning ecosystem—drawn from across this book.

1. Bridge existing silos

Public and private school enrollments in the U.S. have steadily risen over
the preceding decades.10 The education, training, and talent development in-
dustries are similarly expanding along with corresponding increases in both
not-for-profit and open-access resources. However, many of these expansions
are happening in isolation. For example, learner records are typically housed
in stovepiped data silos. Someone might spend 13 years in school as a child
and then graduate with a high school diploma and a transcript with letter
grades. Any additional specialties, sub-competencies, extracurricular activi-
ties, or other insights are usually absent from this documentation. The same is
true of the university or vocational school outputs, and, typically, for previous
work experiences, which may be documented (say, on a resumé) but are rarely
assimilated as meaningful data. A similar story happens throughout service
362 | Modernizing Learning

members’ and civil servants’ careers—there’s lack of robust data as well as


a lack of permeability of data between formal and informal learning, and
among academic, business, and government institutions.

The future learning ecosystem will enable an environment where the different
tools, technologies, and systems a person encounters can communicate data
about his or her performance and the contributions of different activities to it.
Key to this vision, the various systems will need to interoperate, collect and
share meaningful data, and use that compiled information to promote tailored
instruction. In other words, we’ll need greater interoperability across learn-
ing systems and, correspondingly, greater portability of learning-related data.
Part of the change will also likely involve creating systems of learner-owned
and managed data that use metadata to ensure authenticity, respect learners’
privacy needs, and broker across different systems. This will require a unique
set of capabilities to accommodate
security, privacy, architecture, and
A universal learning profile will
content, and it will place demands on
act as an external repository where
individuals can hold their data and development, deployment, employ-
share it as desired to drive educational ment, and assessment of learning sys-
choices, personalization, employment
tems. This technological architecture
eligibility, and personal growth.
for learning forms the essential back-
bone of the future learning ecosys-
tem—connectivity across time and space make the entire vision possible—
hence why interoperability, data specifications, and learner-centric universal
profiles top our recommendations list.
list

2. Foster full-spectrum competencies

Increasingly, schools and employers are recognizing the impact of social,


emotional, metacognitive (self-regulatory), and physical development. While
these competencies have always been important, there’s greater recognition of
1 Bridge existing silos
• Enable system interoperability and data sharing
• Enable
• Develop
• Develop learner-owned universal learner profiles
• Research
• Research questions of security, privacy, architecture, and content-sharing

2 Foster full-spectrum competencies


• Integrate social, emotional, metacognitive, and physical development
• Integrate
• Apply
• Apply asset models (versus norm-referenced developmental models)
• Use
• Use personalized interventions across the developmental dimensions

3 Reveal and enable informal learning


• Acknowledge and integrate informal learning
• Acknowledge
• Foster
• Foster individuals’ self-regulated learning capabilities
• Make
• Make it easier for groups to engage in social learning

4 Improve assessment
• Limit high-stakes summative assessments, particularly in K–12
• Limit
• Integrate
• Integrate more formative, portfolio-based, and experiential assessments
• Make
• Make assessment data and feedback visible to learners

5 Up-skill and empower learning professionals


• Help learning professionals develop the new capabilities they need
• Help
• Reevaluate
• Reevaluate the organization of learning professionals; focus on teams
• Define
• Define and support the development of learning engineers

6 Plan for integration across learning and personnel functions


• More tightly integrate training and education with talent management
• More
• Update
• Update organizational systems to better accommodate informal learning
• Consider
• Consider up-skilling and re-skilling programs

7 Facilitate a mindset shift


• From cognitive and teacher-centric to holistic and learner-centric systems
• From
• From
• From linear and time-based to personalized and nonlinear
• From
• From isolated to more interconnected learning systems

8 Enable learning at scale, technologically and methodologically


• Build extensible, open-architecture components
• Build
• Research
• Research methods that support interconnected, lifelong learning
• Consider
• Consider changes across the social and organizational structures

9 Design for convenience and equity of access


• Make UI/UX considerations paramount
• Make
• Ensure
• Ensure all learners have sufficient connectivity and access to technology
• Carefully
• Carefully consider the social implications of the learning ecosystem

10 Ensure laws, policies, and governance keep pace


• Evaluate (and update) formal laws and policies
• Evaluate
• Encourage
• Encourage participation in cross-cutting professional organizations
• Develop
• Develop cross-cutting working groups and governance processes
364 | Modernizing Learning

their impact on life-long functioning and as a result, motivation to more active-


ly and intentionally support their development. Developing these “full-spec-
trum” competencies must begin at the earliest opportunities and should also
continue throughout our lives, because as individuals grow they’ll encounter
new challenges that will continue to test their holistic capabilities and require
personal strategies to navigate effectively.

It’s necessary to begin with the foundational years (kindergarten through 8th
grade), to widen the curriculum focus to include social, emotional, metacog-
nitive, and physical development as part of formal education. Creating ob-
jectives for teachers in these areas provides the policy justification they need
to spend time in the classroom explicitly focused on developing the whole
student. Inclusion of these competencies, however, necessitates a shift to an
asset model of growth, which places more emphasis on what students can do
currently and what they need to learn next—as opposed to focusing on areas
where improvement is needed to meet norm-referenced or “typical develop-
ment” milestones. This shift from achievement-orientation to growth-orien-
tation can also improve motivation to learn and promote lifelong interest in
self-driven learning.

In the secondary and post-secondary institutions, we should continue to inte-


grate these competencies into the more traditional curricula, while still recog-
nizing that individuals grow and mature at different rates. In other words, the
asset-model of development will need to continue into early-adult and adult
education, which will create a greater need for personalized education as the
range of individuals’ potential capabilities widens.

In workplace settings, too, we anticipate employers will increasingly value


and seek to hire for “full-spectrum” competencies, hence developing and
measuring them throughout adulthood will grow ever more critical. However,
developing these competencies and assessing their current levels within each
person is challenging, particularly in the less-controlled post-secondary and
workforce settings. Consequently, it’ll be necessary to better leverage and
Strategic Planning | 365

be able to measure the impacts of informal and nonformal learning on these


outcomes. Data that meaningfully capture these experiences, as well as the in-
terests and noncognitive abilities (social, emotional, and physical) individuals
demonstrate in these settings, can help inform these assessments as well as
drive future learning and development opportunities, encourage learners’ mo-
tivation for self-regulatory learning, and help stitch together learning across
separate episodes. Among other things, this will also require “lifting the veil”
between workplace and learning-place settings, allowing closer integration
between learning and performance (or operational) venues.

3. Reveal and enable informal learning

The future learning ecosystem moves us toward a holistic approach


to learning that connects the various “above ground” and
“below ground” learning structures, processes, and systems.

In learning and development circles, there’s a popular notion called the


70:20:10 model.11 It estimates that around 70% of learning is informal or on-
the-job, about 20% involves peer and social learning, and only about 10% is
formal training and education. While this model is merely a general concept,
not a firm quantitative rule, it helps underscore the importance of surfacing
informal learning—i.e., the 90% of learning that occurs outside of formal
settings. Informal learning is pervasive and interwoven into the fabric of our
settings
professional, academic and personal lives, and we must be able to reveal and
understand this complex set of behaviors to achieve the goal of holistic, life-
long learning.

As we progress to a more chaotic and data-saturated world, self-regulated


learning skills, or the ability monitor and motivate oneself in learning, will
become even more important. Accordingly, learning in the future will not only
rely on one’s ability to learn provided material but rather the ability to seek
366 | Modernizing Learning

When you take away the performance aspect, people act differently.
In “practice” settings they’re freer to make mistakes without
someone reviewing and judging them. When they’re being assessed,
though, people bring to bear a different mindset and focus. If we
replace more traditional assessment with “stealth” assessment,
will we introduce a paradigm that’s counter to a growth mindsets
and to how learning happens best? If they have to be always “on”
that could be a really challenging dynamic for our learners.

Michelle Barrett, Ph.D.


Vice President of Research Technology,
Data Science, and Analytics, ACT

out new information, determine its accuracy and relevance, and assimilate it
in a manner accessible and translatable to the real world. We will need to edu-
cate and empower people to distinguish accurate from falsified data, manage
data saturation and information overload, and cultivate persistent energy for
lifelong learning. However, individuals’ abilities to engage in effective infor-
mal learning vary. Hence, it’s important to foster individuals’ self-regulation
capabilities and facilitate their active engagement in self-directed learning,
for instance, by providing access to resources, making learning content more
easily “findable” (such as via metadata), or encouraging learning through per-
sonalized prompts.

Of the various informal learning modalities, social learning seems to be par-


ticularly poignant, as well as practically supportable. It’s all about collabora-
tion—bringing in some of the social aspects that enable individuals to share
and learn from each other. Enabling collaboration, the sharing of information,
Strategic Planning | 367

and co-creation of ideas are important, whether in professional or academic


settings, and not all interactions need to be formally organized. Side conver-
sations at the water cooler and informal feedback have surprisingly significant
impacts on how work is accomplished.

Already, many commercial vendors are developing solutions to support and


integrate such informal opportunities. The learning and development industry
also seeks to gather related analytics that can help provide learners with per-
sonal, relevant, and engaging social learning experiences. Nonetheless, many
research questions remain that the scientific community still needs to address,
including maturing our understanding of how to develop individuals’ self-reg-
ulated learning capabilities, how to best support informal learning in applied
organizational contexts, and how to quantify the range of formal-to-informal
learning.

4. Improve assessment

Assessments, along with the learning data they generate and evaluations they
enable, play foundational roles in training and education. In the future, with
the increased emphasis on personalization and data-driven systems, assess-
ments will only grow in importance. However, the nature of the assessments
will change.

At the K–12 level, the number and use of standardized assessments currently
poses several challenges to learning. Today, in the U.S. system, students are
required to take numerous standardized tests; the results from these are used
to identify struggling students or, in aggregate, to uncover underperforming
school systems. In both cases, the assessments serve as accountability devic-
es. Once a child or school is shown wanting, more assessments are used to
focus and monitor their remediation. While this sounds logical, in practice,
time spent on such detailed work can be emotionally and cognitively taxing
as well as a drain on overall learning time. Emphasis on such high-stakes
The problem is, with all the
assessments we have to
bombard students with, we don’t summative testing has been shown
have time to do the project-based to shift the focus away from true
work. That, in and of itself, is learning and instead to encourage
defeating. Teachers have the superficial “teaching to the test”—
test
lessons, but they don’t have tests that typically emphasize cogni-
the time to develop them with tive abilities to the exclusion of the
the students because of all the “full-spectrum” competencies de-
testing. scribed above.12

Sandra Maldonado-Ross In the future, assessments for sec-


President, Seminole Education ondary and postsecondary educa-
Association (Florida) tion need to focus on feedback and
“feed forward” support—across
support all
developmental dimensions. To the
extent possible, and with balance so
that individuals aren’t continuously
being monitored, a rhythm of au-
tomatic assessments in forms more
closely resembling integrated for-
mative assessments, stealth assess-
ments, portfolio evaluations, and
experiential trials should be considered. Hence, significant attention needs to
be paid to understanding new ways to prove capabilities beyond the current
forms of assessments, articulation of grades, or standardized testing meth-
ods. Our concepts of assessment also need to expand in scope. For instance,
assessments of large-scale outcomes, such as mission success and task ac-
complishment, can provide significant, valid data for determining competen-
cy. Such organizational assessments, however, must be linked to the learning
institutions that can foster such performance or respond to gaps in it. We can
no longer consider assessments of learning and performance as sequential,
causal, chronological occurrences; rather, both become inextricably linked
and interdependent.
Strategic Planning | 369

The relationship between assessment, feedback, and self-driven learning is


particularly meaningful to consider. To better enable self-regulated learning,
individuals, groups, and organizations need access to their data, both at the
discrete level (e.g., data from one assessment) and in aggregate (e.g., across
learning arcs). Such data can inform better learning and development choic-
es—but they don’t guarantee it. Data alone aren’t enough; data need to be
presented in ways that support decision-making.
decision-making However, large volumes of
data can create complexity and overload, making them undigestible and less
useful, and “noise” can enter the system, reducing the clarity or true meaning
of the data. Hence, tools are needed to help individuals turn data into insights
and actions. Accordingly, big data analyses and accompanying visualizations
are needed to help learners, learning facilitators, and learning organizations
navigate modern learning systems.

5. Up-skill and empower learning professionals

As learning contexts evolve, so too do the roles and requirements for learning
professionals within them, notably teachers, trainers, educational technolo-
gists, and instructional designers. The speed of progress in this sector means
they’ll need to learn continuously—keeping abreast of the latest research,
technologies, and regulations. Ongoing professional development to re-skill
and up-skill learning professionals, using formal and informal methods across
diverse media, will be critical.

Learning professionals will also need new teamwork skills. Historically,


someone could be a great teacher in an isolated classroom, without requiring
the support of other learning professionals. In the future, teams of special-
ists—each with unique areas of expertise—will be required. Pedagogues will
need to work with data scientists; AI developers will need to collaborate with
media designers; and human resource specialists will need to coordinate with
education and training leaders. At a personal level, individuals will need to
370 | Modernizing Learning

develop appropriate collaborative skills, and organizationally, new adminis-


trative structures may be needed. For instance, instead of assigning a single
teacher to design, develop, and implement a course, a composite team may be
required. Some of these individuals may reside within a centralized “pool” of
shared talent (say, for the data analysts), while others may be dedicated to the
given program (for instance, the main teacher). Holistic solutions will include
cross-training talent management professionals, institutional administrators
and operational supervisors—all of the 70/20/10 components of learning.

The future learning ecosystem will likely be a highly technical and collab-
orative environment supporting both micro- and macro-level instructional
strategies, maybe even leveraging the “in-between” learning experiences and
events—between classes, courses, and life events—to adapt to learners’ inter-
ests, needs, prior knowledge, and resources. As we begin to look at learning
across the lifetime, leveraging big learning data and new learning strategies,
learning professionals will need new knowledge and skills. This has driven
efforts to define the concept of a learning engineer (see Chapter 16),
16 to close
the gaps between technology and instructional design, and between isolated
instructional events and larger-scale learning systems. We’ll need new con-
ceptual models that define learning engineering, their professional practices,
certification and skills, professional development processes, and integration
into teams and organizations.

6. Plan for integration across learning


and personnel functions

The future learning ecosystem vision views learning as an integral and on-
going aspect of life, woven throughout work and personal contexts. This has
unique implications for employers, who will no doubt leverage it as a learning
and performance ecosystem that “enhances individual and organizational ef-
fectiveness by connecting people and supporting them with a broad range of
I just did a survey on issues for teachers asking them what their biggest
issues in the classrooms are. Four major issues in the survey were found.

First, academic freedom doesn’t exist anymore: “Learn as you live is gone.”

Some of the other big issues were about the evaluations. They pressure
teachers and administrators. In any other job, you’re evaluated on what’s
seen, but people don’t go into surgeries and second guess everything
the surgeon does. That doesn’t happen in a regular job; they don’t get an
evaluation that nitpicks every possible thing they do to ensure it fits into the
rules of what they’re told is important. The administrators, many of them
don’t like how strenuous and stressful the process is.

The third was stress in the classroom and stress in the working conditions.
If there isn’t strong contract language then when there’s an issue it’s tough
to fix.

Finally, there are professional development trainings. Education is changing


so quickly, but how are you expecting me to go to professional development
training on top of 60 hours of working? If you don’t go to the trainings, you
can’t learn what new stuff is available, but if you do go, then you’re missing
the classroom work.

Sue Carson
President, Seminole Education Association (Florida)
372 | Modernizing Learning

content, processes, and technologies to drive performance.” 13 In other words,


we imagine employers will seek to leverage it for their talent management,
performance support, knowledge management, access to experts, social net-
working and collaboration, and structured learning functions.

From these components, organizations can craft an infinite number of dy-


namic solutions for developing and employing individuals, and for optimizing
their institutions, writ large. For instance, organizations will be able to better
select and place individuals, shifting away from gross measures of someone’s
capacity (such as a degree title) and towards competency composites. Psycho-
logical and behavioral analytical data will aid developmental recommenda-
tions, identification of talent, and connection across employers and education-
al experiences. Those same data can be used to improve task assignments or
encourage higher rates of retention.

Organizational processes will, therefore, need to evolve to support greater


multidirectional integration among training, education, human resources, and
talent management systems. The Federal Human Capital Business Reference
Model could be used as a guide. This model was developed jointly as a pub-
lic-private partnership mixing human resources, policy, and industry experts
to create a streamlined and simplified HR system. The model denotes func-
tions, sub-functions, authority, and policy. It also clarifies the Human Capital
Management lifecycle government-wide. Ultimately, this model directly in-
forms how HR practitioners plan for, work with, and organize people, policy,
process, service delivery, and data categorization and reporting.14

In the future, we expect to see greater churn across roles, companies, and ca-
reers. As workers increasingly value flexibility, fluid work/life structures, and
personal experiences, we may also see more “gig economy” careers, where in-
dividuals or teams are available for project-based work or consulting services
but don’t work directly for a single company. Correspondingly, there may need
to be greater permeability across the workforce, encouraging people to move
into and out of formal learning, full-time jobs, and personal developmental ex-
Strategic Planning | 373

If we’re going to align learning with employer needs, we


need to deal with job descriptions and postings, and how
they’re organized on the web. Advancements in data
standards now allow us to create structured, dynamic data
on the web. Thus our goal is to: 1. Extend and improve data
schemas for jobs, and 2. Link it to the web semantically.
Structured, linked data maximizes our ability to search,
discover, and compare data about jobs, and to notify anyone
instantly when a job has changed and in what way. By
organizing jobs-data in this way, we can create an entirely
new labor market information system, directly from the
hiring systems employers use. 
Jason Tyszko
VP, Center for Education and Workforce, U.S. Chamber of Commerce

periences. Continuous up-skilling and re-skilling of the workforce will become


paramount. Taken together, this implies individuals’ competencies will likely
need to constantly evolve, meaning someone skilled at learning will be highly
prized, and organizations will need to better accommodate a variety of life-
long learning mechanisms notably nonlinear, informal, and nonformal learn-
ing options. We may also need new social paradigms, for example, for things
like re-employment insurance that can be used to fill the gaps between careers.

7. Facilitate a mindset shift

The success of the future learning ecosystem concept is, in large part, predi-
cated on culture change. A significant mindset shift will need to accompany
any advancement from the Industrial Age of learning towards the future learn-
374 | Modernizing Learning

ing ecosystem vision. Incremental changes


I think the challenge for the
or mere additions to the current system won’t
Department of Defense is
that we’re using the same suffice; stakeholders, which includes nearly
outlooks and perspectives everyone in our society, need to willing em-
that we’ve been using since brace the new paradigm.
1947. The Department has to
change its thinking. My own We need to change how we typically perceive
personal answer to that is education and training. Very broadly speak-
borne out through my work
ing, today’s systems tend to emphasize for-
on the Force of the Future
reports; the only way you’re mal learning. Individuals largely progress
going to change how the through prescriptive and similar paths, most
Department thinks is to bring commonly based on time factors. Courses
in people who look different. (including technology-based offerings) are
It’s fundamental; we need
often authority- or instructor-centric; learn-
to increase the intellectual
diversity of the Department. ers’ roles are to receive the experts’ knowl-
edge and then engage in predefined practice
Morgan Plummer
activities. In the future, we must be willing to
Director, MD5 National Security
Technology Accelerator embrace more flexible and personalized out-
U.S. Department of Defense come-focused learning that happens across
diverse places, time, and modalities.

We also need to shift our perceptions of teach-


ers and trainers, from the sources of learning
to its facilitators, and in turn, more genuinely
emphasize learner-centric methods. For in-
stance, beginning with primary education,
this mindset shift might mean transforming
formal education spaces from places where
students receive information to places where
they co-create it. In the secondary school years, students might take greater
control of their own learning journeys, which may mean more lenient rules for
required courses and greater encouragement for self-directed learning.
Strategic Planning | 375

Closely related, we’ll need to embrace mastery learning and nonlinear, tailored
learning. While such concepts have been touted for decades, most systems—
whether formal schools or workplace development programs—still tend to
emphasize time factors and minimum achievement standards. To move ahead,
we’ll need to let go of the idea of “minimally acceptable” as an advancement
criterion. Similarly, we’ll need to allow more flexibility in systems, moving
away from amassed education and training approaches, with predefined lin-
ear curricula, and instead towards more nonlinear, personalized trajectories.

Finally, we’ll need to change how we approach the “ownership” of learning.


Today, we have separate silos of learning and separate “owners” of those si-
los—who usually also claim ownership of the data within them. The future
learning ecosystem represents a dramatic shift away from a single organi-
zation trying to meet all education and training needs with a top-down de-
sign. In the future, separate entities will need to negotiate within a shared
“marketplace
marketplace”” of learning that has no single owner, leverages the power of
self-discovery by placing tools into learners’ hands, and relies on integration
across a system-of-systems. Without careful plans, we may find increased
separation and artificial “walls” between segments of the ecosystem, as dif-
ferent commercial vendors or learning institutions try to sell proprietary solu-
tions or promote systems that are intentionally cumbersome to export data or
move away from, so called “vendor lock.” Changing mindsets (and incentives)
to embrace this new model may prove challenging, both for individuals and
organizations.

8. Enable learning at scale, technologically


and methodologically

To make the learning ecosystem practical, particularly from a resource per-


spective, it will need to support a large number of learners and organiza-
tions. Similarly, it will need to be future-proofed: designed to meet today’s
376 | Modernizing Learning

requirements but also with a structure


that can evolve to meet future needs
When I got a job here, I and advancements. Technologically,
wanted to change people’s such a complex system can’t be built
minds about math. I could from a single blueprint; it can only be
do it in the dozens or maybe
achieved through an open systems ar-
hundreds. Now, educators
chitecture approach. This necessarily
are more empowered
because it’s not just who
emphasizes interoperability, modular
they can teach in the designs, common technical specifica-
classroom—now they can tions, shared data standards, and nego-
reach thousands. tiated data rights as well as extensibility
in all components to help grow the sys-
Ralucca Gera, Ph.D. tem over time. A long-term strategy, in-
Associate Provost for Graduate cluding broad community coordination
Education and Professor of
Mathematics for training and education technologies,
Naval Postgraduate School data and metadata policies, and collec-
tive technical governance, is needed.
Paradoxically, this learning at scale will
accompany an increase in the intimacy of learning, as the same technology
that enables access also increasingly supports the “mass customization” of
tailored, personal experiences.

Similarly, various social and organizational structures will need to be recon-


ceptualized, as these changes in learning will create wide-ranging impacts,
from the way our K–12 schools function to the nature of work across soci-
ety. For instance, the timing for students attending school may change; the
movement into and out of employment organizations may change and grow
in frequency. Widening the aperture and access to learning might also change
the nature of traditional trade schools and colleges, and it will likely create
new markets for different educational experiences. For example, new entities
may appear in this learning market to provide “credit” to “students” who take
part in experiences, ranging from something like rock climbing and travel
Strategic Planning | 377

excursions to bootcamps and competency-based micro-degrees.

From a methodical standpoint, there’s an imperative to conduct deliberate


investigations across the range of new learning approaches, including those
outside the dominant paradigm. For example, how do contextual factors, in-
cluding culture, social context, instruction, and time of life, impact learn-
ing? 15 How does technology influence the psychology of learners, and what
are the design requirements for nonlinear lifelong learning? How can learners
aggregate and make sense of learning across multiple experiences, while min-
imizing cognitive friction? There’s much to consider. Our models of learning
and teaching must evolve, both in theory and practice, and be translated into
reference sets, use cases, and other formal representations to inform the de-
sign and delivery of learning at scale. To support that, however, we first need
the technological backbone and the mindset shift. These questions can’t be
answered within the current system because it impedes access to, integration
of, and sufficient measurement of learning—all of which are necessary for
addressing these questions. Thus, the only option is to create a system-of-sys-
tems approach that supports its own continuous evolution.

Once realized, the technological architecture doesn’t just allow for improved
access to status-quo learning opportunities—it creates an entirely new capa-
bility. Metaphorically, consider the components of a car (the steering wheel,
tires, pistons, and so on); separately, they’re functional objects, but when con-
nected together, they can produce an entirely new capability—transportation.
Similarly, the future learning ecosystem, by the aggregate nature of the systems
that comprise it, will create unimaginable new capacities, more than merely
the sum of its parts or the incremental expansion of today’s learning paradigm.

9. Design for convenience and equity of access

Usability is often the limiting factor of technical systems. No matter how


brilliant a new application or hardware solution, if real people in real-world
There are 417 national parks and monuments. We’re all over the globe. We’re in
remote locations. You can lose your cell service. It says on our website that we had
340,000 volunteers in 2016. How do we train all those people? We don’t have any
subject-matter experts in our Washington office—the expertise is out in the parks.
How do we get that knowledge into our system and then out to the workforce?

We created the Common Learning Portal. It’s a web portal—a marketplace for
training. It opens April 2019. The government’s cybersecurity processes (FedRAMP)
kept us from opening the doors sooner; it’s been in pilot-project mode, but
operational, for two years now. It provides a comprehensive learning performance
ecosystem, a holistic view of learning. The system enables us to put information,
people, and other learning resources in places where people can find them, even on
a mobile device. We hope our personnel and volunteers who have been out in the
field for work can go back to their offices and do their training, which they have to
do at the beginning of every cycle. Already, we have over 500,000 page views and
4,000 registered users—without even formally launching. It’s caught on by word of
mouth. Some trainers got excited. We had support from leadership and people. It
was a grassroots effort.

So that’s where we are going tomorrow. We’re not throwing away formal learning,
but we’re trying to pull in performance support, microlearning, and things that allow
us to better do our jobs.

Courtesy of Dale Carpenter


Superintendent (Acting), National Park Services
Strategic Planning | 379

contexts can’t use it, it won’t achieve its goals. At this most obvious level,
this means system usability—across its various user interfaces and user ex-
periences—plays a major role in its success. It’s necessary to put a focus on
UI/UX, making all aspects of the system as intuitive, modern, and effective
as possible, to increase adoption and facilitate its customization to unique
requirements across the broad stakeholder community.

Similarly, issues of network connectivity and technical access are equally im-
portant and extend beyond technology, touching on social and societal consid-
erations. Access issues already limit educational opportunities for children in
many rural or underserved areas. As more of our learning becomes digitized
and networked, we must carefully ensure equity in access to it—not only for
ethical reasons but also to maximize the diverse capabilities of society and
enable all to realize their unique potentials. If not, we risk widening the edu-
cation gap, creating greater disparity in access to quality education and train-
ing, and potentially creating a bifurcation between the “haves” and the “have-
nots.” In other words, we could inadvertently build a divide between those
with access to open, unstructured junk information versus those with access
to higher-fidelity, semi-automated methods of transmitting quality knowledge
within and across communities.

The advent of the learning ecosystem could affect populations, workforces,


wealth distribution and other social factors. Until such time that the ability
to adapt to the pace of change becomes irrelevant, adaptation will become
increasingly ever more important, and that will rest squarely upon the ability
to learn. We must carefully consider not only the social implications of es-
tablishing the learning ecosystem but also its impact on those unable to fully
benefit from it. We have an ethical imperative to consider if, and how, access
to learning is protected and enabled throughout society—and perhaps even
across the globe. If managed effectively, however, “education for humanity”
becomes a real possibility, bringing knowledge and elevating the capabilities
of our entire world.
380 | Modernizing Learning

10. Ensure laws, policies, and governance keep pace

Holistic solutions will demand holistic governance, as well as new law and
policies. These may span broad areas of consideration from technical frame-
works and interoperability standards to content and data exchange processes,
and equity, ethics, and fairness of use. Below are a few considerations, but this
discussion will require a much more extensive treatment, as well as interorga-
nizational coordination, to fully outline.

Starting with K–12 education, new policies and processes are needed in sev-
eral areas. For example, moving to competency-based learning methods will
be key for fostering a learner-centered, development-oriented system. In the
U.S., the Common Core Standards, in theory, allow for a similar kind of co-
ordination. In practice, however, these standards have become inflexible re-
quirements added to already overloaded schedules. Transitioning to a compe-
tency-based model would better allow teachers to personalize learning and let
students earn credit for knowledge acquired outside the classroom. Teachers
will need policy to support them in exercising the academic freedom required
by this model, to be able to adjust content and methods to meet each student’s
unique developmental needs. Additionally, as indicated above, competency
goals will need to be augmented to incorporate social, emotional, metacog-
nitive, and physical elements for “whole person” development. Additionally,
the Every Student Succeeds Act focuses on providing funds to schools that
use evidence-based practices, yet teachers and administrators rarely receive
formal training on research design and statistics. Expanding existing govern-
mental programs (e.g., the Education Innovation Programs, U.S. Department
of Education) that aid in closing this research-practice gap will help optimize
learning and can also support the up-skill and re-skilling of learning profes-
sionals (as described in Recommendation 5, above).

For post-secondary and workforce education, policies must address key


cross-community issues, including funding allocation, data sharing, and data
Strategic Planning | 381

The big thing here, that everyone’s dealing with, is the challenge of: How
do you transition into a performance organization? How do you support
an organization trying to become a performance-based one, and what
are the other things that wrap around that structure? For example, talent
management systems are important, but the way HR people manage now
is mainly through “box-checking,” like the things that you’re required to
do for mandatory training. We’ll have to rework so many things—HR, the
compliance stuff, and assignment decisions…Conceptually, though, it’s
always been the same thing: How do you choose the right people?

Michael Freeman
Consultant, Training and Learning Technologies

usage rights. The diversity of learning venues obviates any unitary solution,
but certain characteristics will be common, such as privacy, continual assess-
ment, and security.

As we progress to a digitized nation, we will need to ensure ethical use of


learning data, both for students and employees. As such, policy must be writ-
ten that ensures individuals can own their own data, with updated laws de-
signed to protect them in current and evolving contexts. Existing laws, such
as the Family Educational Rights and Privacy Act, provide some measure of
protection, but weren’t designed for the kinds of data-rich, technology-en-
abled learning that’s emerging. They also tend to focus on the education “silo”
rather than a lifelong perspective. A balance across stakeholders, notably be-
tween the public and private sectors, will be needed, particularly given the
commercial value of data as a sharable resource. Such laws and policies need
382 | Modernizing Learning

We know we have a shortage of talent (human capital) for certain positions


but we can’t just “up” our recruiting tactics.…We’ve got to transform
how we put them into that pipeline. The big thing I want is to integrate
government with industry and universities—to formalize partnerships
sooner, so we can get smarter about what people need to be ready to
participate in our workforce. Right now, the people we need aren’t coming
out of the universities; so, we’ve got incentive to work together. We could
start small with a representative government agency that could be our
champion and also couch the program as a college internship or co-op. We
have to figure this problem out—we have to grow our workforce.

Anne Little, Ph.D.


Vice President, Training Solutions Development, SAIC

to be reconsidered for the future context and designed in a way that balances
privacy with functionality.

New policy considerations will likely also concern accreditation standards


and assessment validation. For example, consider a medical student who’s
gained competencies through means other than formal education (e.g., an in-
ternship as a teenager, combined with lifeguard training, volunteering as a
medic for local disasters, and online personal study). She could, in theory,
graduate from medical school earlier than her peers—however, only if valid
assessments can be used to comprehensively evaluate her capabilities across
the full-spectrum of necessary competencies. Beyond developing such assess-
Strategic Planning | 383

ments (as mentioned in Recommendation 4), who will validate them, update
them, and accredit their use across learning and workforce systems? Further,
how will schools that provide degrees based on mixed methods for compe-
tency attainment be formally accredited or ranked? To continue our example,
laws involving medical insurance, malpractice determinations, and formal li-
censure might be impacted.

Broadly, there’s also a need to develop interoperability specifications. Pro-


fessional organizations, such as the IEEE Learning Technology Standards
Committee or ISO IT for Learning, Education and Training, help formalize
technical standards across communities. However, this still only applies to
the interface or data layers. Each organization still makes many independent
decisions about instructional media, new technologies, and their technical
and programmatic factors. While organizations should retain their autonomy,
there’s opportunity to increase coordination, give collective guidance, and at
least within organizations or alliances, to create shared processes. A federated
data system is an extensive endeavor. We have to constantly ask, how can we
protect this system and yet keep information as open as possible? Thus, addi-
tional governing agreements in the areas of cybersecurity, privacy, and identi-
ty, as well as considerations for copyrights and data ownership, are important.
For example, for the U.S. Department of Defense DoD Instruction 1322.26
(“Distributed Learning”) provides guidance on best practices for distributed
learning and permissions to collect, aggregate, and assess data. This is one
of many policies that could be reviewed to encourage greater unity of action
across the military services and other defense components as well as the U.S.
Government, writ large.

We need to develop effective forms of governance for a diverse and dispa-


rate community of practice including government, academic, and industry
partners. This macro concern for governance is a mirror of what must occur
within the future learning ecosystem, itself: As components federate to attain
a capability the need for agile partnerships will grow, enabling rapid aggrega-
384 | Modernizing Learning

tion (or disaggregation) of federated capabilities. Such collective governance


will need to set approaches, policies, and management strategies that edu-
cation and training stakeholders can adopt to enable effective learning—not
only within a given silo—but across the composite, collective system.

Finally, there is a broader imperative that frames our approaches—policy and


methodology—for moving towards the future learning ecosystem. We do not,
and cannot, fully appreciate the impacts of exponential technological change,
particularly as we approach the point beyond which it’s impossible to fathom
(the “singularity”). Ethical considerations must be an innate characteristic of
our process methodologies, or we will sacrifice the human nature of progress.
Simultaneously, our span of regard will grow to include machine learning as
an essential and constant complement to human learning and employment; ar-
eas that we cannot approach reductively. Consequently, process perspectives
and non-linear contexts will characterize the evolution of the future learning
ecosystem—perhaps the final step away from Industrial Age thinking.

CONCLUSION
In this chapter, we’ve offered several recommendations for the advancement
of learning. Throughout this process, we’ve assumed that technologies, nota-
bly automation and data analytics, will continue to advance. In other words,
we felt it safe to assume that such capabilities are (or would be) technological-
ly feasible. The challenge lies not in developing the technologies but in their
validation, effective integration into learning systems, and consideration for
the corresponding social, organizational, and societal changes they’ll produce.

It’s not feasible, nor frankly advisable, though to plan out every piece of this
future learning ecosystem; the rapid pace of change and its complexity neces-
sarily require its design to be dynamic, flexible, and collaborative. However,
we’ve attempted to apply systems-thinking approaches to the planning pro-
Strategic Planning | 385

cess, considering the comprehensive “talent development system,” including


formal and informal training and education. We’ve also tried to harmonize
across the principles of learning science, learning technology, data science,
organizational dynamics, and policy, and consider a lifelong learning con-
tinuum to include K–16, the public and private workforce, military service,
and self-directed learning. Specific solutions should be grounded within this
larger tapestry so that when implemented, they’re most likely to work in con-
cert across technology, design, commitment, governance, policy, and human
infrastructure factors.

The immediate and enduring relevance of this discussion is clear; we’re con-
ducting basic research now that will provide knowledge to reframe our future
paradigms, bounding the unknowable to both enable and constrain future
choices. We recognize whatever choices we make will have consequences,
but learning, itself, is essential for making those future choices. The con-
fluence of learning and technology—the evolution from traditional schools,
to distributed learning, and now to “ubiquitous learning”—is driving us to-
wards the need for learning across time, space, and function using tools and
techniques from across diverse venues to enable seamless lifelong learning,
whether training, education, or experience, as part of a holistic approach to
empowering human potential. Interdisciplinary stewardship will be essential
to extend and connect learning science, policy, and technology to address to-
day’s challenges and be prepared for the unknowable future.

If we don’t like the rules,


why don’t we change the rules?

– Reese Madsen, Senior Advisor for Talent Development, U.S. OPM


Chief Learning Officer, OSD (Intelligence and Security)
End Matter
388 | Modernizing Learning

6 Friedman, T. L. (2016). Thank you for being late:


An optimist’s guide to thriving in the age of acceler-
ENDNOTES FOR SECTION 1 ations. New York: Farrar, Straus and Giroux.
(FOUNDATIONS) 7 Ibid. National Academies (2018). Endnote 1-4.
1-4

Chapter 1 Endnotes Chapter 2 Endnotes


1 Atkinson, R.C., & Shiffrin, R.M. (1968). Human 1 See Niper, S. (1989). Third generation distance
memory: A proposed system and its control pro- learning and computer conferencing. In Mason, R.
cesses. Psychology of Learning and Motivation, 2, and Kaye, A. (eds.), Mindweave: Communication,
89–195. Computers and Distance Education (pp. 63–73).
Oxford: Pergamon Press.
2 Baddeley, A.D. (1966). The influence of acoustic
and semantic similarity on long-term memory for Also, James Taylor expanded Niper’s model, di-
word sequences. The Quarterly Journal of Experi- viding its third generation into a fourth generation
mental Psychology, 18(4), 302–309. focused on flexible online learning delivery and
fifth that adds intelligent automation to the flexible
3 Many classic studies in cognitive psychology web-based delivery. See Taylor, J.C. (2001). Fifth
could be listed here; see, for example: generation distance education. Instructional Sci-
Anderson, M.C. & Neely, J.H. (1996). Interference ence and Technology, 4(1), 1-14.
and inhibition in memory retrieval. In E. L. Bjork 2 Simpson, M. & Anderson, B. (2012). History and
& R.A. Bjork (Eds.), Memory: Handbook of per- heritage in open, flexible and distance education.
ception and cognition (2nd ed.). San Diego: Aca- Journal of Open, Flexible, and Distance Learning,
demic Press. 16(2), 1–10.
Bjork, R.A. (1989). Retrieval inhibition as an adap- 3 Moore, M.G. & Anderson, W.G. (Eds.). (2003).
tive mechanism in human memory. In H.L. Roedi- Handbook of distance education. Mahwah/Lon-
ger, II & I.M. Craik (Eds.), Varieties of memory & don: Lawrence Erlbaum Associates.
consciousness: Essays in honor of Endell Tulving
(pp. 309–330). Hillside, NJ: Erlbaum. 4 Saettler, P. (1990). The Evolution of American Ed-
ucational Technology. Englewood, CO: Libraries
Sohn, M.H., Goode, A., Stenger, V.A., Jung, K.J., Unlimited, Inc.
Carter, C.S., & Anderson, J.R. (2005). An infor-
mation-processing model of three cortical regions: 5 Spector, J.M., Merrill, M.D., Elen, J., & Bishop,
Evidence in episodic memory retrieval. Neuroim- M.J. (Eds.). (2014). Handbook of research on ed-
age, 25(1), 21–33. ucational communications and technology. New
York: Springer.
Sweller, J. (1994). Cognitive load theory, learning
difficulty, and instructional design. Learning and 6 Molenda, M. (2013). Historical foundations. In
Instruction, 4(4), 295–312. Spector, J.M., Merrill, M.D., Elen, J., & Bishop,
M.J. (Eds.). Handbook of research on educational
4 National Academies of Sciences, Engineering, and communications and technology (pp. 3–18). New
Medicine (2018). How people learn II: Learners, York: Springer.
contexts, and cultures. Washington, DC: The Na-
tional Academies Press. 7 See discussion and original citations from ibid.
Molenda, 2013, p. 16. Endnote 2-6.
2-6
NOTE: A digital version of this book is openly and
publicly available at nap.edu/24783 8 Pask, G. (1960). The teaching machine as a control
mechanism. Transactions of the Society for Instru-
5 See, for example: ment Technology, 12(2), 72–82.
Manuti, A., Pastore, S., Scardigno, A.F., Giancas- Pask, G. (1982). SAKI: Twenty-five years of adaptive
pro, M.L., & Morciano, D. (2015). Formal and training into the microprocessor era. International
informal learning in the workplace: A research re- Journal of Man-Machine Studies, 17(1), 69–74.
view. International journal of training and devel-
opment, 19(1), 1–17. 9 Kulik, J.A. (1994). Meta-analytic studies of find-
ings on computer-based instruction. Technology
Mocker, D.W. & Spear, G.E. (1982). Lifelong assessment in education and training, 1, 9–34.
learning: Formal, nonformal, informal, and
self-directed (ERIC No. ED220723). https://eric. 10 For more information on this historic adaptive in-
structional systems, see, for example:
ed.gov/?id=ED220723
Endnotes | 389

Lesgold, A. (1988). SHERLOCK: A coached prac- 20 For example, see:


tice environment for an electronics troubleshoot- Piaget, J. (1964). Part I: Cognitive development in
ing job (ERIC No. ED299450). https://eric.ed.gov children: Piaget development and learning. Journal
/?id=ED299450 of Research in Science Teaching, 2(3), 176–186.
Anderson, J.R., Conrad, F.G., & Corbett, A.T. Vygotsky, L.S. (1978). Mind in society: The devel-
(1989). Skill acquisition and the LISP tutor. Cogni- opment of higher psychological functions. London:
tive Science, 13(4), 467-505. Harvard University Press.
Brown, J.S., Burton, R., & de Kleer, J. (1982). Also, for an historical overview, see: Matthews, W.
Pedagogical, natural language and knowledge en- J. (2003). Constructivism in the classroom: Episte-
gineering techniques in SOPHIE I, II, and Ill. In D. mology, history, and empirical evidence. Teacher
Sleeman & J.S. Brown (Eds.), Intelligent Tutoring Education Quarterly, 30(3), 51–64.
Systems. Academic Press.
21 Gamoran, A., Secada, W.G., & Marrett, C.B.
11 For meta-analyses on intelligent tutors, see: (2000). The organizational context of teaching and
Dodds, P.V.W., & Fletcher, J.D. (2004). Opportu- learning. In M.T. Hallinan (Ed.), Handbook of the
nities for new “smart” learning environments en- Sociology of Education (pp. 37-63). New York:
abled by next generation web capabilities. Journal Kluwer Academic/Plenum Publishers.
of Education Multimedia and Hypermedia, 13(4), 22 For a review, see: Au, K.H. (1998). Social con-
391–404. structivism and the school literacy learning of stu-
Kulik, J.A., & Fletcher, J.D. (2016). Effectiveness dents of diverse backgrounds. Journal of Literacy
of intelligent tutoring systems: A meta-analytic re- Research, 30(2), 297–319.
view. Review of Educational Research, 85, 171–204. 23 For Xerox’s NoteCards, see: Halasz, F. (1988). Re-
12 U.S. Office of Technology Assessment. (1988). flections on Notecards: Seven issues for the next
Power On!-New Tools for Teaching and Learning generation of hypermedia systems. Communica-
(OTA-SET-379). Washington, DC: U.S. Govern- tions of the ACM, 31(7), 836–852.
ment Printing Office. For Carnegie-Mellon University’s Andrew, see
13 U.S. Congress, Office of Technology Assessment for example: Morris, J.H., Satyanarayanan, M.,
(1989). Linking for learning: A new course for ed- Conner, M.H., Howard, J.H., Rosenthal, D.S., &
ucation (OTA-SET-430). Washington, DC: U.S. Smith, F.D. (1986). Andrew: A distributed person-
Government Printing Office. NOTE: Refer to page al computing environment. Communications of the
27 for the blockquote and page 26 for the in-text ACM, 29(3), 184–201.
quote following it. 24 Scardamalia, M., Bereiter, C., McLean, R.S., Swal-
14 Molenda, M. (2003). In search of the elusive ADDIE low, J., & Woodruff, E. (1989). Computer-support-
model. Performance improvement, 42(5), 34–36. ed intentional learning environments. Journal of
15 Ibid. Molenda (2013). Endnote 2-6.
2-6 Educational Computing Research, 5(1), 51–68.
16 Ibid. Kulik (1994). Endnote 2-9.
2-9 See page 18. 25 Hiltz, S. R. (1994). The virtual classroom: Learn-
ing without limits via computer networks. Intellect
17 Peters, O. (1994). Distance education and indus- Books. Norwood, NJ: Ablex Publishing Corpora-
trial production: A comparative interpretation in tion. See pages 5–6.
outline (1967). Otto Peters on distance education:
The industrialization of teaching and learning (D. 26 Bell, M. W. (2008). Toward a definition of “virtu-
Keegan, Ed., pp. 107–127). See page 111. al worlds.” Journal For Virtual Worlds Research,
1(1). See page 2.
18 Two useful articles to begin with are, e.g.:
27 Naimark, M. (1997). A 3D moviemap and a 3D
Sweller, J. (1988). Cognitive load during problem panorama. Proceedings of SPIE 3012, Stereoscop-
solving: Effects on learning. Cognitive science, ic Displays and Virtual Reality Systems IV. doi.
12(2), 257–285. org/10.1117/12.274471
Sweller, J. (2008). Cognitive load theory and the 28 Morningstar, C., & Farmer, F.R. (2008). The les-
use of educational technology. Educational Tech- sons of Lucasfilm’s habitat. Journal For Virtual
nology Publications, 48(1), 32–35. Worlds Research, 1(1).
19 Bloom, B.S. (1984). The 2 sigma problem: The
search for methods of group instruction as effec-
tive as one-to-one tutoring. Educational Research-
er, 13(6), 4–16.
390 | Modernizing Learning

29 For example, see: Schatz, S., Nicholson, D., & Lester, J.C., Towns, S.G., & Fitzgerald, P.J. (1998).
Dolletski, R. (2012). A system’s approach to sim- Achieving affective impact: Visual emotive com-
ulations for training: Instruction, technology, and munication in lifelike pedagogical agents. Interna-
process engineering. In P. J. Mosterman (Series tional Journal of Artificial Intelligence in Educa-
Ed.) Real-time Simulation Technologies: Princi- tion, 10, 278–291.
ples, Methodologies, and Applications (pp. 371– 37 D’Mello, S.K., Picard, R., & Graesser, A.C. (2007).
388). Boca Raton, FL: CRC Press. Toward an affect-sensitive AutoTutor. IEEE Intel-
30 Web-based Education Commission (2000). The ligent Systems, 22, 53–61.
power of the internet for learning: Moving from Kort, B., Reilly, R., & Picard, R.W. (2001). An af-
promise to practice. Washington, DC: Web-Based fective model of interplay between emotions and
Education Commission. www2.ed.gov/offices/AC/ learning: Reengineering educational pedagogy—
WBEC/FinalReport/index.html See pages 75–77. Building a learning companion. In T. Okamoto, R.
31 For example, see: Hartley, Knshuk, & J.P. Klus (Eds.), Proceedings
Beaumont, I., & Brusilovsky, P. (1995). Educa- of the IEEE International Conference on Advanced
tional applications of adaptive hypermedia. In Hu- Learning Technologies (pp. 43–46).
man-Computer Interaction (pp. 410–414). Spring- 38 For example, see: Calvo, R.A., & D’Mello, S. K.
er, Boston, MA. (2010). Affect detection: An interdisciplinary review
Brusilovsky, P., Pesin, L., & Zyryanov, M. (1993). of models, methods, and their applications. IEEE
Towards an adaptive hypermedia com-ponent for Transactions on Affective Computing, 1, 18–37.
an intelligent learning environment. In L.J. Bass, 39 Mayer, R.E. (1997). Multimedia learning: Are we
J. Gornostaev, & C. Unger (Eds.), International asking the right questions. Educational Psycholo-
Conference on Human-Computer Interaction (pp. gist, 32, 1–19.
348-358). Springer, Berlin, Heidelberg. 40 For example, see: Garrison, D.R., Anderson, T., &
32 For example, see: Archer, W. (2003). A theory of critical inquiry in
Koedinger, K.R., Anderson, J.R., Hadley, W.H., online distance education. In M. Moore and G. An-
& Mark, M.A. (1997). Intelligent tutoring goes to derson (Eds.), Handbook of Distance Education.
school in the big city. International Journal of Ar- (pp.113–127). New York: Erlbaum.
tificial Intelligence in Education, 8, 30–43. 41 Garrison, R. (2000). Theoretical challenges for
Ritter, S., Anderson, J.R., Koedinger, K.R., & Cor- distance education in the 21st century: A shift from
bett, A. (2007). Cognitive Tutor: Applied research structural to transactional issues. The Internation-
in mathematics education. Psychonomic Bulletin & al Review of Research in Open and Distributed
Review, 14(2), 249–255. Learning, 1(1). doi.org/10.19173/irrodl.v1i1.2

33 Ibid. Kulik & Fletcher (2016). Endnote 2-11.


2-11 42 Peters, O. (1993). Distance education in a postin-
dustrial society. In D. Keegan (Ed.), Theoretical
34 Emerging in the 1990s, the learning gains fostered Principles of Distance Education (pp. 39–58).
by intelligent tutors became approximately equiva- London: Routledge. See page 40.
lent to those from human tutors. See reviews by:
43 Daniel, J. (1996). Mega-universities and Knowl-
Graesser, A.C., Rus, V., Hu, X. (2017). Instruction edge Media: Technology Strategies for Higher Ed-
based on tutoring. In R.E. Mayer and P.A. Alexan- ucation. London: Kogan Page. See page 5.
der (Eds.), Handbook of Research on Learning and
Instruction (pp. 460–482). New York: Routledge. 44 Ibid. Web-based Education Commission (2000).
Endnote 2-30.
2-30 Refer to page 1.
VanLehn, K. (2011). The relative effectiveness of
human tutoring, intelligent tutoring systems and 45 Ibid. Web-based Education Commission (2000).
other tutoring systems. Educational Psychologist, Endnote 2-30.
2-30 Refer to page 23.
46, 197–221 46 Youngblut, C. (1998). Educational uses of virtual
35 Picard R.W. (1997). Affective computing. The MIT reality technology (No. IDA-D-2128). Alexandria
Press, Cambridge, MA. VA: Institute for Defense Analyses.
36 For example, see: 47 Page, E.H., & Smith, R. (1998). Introduction to
military training simulation: A guide for discrete
Graesser, A.C., Wiemer-Hastings, K., Wiemer- event simulationists. In D.J. Medeiros, E.F. Wat-
Hastings, P., Kreuz, R., & the Tutoring Research son, J.S. Carson, & M.S. Manivannan (Eds.),
Group. (1999). AutoTutor: A simulation of a hu- Winter Simulation Conference Proceedings (pp.
man tutor. Cognitive Systems Research, 1, 35–51. 53–60). Piscataway, NJ: IEEE.
Endnotes | 391

48 Henninger, A.E., Cutts, D., Loper, M., Lutz, R., 59 For example, see: Cavanagh, S. (2018, November
Richbourg, R., Saunders, R., & Swenson, S. 21). Ed. Dept. pulls plug on ‘Learning Registry,’ an
(2008). Live virtual constructive architecture road- Obama-Era tech initiative. EdWeek Market Brief.
map (LVCAR) final report. Alexandria VA: Insti- marketbrief.edweek.org
tute for Defense Analyses. 60 Johnstone, S. M. (2005). Open educational resourc-
49 El Kaliouby, R. & Robinson, P. (2005). General- es serve the world. Educause Quarterly, 28(3), 15.
ization of a vision-based computational model of 61 Howe, J. (2006, June 2). Crowdsourcing: A defini-
mind-reading. In International Conference on Af- tion. Wired. ww.wired.com/2006/06/crowds
fective Computing and Intelligent Interaction (pp.
582-589). Springer, Berlin, Heidelberg. 62 Siemens, G. (2005). Connectivism: A learning
theory for the digital age. International Journal of
50 Director for Readiness and Training (1999, April Instructional Technology and Distance Learning,
30). Department of Defense Strategic Plan for Ad- 2(1), 3–10.
vanced Distributed Learning (Report to the 106th
Congress). Washington, DC: U.S. Office of the 63 National Research Council (2000). How people
Deputy Under Secretary of Defense for Readiness. learn: Brain, mind, experience, and school: Ex-
apps.dtic.mil/dtic/tr/fulltext/u2/a470552.pdf.
apps.dtic.mil/dtic/tr/fulltext/u2/a470552.pdf panded edition. Washington, DC: The National
Academies Press.
51 Ibid. El Kaliouby & Robinson (2005). Endnote 2-49.
2-49
NOTE: A digital version of this book is openly and
52 Motlik, S. (2008). Mobile learning in developing publicly available at doi.org/10.17226/9853
doi.org/10.17226/9853.
nations. The International Review of Research in
Open and Distributed Learning, 9(2). www.irrodl. 64 Anderson, L., & Krathwohl, D.E. (2001). A Tax-
org/index.php/irrodl/article/view/564/1039 onomy for learning teaching and assessing: A re-
vision of Bloom’s taxonomy of educational objec-
53 Crompton, H. (2013). A historical overview of tives. New York: Addison.
m-learning: Toward learner-centered education. In
Z.L. Berge & L. Muilenburg (Eds.), Handbook of 65 Merrill, M.D. (2002). First principles of instruc-
Mobile Education. Hoboken: Taylor and Francis. tion. Educational Technology Research and Devel-
opment, 50(3), 43–59.
54 Watson, J., Murin, A., Vashaw, L., Gemin, B., &
Rapp, C. (2010). Keeping Pace with K-12 Online 66 Fiore, S.M., & Salas, E.E. (2007). Toward a sci-
Learning: An Annual Review of Policy and Prac- ence of distributed learning. Washington, DC:
tice, 2010. Evergreen Education Group. American Psychological Association.

55 For example, see: 67 Pashler, H., Bain, P., Bottge, B., Graesser, A.,
Koedinger, K., McDaniel, M., & Metcalf, J. (2007).
Muoio, A. (2000, October). Cisco’s Quick Study. Organizing instruction and study to improve stu-
Fast Company. www.fastcompany.com/41492/cis- dent learning (NCER 2007-2004). Washington,
cos-quick-study DC: National Center for Education Research, In-
Seufert, S. (2001). E-learning business models, stitute of Education Sciences, U.S. Department of
framework and best practice examples. In M.S. Education. http://ncer.ed.gov
Raisinghani (Ed.), Cases on Worldwide E-Com- 68 Magoulas, G.D., & Chen, S.Y. (Eds.). (2006).
merce: Theory in Action (70–94). New York: Idea Advances in web-based education: personalized
Group. learning environments (ERIC No. ED508909).
56 Fletcher, J. D. (2009). Education and training tech- Hershey, PA: Information Science Publishing.
nology in the military. Science, 323(5910), 72–75. 69 Woolf, B.P. (2009). Building intelligent tutoring
Wisher, R. A. & Khan, B. H. (Eds.), Learning on systems. Burlington, MA: Morgan Kaufman.
demand: ADL and the Future of e-Learning. Wash- 70 King, A. (1993). From sage on the stage to guide
ington DC: Department of Defense. on the side. College teaching, 41(1), 30–35.
57 Fletcher, J. D. (2005). The Advanced Distributed 71 O’Flaherty, J., & Phillips, C. (2015). The use of
Learning (ADL) vision and getting from here to flipped classrooms in higher education: A scoping re-
there (No. IDA/HQ-D-3212). Alexandria VA: In- view. The Internet and Higher Education, 25, 85–95.
stitute for Defense Analyses. apps.dtic.mil/dtic/tr/
fulltext/u2/a452053.pdf. See page 7.
fulltext/u2/a452053.pdf 72 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67

58 Rehak, D., Dodds, P., & Lannom, L. (2005, May). 73 Kelley, P. (2008). Making minds. New York: Rout-
A model and infrastructure for federated learning ledge. See page 4
content repositories. Paper presented at the 14th 74 Ibid. Graesser et al. (1999). Endnote 2-36.
2-36
World Wide Web Conference, Chiba, Japan.
392 | Modernizing Learning

Graesser, A. C. (2016). Conversations with AutoTu- 87 For example, see: Hampson, R.E., Song, D., Rob-
tor help students learn. International Journal of Ar- inson, B.S., Fetterhoff, D., Dakos, A. S., Roeder, B.
tificial Intelligence in Education, 26(1), 124–132. M., et al. (2018). Developing a hippocampal neural
Nye, B.D., Graesser, A.C., & Hu, X. (2014). Auto- prosthetic to facilitate human memory en-coding
Tutor and family: A review of 17 years of natural and recall. Journal of Neural Engineering, 15(3),
language tutoring. International Journal of Artifi- 036014.
cial Intelligence in Education, 24(4), 427–469. 88 See www.gifttutoring.org
75 Rowe, J.P., Shores, L.R., Mott, B.W., & Lester, J. Also refer to, for example:
C. (2010). Integrating learning and engagement in Sottilare, R.A., Goldberg, B. S., Brawner, K.W.,
narrative-centered learning environments. In: V. & Holden, H.K. (2012). A modular framework to
Aleven, J. Kay, & J. Mostow (Eds.), Internation- support the authoring and assessment of adaptive
al Conference on Intelligent Tutoring Systems. ITS computer-based tutoring systems (CBTS). In Pro-
2010. Lecture Notes in Computer Science, vol 6095 ceedings of the I/ITSEC. Arlington, VA: National
(pp. 166–177). Berlin, Heidelberg: Springer. Training and Simulation Association.
76 Johnson, W.L., & Valente, A. (2009). Tactical lan- 89 Sinatra, A., Graesser, A., Hu, X., & Brawner, K.,
guage and culture training systems: Using AI to (2019). Design Recommendations for Intelligent
teach foreign languages and cultures. AI Magazine, Tutoring Systems: Artificial Intelligence (Volume
30(2), 72. 6). U.S. Army Research Laboratory.
77 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67 90 IEEE Competency Data Standards Work Group
78 Roediger, H.L., and Karpicke, J.D. (2006). The (CDSWG20 P1484.20.1). (2018). sites.ieee.org/
power of testing memory: Basic research and im- sagroups-1484-20-1
plications for educational practice. Perspectives on 91 Yang, I. (2014, October 30). Grading adults on life
Psychological Science, 1(3), 181–210. experience. The Atlantic. www.theatlantic.com
79 For example, see: Landauer, T.K., Laham, D., & 92 For example, see: Anderson, L. (2018). Compe-
Foltz, P.W. (2003). Automatic essay assessment. tency-based education: Recent policy trends. The
Assessment in Education: Principles, Policy & Journal of Competency-Based Education, 3(1).
Practice, 10(3), 295–308. doi: 10.1002/cbe2.1057
80 Siemens, G. (2006). Knowing knowledge. www. 93 Kazin, C. (2017, August 15). Microcredentials, mi-
knowingknowledge.com cromasters, and nanodegrees: What’s the big idea?
81 Baker, R.S.J.D. & Yacef, K. (2009). The state of edu- The EvoLLLution. www.evolllution.com
cational data mining in 2009: A review and future vi- 94 Dede, C., Richards, J. & Saxberg, B. (2018). Learn-
sions. Journal of Educational Data Mining, 1, 3–16. ing Engineering for Online Education. Routledge.
82 For example, see: Baker, R.S.J.D. & Inventado, P.S. 95 As quoted by Blake-Plock, S. (2018, January).
(2014). Educational data mining and learning ana- Learning engineering: Merging science and data
lytics. In J.A. Larusson & B. White (Eds.), Learn- to design powerful learning experiences. Getting
ing Analytics (pp. 61-75). New York: Springer. Smart. www.gettingsmart.com
83 Evans, D. (2011, April). The Internet of Things: 96 Saxberg, B. (2016, July). “Learning engineering”
how the next evolution of the internet is changing making its way in the world. Getting Smart. www.
everything. CISCO white paper. www.cisco.com gettingsmart.com
84 Gómez, J., Huete, J.F., Hoyos, O., Perez, L., &
Grigori, D. (2013). Interaction system based on in-
ternet of things as support for education. Procedia Chapter 3 Endnotes
Computer Science, 21, 132–139.
1 Allen, I.E., & Seaman, J. (2016). Online Report
85 For example, see: Bower, M., & Sturman, D.
Card: Tracking Online Education in the Unit-
(2015). What are the educational affordances of
ed States (ERIC No. ED572777). Babson Park,
wearable technologies? Computers & Education,
MA: Babson Survey Research Group. eric.ed.gov
88, 343–353.
/?id=ED572777
86 D’Mello, S.K., Kappas, A., & Gratch, J. (2018).
2 U.S. Department of Education, National Center for
The affective computing approach to affect mea-
Education Statistics. (2018). Digest of Education
surement. Emotion Review, 10(2), 174–183.
Statistics 2016 (NCES 2017-094). nces.ed.gov/
pubs2017/2017094.pdf. See Table 311.15.
pubs2017/2017094.pdf
Endnotes | 393

3 Association for Talent Development Research. 17 For example, see: Puentedura, R. (2014). Learning,
(2017). Next Generation E-Learning: Skills and technology, and the SAMR model: Goals, process-
Strategies (Product Code 191706). Alexandria,VA: es, and practice. Ruben R. Puentedura’s blog. hip-
ATD Research. pasus.com/blog/archives/127
4 Shah, D. (2018, March 10). A product at every 18 Khan, B.H. (2003). The global e-learning frame-
price: A review of MOOC stats and trends in 2017. work. STRIDE Handbook, 42–51.
MOOC Report by Class Central. www.class-cen- 19 Farid, S., Ahmad, R., & Alam, M. (2015). A hier-
tral.com/report/moocs-stats-and-trends-2017 archical model for e-learning implementation chal-
5 World Bank (2018). World Development Report lenges using AHP. Malaysian Journal of Computer
2018: Learning to Realize Education’s Promise. Science, 28(3), 166–188.
Washington, DC: World Bank. doi: 10.1596/978- 20 Aguti, B., Wills, G.B., & Walters, R.J. (2014). An
1-4648-1096-1. See page 16.
1-4648-1096-1 evaluation of the factors that impact on the effective-
NOTE: This report, which includes a notable amount ness of blended e-learning within universities. In
of useful empirical data, is freely available under Proceedings of the International Conference on In-
the Creative Commons Attribution license. formation Society (pp. 117–121). Piscataway: IEEE.
6 Ibid. National Academies (2018). Endnote 1-4.
1-4 21 Roscoe, R.D., Branaghan, R., Cooke, N.J., & Craig,
7 Peggy Ertmer and Timothy Newby offer a highly S.D. (2017). Human systems engineering and edu-
readable comparison of these theories in the con- cational technology. In R.D. Roscoe, S.D. Craig &
text of instructional design in their 1993 article: I. Douglas (Eds.), End-user considerations in edu-
Ertmer, P.A. & Newby, T.J. (1993). Behaviorism, cational technology design. (pp. 1–34). New York:
cognitivism, constructivism: comparing critical IGI Global.
features from an instructional design perspective 22 Sohoni, S., Craig, S.D. & Vedula, K. (2017). A
(reprint). Performance Improvement Quarterly, blueprint for an ecosystem for supporting high
6(4), 50–72. quality education for engineering. Journal of Engi-
8 Ibid. Merrill (2002). Endnote 2-65.
2-65 neering Education Transformation, 30(4), 58–66.
9 Ambrose, S. A., Lovett, M., Bridges, M. W., DiP- 23 Dick, W., Carey, L., & Carey, J.O. (2011). The sys-
ietro, M., & Norman, M. K. (2010). How learning tematic design of instruction (7th ed.). Upper Sad-
works: Seven research-based principles for smart dle River, NJ: Pearson.
teaching. San Francisco, CA : Jossey-Bass. 24 Douglas, I. (2006). Issues in software engineering
10 Chi, M.T.H., & Wylie, R. (2014). The ICAP frame- of relevance to instructional design. TechTrends,
work: Linking cognitive engagement to active 50(5), 28–35.
learning outcomes. Educational Psychologist, 25 Cooke, N.J. & Hilton, M.L. (2015). Enhancing the
49(4), 219–243. See page 220. effectiveness of team science. Washington, D.C.:
11 Winne, P.H. (2011). A cognitive and metacogni- National Academies Press.
tive analysis of self-regulated learning. In D.H. Fiore, S.M., Graesser, A.C., & Greiff, S. (2018).
Schunk, & B. Zimmerman (Eds.), Handbook of Collaborative problem-solving education for the
self-regulation of learning and performance (pp. twenty-first-century workforce. Nature Human Be-
15-32). Ney York: Routledge. haviour, 2, 367–369.
12 Ibid. Pashler et al. (2007). Endnote 2-67.
2-67 26 For an informative and lighthearted discussion
13 Graesser, A.C. (2009). Cognitive scientists prefer on software development teams see: Fitzpatrick,
theories and testable principles with teeth. Educa- B., & Collins-Sussman, B. (2012). Team geek: a
tional Psychologist, 44(3), 193–197. software developer’s guide to working well with
others. Sebastopol, CA: O’Reilly Media, Inc.
14 Jarvis, P. (2012). Towards a comprehensive theory
of human learning, Vols. 1–3. New York: Rout- 27 www.merlot.org
ledge. The Multimedia Educational Resource for Learn-
15 Saettler, P. (1990). The evolution of American educa- ing and Online Teaching (MERLOT) project began
tional technology. Englewood: Libraries Unlimited. in 1997, when the California State University Cen-
ter for Distributed Learning developed and provid-
16 Mayer, R. E. (2017). Using multimedia for e-learn- ed free access to open educational resources.
ing. Journal of Computer Assisted Learning, 33(5),
403–423.
394 | Modernizing Learning

28 www.oercommons.org 9 Ibid. National Academies. (2018). Endnote 1-4.


1-4
These resources can range from lectures that are 10 Medel-Añonuevo, C., Ohsako, T., & Mauch, W.
publicly available on websites like YouTube, to (2001). Revisiting lifelong learning for the 21st cen-
electronic books, to entire online courses. tury (ERIC No. ED469790). Hamburg (Germany):
29 ies.ed.gov/ncee/wwc United Nations Educational, Scientific, and Cultur-
al Organization. eric.ed.gov/?id=ED469790
30 stemedhub.org
11 Ibid. National Academies (2018). Endnote 1-4.
1-4 See
31 www.nap.edu page 3.
32 www.dtic.mil 12 Ibid. For representative examples of Piaget’s and
Vygotsky’s work, see Endnote 2-20.
2-20
Chapter 4 Endnotes 13 Ibid. Sweller, J. (1994). Endnote 2-18.
2-18
14 For example, see:
1 Students need non-instructional assimilation
Erikson, E.H. (1950). Growth and crises of the
time—for example, see: Baepler, P., Walker, J. D.,
“healthy personality.” In M.J.E. Senn (Ed.), Sym-
& Driessen, M. (2014). It’s not about seat time:
posium on the healthy personality (pp. 91–146).
Blending, flipping, and efficiency in active learning
Oxford, England: Josiah Macy, Jr. Foundation.
classrooms. Computers & Education, 78, 227–236.
Bronfenbrenner, U. (1979). The ecology of human
…varied experiences to aid comprehension—for
development. Harvard university press.
example, see: Kolb, D. A. (2014). Experiential
learning: Experience as the source of learning and 15 For instance, as demonstrated by the Diagnostic
development. FT press. and statistical manual of mental disorders (DSM-
5®), published 2013.
…learning is context-based—for example, see:
Berns, R. G., & Erickson, P. M. (2001). Contex- 16 Jones, D.E., Greenberg, M., & Crowley, M. (2015).
tual teaching and learning: Preparing students for Early social-emotional functioning and public
the new economy. The Highlight Zone: Research@ health: The relationship between kindergarten so-
Work No. 5 (ERIC No. ED452376). Washington, cial competence and future wellness. American
DC: Office of Vocational and Adult Education. Journal of Public Health, 105(11), 2283–2290.
eric.ed.gov/?id=ED452376 17 Saarni C. (2011). Emotional development in child-
2 For example, the U.S. Every Student Succeeds hood. In R.E. Tremblay RE, M. Boivin, R. Peters,
Act of 2015, Pub. L. No. 114-95 § 114 Stat. 1177 & M. Lewis (Eds.), Encyclopedia on Early Child-
(2015–2016). www.ed.gov/essa hood Development [online]. www.child-encyclo-
pedia.com/emotions/according-experts/emotion-
3 Ibid. Sweller, J. (1994). Endnote 2-18.
2-18
al-development-childhood
4 Mathers, C.D., Stevens, G.A., Boerma, T., White,
18 casel.org
R.A., & Tobias, M.I. (2015). Causes of internation-
al increases in older age life expectancy. The Lan- 19 Schore, A.N. (2015). Affect regulation and the or-
cet, 385(9967), 540–548. igin of the self: The neurobiology of emotional de-
velopment. New York: Routledge.
5 Bialik, C. (2010, September 4). Seven careers in
a lifetime? Think twice, researchers say. The Wall 20 For evidence that intra- and interpersonal capa-
Street Journal. www.wsj.com bilities can be enhanced through learning see, for
example:
6 For example, see: Park, D.C. & Reuter-Lorenz,
P.A. (2009). The adaptive brain: Aging and neuro- Durlak, J.A., Weissberg, R.P., Dymnicki, A.B.,
cognitive scaffolding. Annual Review of Psycholo- Taylor, R.D., & Schellinger, K.B. (2011). The im-
gy, 60, 173–196. pact of enhancing students’ social and emotional
learning: A meta-analysis of school-based universal
7 Bryck, R.L., & Fisher, P.A. (2012). Training the
interventions. Child development, 82(1), 405–432.
brain: Practical applications of neural plasticity
from the intersection of cognitive neuroscience, Sklad, M., Diekstra, R., Ritter, M. D., Ben, J., &
developmental psychology, and prevention sci- Gravesteijn, C. (2012). Effectiveness of school-
ence. American Psychologist, 67(2), 87–100. based universal social, emotional, and behavioral
programs: Do they enhance students’ development
8 Power, J.D., & Schlaggar, B.L. (2017). Neural
in the area of skill, behavior, and adjustment? Psy-
plasticity across the lifespan. Wiley Interdisciplin-
chology in the Schools, 49(9), 892–909.
ary Reviews: Developmental Biology, 6(1), e216.
Endnotes | 395

21 See, for example:


Chapter 5 Endnotes
Vogel-Walcutt, J.J., Ross, K.G., & Phillips, J.K.,
(2016). Instructional design roadmap: Principles 1 von Clausewitz, C. (1989). On War. (M. Howard
to maximize learning across developmental stages. and P. Paret, Eds. and Trans.) Princeton University
In Proceedings of the I/ITSEC. Arlington, VA: Na- Press. (Original work published 1832)
tional Training and Simulation Association. Or refer to the O. Jolles translation (1943)
Vogel-Walcutt, J.J., Ross, K.G., Phillips, J.K., 2 Viegas, S. Antunes Teixeira L.A., Andrade, D.F. &
& Knarr, K.A. (2015). Marine Corps instructor Moreira Silva, J.T. (2015). The information over-
mastery model: A foundation for Marine faculty load due to attention, interruptions and multitasks.
professional development. In Proceedings of the Australian Journal of Basic and Applied Sciences,
I/ITSEC. Arlington, VA: National Training and 9(27): 603–613.
Simulation Association.
3 Hemp, P. (2009). Death by information overload.
Morrison, J.G., Kelly, R.T., Moore, R.A., & Harvard Business Review, 87(9), 82-9.
Hutchins, S.G. (1998). Implications of decision-
making research for decision support and displays. Spira, J.B., & Feintuch, J.B. (2005). The cost of not
In J.A. Cannon-Bowers & E. Salas (Eds.), Making paying attention: How interruptions impact knowl-
decisions under stress: Implications for individuals edge worker productivity. Report from Basex, Inc.
and teams (pp. 375–406). Washington, DC: APA 4 Brown, T. E. (2006). Attention deficit disorder: The
22 Payne, V.G., & Isaacs, L.D. (2017). Human motor unfocused mind in children and adults. Yale Uni-
development: A lifespan approach (9th ed.). Lon- versity Press. (As discussed in ibid. Viegas et al.,
don/New York: Routledge. 2015, Endnote 5-2.)
5-2
23 Edwards, L.C., Bryant, A.S., Keegan, R.J., Mor- 5 Benedek, A. (2013). Mobile multimedia-based
gan, K., & Jones, A.M. (2017). Definitions, foun- knowledge transfer: A toolkit and a 3.0 Reference
dations and associations of physical literacy: a sys- model. In L. Gómez Chova, A. López Martínez,
tematic review. Sports medicine, 47(1), 113–126. & I. Candel Torres (Eds.), Proceedings of EDU-
LEARN 13 (pp. 1926-1936). IATED.
24 For example, see: Swan, M. (2013). The quantified
self: Fundamental disruption in big data science 6 Kilgore, W. (2018, December 27). UX to LX: The
and biological discovery. Big data, 1(2), 85–99. rise of learner experience design. EdSurge. Re-
trieved from www.edsurge.com
www.edsurge.com.
25 Ghayvat, H., Liu, J., Mukhopadhyay, S.C., & Gui,
X. (2015). Wellness sensor networks: A proposal 7 Floor, N. (2018, March 5). Learning experience
and implementation for smart home for assisted design is NOT a new name for instructional design
living. IEEE Sensors Journal, 15(12), 7341–7348. [blog post]. Learning Experience Design.com. Re-
trieved from www.learningexperiencedesign.com
www.learningexperiencedesign.com.
26 Ibid. Edwards et al. (2017). Endnote 4-23.
4-23
8 Weigel, M. (2015, April 2). Learning experience
27 Laal, M. (2011). Lifelong learning: what does it design vs. user experience: Moving from “user”
mean? Procedia – Social and Behavioral Sciences, to “learner” [blog post]. Six Red Marbles. www.
28, 470–474. sixredmarbles.com
28 Kalz, M. (2015). Lifelong learning and its support 9 Schatz, S. Berking P. & Raybourn, E.M. (2017).
with new technologies. International Encyclope- Industrial knowledge design: An approach for de-
dia of the Social & Behavioral Sciences (2nd ed.), signing information artefacts. Theoretical Issues in
93–99. Ergonomics Science, 18(6), 501–518.
29 For example, see: Weinberg, R.S., & Gould, D.S. 10 Battarbee, K., & Mattelmäki, T. (2003). Meaning-
(2018). Foundations of sport and exercise psychol- ful product relationships. In Design and Emotion,
ogy (7th ed.). Human Kinetics. Vol. 1 (pp. 337–344).
30 Sae Schatz and her colleagues have dubbed this 11 For example, see:
integrated approach “Industrial Knowledge De-
sign” and outline it in an article, see: Schatz, S., Pine, B.J. & Gilmore, J.H. (1998). Welcome to the
Berking, P., & Raybourn, E. M. (2017). Industri- experience economy. Harvard business review, 76,
al knowledge design: an approach for designing 97–105.
information artefacts. Theoretical Issues in Ergo- Pullman, M.E. & Gross, M.A. (2004). Ability of ex-
nomics Science, 18(6), 501–518. perience design elements to elicit emotions and loy-
Also refer to Chapter 5 in this volume. alty behaviors. Decision Sciences, 35(3), 551–578.
396 | Modernizing Learning

Schmitt, B. (1999). Experiential marketing. The 25 Interaction Design Foundation (2017). Learning
Free Press. experience design: The most valuable lessons
12 Berry, L.L., Carbone, L.P., & Haeckel, S.H. [blog post]. The Interaction Design Foundation.
(2002). Managing the total customer experience. www.interaction-design.org.
www.interaction-design.org
MIT Sloan Management Review. 26 Rifai, N., Rose, T., McMahon, G.T., Saxberg, B.,
13 Ibid. see Pullman & Gross (2004) and Berry et al. & Christensen, U.J. (2018). Learning in the 21st
(2002), respectively. Endnote 5-11.
5-11 century: Concepts and tools. Clinical chemistry,
64(10), 1423–1429.
14 Kolb, D. (1984). Experiential learning: experience
as the source of learning and development. Pren- 27 Norman, D. (2005). Emotional design: Why we
tice Hall. See page 41. love (or hate) everyday things. Basic Books.

15 Kolb, A.Y. & Kolb, D.A. (2009). The learning


way: Meta-cognitive aspects of experiential learn-
ing. Simulation & Gaming. 40(3), 297–327. See ENDNOTES FOR SECTION 2
page 298. (TECHNOLOGY)
16 National Society for Experiential (2013, December
9). Eight principles of good practice for all experi-
ential learning activities. National Society for Ex-
periential Education. www.nsee.org/8-principles Chapter 6 Endnotes
17 Ibid. Schmitt (1999). Endnote 5-11.
5-11 1 Siemens, G., Dawson S. and Eshleman, K. (2018,
18 Ariely, D. (2008). Predictably irrational: the hid- October 29). Complexity: A leader’s framework for
den forces that shape our decisions. New York: understanding and managing change in higher edu-
HarperCollins. cation. EDUCAUSE Review, 53(6). educause.edu
19 Cialdini, R. (2009). Influence: Science and prac- 2 Rainie, L. & Anderson, J. (2017, May 3). The fu-
tice. Boston, MA: Pearson Education. ture of jobs and jobs training. Pew Research Cen-
20 Thaler, R.H., & Sunstein, C.R. (2009). Nudge: Im- ter. www.pewinternet.org
proving decisions about health, wealth, and happi- 3 Johnson-Freese, J. (2012) The reform of military
ness. London: Penguin. education: Twenty-five years later (ADA570086).
21 See, for example: Philadelphia, PA: Foreign Policy Research Inst.
apps.dtic.mil/docs/citations/ADA570086
Moss, M. (2013, August 27). Nudged to the pro-
duce aisle by a look in the mirror. The New York 4 Task Force on Defense Personnel (2017). The
Times. www.nytimes.com building blocks of a ready military: People, fund-
ing, tempo. Washington, DC: Bipartisan Policy
Bannister, J. (2010, July 19). NMSU researchers Center. bipartisanpolicy.org
shop around for healthier grocery carts [press re-
lease]. New Mexico State University. newscenter. 5 lrmi.dublincore.org
nmsu.edu 6 schema.org
22 Kerwin, W.T., Blanchard, G.S., Atzinger, E.M., & 7 schema.org/Course
Topper, P.E. (1980). Man/machine interface – a 8 U.S. Office of Personnel Management (2018).
growing crisis (DTIC Accession No. ADB071081). OPM strategic plan fiscal years 2018–2022. Wash-
U.S. Army Material Systems Analysis Activity. ington, DC: OPM. www.opm.gov
23 Garrett, J.J. (2010). Elements of user experience, 9 Growth Engineering (2017, May). Informal learn-
the: User-centered design for the web and beyond. ing: What is the 70:20:10 model? [blog post].
Pearson Education. Growth Engineering. www.growthengineering.
24 Pew, R.W., Mavor, A.S., et al. (2007). Human-Sys- co.uk/70-20-10-model
tem Integration in the system development process: 10 Blackman, D.A., Johnson, S.J., Buick, F., Faifua,
A new look. National Academies Press. See page 191. D.E., O’Donnell, M., & Forsythe, M. (2016). The
NOTE: A digital version of this book is openly and 70:20:10 model for learning and development: An
publicly available at nap.edu/11893 effective model for capability development? In
Academy of Management Proceedings, Vol. 2016
(p. 10745). Briarcliff Manor, NY: Academy of
Management.
Endnotes | 397

11 Tyszko, J.A., Sheets, R.G., Reamer, A.D. (2017). 2 Cavoukian, A. (2009). Privacy by design. Ontario,
Clearer signals: Building an employer-led job Canada: Information and Privacy Commissioner of
registry for talent pipeline management. Washing- Ontario. http://www.ontla.on.ca/library/repository/
ton, DC: U.S. Chamber of Commerce Foundation. mon/23002/289982.pdf
www.luminafoundation.org 3 Knijnenburg, B. P. (2015). A user-tailored ap-
proach to privacy decision support (Doctoral dis-
sertation, UC Irvine).
Chapter 7 Endnotes
4 Westin, A.F., Harris, L. et al. (1981). The Dimen-
1 For example, see: Cybrary.it. Free hacking train- sions of privacy: A national opinion research sur-
ing. www.cybrary.it/freehackingtraining vey of attitudes toward privacy. New York: Garland.
2 Fu, H., Liao, J., Yang, J., Wang, L., Song, Z., 5 Chellappa, R.K. & Sin, R.G. (2005). Personaliza-
Huang, X., et al. (2016). The Sunway TaihuLight tion versus privacy: An empirical examination of
supercomputer: system and applications. Science the online consumer’s dilemma. Information Tech-
China Information Sciences, 59(7), 072001. nology and Management. 6(2), 181–202.
Hruby, D. (2018). Putting China’s science on the 6 Malhotra, N.K., Kim, S.S. & Agarwal, J. (2004).
map. Nature, 553(7688). Internet users’ information privacy concerns
(IUIPC): The construct, the scale, and a nomolog-
3 Gordon, L.A., Loeb, M.P., Lucyshyn, W., & Zhou,
ical framework. Information Systems Research.
L. (2015). Externalities and the magnitude of cyber
15(4), 336–355.
security underinvestment by private sector firms: a
modification of the Gordon-Loeb model. Journal 7 Knijnenburg, B.P. & Cherry, D. (2016, June). Com-
of Information Security, 6(1), 24–30. ics as a medium for privacy notices. Paper present-
ed at the SOUPS 2016 workshop on the Future of
4 Rustici Software. (n.d.) The layers of Experience
Privacy Notices and Indicators, Denver, CO.
API. xapi.com/the-layers-of-experience-api-xapi
8 Wisniewski, P.J., Knijnenburg, B.P., & Lipford,
5 Ramirez-Padron, R. (2017, July) Pushing xAPI
H.R. (2017). Making privacy personal: Profiling
statements in real time: Part 3. tlacommunity.com/
social network users to inform privacy educa-
pushing-xapi-statements-in-real-time-part-3
tion and nudging. International Journal of Hu-
6 Ibid. Ramirez-Padron (2017). Endnote 7-5.
7-5 man-Computer Studies. 98, 95–108.
7 Perrow, C. (1984). Normal accidents: Living with 9 Ibid. Wisniewski et al. (2017). Endnote 8-7.
8-7
high-risk technologies. New York: Basic Books.
10 Narayanan, A. & Shmatikov, V. (2008). Robust
8 Ibid. Perrow (1984). Endnote 7-7.
7-7 de-anonymization of large sparse datasets. In IEEE
9 Bambauer, D.E. (2014). Ghost in the network. Symposium on Security and Privacy (pp. 111-125).
University of Pennsylvania Law Review, 162, pp. 11 Ibid. Chellappa & Sin (2005). Endnote 8-4.
8-4
1011–1091.
12 Gootman, S. (2016). OPM hack: The most danger-
Lally, L. (2005). Information technology as a target ous threat to the federal government today. Journal
and shield in the post 9/11 environment. Information of Applied Security Research. 11(4), 517–525.
Resources Management Journal, 18, pp. 14–28.
13 Kobsa, A., Cho, H., & Knijnenburg, B.P. (2016).
10 Massachusetts Institute of Technology. (2018). The effect of personalization provider characteris-
Kerberos: The network authentication protocol. tics on privacy attitudes and behaviors: An elabo-
web.mit.edu/kerberos ration likelihood model approach. Journal of the
Association for Information Science and Technolo-
gy, 67(11), 2587–2606.
Chapter 8 Endnotes 14 Knijnenburg, B.P., Sivakumar, S., & Wilkinson,
D. (2016). Recommender systems for self-actual-
1 Sandia National Laboratories is a multimission
ization. In S. Sen & W. Geyer (Eds.), Proceedings
laboratory managed and operated by National
of the 10th ACM Conference on Recommender Sys-
Technology & Engineering Solutions of Sandia,
tems (pp. 11–14). New York: ACM.
LLC, a wholly owned subsidiary of Honeywell
International Inc., for the U.S. Department of En-
ergy’s National Nuclear Security Administration
under contract DE-NA0003525.
398 | Modernizing Learning

15 Page, X., Knijnenburg, B.P., & Kobsa, A. (2013). 2 Siemens, G. & Baker, R.S.J.D. (2012). Learning
FYI: Communication style preferences underlie analytics and educational data mining: towards
differences in location-sharing adoption and usage. communication and collaboration. In S.B. Shum, D.
In F. Mattern & S. Santini (Eds.), Proceedings of Gasevic, & R. Ferguson (Eds.), Proceedings of the
the 2013 ACM international joint conference on 2nd International Conference on Learning Analytics
Pervasive and ubiquitous computing (pp. 153– and Knowledge (pp. 252–254). New York: ACM.
162). New York: ACM. 3 The phrase “big learning data” is a nod to Maise, E.
16 Teltzrow, M. & Kobsa, A (2004). Impacts of user (Ed.). (2014). Big Learning Data. Alexandria, VA:
privacy preferences on personalized systems: a American Society for Training and Development.
comparative study. In C.M. Karat, J. Blom, & J. 4 Crowder, M., Antoniadou, M., & Stewart, J.
Karat (Eds.), Designing Personalized User Expe- (2018). To BlikBook or not to BlikBook: Explor-
riences for eCommerce (pp. 315–332). Norwell, ing student engagement of an online discussion
MA: Kluwer Academic Publishers. platform. Innovations in Education and Teaching
17 Compañó, R. & Lusoli, W. (2010). The policy International, 1–12.
maker’s anguish: Regulating personal data behav- 5 Mazza, R. & Dimitrova, V. (2004) Visualising stu-
ior between paradoxes and dilemmas. In T. Moore, dent tracking data to support instructors in web-based
D. Pym, & C. Ioannidis (Eds.), Economics of Infor- distance education. In S. Feldman & M. Uretsky
mation Security and Privacy (pp. 169–185). Bos- (Eds.), Proceedings of the WWW Alt. ’04: 13th In-
ton, MA: Springer. ternational World Wide Web Conference on Alter-
18 Nissenbaum, H. (2011). A contextual approach to nate Track Papers and Posters. New York: ACM.
privacy online. Daedalus. 140(4), 32–48. 6 Arnold, K.E., & Pistilli, M.D. (2012). Course sig-
19 Kay, M. & Terry, M. (2010). Textured agreements: nals at Purdue: Using learning analytics to increase
Re-envisioning electronic consent. In L.F. Cranor student success. In Proceedings of the 2nd interna-
(Ed.), Proceedings of the Sixth Symposium on Us- tional conference on learning analytics and knowl-
able Privacy and Security (pp. 13:1–13:13). New edge (pp. 267–270). New York: ACM.
York: ACM. NOTE: However, readers are cautioned that Course
20 Wisniewski, P., Islam, A.K.M.N., Knijnenburg, Signals may have produced undesirable effects,
B.P., & Patil, S. (2015). Give social network us- possibly encouraging under-performing students
ers the privacy they want. In D. Cosley & A. Forte to withdrawal. This highlights the importance of
(Eds.), Proceedings of the 18th ACM Conference considering, not just analyzing data, but how to ef-
on Computer Supported Cooperative Work & So- fectively use it to achieve the desired outcomes.
cial Computing (pp. 1427–1441). New York: ACM. 7 For a succinct overview of learning analytics and
21 Ibid. Compañó & Lusoli (2010). Endnote 8-16.
8-16 educational data mining, as well as several use-
22 Spiekermann, S., Grossklags, J., & Berendt, B. case examples, see: Charlton, P., Mavrikis, M., &
(2001). E-privacy in 2nd generation e-commerce: Katsifli, D. (2013). The potential of learning an-
Privacy preferences versus actual behavior. In alytics and big data. Ariadne. www.ariadne.ac.uk/
M.P. Wellman & Y. Shoham (Eds.), Proceedings of issue/71/charlton-et-al
the 3rd ACM Conference on Electronic Commerce 8 For additional examples and a useful historical
(pp. 38–47). New York: ACM. account of educational data mining and learning
23 Ibid. Knijnenburg (2015). Endnote 8-2.
8-2 analytics, see: Ferguson, R. (2012). Learning ana-
lytics: drivers, developments and challenges. Inter-
24 Ibid. Knijnenburg (2015). Endnote 8-2.
8-2 national Journal of Technology Enhanced Learn-
ing, 4(5/6) pp. 304–317.
Chapter 9 Endnotes 9 For a systematic review of learning analytics and
educational dashboards, from scholarly articles
1 For those curious about the similarities and dif- published between 2011 and 2015, see: Schwen-
ferences between Educational Data Mining and dimann, B.A., Rodriguez-Triana, M.J., Vozniuk,
Learning Analytics, Ryan Baker and Paul Salvador A., Prieto, L.P., Boroujeni, M.S., Holzer, A., et
Inventado authored a detailed chapter, see: Ibid. al. (2017). Perceiving learning at a glance: A sys-
Baker & Inventado (2014). Endnote 2-82.
2-82 tematic literature review of learning dashboard
research. IEEE Transactions on Learning Technol-
ogies, 10(1), 30-41.
Endnotes | 399

10 “Open-learner” (or “open-student”) models pro- 13 The full quote is, “There are three kinds of lies:
vide learning visualizations of the underlying user lies, damned lies, and statistics.” This is often at-
model within a system the learners, themselves. tributed to Mark Twain who, in turn, attributes it to
Open-learner models appear to enhance outcomes the British prime minister Benjamin Disraeli. It’s
(particularly for lower performing students) and not entirely clear who invented the witticism, but
increase motivation, improve self-awareness, and it’s sentiment is obvious. For more delightful ety-
support self-directed learning. Robert Bodily and mology about it, see: Velleman, P. F. (2008). Truth,
colleagues recently published a review comparing damn truth, and statistics. Journal of Statistics Ed-
learning analytics dashboards with open learner ucation, 16(2).
models:
Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D.,
Xhakaj, F., & Verbert, K. (2018). Open learner Chapter 10 Endnotes
models and learning analytics dashboards: a sys-
1 Ibid. Kulik & Fletcher (2016). Endnote 2-11.
2-11
tematic review. In A. Pardo, K. Bartimote-Auf-
flick, G. Lynch (Eds.), Proceedings of the 8th 2 Raybourn, E.M., Deagle, E., Mendini, K., &
International Conference on Learning Analytics Heneghan, J. (2005). Adaptive thinking and lead-
and Knowledge (pp. 41–50). New York: ACM. ership simulation game training for special forces
officers. In Proceedings of the I/ITSEC. Arlington,
Some other useful sources include:
VA: National Training and Simulation Association.
Mitrovic, A., & Martin, B. (2002). Evaluating the
3 Steenbergen-Hu, S., & Cooper, H. (2014). A me-
effects of open student models on learning. In
ta-analysis of the effectiveness of intelligent tutor-
P. De Bra, P. Brusilovski, & R. Conejo (Eds.),
ing systems on college students’ academic learning.
International Conference on Adaptive Hyper-
Journal of Educational Psychology, 106(2), 331.
media and Adaptive Web-Based Systems (pp.
296–305). Berlin, Heidelberg: Springer, 4 VanLehn, K. (2011). The relative effectiveness of
human tutoring, intelligent tutoring systems, and
Chou, C.Y., Tseng, S.F., Chih, W.C., Chen, Z.H.,
other tutoring systems. Educational Psychologist,
Chao, P.Y., Lai, K.R. et al. (2017). Open student
46(4), 197-221.
models of core competencies at the curriculum
level: Using learning analytics for student reflec- 5 Ibid. Kulik & Fletcher (2016). Endnote 2-11.
2-11
tion. IEEE Transactions on Emerging Topics in 6 Murray, T. (1999). Authoring intelligent tutoring
Computing, 5(1), 32–44. systems: An analysis of the state of the art. Inter-
Brusilovsky, P., Somyürek, S., Guerra, J., Hosseini, national Journal of Artificial Intelligence in Edu-
R., & Zadorozhny, V. (2015, June). The value cation, 10, 98–129.
of social: Comparing open student modeling 7 Koedinger, K.R., McLaughlin, E.A., & Stamp-
and open social student modeling. In F. Ricci, er, J. C. (2014). Data-driven learner modeling to
K. Bontcheva, O. Conlan, & S. Lawless (Eds.), understand and improve online learning: MOOCs
User Modeling, Adaptation and Personaliza- and technology to advance learning and learning
tion. UMAP 2015. Lecture Notes in Computer research Ubiquity, 2014(May), 3.
Science, Vol. 9146 (pp. 44–55). Cham: Springer.
8 Goldberg, B., Schatz, S., & Nicholson, D. (2010).
11 For background information on Kappa see: github. A practitioner’s guide to personalized instruction:
com/milinda/kappa-architecture.com Macro-adaptive approaches for use with instruc-
The original paper describing this paradigm came tional technologies. In D. Kaber & G. Boy (Eds.)
out of work done at LinkedIn, see: Kreps, J. (2014, Proceedings of the 2010 Applied Human Factors
July 2). Questioning the Lambda Architecture: The and Ergonomics Conference: Advances in Cogni-
Lambda Architecture has its merits, but alterna- tive Ergonomics (pp. 735–745). Boca Raton, FL:
tives are worth exploring. O’Reilly. www.oreilly. CRC Press.
com/ideas/questioning-the-lambda-architecture 9 Young, J.R. (2018). Keystroke dynamics: Utiliz-
Also, for a deeper dive, see the presentation by ing keyprint biometrics to identify users in online
Martin Kleppmann at youtu.be/fU9hR3kiOK0 courses. (Doctoral dissertation, Brigham Young
University).
12 kafka.apache.org
10 Beck, J.E., & Woolf, B.P. (2000, June). High-level
student modeling with machine learning. In Inter-
national Conference on Intelligent Tutoring Sys-
tems (pp. 584–593). Berlin, Heidelberg: Springer.
400 | Modernizing Learning

11 Sottilare, R.A., Brawner, K.W., Goldberg, B.S., & 21 Schmorrow, D., Nicholson, D., Lackey, S.J., Allen,
Holden, H.K. (2012). The generalized intelligent R.C., Norman, K., & Cohn, J. (2009). Virtual re-
framework for tutoring (GIFT). Orlando, FL: US ality in the training environment In P.A. Hancock,
Army Research Laboratory–Human Research & D.A. Vincenzi, J.A. Wise, & M. Mouloua (Eds.),
Engineering Directorate. gifttutoring.org Human Factors in Simulation and Training. Boca
12 Nye, B. D. (2016). Its, the end of the world as we Raton, FL: CRC Press.
know it: Transitioning AIED into a service-orient-
ed ecosystem. International Journal of Artificial
Intelligence in Education, 26(2), 756–770. ENDNOTES FOR SECTION 3
13 Folsom-Kovarik, J.T., Jones, R.M., & Schmorrow, (LEARNING SCIENCE)
D. (2016). Semantic and episodic learning to inte-
grate diverse opportunities for life-long learning.
In Proceedings of MODSIM World. Arlington, VA:
National Training and Simulation Association. Chapter 11 Endnotes
14 Weinstein, Y., & Roediger, H. L. (2012). The ef-
fect of question order on evaluations of test per- 1 NOTE: The views expressed in this chapter are en-
formance: how does the bias evolve? Memory & tirely those of the author, a contractor with Metis
Cognition, 40(5), 727–735 Solutions, and do not necessarily reflect the views,
policy, or position of the United States Govern-
15 For review of data quality, see: Pipino, L.L., Lee, ment, Department of Defense, United States Spe-
Y.W., & Wang, R.Y. (2002). Data quality assess- cial Operations Command, or the Joint Special Op-
ment. Communications of the ACM, 45(4), 211–218. erations University.
For a discussion on data fairness, see also Brun, 2 Shute, V. & Ventura, M. (2013). Stealth assess-
Y., & Meliou, A. (2018). Software fairness. In ment: Measuring and supporting learning in video
Proceedings of the 26th ESEC/FSE (pp. 754–759). games. Boston, MA: MIT Press.
New York: ACM.
3 Hattie, J. (2009). Visible learning: A synthesis of
16 Soh, L.K., & Blank, T. (2008). Integrating case- over 800 meta-analyses relating to achievement.
based reasoning and meta-learning for a self-im- Abingdon, UK: Routledge. See page 24.
proving intelligent tutoring system. International
Journal of Artificial Intelligence in Education, 4 Hatfield, S. (2009). Assessing your program-level
18(1), 27–58. assessment plan. Number 45 in IDEA paper series.
Manhattan, KS: IDEA Center. See page 1.
17 Folsom-Kovarik, J.T., Wray, R.E., & Hamel, L.
(2013). Adaptive assessment in an instructor-me- 5 Sadler, D.R. (1989). Formative assessment and the
diated system. In Proceedings of the International design of instructional systems. Instructional Sci-
Conference on Artificial Intelligence in Education ence, 18, 119–144. See page 121.
(pp. 571–574). Berlin, Heidelberg: Springer. 6 Sadler, D.R. (2010). Beyond feedback: Developing
18 Krening, S., Harrison, B., Feigh, K.M., Isbell, C. student capability in complex appraisal. Assess-
L., Riedl, M., & Thomaz, A. (2017). Learning from ment & Evaluation in Higher Education, 35(5),
explanations using sentiment and advice. IEEE 535–550.
Transactions on Cognitive and Developmental 7 Hattie, J. & Timperley, H. (2007). The power of
Systems, 9(1), 44–55. feedback. Review of Educational Research, 77(1),
19 Aleven, V., Xhakaj, F., Holstein, K., & McLaren, 81–112.
B.M. (2016). Developing a teacher dashboard for 8 For examples of stealth assessment, see:
use with intelligent tutoring systems. In IWTA@ Shute, V. & Spector, J. M. (2008). SCORM 2.0
EC-TEL (pp. 15-23). New York: ACM. white paper: Stealth assessment in virtual worlds.
20 Czarkowski, M., & Kay, J. (2006). Giving learners Unpublished manuscript.
a real sense of control over adaptivity, even if they Ibid. Shute & Ventura (2013). Endnote 11-2.
11-2
are not quite ready for it yet. In M. Czarkowski, &
J. Kay (Eds.), Advances in Web-based education: 9 Thille, C. (2015, November 19). Big data, the sci-
Personalized learning environments (pp. 93–126). ence of learning, analytics, and transformation of
Hershey, PA: IGI Global. education [Video file]. Presented at media X con-
ference Platforms for Collaboration and Produc-
tivity, Stanford University. https://youtu.be/cYq-
s0Ei2tFo
Endnotes | 401

10 Ibid. Thille (2015). Endnote 11-9.


11-9 3 Wiggins, G., & McTighe, J. (1998). What is back-
11 Dron, J. (2007a). Designing the undesignable: So- ward design? Understanding by Design, 1, 7–19.
cial software and control. Educational Technology 4 Akdeniz, C. (2016). Instructional strategies. In C.
and Society, 10(3), 60–71. Akdeniz (Ed.), Instructional Process and Concepts
Dron, J. (2007b). The teacher, the learner and the in Theory and Practice (pp. 57–105). Singapore:
collective mind. AI & Society, 21, 200–216. Springer.

12 Ibid. Dron (2007a). Endnote 11-11.


11-11 See page 61. 5 Jonassen, D.H., Grabinger, R.S., & Harris, N.D.
C. (1990). Analyzing and selecting instructional
13 Ibid. Hattie & Timperley (2007). Endnote 11-7.
11-7 strategies and tactics. Performance improvement
14 Scott, S. (2013). Practicing what we preach: To- quarterly, 3(2), 29–47.
wards a student-centered definition of feedback. 6 Refer to Chapter 11 in this volume
Teaching in Higher Education, 19(1), 49–57.
See also Ibid. Dron, J. (2007a & b). Endnote 11-11.
11-11
15 Boud, D. & Molloy, E. (2013a). Feedback in high-
er and professional education: Understanding it 7 Marr, D.(1982). Vision. San Francisco: W. H. Free-
and doing it well. Abingdon: Routledge. man. See pages 19–20.
Boud, D. & Molloy, E. (2013b). Rethinking models 8 Ibid. Hattie, J. (2009). Endnote 11-3.
11-3
of feedback for learning: The challenge of design. 9 Ertmer, P. A., & Newby, T. J. (2013). Behaviorism,
Assessment & Evaluation in Higher Education, 38 cognitivism, constructivism: Comparing critical
(6), 698–712. features from an instructional design perspective.
16 Carey, B. (2014). How we learn: The surprising Performance Improvement Quarterly, 26(2), 43–71.
truth about when, where, and why it happens. New 10 Ibid. Del Moral-Pérez et al. (2013). Endnote 12-2.
12-2
York: Random House. 11 Dabbagh, N., Marra, R.M., & Howland, J.L.
17 Wiggins, G. & McTighe, J. (2005). Understanding (2019). Meaningful online learning: Integrating
by design, 2nd edition. Alexandria, VA: Association strategies, activities and learning technologies for
for Supervision and Curriculum Development. effective designs. New York: Routledge.
18 Michael, N. & Libarkin, J. (2016). Understanding 12 Ibid. Dabbagh et al. (2019). Endnote 12-11.
12-11
by design: Mentored implementation of back-
ward design methodology at the university level.
Bioscene, 42(2), 44–52. Chapter 13 Endnotes
Reynolds, H. & Kearns, K. (2017). A planning tool
1 Air Force Instruction 36-2201, Air Force Training
for incorporating backward design, active learning,
Program, see paragraph 4.1.2.
and authentic assessment in the college classroom.
College Teaching, 65(1), 17–27. 2 Bloom’s taxonomy identified the following general
categories associated with mastery in the cognitive
19 Ibid. Boud & Molloy (2013b). Endnote 11-15.
11-15
domain (starting with the simplest and moving to
Hattie & Timperley (2007). Endnote 11-7.
11-7 the most complex): knowledge, comprehension,
20 Wiliam, D. (2018). Embedded formative feedback. application, analysis, synthesis, and evaluation.
Bloomington, IN: Solution Tree Press. See page 29. Bloom, B.S., Engelhart, M.D., Furst, E.J., Hill,
W.H., Krathwohl, D.R. (1956). Taxonomy of ed-
ucational objectives: The classification of educa-
Chapter 12 Endnotes tional goals. Handbook I: Cognitive domain. New
York: David McKay Company.
1 Siemens, G. (2006). Connectivism: Learning the-
ory or pastime of the self-amused? Unpublished 3 Doyle, T. (2008). Helping students learn in a learn-
manuscript. er-centered environment: A guide to facilitating
learning in higher education. Sterling, VA: Stylus.
2 Del Moral-Pérez, E., Cernea, A., & Villalustre, L.
(2013). Connectivist learning objects and learning 4 See the Direct Assessment: Competency-Based
styles. Interdisciplinary Journal of e-Skills and Educational Programs policy from the Southern
Lifelong Learning, 9, 105–124. Association of Colleges and Schools Commission
on Colleges. www.sacscoc.org/pdf/081705/Direc-
Siemens, G. (2008). Learning and knowing in net- tAssessmentCompetencyBased.pdf
works: Changing roles for educators and designers.
ITFORUM for Discussion, 27, 1–26.
Ibid. Siemens (2006). Endnote 12-1.
12-1
402 | Modernizing Learning

5 U.S. Office of Personnel Management. (n.d.). As- 3 Raybourn, E., Schatz, S., Vogel-Walcutt, J.J., and
sessment & selection: Competencies. Retrieved Vierling, K. (2017). At the tipping point: Learning
January 13, 2019 from www.opm.gov science and technology as key strategic enablers
6 U.S. Air Force (2014). Institutional Competency for the future of defense and security. In Proceed-
Development and Management (Air Force Manual ings of the I/ITSEC. Arlington, VA: National Train-
36-2647). www.e-publishing.af.mil ing and Simulation Association.

7 Bramante, F., & Colby, R. (2012). Off the clock: 4 Stodd, J. (2018, July 10). Context of the Social Age
Moving education from time to competency. Thou- [blog post]. Julian Stodd’s Learning Blog. Retrieved
sand Oaks, CA: Corwin. See page 65. August 28, 2018 from julianstodd.wordpress.com

8 Spencer, L. & Spencer, S.M. (1993). competence 5 The Economist. (2017, January 14). The return of
at work: Models for superior performance. New the MOOC: Established education providers v new
York: John Wiley & Sons. contenders. The Economist. Retrieved February 2,
2018 from www.economist.com
9 Lucia, A.D., & Lepsinger, R. (1999). The Art and
Science of competency models: Pinpointing criti- 6 Stodd, J & Reitz, E.A. (2016). Black swans and the
cal success factors in organizations. San Francis- limits of hierarchy. n Proceedings of the I/ITSEC.
co: Jossey-Bass/Pfeiffer. Arlington, VA: National Training and Simulation
Association.
10 Ibid. Bramante & Colby (2012). Endnote 13-7.
13-7
7 Brafman, O., & Beckstrom, R. A. (2006). The
11 Ward, S.C. (2016, February 1) Let them eat cake starfish and the spider: The unstoppable power of
(competently). Inside Higher Education. www.in- leaderless organizations. Penguin.
sidehighered.com
8 Stodd, J. (2017, January 17). 10 Tips for designing
12 Hollenbeck, G. & Morgan, M. (2013). Competen- effective social learning [blog post]. Julian Stodd’s
cies, not competencies: Making global executive Learning Blog. Retrieved August 28, 2018 from
development work. Advances in global leadership julianstodd.wordpress.com
(pp. 101–119). Emerald Group Publishing Limited.
9 Ibid. Stodd (2015). Endnote 14-1.
14-1
13 U.S. Department of Energy (2013). U.S. Depart-
ment of Energy Leadership Development Pro- 10 O’Neil, H.F., Perez, R.S., & Baker, E.L., Eds.
grams 2013–2014: Readings by Executive Core (2014). Teaching and measuring cognitive readi-
Qualifications. www.opm.gov ness. New York, NY: Springer.

14 Krauss, S.M., (2017). How Competency-Based 11 Jarche, H. (2014, January). The seek-sense-share
Education May Help Reduce Our Nation’s Tough- framework. Inside Learning Technologies. jarche.
est Inequities (Issue Paper). Indianapolis: Lumina com/2014/02/the-seek-sense-share-framework
Foundation. www.luminafoundation.org 12 St. Clair, R.N., Thome-Williams, A.C., & Su, L.
15 www.luminafoundation.org/priorities (2005). The role of social script theory in cognitive
blending. In M. Medina & L. Wagner (Eds.), Inter-
16 Voorhees, R.A. Competency-based learning mod- cultural Communication Studies, 15(1), 1–7.
els: A necessary future. New directions for institu-
tional research, 2001(110), 5–13.
Chapter 15 Endnotes
Chapter 14 Endnotes 1 Yarnall, L., Remold, J., & Shechtman, N. (2018,
October). Developing employability skills: Har-
1 Stodd, J. (2015, October 30). An introduction to vesting ideas from the field. Presentation at the
Scaffolded Social Learning [blog post]. Julian annual principal investigators’ conference of the
Stodd’s Learning Blog. Retrieved August 28, 2018 Advanced Technological Education program, Na-
from julianstodd.wordpress.com tional Science Foundation, Washington, DC.
2 Foster, R.E. & Fletcher, J.D. (2002). Comput- 2 Winters, F.I., Greene, J.A., & Costich, C.M.
er-based aids for learning, job performance, and (2008). Self-regulation of learning within comput-
decision-making in military applications: Emer- er-based learning environments: A critical analysis.
gent technology and challenges (IDA Document Educational Psychology Review, 20(4), 429–444.
D-2786). Alexandria, VA: Institute for Defense
Analyses. 3 Azevedo, R. (2014). Issues in dealing with sequen-
tial and temporal characteristics of self- and social-
ly-regulated learning. Metacognition Learning, 9,
217–228.
Endnotes | 403

4 Marsick, V.J., & Watkins, K.E. (2001). Informal 14 Lan, W.Y., Bremer, R., Stevens, T., & Mullen, G.
and incidental learning. New Directions for Adult (2004, April). Self-regulated learning in the online
and Continuing Education, 2001(89), 25–34. environment. Paper presented at the annual meet-
5 Hu, H., & Driscoll, M.P. (2013). Self-regulation in ing American Educational Research Association,
e-learning environments: A remedy for community San Diego, California.
college? Educational Technology & Society, 16(4), 15 Zimmerman, B.J., & Pons, M.M. (1986). Develop-
171–184. ment of a structured interview for assessing student
Sitzmann, T., & Ely, K. (2011). A meta-analysis of use of self-regulated learning strategies. American
self-regulated learning in work-related training and Educational Research Journal, 23(4), 614–628.
educational attainment: What we know and where 16 González-Torres, M.C., & Torrano, F. (2008).
we need to go. Psychological Bulletin, 137(3), 421. Methods and instruments for measuring self-reg-
6 Zimmerman, B.J. (1990). Self-regulating academ- ulated learning. In Handbook of Instructional Re-
ic learning and achievement: The emergence of a sources and their Applications in the Classroom
social cognitive perspective. Educational Psychol- (pp. 201–219). New York, NY: Nova Science.
ogy Review, 2, 173–201. 17 Organisation for Economic Co-operation and De-
7 Ibid. Sitzmann & Ely (2011). Endnote 15-5.
15-5 velopment. (2000). Literacy in the information
age: Final report of the International Adult Litera-
8 Bandura, A. (1991). Social cognitive theory of cy Survey. Paris: OECD. www.oecd.org
self-regulation. Organizational Behavior and Hu-
man Decision Processes, 50, 248–287. 18 Ibid. Chi & Wylie (2014). Endnote 3-10.
3-10

Rotter, J. B. (1990). Internal versus external con- 19 Ibid. OECD (2000). Endnote 15-17.
15-17
trol of reinforcement: A case history of a variable. 20 Freed, M., Yarnall, L., Spaulding, A., & Gerva-
American Psychologist, 45(4), 489. sio, M. (2017). A mobile strategy for self-directed
9 Zimmerman, B.J. (2000). Attaining self-regulation: learning in the workplace. In Proceedings of the
A social-cognitive perspective. In M. Boekaerts I/ITSEC. Arlington, VA: National Training and
& P.R. Pintrich (Eds.), Handbook of Self-Regu- Simulation Association.
lation (pp. 13–39). New York: Academic Press.
10 For example, see: Moos, D.C. & Azevedo, R. (2008).
Exploring the fluctuation of motivation and use of ENDNOTES FOR SECTION 4
self-regulatory processes during learning with hy- (ORGANIZATION)
permedia. Instructional Science, 36(3), 203–231.
11 Baker, L. & Brown, A.L. (1984). Metacognitive
skills and reading. In P.D. Pearson (Ed.), Hand-
book of reading research (pp. 353–394). Mahwah, Chapter 16 Endnotes
NJ: Erlbaum.
1 Januszewski, A. & Molenda, M. (2008). Educa-
12 Zimmerman, B. J. (1998). Developing self-fulfill- tional technology: A definition with commentary.
ing cycles of academic regulation: An analysis of New York: Taylor & Francis Group.
exemplary instructional models. In D.H. Schunk
& B.J. Zimmerman (Eds.). Self-regulated learn- 2 Glover, J. and Ronning, R. (1987). Historical
ing: From teaching to self-reflective practice. New Foundations of Educational Psychology. New
York, NY: Guilford Press. York, Plenum Press.
Zimmerman, B.J. (2000). Attaining self-regulation: 3 For example, see:
a social cognitive perspective. In M. Boekaerts, Dick, W., & Carey, L. (1978). The systematic de-
P.R. Pintrich & M. Zeidner (Eds.), Handbook of sign of instruction (1st ed.). Chicago: Scott, Fores-
Self-Regulation. San Diego: Academic Press. man and Company.
13 Pintrich, P. R. (1991). A manual for the use of the Ibid. Dick, Carey, & Carey (2001). Endnote 3-23.
3-23
Motivated Strategies for Learning Questionnaire 4 Ibid. Glover and Ronning (1987). Endnote 16-2.
16-2
(ERIC No. ED338122). Ann Arbor, MI: Nation-
al Center for Research to Improve Postsecondary 5 Gustafson, K. L., & Branch, R. M. (1997). Revi-
Teaching and Learning. eric.ed.gov/?id=ED338122 sioning models of instructional development. Ed-
ucational Technology Research and Development,
45(3), 73–89.
404 | Modernizing Learning

6 Gagné, R.M., Wager, W.W., Golas, K.C., Keller, 14 Dede, C., Richards, J., & Saxberg, B. (2018).
J.M. (2005). Principles of instructional design (5th Learning engineering for online education: Theo-
Ed.). Belmont, CA: Wadsworth/Thomson Learning. retical contexts and design-based examples. New
7 See, for example: York: Routledge.

Darling-Hammond, L. (2005). Teaching as a pro- 15 Ibid. Dede et al. (2018). Endnote 16-14.
16-14
fession: Lessons in teacher preparation and pro- 16 Ibid. Dede et al. (2018). Endnote 16-14.
16-14 See page 29.
fessional development. Phi Delta Kappan, 87(3), 17 For more information, see: www.opm.gov/pol-
237–240. icy-data-oversight/classification-qualifications/
Gage, N. L. (1989). The paradigm wars and their general-schedule-qualification-standards/
aftermath: A “historical” sketch of research on 18 Dede. C. (2018, October 19). The 60 year curricu-
teaching since 1989. Educational Researcher, lum: Developing new educational models to serve
18(7), 4–10. the agile labor market. The EvoLLLution. evolllu-
Seidel, T., & Shavelson, R. J. (2007). Teaching ef- tion.com (Included with permission).
fectiveness research in the past decade: The role
of theory and research design in disentangling
meta-analysis results. Review of Educational Re-
search, 77(4), 454–499.
Chapter 17 Endnotes
8 See, for example: 1 Gall, J. (1975). General Systemantics: An Essay on
Angelo, T.A., & Cross, K.P. (1993). Classroom how Systems Work, and Especially how They Fail,
assessment techniques: A handbook for college Together with the Very First Annotated Compen-
teachers (2nd Ed.). San Francisco: Jossey-Bass, Inc. dium of Basic Systems Axioms: a Handbook and
Ready Reference for Scientists, Engineers, Labo-
Shulman, L.S. (2004). Visions of the possible: ratory Workers, Administrators, Public Officials,
Models for campus support of the scholarship of Systems Analysts, Etc., Etc., Etc., and the General
teaching and learning. In Becker, W.E & Andrews, Public. General Systemantics Press. See page 71.
M.L. (Eds.), The Scholarship of Teaching and
Learning in Higher Education: Contributions of
Research Universities (9–24). Bloomington, IN: Chapter 18 Endnotes
Indiana University Press.
Sorcinelli, MD., Austin, A.E. Eddy, P.L. & Beach, 1 Godin, S. (2016, March 6). Stop stealing dreams
A.L. (2006). Creating the future of faculty develop- [Blog post]. Medium. medium.com
ment. San Fransisco: Jossey-Bass, Inc. 2 Bordia, P., Hunt, E., Paulsen, N., Tourish, D., &
9 Campbell, K., Schwier, R.A., & Kenny, R.F. DiFonzo, N. (2004). Uncertainty during organi-
(2009). The critical, relational practice of instruc- zational change: Is it all about control? European
tional design in higher education: An emerging Journal of Work and Organizational Psychology,
model of change agency. Educational Technology 13(3), 345–365.
Research and Development, 57(5), 645–663. 3 See, for example:
10 Rothwell, W.J. & Kazanas, H. C. (1998). Mastering Lane, I. F. (2007). Change in higher education:
the instructional design process: A systematic ap- Understanding and responding to individual and
proach (2nd Ed.). San Francisco: Jossey-Bass/Pfeiffer. organizational resistance. Journal of Veterinary
11 McDonald, J. K. (2011). The creative spirit of de- Medical Education, 34(2), 85–92.
sign. TechTrends, 55(5), 53–57. Zell, D. (2003). Organizational change as a process
12 Cross, N. (1982). Designerly ways of knowing. of death, dying, and rebirth. The Journal of Applied
Design studies, 3(4), 221–227. See page 224. Behavioral Science, 39(1), 73–96.
13 As quoted from Willcox, K.E., Sarma, S., & Lip- 4 Clarke, J.S., Ellett, C.D., Bateman, J.M., & Ru-
pel, P. (2016). Online education: A catalyst for gutt, J.K. (1996). Faculty Receptivity/Resistance
higher education reforms. Cambridge: MIT. to Change, Personal and Organizational Efficacy,
Decision Deprivation and Effectiveness in Re-
For the original, see:
search 1 Universities (ERIC No. ED402846). Pa-
Simon, H.A. (1967) “Job of a college president.” per presented at the Annual Meeting of the Associ-
Educational Record 48(1), 68–78. Washington, ation for the Study of Higher Education, Memphis,
D.C.: American Council on Education. TN. eric.ed.gov/?id=ED402846
Endnotes | 405

5 Riley, W. (1989). Understanding that resistance 4 McDonalds (n.d.). Hamburger University. corpo-
to change is inevitable [Monograph]. Managing rate.mcdonalds.com
Change in Higher Education, 5, 53–66. 5 Starbucks (n.d.) Future leaders start here. www.
6 Susskind, R., & Susskind, D. (2016, October 11). starbucks.com
Technology will replace many doctors, lawyers, 6 Bureau of Labor Statistics (2017, August 24). Num-
and other professionals. Harvard Business Review. ber of jobs, labor market experience, and earnings
hbr.org growth among Americans at 50: Results from a lon-
7 Bordia, P., Hobman, E., Jones, E., Gallois, C., & gitudinal survey (USDL-17-1158). www.bls.gov
Callan, V.J. (2004). Uncertainty during organiza- 7 Bureau of Labor Statistics (2018, September 20).
tional change: Types, consequences, and manage- Employee tenure summary (USDL-18-1500).
ment strategies. Journal of business and psycholo- www.bls.gov
gy, 18(4), 507–532.
8 Aldrich, C. (2003). Simulations and the future of
8 Ibid. Lane (2007). Endnote 18-3.
18-3 learning. San Francisco: Wiley. See page 7.
9 Mulholland, B. (2017, July 14). 8 critical change 9 Loughran, D.S. (2014). Why is veteran unemploy-
management models to evolve and survive. Pro- ment so high? Santa Monica, CA: RAND Corpora-
cess.st. www.process.st tion. www.rand.org
10 Sinek, S. (2011). Start with why how great leaders 10 Snyder, T.D. (2018). Mobile Digest of Education
inspire everyone to take action. New York: Portfo- Statistics, 2017 (NCES 2018-138). U.S. Depart-
lio/Penguin. ment of Education. Washington, DC: Nation-
11 Gawande, A., & Gawande, A. (2010). The check- al Center for Education Statistics. nces.ed.gov/
list manifesto: How to get things right. New York: pubs2018/2018138.pdf
Henry Holt. 11 Ibid. Blackman et al. (2016). Endnote 6-10.
6-10
See also: Shane Parrish interview of Dr. Atul Ga- 12 Zimmerman, B.J. & Dibenedetto, M.K. (2008).
wande on “The Learning Project with Shane Par- Mastery learning and assessment: Implications for
rish” [Podcast]. (2018, October 2). students and teachers in an era of high-stakes test-
12 Teller, A. (2016, April 20). Celebrating Failure Fu- ing. Psychology in the Schools, 45(3), 206–216.
els Moonshots [Audio blog interview]. Retrieved 13 Rosenberg, M. (2014, October 14). Marc my words:
October 8, 2018 from ecorner.stanford.edu/pod- In learning and performance ecosystems, the
cast/celebrating-failure-fuels-moonshots/ whole is greater than the sum of the parts. Learn-
13 This truism is often attributed as an “African prov- ing Solutions. www.learningsolutionsmag.com
erb,” which is most likely inaccurate and certainly 14 U.S. OPM (n.d.). HR line of business: HC Busi-
overly unspecific. It’s been often quoted by vari- ness Reference Model. www.opm.gov
ous well-known speakers, including by Al Gore
when accepting his accepting his Nobel Peace 15 Ibid. National Academies (2018). Endnote 1-4.
1-4
Prize; yet, it’s unclear where the saying originally
derives from. Regardless, it’s still relevant, no mat-
ter the source! (jezebel.com/on-the-origin-of-cer-
jezebel.com/on-the-origin-of-cer-
tain-quotable-african-proverbs-1766664089)
tain-quotable-african-proverbs-1766664089

Chapter 19 Endnotes
1 For example, see: Allen, I. E., & Seaman, J. (2016).
Online report card: Tracking online education in
the United States. Babson Survey Research Group.
2 Statista (2018, December). Total training expen-
ditures in the United States from 2012 to 2018
(in billion U.S. dollars). www.statista.com/statis-
tics/788521/training-expenditures-united-states/
3 Epstein, E.A. (2010, April 11). Chew on this: In-
side McDonald’s Hamburger University—the
“Harvard of the fast food biz.” The Daily Mail.
www.dailymail.co.uk
406 | Modernizing Learning

You might also like