Challenge, Learning Seen As A Cost Without A Defined Benefit
Challenge, Learning Seen As A Cost Without A Defined Benefit
Challenge, Learning Seen As A Cost Without A Defined Benefit
The conversations between Chief Learning Officers (CLOs) and business leaders have changed in
just the past few years. Learning organizations have always contributed to the success of a business strategy
when it comes to imparting needed information to employees about new processes, technology, techniques,
business skills or professional development for next generation leaders. In some cases learning benefit is
presumed, the cost isnt considered, and the value is assumed and intrinsic. In most cases, however,
learnings contribution is seen as a cost without a well-defined benefit. That is often the result of learning
professionals trying to measure whether new learning occurred without measuring the benefit of learning
new things.
Learning results reporting has typically measured how good the training was from the eyes of the
participants, traditionally measured at four levels.
1.
environment? Did the training accomplish the stated objectives? Almost all learning organizations that
deliver training seek this type of data, which can be loosely associated with a customer satisfaction
measurement.
2.
the training environment. This is a measure of instructional design and facilitation integrity: Did the
material and the instructor teach the student what the student needed to know/do or how to act?
3.
Level 3, Gauges The Extent To Which Workplace Behavior Changes And Indicates Skill Acquisition
Did the learner translate the new skills, knowledge or behavior to the job? Is the learner still
performing with the new knowledge, skills or behavior some 60, 90 or even 120 days later? Often data is
obtained by the learning organization reaching out to the manager of the students to obtain feedback. This
level measures the effectiveness of the training as it relates to how closely the training modeled the work,
processes, tools and instructions the student encounters on the job.
4.
high-level, without defining what learning could have and should have contributed to specific business
goals.
Formalize measurement
In the learning planning stage, meet and learn from your organizations business leaders the skills
their teams need in order to achieve specific business goals. Define learning impacts by the organizations
business metrics.
2.
organizations, etc.
3.
Define scope
Confirm regional and global measurement goals and responsibilities.
4.
Document results
Track results by the agreed business metrics; automate reports; provide predictive measures that
point to business opportunities or flag risks; communicate in a business language that resonates with
business leaders.
5.
While this is the ideal state of measurement, there is wide divergence from company to company,
and even within one learning organization, on how learning measurement is done in terms of tools,
methods, and techniques. In fact, more than half of todays learning professionals are dissatisfied with how
they are measuring learning from access to success metrics to resources spent on tracking the metrics.
M3 offers a measurement improvement strategy by:
1.
Requiring learning organizations to clearly define their success criteria which are mapped to the
2.
3.
criteria
Prescribing specific next steps to improve the learning organizations measurement capabilities and
maturity
As the learning organizations measurement maturity level increases, it will be able to evolve from
reactive measurements (traditional training impacts) to actionable measurements (linking learning results to
business metrics) to proactive measurements (predictive analytics pointing to business opportunities as well
as risk areas the company or organization should focus on).
M3 is comprised of five stages, following the standard CMM approach.
1.
Initial Stage
The Initial Stage is typically evidenced by a paper-based evaluation system administered locally and
without global governance. Often this evaluation system is inconsistent in its application. Work is ad hoc
and not well organized or systematic. In some cases, processes are performed differently within the same
organization due to geographic dispersion and autonomy of the learning function in other regions. There is
no capability of pulling information together quickly or creating reports without a great deal of manual
manipulation. Metrics used in this stage are primarily training impacts measurements learning operationsfocused (traditional Levels 1 and 2 measurements). This sort of measurement strategy makes it nearly
impossible to aggregate all training information across an organization for the purpose of determining the
health of training activity or make informed management decisions.
2.
Repeatable Stage
Organizations in the Repeatable Stage are doing the same thing the same way with some rigor and
consistency. Tools may be electronic instead of paper-based and there is an element of process discipline.
Examples include measurement methodologies such as Metrics That Matter , survey deployment tools,
and homegrown data marts for data collection. For global learning organizations, the goal in this stage is to
create consistent processes for pulling data and generating basic reports across all regions. Organizations at
this stage find it difficult to provide solid business metrics, other than efficiency measures such as how
many, how often, and how much. When it comes to customer satisfaction, often the inconsistency in
measurement methodology or the variation in survey instruments makes comparison of customer sentiment
against any benchmark difficult.
3.
Defined Stage
Organizations next move into the Defined Stage, starting with metrics definition. Extending beyond
traditional training impact measures, learning organizations collaborate with business leaders to understand
their organizational business goals and identify the skills needed to get to these goals. They then develop
learning strategies and programs and define metrics in the context of the business goals.
Measurement application also becomes formalized. Measurement champions and sponsors
are
assigned and announced to the organization. Data sources are identified and rationalized where needed.
Data marts are created for easy data retrieval and report generation.
Measurement processes as well as strategies are properly documented. Learning organizations in
this stage may enhance traditional Levels 3 and 4 training impact measures
by adopting more
comprehensive standards such as Talent Development Reporting Principles (TDRp). Applicable TDRp
reporting templates include:
a. Efficiency Statement
Reports on the organizations activities and investment in learning and development. Examples
include the number of learners attending training, the number of programs delivered, and the
cost to produce training.
b. Effectiveness Statement
Reports on how well learning activities contribute to the business outcomes. Examples include
alignment to the organizations goals, quality of content and delivery, application to job, and
business impact.
Learning professionals at this stage will also start estimating and isolating the business impact of
training. The first step is for the learning organization, typically a learning consultant, to have
planning sessions with the business unit leader and understand the top business goals. For
example, the business leader may have a goal to increase sales by 10%. The next step is for the
business leader to list the key types of input that will have an impact on sales results. Examples
of such input may include new product launches, marketing programs, sales incentive changes,
sales programs, managers realignment, and sales training. The learning consultant then works
with the business leader to estimate the relative impact of each input, resulting in a table that
might look like this: (Figure)
In this example, to achieve 10% sales growth with sales training estimated to make a 12% impact
on the results, we can calculate the actual incremental sales expected to be brought along by
training: 10% x 12% = 1.2%. The table can be updated as follows: (Figure)
To wrap up this example, the learning organization will most likely use surveys and interview
selected learners to estimate the incremental sales as an outcome of the training. This will be the
beginning of a formal approach to evaluating business impact brought along by training.
Unfortunately, many organizations become satisfied with this level of measurement and end their
4.
pursuit at the Defined Stage, not taking into account the business value of the learning.
Managed Stage
At the Managed Stage of the Maturity Model, we see the first comprehensive, automated learning
dashboard business intelligence presented in a dashboard format linking learning results to business
performance. Business leaders can review the dashboard easily and make decisions based on insights from
the dashboard.
Dashboards may exist in earlier stages but they are typically updated manually, consuming
significant amounts of learning professionals time and effort and are prone to human error. The dashboard
in the Managed Stage is dynamically and automatically updated directly from source data. A data mart
connects different databases and enables complex reporting. Rich-content reports also become possible,
allowing business leaders to literally push different buttons to change their view of the reports, apply
different organizational or audience filters, or zoom in on different business metrics and associated learning
results.
Reports in the Managed Stage will provide diagnostic analyses, laying out clearly what the data
means whether there are business
organization. In addition, success case methodology may be used to understand how specific learning
programs have produced targeted, measurable business results for specific organizations, thus projecting a
return on learning investment. In this stage, learning organizations should have solid data to produce a
business outcome report for various business units using the TDRp template:
Business Outcome Statement
Reports on an organizations desired business results and learning activities impact on those results.
Examples: revenue, market share, quality, and cost reduction. In addition to estimating business impact as
described in the Defined Stage, learning organizations will have better established and more comprehensive
data points to validate business results. For example, CLOs may now have access to a data feed from the
finance department, allowing them to retrieve actual revenue and gross margin results. (Figure)
Transitioning from the Defined Stage to the Managed Stage is the most challenging, as linking
learning results to business performance is no easy task. The business intelligence yielded, however, is of
significant value.
5.
Optimizing Stage
The final stage of maturity is the Optimizing Stage. All measurement data is defined in one of the
three reporting statements referenced below: Efficiency, Effectiveness, and Business Outcome.
Efficiency and Effectiveness reports, while they are automated and supplied via push button
reporting, rely on static data and represent only a snapshot in time. They are often used as day-to-day
management tools to help the learning organization improve their operational efficiencies and training
quality.
Business Outcome reports, on the other hand, demonstrate the value and contribution of learning to
the overall business goals, written in business language. Most importantly, measurement is no longer simply
retrospective but also proactive. For example, evaluation benchmark alarms can be set to alert designated
learning leaders if an organizations training
negative business impact is even detected. Learning leaders can then investigate the problems and take
immediate remedial action. The report could also project budget performance based on burn rate of training
development costs, so learning leaders can proactively readjust development priorities, methods, and spend
patterns. At this advanced stage, the learning measurement process is
Formalize Measurement
In the learning planning stage, agree with business leaders on the skills their organizations need to
achieve specific business goals. Define learning impact by the organizations business metrics. For example,
a percentage of increase in sales, increase of revenue in dollars, reduction in employee turnover per critical
talent, etc.
2.
organizations, etc. This might sound like a simple thing to do, but it takes some thought. You need to decide
which person fits which role and be able to sell them on why the effort is important and why they are
critical to that role. Because this may create additional work for them and they may have other priorities,
you must prepare to logically address any concerns they might have and help them see the value to them in
this initiative.
3.
Define scopes
Confirm regional and global measurement goals and responsibilities. To do this, identify those in the
organization who support learning and make them the point of data collection or funnel to get information
for you to use. Potentially it can be an additional duty or it may be that the person is more a dotted line to
provide reports via templates or input forms.
4.
Document results
Track results by the agreed business metrics. Automate reports. Typically this is done through the
use of database tools. The tools could be as simple as Microsoft Access or as sophisticated as commercial
database tools and databases like SQL or Oracle. The idea is to have the capability to run a report and send
to stakeholders. The more accurately you can display results the more credible you become. Provide
predictive measures that point to business opportunities or flag risks. Communicate in the business language
that resonates with business leaders.
5.
understand where you store and retrieve the information required to create reports and dashboard views.
Make sure you automate clean processes with an eye to streamline work and not just automate for the sake
of automation. Start with a what if mindset, i.e. What if I could store data for stakeholders and they could
see the data displayed graphically on their phone, tablet or laptop? How would I go about that?
SCOPE
Maturity
Levels Level 1 INITIAL Level
2
and and Demonstration Poor
REPEATABLE
Less than Average
Measurement
Little/No
Basic measurement
Formalized
measurement
process exists and
planned
or measurement
conducted
execution
is
regularly or with repeated
the
same successfully
methodology as the
rest
of
the
organization
Clear Roles and Little/No awareness Limited/Local
Responsibilities
of measurement go- awareness
of
to contact(s)
measurement go-to
contact(s) as well
as
measurement
ownership,
leadership,
sponsorship
Global
Business/ Little/No
Regions Addressed consideration
to
expand to a global
process
Basic awareness of
measurement needs
and activities of all
regions/sub regions
(divisions,
countries)
Measurement
Sponsors,
Champions, Team
Leaders,
Team
Members identified,
communicated, and
engaged with an
understanding
of
their responsibilities
individually and to
each other; proper
2-way interaction
with Measurement
Stakeholders
Level
5
OPTIMIZING
Superior
Champions engaged
and
held
accountable
by
leadership
team
(Sponsors); optimal
measurement
vision exists and is
communicated
Common
and
understanding of all
measurement roles
and responsibilities
by
including Sponsors,
Champions, Team
Leaders,
Team
and
Members,
an
Measurement
of
Execution Owner,
Stakeholders, and
all
support
organizations;
all
roles
continually
challenge status quo
Regional
Regional
Measurement
Champions
and Champions
and commonality
Subject
Matter SMEs active in maximized
as
Experts
(SMEs) maturity
appropriate;
identified;
improvement and stakeholders
measurement needs commonality
knowledge sharing
and improvement increasing
and
coordinating
actions defined
continuous
improvement
DOCUMENTATION
Measurement
Documentation
Created
PROCESS
Process
Documented
Maintained
Little/No
documentation
reports exist
Limited
or documentation or
reports exist or
may exist at a local
level
(region,
function, facility)
only
Little/No
Complete
and understanding
of understanding
measurement
measurement
processes. Work is process
Ad Hoc
Global high-level
measures defined
and documentation
or reports exist
Global
measures
defined
and
automated
documentation or
reports
exist
(dashboard)
and
analysis of data is
provided
Measurement
Processes
of processes defined automated and data
and documented
dictionary exists, is
reviewed,
and
modified
or
updated as needed
activities
Global
measures
defined
and
automated;
documentation or
reports
exist
(dashboard)
and
analysis of data is
provided; forward
looking
or
predictive
measures provided
and alarms or other
warning triggers are
in place to provide
business
intelligence
Process
continuously drives
maturity
through
change
and
digitization
NOTE: The amount of time it takes to progress from one Maturity Level to the next depends on where you are today, how much funding and dedicated people
resources you have available, and the number and type of conflicting priorities that may exist in your organization.
SUMMARY
Measurement metrics on traditional training impacts are still necessary but insufficient, requiring
learning organizations to make a paradigm shift. Learning professionals must link learning directly to
business goals and show how learning has impacted the organizations achievement of such goals. The
value of learning to the business can no longer be assumed; it must be demonstrated. While many learning
professionals agree with this direction, they often arent sure where to start or what the end-state should
look like.
The Motorola Solutions Measurement Maturity Model (M 3 ) provides a roadmap that begins with
learning professionals retooling their metrics to match that of their business leaders. Five measurement
maturity levels are defined: Initial, Repeatable, Defined, Managed, and Optimizing. As the measurement
maturity level advances, the link between learning and business performance becomes more direct. At the
more advanced levels, measurement reports can be used to predict risks and opportunities and make an even
greater impact on the business outcome.