2021 Developing A Measurement of Employee Learning Agility
2021 Developing A Measurement of Employee Learning Agility
https://www.emerald.com/insight/2046-9012.htm
EJTD
46,5/6 Developing a measurement of
employee learning agility
Jinju Lee and Ji Hoon Song
Department of Educational Technology, Hanyang University, Seoul, Republic of Korea
450
Received 28 January 2021
Revised 17 March 2021
Abstract
Accepted 23 March 2021 Purpose – The purpose of this study is to develop a reliable and valid measurement of employee learning
agility for use in human resources development.
Design/methodology/approach – This paper analyzed a total of 365 cases collected from Korean
organizations. This paper conducted exploratory factor analysis, confirmatory factor analysis and Rasch
analysis to validate the measurement. This paper conducted a measurement invariance (MI) test to confirm
the generalizability of the measurement and used Harman’s single factor test to assess the common method
variance (CMV).
Findings – This paper derived a learning agility measurement with six subfactors (Cronbach’s a = 0.893).
This paper verified that it could be applied equally to supervisors and subordinates using an MI test. Factor
analysis confirmed the feasibility of CMV. Based on I-ADAPT theory, the learning agility measurement can
be applied to assess not only leader competency but also general employee competency.
Research limitations/implications – Practitioners may use this model of learning agility for
developing competency-based training programs and evaluations. This study is meaningful because it
extends the concept of learning agility. In particular, the MI test indicated that there are no differences
between supervisors and employees regarding the utility of the concept.
Originality/value – This study is meaningful because it extends the concept of learning agility. In
particular, the MI test indicated that there are no differences between supervisors and employees regarding
the utility of the concept. The concept in this paper is distinguished from other studies by applying the Rasch
model from an item response theory perspective.
Keywords Measurement invariance, Scale development, Rasch analysis, Organizational Agility,
I-ADAPT theory, Learning agility
Paper type Research paper
Learning agility is the willingness and ability to learn new things to obtain results under
difficult or different conditions (De Meuse et al., 2010; Lombardo and Eichinger, 2000). The
business environment is changing rapidly, and organizations increasingly seek to determine
the potential competence of individual employees rather than their current achievements
when assessing core human resources (Corporate Leadership Council, 2005; De Meuse,
2017). In previous studies, learning agility has mainly been studied in the context of leader
success (De Meuse, 2017). Lombardo and Eichinger (2000) emphasized that greater learning
abilities are associated with greater potential within organizations and defined learning
agility as being able to rapidly practice what is learned in a new or first encounter situation
based on ability and willingness to learn from experience (Eichinger et al., 2010; Lombardo
and Eichinger, 2000). However, learning agility is not only required for leaders, it is a
competency critical for all organization members because human resources that adapt
European Journal of Training and
Development
quickly and continuously to changing environments are essential for the sustainability of an
Vol. 46 No. 5/6, 2022
pp. 450-467
organization’s competitive advantage (Barney and Clark, 2007, Garvin et al., 2008).
© Emerald Publishing Limited Although the learning agility of adult learners is closely related to mental and
2046-9012
DOI 10.1108/EJTD-01-2021-0018 psychological factors, including experience and learning disposition (De Meuse, 2017), there
is still confusion about what learning agility is, how to measure it and its relationship to Employee
leader success (De Meuse, 2017; De Meuse et al., 2012; Gravett and Caldwell, 2016). Prior learning
research has been conducted to derive learning agility characteristics and develop
measurement techniques (Bedford, 2011; Burke et al., 2016; De Meuse et al., 2012; Mitchinson
agility
and Morris, 2012; Smith, 2015). However, as there is no consensus definition of learning
agility, there is no widely accepted measure (De Meuse, 2017). Therefore, in this study, we
develop and justify measurements of learning agility used in academia and in practice. We
451
also consider the mechanisms of individuals, teams and organizations as factors that affect
organizational performance in a multidimensional manner (Colquitt et al., 2011). From a
human resource development (HRD) perspective, applying individual experience and
knowledge to job performance is critical to organizational performance. People with high
learning agility must consider individual and organizational learning perspectives
simultaneously because they are continually evolving, growing and trying to maintain a
new way of thinking (Lee and Song, 2020).
The purpose of this study is to develop a valid and reliable measurement of learning
agility. The measurement developed in this study was performed according to the procedure
presented by DeVellis (2016). Content validity was confirmed through two expert reviews,
and the difficulty of the items was considered by performing a Rasch analysis based on item
response theory (IRT). We secured statistical validity by performing measurement identity
verification to confirm the generalization developed in this study. Collecting and reviewing
data on learning agility in this study enabled logical and clear conceptual reasoning,
through which the conceptual integrated model of learning agility can be justified (Whetten,
1989).
Finally, it is necessary to apply learning agility measurements to all members of an
organization rather than confining the assessment to leaders. Learning agility has been
studied in the context of leader competence (De Meuse et al., 2010). However, modern
organizational structure increasingly emphasizes the efficiency of small-scale, team-based
and project-based operations to enable more agile and expeditious responses. Thus, learning
agility is not an aptitude that only a small number of leaders need to have, but rather a
criterion that can differentiate individual competence levels among employees (Ployhart and
Bliese, 2006; Pulakos et al., 2000).
Therefore, in this study we examine existing measurement tools used to capture learning
agility and develop a valid and reliable measurement technique using statistical tests,
including Rasch analysis, based on the conceptual integration model.
Theoretical background
Concept of learning agility
Learning agility is the ability to flexibly change thoughts and behavior and adapt to new
environments based on willingness to learn from experience when faced with unfamiliar
environments (Lombardo and Eichinger, 2000). It focuses on learning types related to new
behaviors or changing attitudes, including competence and the ability to learn from
experience and perform successfully in new and unfamiliar environments.
Learning agility focuses on the ability to learn new behaviors and change attitudes,
exhibit competence and ability to learn from experience and perform successfully in new
and unfamiliar environments. Prior studies of learning agility can be largely classified into
two perspectives (De Meuse, 2017). The bandwidth approach to learning agility is seen as a
multidimensional characteristic of individuals learning effectively through experience (De
Meuse et al., 2012), whereas in the limited approach, learning agility is seen as a factor
EJTD related to the speed and flexibility of learning through experience (DeRue et al., 2012a; Ryu
46,5/6 and Oh, 2016).
However, as there is still no standard definition of learning agility, there are differences
in research approaches. Although most studies have addressed learning agility in the
leadership domain, it is not desirable to limit considerations of learning agility to leaders’
capabilities. Since every employee in all different position also need to acquire new skills
452 and knowledge quickly for individual performance and organizational changes. In the
following sections, we analyze the main models of learning agility, reflecting the flow of
learning agility studies.
Methods
Samples
In this study, measurements were not focused on a specific industry group or occupation,
and we performed random sampling of workers in various industries and companies.
Learning agility is not specific to particular industry groups and is not limited to specific
positions, genders and ages in modern organizations.
Second, in this study, we conducted a MI test to verify the generalizability of the
measurement. To this end, our sample was divided into employees and managers when
collecting data to determine whether learning agility, which was considered a concept for
predicting the potential of leaders, can be extended and applied to general workers.
We collected data from 390 respondents by distributing online and offline surveys to 450
Korean private sector employees (response rate 86.6%). Responses with missing values
were excluded from the final data set for analysis based on Mahalanobis D distance (Kline,
2015).
The total number of respondents included in the final sample was 365, including 121
males (33.2%) and 244 females (66.8%). The majority of the respondents were in their 30s,
with 1–3 years of work experience. There were 165 managers in the sample (45.2%) and 200
general employees (54.8%).
Measures
As learning agility is a complex concept, a methodological approach that includes a
multiple-item scale rather than a single-item scale should be used for measurement. We
followed the process proposed by DeVellis (2016) to develop the scale. Measurement type
(scale type, response category, response format) should be chosen to minimize questionnaire
information loss (DeVellis, 2016). The item set for the learning agility measure developed in
this study was based on a conceptual model of learning agility (Lee and Song, 2020). Factors
included self-directed learning, seeking constructive feedback, critical reflection, challenging
experience, rational problem-solving and adaptation to the job environment, each of which
extracts a statement similar in context to the learning agility measurement in Table 1.
Confirmation of predictive validity requires examining criteria related to future behavior,
such as the use of reminders and purchasing intent, to predict state changes in a
measurement concept or attribute at a future point. This study measured task performance,
contextual performance and innovative work behavior, which have frequently been used as
outcome variables to assess learning agility in previous reviews.
Task performance relates to core organizational maintenance tasks and services that
contribute to the effectiveness of the organization by supporting the core function (Borman
and Motowidlo, 1993). In this study, we include six task performance items developed by
Lee and Yoo (2016).
EJTD Citation (year) Subfactors (no. of items)
46,5/6
Lombardo and Eichinger People agility (6), mental agility (6), change agility (6), results agility (7)
(2000)
De Meuse et al. (2012) Seeking (5), sense-making (5), internalizing (5), applying (5), self-awareness (5)
Mitchinson and Morris Innovating, performing, reflecting, risking, defending
(2012)
454 Smith (2015) Feedback seeking (5), information seeking (5), reflection (9), experimenting (9),
agility (5), other (10)
Bedford (2011) Learning agility (9)
Job performance rating Learning agility (9), interpersonal effectiveness (3), high potential (6), building
(JPR, 2009) collaboration (3), leading courageously (3), creating alignment (3), team
leadership (3), developing leaders (3), strategic thinking (3), business acumen (3),
critical thinking and judgment (3), planning and organizing (3), managing
execution (5), drive for results (3), innovation and risk taking (3), resilience (3),
Table 1. integrity (3), overall judgments (10)
Sources used to Kember et al. (2000) Habitual action (4), understanding (4), reflection (4), critical reflection (4)
develop the learning Im et al. (2017) Self-awareness (5), growth-oriented (5), flexible thinking (5), reflective behavior
agility measurement seeking (5), behavioral change (5)
Procedures
Expert review. Two expert reviews were conducted to assess the content validity of
measures based on the concept of learning agility. First, we consulted three bilingual
Korean-English speakers in the HRD field, two with over ten years of experience in HRD
practice and a third who currently teaches at universities in the USA. The experts reviewed
the overall concept, a brief description of the concept and learning agility components and
provided e-mail feedback. Overall, the experts noted that the concept of “jobs” in learning
agility is rooted in the organizational context, so it is necessary to refine the organizational
context to define learning agility. In addition, the six factors as mentioned earlier lack their
specificity, thus, including “learning agility” among the existing names of factor may lead to
multicollinearity. Accordingly, the factor, learning agility is specified and modified to meet
the purpose of the measurement development. The readability of each item and the
overlapping expressions were revised. The areas in which respondents warned possible
misunderstanding were reassessed. To improve readability, the expressions of the items
were revised after the first expert review briefly.
The second review was conducted among seven experts, including experts in HRD and
with doctoral degrees in educational technology. The item level content validity index
(I-CVI) was calculated for each item, and those with low validity were considered for deletion Employee
(Fehring, 1987). The items that were retained had I-CVI coefficients higher than 0.80. The learning
opinions and I-CVI coefficients collected through the first and second expert reviews are
summarized in Table 2, the items removed based on feedback are identified, and the
agility
corrections made for each item are noted.
Rasch analysis. In this study, we examined six factors that made up the learning agility
measurement using a rating scale model in jMetrik software (Meyer, 2009–2018). We
adopted six learning agility factors derived according to a conceptual model developed 455
through an integrated literature review process (Lee and Song, 2020) and determined
(a) whether the item discrimination was similar to or different from other items within the
measurement and (b) whether the measurement was composed of items with varying levels
of difficulty to measure different psychological traits of respondents. Therefore, in this
study, we tested whether the distribution of measurement difficulty level was appropriate
by analyzing item difficulty levels and item curve.
Normalization. Before verifying reliability and validity, the normality, standard
deviation, skewness and kurtosis of the data collected in this study were checked to confirm
normality. The skewness ranged from 0.527 to 0.041. Kurtosis refers to the degree of
sharpness at the center of distribution and is equal to the standard normal distribution if the
kurtosis is zero. Kurtosis ranged from 0.571 to 0.642. The data collected in this study were
therefore confirmed to be acceptable according to skewness and kurtosis.
Factor analysis. The purpose of factor analysis is as follows. First, the number of
variables is decreased through factor analysis, and unnecessary subfactors are removed.
Many variables are reduced to a small number of factors by bundling the relevant items into
a limited number of relevant variables to illustrate learning agility. Second, as the related
variables are bundled, the characteristics of the variables can be better grasped. As the
bundled factors have mutually independent characteristics, the characteristics of each
variable are known. Third, the validity of the measurement can be evaluated. Finally, factor
scores can be applied to further studies, such as regression analysis or cluster analysis
(Kline, 2015; Thompson, 2004).
The first procedure in exploratory factor analysis (EFA) determines the number of
variables after selecting variables using a varimax rotation for the factor rotation. Factors
with eigenvalues greater than one were extracted (Osborne et al., 2014; Williams et al., 2010;
Thompson, 2004).
Confirmatory factor analysis (CFA) can verify factors by analyzing the relationships
between measured and latent variables and evaluating the model’s overall fit (Brown, 2014;
Kline, 2015; Thompson, 2004). If a chi-square result is statistically significant, we may
confirm that the data used in this study are suitable for the research model. However, as the
chi-square goodness-of-fit test results tend to inflate as the sample size increases, we
included several fit indices in the analysis (Hair et al., 1998, 2016) and estimated the internal
consistency reliability (Cronbach’s a).
Measurement invariance and single factor test. We based assessments of validity on
(a) test content, (b) internal structure, (c) external criteria and (d) generalization according to
the validity measure used in Messick (1995). MI tests should be implemented to determine
whether data are used equally without discrimination between supervisors and
subordinates (Cheung and Rensvold, 2002; Van de Schoot et al., 2012; Meredith, 1993;
Vandenberg and Lance, 2000; Steinmetz et al., 2009).
In addition, we minimized CMV, which can be caused by self-reporting. CMV was
minimized in the research design and data collection stages as well as the statistical
processing stage. Before data were collected, the items were revised twice for content
EJTD
Comments I-CVI Modifications
46,5/6
L1 Unnecessary modifiers and 1 Item modified
vocabularies
L2 – 0.92 –
L3 – 0.67 Consider deletion
L4 The learning opportunities offered by 0.5 –
456 the company are less relevant to self-
directed learning
L5 – 0.92 –
L6 – 0.92 –
L7 – 0.75 –
L8 Unappropriated adjectives must be 0.67 Consider modification or
modified deletion
F1 Unappropriated adjectives must be 0.92 Item modified
modified
F2 – 1 –
F3 0.92
F4 Two perspectives are expressed 0.92 Separated by two questions
together
F5 To modify the sentence order 0.92 Item modified
considering readability
F6 – 0.92 –
F7 The contents of the item overlap with 0.92 –
F1
F8 The contents of the item overlap with 0.83 Consider deletion
F1
R1 May be interpreted as a different factor 0.92 Separate from other factors
R2 – 0.75 –
R3 Inappropriate expression included 0.83 Item modified
R4 – 0.75 Consider deletion
R5 More specifically, you need to be more 0.83 Item modified
specific in terms of readability
R6 Redundant expression 0.83 Item modified
R7 This item can cause biased responses 0.83 Item modified
owing to assertive expressions
R8 – 0.58 Consider deletion
E1 – 0.92
E2 Two perspectives are expressed 0.75 Item modified
together
E3 To modify the sentence order 0.75 Item modified
considering readability
E4 0.83
E5 Two perspectives are expressed 0.75 Consider modification or
together deletion
E6 Need to delete unnecessary expressions 0.92 Item modified
to clarify clarity
E7 – 0.92 –
E8 Correct the sentence in the same order 0.75 Item modified
Table 2. as the previous question for
Expert review to consistency
P1 Item does not describe variable 0.67 Consider deletion
ensure content
validity (continued)
Employee
Comments I-CVI Modifications
learning
P2 – 0.83 – agility
P3 Two perspectives are expressed 0.58 Consider separation or
together deletion
P4 – 0.83 –
P5 – 0.67 Consider deletion
P6 – 0.92 – 457
P7 – 0.92 –
P8 – 0.5 Consider deletion
A1 Unclear wording included 0.83 –
A2 Redundant expression 0.75 Consider modification or
deletion
A3 Need to use clearer expressions 0.75 Item modified
A4 – 0.92 –
A5 Unclear wording included 0.83 Item modified
A6 Unclear wording included 0.83 Item modified
A7 Need to delete unnecessary expressions 0.58 Consider modification or
for clarity deletion
A8 Redundant expression 0.92 Item modified
validity by expert review. For statistical processing, we conducted Hamman’s single factor
test to check for the occurrence of CMV in the collected data (Podsakoff et al., 2003).
Results
Rasch analysis results
We focused on the following variables to identify the each item’s difficulty level and the
personal trait scores for the items. The distributions of the difficulty levels were compared.
If each item’s infit and outfit were 0.7 or less and 1.3 or more, respectively, and the Z value
was less than 2 or greater than 2, it was considered an inappropriate item (Bond and Fox,
2001). Four items were deleted based on the results of Rasch analysis (Table 3). After
conducting Rasch analysis, eight items were deleted based on the first and second expert
reviews owing to having I-CVI coefficients less than 0.80.
Item Difficulty Std. error WMS Std. WMS UMS Std. UMS
Figure 1.
Scree plot
Factors
Employee
L R E F A P learning
agility
L7 0.767 0.301 0.222 0.051 0.156 0.214
L2 0.751 0.061 0.126 0.238 0.083 0.112
L5 0.743 0.294 0.246 0.051 0.145 0.154
R6 0.069 0.847 0.146 0.034 0.100 0.063
R7 0.244 0.732 0.164 0.082 0.143 0.234 459
R5 0.219 0.682 0.175 0.164 0.071 0.102
E6 0.108 0.132 0.798 0.131 0.197 0.189
E4 0.188 0.285 0.728 0.150 0.085 0.036
E8 0.293 0.135 0.705 0.119 0.011 0.258
F2 0.003 0.113 0.215 0.812 0.139 0.036
F6 0.077 0.119 0.012 0.805 0.122 0.246
F4 0.359 0.029 0.150 0.719 0.120 0.050
A3 0.151 0.103 0.019 0.019 0.860 0.035
A4 0.009 0.171 0.071 0.229 0.781 0.169
A6 0.186 0.014 0.228 0.194 0.601 0.282
P5 0.104 0.031 0.143 0.102 0.004 0.854
P8 0.222 0.256 0.129 0.059 0.207 0.714
P7 0.176 0.278 0.283 0.120 0.374 0.608
Eigenvalue 2.234 2.226 2.113 2.094 2.077 2.036
Variance (%) 12.409 12.367 11.737 11.632 11.541 11.314
Cumulative variance (%) 12.409 24.776 36.513 48.145 59.686 71.000
Table 4.
Notes: L = self-directed learning, S = seeking constructive feedback, R = critical reflection, E = Factor analysis
challenging experience, P = rational problem solving, A = adaptation to the job environment results
Correlation
Constructs M SD LA TP CP IWB AVE CR VIF
IV DV B SE b t p R2
Conclusions
Findings
The results of this study can be summarized as follows. First, we derived six subfactors
(self-directed learning, seeking constructive feedback, critical reflection, challenging
Latent mean
Supervisors Employees SE CR p Variance Effect size
References
Anderson, J.C. and Gerbing, D.W. (1988), “Structural equation modeling in practice: a review and
recommended two-step approach”, Psychological Bulletin, Vol. 103 No. 3, pp. 411-423.
Barney, J.B. and Clark, D.N. (2007), Resource-Based Theory: Creating and Sustaining Competitive
Advantage, Oxford University Press.
Bedford, C.L. (2011), “The role of learning agility in workplace performance and career advancement”,
Unpublished doctoral dissertation, University of Minnesota, [S.l.], available at: www.riss.kr/link?
id=T12725161
Borman, W.C. and Motowidlo, S.J. (1993), “Expanding the criterion domain to include elements of
contextual performance”, in Schmitt, N. and Borman, W.C. (Eds), Personnel Selection in
Organizations, Jossey-Bass.
Burke, W.W., Roloff, K.S. and Mitchinson, A. (2016), “Learning agility: a new model and measure”,
Working Paper.
Bond, T.G. and Fox, C.M. (2001), Applying the Rasch Model: Fundamental Measurement in the Human
Sciences, Lawrence Erlbaum.
Brown, T.A. (2014), Confirmatory Factor Analysis for Applied Research, Guilford.
Cheung, G.W. and Rensvold, R.B. (2002), “Evaluating goodness-of-fit indexes for testing measurement
invariance”, Structural Equation Modeling: A Multidisciplinary Journal, Vol. 9 No. 2, pp. 233-255,
doi: 10.1207/S15328007SEM0902_5.
Cohen, J. (2001), Statistical Power Analysis for the Behavioral Sciences (2nd ed.)., Erlbaum.
Colquitt, J.A., Lepine, J.A.,Wesson, M.J. and Gellatly, I.R. (2011), Organizational Behavior: Improving
Performance and Commitment in the Workplace (5th ed.), McGraw-Hill Irwin.
Corporate Leadership Council (2005), Realizing the full potential of rising talent, Corporate Executive
Board.
De Meuse, K.P., Dai, G., Swisher, V.V., Swisher and V.V. (2016), “Leadership development: Exploring,
clarifying, and expanding our understanding of learning agility”, Industrial and Organizational
Psychology, Vol. 5 No. 3, pp. 280-286, doi: 10.1111/j.1754-9434.2012.01445.
De Meuse, K.P. (2017), “Learning agility: its evolution as a psychological construct and its empirical Employee
relationship to leader success”, Consulting Psychology Journal: Practice and Research, Vol. 69
No. 4, pp. 267-295, doi: 10.1037/cpb0000100.
learning
De Meuse, K.P., Dai, G. and Hallenbeck, G.S. (2010), “Learning agility: a construct whose time has
agility
come”, Consulting Psychology Journal: Practice and Research, Vol. 62 No. 2, pp. 119-130, doi:
10.1037/a0019988.
De Meuse, K.P., Dai, G., Swisher, V.V., Eichinger, R.W. and Lombardo, M.M. (2012), “Leadership
development: exploring, clarifying, and expanding our understanding of learning agility”, 465
Industrial and Organizational Psychology, Vol. 5 No. 3, pp. 280-286, doi: 10.1111/j.1754-
9434.2012.01445.x.
DeRue, D.S., Ashford, S.J. and Myers, C.G. (2012a), “Learning agility: in search of conceptual clarity and
theoretical grounding”, Industrial and Organizational Psychology, Vol. 5 No. 3, pp. 258-279, doi:
10.1111/j.1754-9434.2012.01444.x.
DeVellis, R.F. (2016), Scale Development: Theory and Applications, 4th ed., Sage publications, Inc.
Eichinger, R.W., Lombardo, M.M. and Capretta, C.C. (2010), FYI for Learning Agility, Lominger
International: A Korn/Ferry Company.
Fehring, R.J. (1987), “Methods to validate nursing diagnoses”, Heart and Lung: The Journal of Critical
Care, Vol. 16 No. 6 Pt 1, pp. 625-629.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equation models with unobservable
variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.
Garvin, D.A., Edmondson, A.C. and Gino, F. (2008), “Is yours a learning organization?”, Harvard
Business Review, Vol. 86 No. 3, pp. 109-120.
Gravett, L.S. and Caldwell, S.A. (2016), Learning Agility, Springer, doi: 10.1057/978-1-137-59965-0.
Hair, J.F., Jr, Hult, G.T.M., Ringle, C. and Sarstedt, M. (2016), A Primer on Partial Least Squares
Structural Equation Modeling (PLS-SEM), 2nd ed., Sage publications, Inc.
Hair, J.F., Black, W.C., Babin, B.J., Anderson, R.E. and Tatham, R.L. (1998), Multivariate Data Analysis,
5th ed., Prentice hall, Upper Saddle River, NJ.
Hong, S., Malik, M.L. and Lee, M.-K. (2003), “Testing configural, metric, scalar, and latent mean invariance
across genders in sociotropy and autonomy using a non-western sample”, Educational and Psychological
Measurement, Vol. 63 No. 4, pp. 636-654, doi: 10.1177/0013164403251332.
Hu, L. T. and Bentler, P.M. (1999), “Cutoff criteria for fit indexes in covariance structure analysis:
conventional criteria versus new alternatives”, Structural Equation Modeling: A
Multidisciplinary Journal, Vol. 6 No. 1, pp. 1-55, doi: 10.1080/10705519909540118.
Im, C., Wee, Y. and Lee, H. (2017), “A study on the development of the learning agility scale”, The
Korean Journal of Human Resource Development Quarterly, Vol. 19 No. 2, pp. 81-108.
Kember, D., Leung, D.Y., Jones, A., Loke, A.Y., McKay, J., Sinclair, K. and Wong, M. (2000),
“Development of a questionnaire to measure the level of reflective thinking”, Assessment and
Evaluation in Higher Education, Vol. 25 No. 4, pp. 381-395, doi: 10.1080/713611442.
Kline, R.B. (2015), Principles and Practice of Structural Equation Modeling, 4th ed., Guilford
publications, New York, NY.
Kim, D.Y. and Yoo, T.Y. (2002), “The relationships between the Big Five personality factors and
contextual performance in work organizations”, Korean Journal of Industrial and Organizational
Psychology, Vol. 15 No. 2, pp. 1-24.
Koh, K.H. and Zumbo, B.D. (2008), “Multi-group confirmatory factor analysis for testing measurement
invariance in mixed item format data”, Journal of Modern Applied Statistical Methods, Vol. 7
No. 2, pp. 471-477, doi: 10.22237/jmasm/1225512660.
Lee, J. and Song, J.H. (2020), “Developing a conceptual integrated model for the employee’s learning
agility”, Performance Improvement Quarterly, Online first, doi: 10.1002/piq.21352.
EJTD Lee, C. and Yoo, T. (2016), “The effect of personality on task performance and adaptive performance:
the mediating effect of job crafting and the moderating effect of leader’s empowering behavior”,
46,5/6 Korean Journal of Industrial and Organizational Psychology, Vol. 29 No. 4, pp. 607-630.
Lombardo, M.M. and Eichinger, R.W. (2000), “High potentials as high learners”, Human Resource
Management, Vol. 39 No. 4, pp. 321-329, doi: 10.1002/1099-050X(200024)39:4<321::AID-
HRM4>3.0.CO;2-1.
466 Meredith, W. (1993), “Measurement invariance, factor analysis and factorial invariance”,
Psychometrika, Vol. 58 No. 4, pp. 525-543.
Messick, S. (1995), “Validity of psychological assessment: validation of inferences from persons’
responses and performances as scientific inquiry into score meaning”, American Psychologist,
Vol. 50 No. 9, pp. 741-749, doi: 10.1037/0003-066X.50.9.741.
Mitchinson, A. and Morris, R. (2016), “Learning about learning agility”, A White Paper.
Motowildo, S.J., Borman, W.C. and Schmit, M.J. (1997), “A theory of individual differences in task and
contextual performance”, Human Performance, Vol. 10 No. 2, pp. 71-83, doi: 10.1207/
s15327043hup1002_1.
Osborne, J.W., Costello, A.B. and Kellow, J.T. (2014), Best Practices in Exploratory Factor Analysis,
CreateSpace Independent Publishing Platform, Louisville, KY.
Ployhart, R.E. and Bliese, P.D. (2006), “Individual adaptability (I-ADAPT) theory: conceptualizing the
antecedents, consequences, and measurement of individual differences in adaptability”, in Salas, E. (Ed.),
Understanding Adaptability: A Prerequisite for Effective Performance within Complex Environments,
Emerald Group Publishing Limited, Oxford, pp. 3-39, doi: 10.1016/S1479-3601(05)06001-7.
Podsakoff, P.M., MacKenzie, S.B., Lee, J.-Y. and Podsakoff, N.P. (2003), “Common method biases in
behavioral research: a critical review of the literature and recommended remedies”, Journal of
Applied Psychology, Vol. 88 No. 5, pp. 879-903, doi: 10.1037/0021-9010.88.5.879.
Pulakos, E.D., Arad, S., Donovan, M.A. and Plamondon, K.E. (2000), “Adaptability in the workplace:
development of a taxonomy of adaptive performance”, Journal of Applied Psychology, Vol. 85
No. 4, pp. 612-624, doi: 10.1037/0021-9010.85.4.612.
Rasch, G. (2001), “Studies in mathematical psychology: I. Probabilistic models for some intelligence and
attainment tests. Nielsen ^ Lydiche”,
Ryu, H. and Oh, H. (2016), “Learning agility: issues and challenges”, The Korean Journal of Human
Resource Development Quarterly, Vol. 18 No. 4, pp. 119-147.
Scott, S.G. and Bruce, R.A. (1994), “Determinants of innovative behavior: a path model of individual innovation
in the workplace”, Academy of Management Journal, Vol. 37 No. 3, pp. 580-607, doi: 10.2307/256701.
Smith, B.C. (2015), “How does learning agile business leadership differ? Exploring a revised model of
the construct of learning agility in relation to executive performance”, Unpublished doctoral
dissertation, Columbia University, [S.l.], available at: www.riss.kr/link?id=T14028784
Steenkamp, J.-B.E. and Baumgartner, H. (1998), “Assessing measurement invariance in cross-national
consumer research”, Journal of Consumer Research, Vol. 25 No. 1, pp. 78-90, doi: 10.1086/209528.
Steinmetz, H., Schmidt, P., Tina-Booh, A., Wieczorek, S. and Schwartz, S.H. (2009), “Testing measurement
invariance using multigroup CFA: differences between educational groups in human values
measurement”, Quality and Quantity, Vol. 43 No. 4, pp. 599-616, doi: 10.1007/s11135-007-9143-x.
Thompson, B. (2004), Exploratory and Confirmatory Factor Analysis: Understanding Concepts and
Applications, American Psychological Association, Washington, DC.
Van de Schoot, R., Lugtig, P. and Hox, J. (2012), “A checklist for testing measurement invariance”, European
Journal of Developmental Psychology, Vol. 9 No. 4, pp. 486-492, doi: 10.1080/17405629.2012.686740.
Vandenberg, R.J. and Lance, C.E. (2000), “A review and synthesis of the measurement invariance
literature: suggestions, practices, and recommendations for organizational research”,
Organizational Research Methods, Vol. 3 No. 1, pp. 4-70, doi: 10.1177/109442810031002.
Whetten, D.A. (1989), “What constitutes a theoretical contribution?”, Academy of Management Review, Employee
Vol. 14 No. 4, pp. 490-495, doi: 10.5465/AMR.1989.4308371.
learning
Williams, B., Onsman, A. and Brown, T. (2010), “Exploratory factor analysis: a five-step guide for
novices”, Australasian Journal of Paramedicine, Vol. 8 No. 3, pp. 1-13, doi: 10.1.1.414.4818. agility
Further reading
Argyris, C. and Schön, D. (1984), “Theories of action, double loop learning and organizational learning”,
available at: www.infed.org/thinkers/argyris.htm (accessed 23 September 2019).
467
Connolly, J. (2001), “Assessing the construct validity of a measure of learning agility”, Unpublished
doctoral dissertation, Florida International University, doi: 10.25148/etd.FI14060893.
Cyert, R.M. and March, J.G. (1963), A Behavioral Theory of the Firm, Prentice Hall.
De Meuse, K.P. (2019), “A meta-analysis of the relationship between learning agility and leader
success”, Journal of Organizational Psychology, Vol. 19 No. 1, pp. 25-34.
DeRue, D.S. and Ashford, S.J. (2010), “Who will lead and who will follow? A social process of leadership
identity construction in organizations”, Academy of Management Review, Vol. 35 No. 4,
pp. 627-647, doi: 10.5465/AMR.2010.53503267.
DeRue, D.S., Ashford, S.J. and Myers, C.G. (2012b), “Learning agility: many questions, a few answers,
and a path forward”, Industrial and Organizational Psychology, Vol. 5 No. 3, pp. 316-322, doi:
10.1111/j.1754-9434.2012.01465.x.
Hallenbeck, G.S. (2016), Learning Agility: Unlock the Lesson of Experience, Center for Creative
Leadership.
Hoff, D.F. and Burke, W.W. (2017), Learning Agility: The Key to Leader Potential, Tulsa, OK.
Lombardo, M.M. and Eichinger, R.W. (1994), Learning Agility: The Learning Arichitect User’s Manual,
Lominger Inc.
McCauley, C.D. (2001), “Leader training and development”, in Zaccaro, S.J. and Klimoski, R.J. (Eds), The
Jossey-Bass Business and Management Series. The Nature of Organizational Leadership:
Understanding the Performance Imperatives Confronting Today’s Leaders, Jossey-Bass,
pp. 347-383.
Merriam, S.B., Caffarella, R.S. and Baumgartner, L.M. (2012), Learning in Adulthood: A Comprehensive
Guide, 4th ed., Jossey-Bass.
Sessa, V.I. and London, M. (2015), Continuous Learning in Organizations: Individual, Group, and
Organizational Perspectives, Lawrence Erlbaum, doi: 10.1111/j.1744-6570.2008.00119_6.x.
Torraco, R.J. (2016), “Writing integrative literature reviews: using the past and present to explore the
future”, Human Resource Development Review, Vol. 15 No. 4, pp. 404-428.
Wang, S. and Beier, M.E. (2012), “Learning agility: not much is new”, Industrial and Organizational
Psychology, Vol. 5 No. 3, pp. 293-296, doi: 10.1111/j.1754-9434.2012.01448.x.
Corresponding author
Ji Hoon Song can be contacted at: [email protected]
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: [email protected]