0% found this document useful (0 votes)
103 views30 pages

Cognitive Vs Hierarchical TA

This article compares hierarchical task analysis (HTA) and cognitive work analysis (CWA), two approaches used in human factors and ergonomics. It discusses their different theoretical underpinnings, methodologies, and potential contributions to system design and evaluation. The article analyzes how both methods were recently applied to evaluate a military rotary wing mission planning software tool, highlighting their complementary outputs despite different analyses. It argues that applying both approaches provides benefits to inform the design or evaluation of products and systems.

Uploaded by

Reshmi Varma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views30 pages

Cognitive Vs Hierarchical TA

This article compares hierarchical task analysis (HTA) and cognitive work analysis (CWA), two approaches used in human factors and ergonomics. It discusses their different theoretical underpinnings, methodologies, and potential contributions to system design and evaluation. The article analyzes how both methods were recently applied to evaluate a military rotary wing mission planning software tool, highlighting their complementary outputs despite different analyses. It argues that applying both approaches provides benefits to inform the design or evaluation of products and systems.

Uploaded by

Reshmi Varma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

This article was downloaded by: [Temple University Libraries]

On: 17 November 2014, At: 17:06


Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Theoretical Issues in Ergonomics


Science
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/ttie20

Hierarchical task analysis vs. cognitive


work analysis: comparison of theory,
methodology and contribution to
system design
a b c d
Paul Salmon , Daniel Jenkins , Neville Stanton & Guy Walker
a
Human Factors Group , Monash University Accident Research
Centre, Clayton Campus, Monash University , Victoria, 3800,
Australia
b
Sociotechnic Solutions , 2 Mitchell Close, St Albans, Herts, AL1
2LW, UK
c
Transportation Research Group, University of Southampton,
School of Civil Engineering and the Environment , Highfield,
Southampton, SO17 1BJ, UK
d
School of the Built Environment, Heriot Watt University ,
Edinburgh, UK
Published online: 08 Feb 2010.

To cite this article: Paul Salmon , Daniel Jenkins , Neville Stanton & Guy Walker (2010)
Hierarchical task analysis vs. cognitive work analysis: comparison of theory, methodology and
contribution to system design, Theoretical Issues in Ergonomics Science, 11:6, 504-531, DOI:
10.1080/14639220903165169

To link to this article: http://dx.doi.org/10.1080/14639220903165169

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Theoretical Issues in Ergonomics Science
Vol. 11, No. 6, November–December 2010, 504–531

Hierarchical task analysis vs. cognitive work analysis: comparison of


theory, methodology and contribution to system design
Paul Salmona*, Daniel Jenkinsb, Neville Stantonc and Guy Walkerd
a
Human Factors Group, Monash University Accident Research Centre, Clayton Campus,
Monash University, Victoria, 3800, Australia; bSociotechnic Solutions, 2 Mitchell Close,
St Albans, Herts, AL1 2LW, UK; cTransportation Research Group, University of Southampton,
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

School of Civil Engineering and the Environment, Highfield, Southampton, SO17 1BJ, UK;
d
School of the Built Environment, Heriot Watt University, Edinburgh, UK
(Received 15 May 2008; final version received 29 May 2009)

The cognitive work analysis framework continues to attract increasing attention


from the human factors and ergonomics community. Conversely, hierarchical
task analysis has been, and remains, the most popular of all human factors and
ergonomics methods. This article compares the two approaches in terms of their
theoretical underpinning, methodological approach and potential contributions
to system design and evaluation. To do this, recent analyses, involving both
approaches, of a military rotary wing mission planning software tool are
compared and contrasted in terms of their methodological procedure and analysis
outputs. The findings indicate that, despite the very different theoretical and
methodological nature of the two approaches, and also the entirely different
analyses derived, the two methods provide highly complementary outputs. In
conclusion, it is argued that there is benefit in applying both approaches to inform
the design and/or evaluation of the same product or system.
Keywords: cognitive work analysis; hierarchical task analysis; mission planning;
human factors methods

1. Introduction
Out of the abundance of human factors and cognitive engineering methods available,
hierarchical task analysis (HTA; Annett et al. 1971) and cognitive work analysis (CWA;
Vicente 1999a) are arguably the most popular. The former represents the traditional task
analytic approach; the latter represents the more modern system design framework. Both
approaches have distinct theoretical underpinnings and approach the analysis of systems
in quite different ways, yet they are often discussed in the same breath. It is important to
clarify what the theoretical and methodological differences between the two are and, even
more so, to identify if, and how, the two approaches can be used in tandem during
complex system design and evaluation efforts (Hajdukiewicz and Vicente 2004). This
article presents a comparison of the two approaches in terms of their theoretical
underpinning, methodological procedure and contribution to the system design life cycle.
To do this the methods are first discussed in terms of their theoretical underpinning and
methodological nature, following which recent HTA and CWA analyses of a military
rotary wing mission planning system (MPS) software tool are compared and contrasted.

*Corresponding author. Email: [email protected]

ISSN 1464–536X online


ß 2010 Taylor & Francis
DOI: 10.1080/14639220903165169
http://www.informaworld.com
Theoretical Issues in Ergonomics Science 505

2. Hierarchical task analysis


The ‘task’ in HTA is something of a misnomer. HTA does not focus exclusively on tasks,
rather it is concerned with goals (an objective or end state) and these are hierarchically
decomposed (Annett and Stanton 1998). HTA’s origins go as far back as the early 1900s to
the so-called scientific management movement of that time. Scientific management
methods, advocated by the likes of Frederick Taylor and the Gilbreths, were used to
analyse tasks in order to investigate more efficient ways in which to undertake them. Early
methods focused on how the work was performed, what was needed to perform the work,
why the work was performed in this way and how the work could be improved (Stanton
2006). Inspired by Miller et al.’s work on ‘plans and the structure of behaviour’ (Miller
et al. 1960), HTA was developed in the 1960s in response to a need to better understand
complex cognitive tasks (Annett 2004). The changing nature of industrial work processes
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

around that time meant that tasks were becoming more cognitive in nature and
approaches that could be used to describe and understand these novel work processes were
subsequently required. Notwithstanding its scientific management origins, HTA was
unique at the time in that, in addition to the physical tasks being performed, it also
attempted to describe the cognitive aspects of goal attainment (something that is often
overlooked by antagonists of the method). Thus, HTA represented a significant departure
from existing approaches of the time since it focused on goals and related cognitive and
physical processes, rather than merely the physical and observable aspects of task
performance.
Stanton (2006) describes the heavy influence of control theory on the HTA
methodology and demonstrates how the test-operate-test-exit unit (central to control
theory) and the notion of hierarchical levels of analysis are similar to HTA representations
(plans and sub-goal hierarchy). HTA itself works by decomposing systems into a hierarchy
of goals, sub-ordinate goals, operations and plans; it focuses on: ‘what an operator . . . is
required to do, in terms of actions and/or cognitive processes to achieve a system goal’
(Kirwan and Ainsworth 1992, p. 1). It is important to note here that an ‘operator’ may be
a human or a technological operator (e.g. system artefacts such as equipment, devices and
interfaces). HTA outputs, therefore, specify the overall goal of a particular system, the
sub-goals to be undertaken to achieve this goal, the operations required to achieve each of
the sub-goals specified and the plans, which are used to ensure that the goals are achieved.
The plans component of HTA is especially important since they specify the sequence, and
under what conditions, different sub-goals have to be achieved in order to satisfy the
requirements of a super-ordinate goal.
The HTA process is simplistic, involving collecting data about the task or system under
analysis (through techniques such as observation, questionnaires, interviews with subject
matter experts (SMEs), walkthroughs, user trials and documentation review to name but a
few) and then using these data to decompose and describe the goals and sub-goals
involved. The HTA procedure is presented in Figure 1.
Despite the vast range of human factors and ergonomics methods available, the
popularity of the HTA methodology is unparalleled and the approach is the most popular
and ubiquitous, not just out of task analysis methods, but out of all human factors and
ergonomics methods (Kirwan and Ainsworth 1992, Annett 2004, Stanton et al. 2005).
HTA has been applied now for over 40 years in all manner of domains and its heavy use
shows no signs of abating, certainly not within human factors and ergonomics circles.
Although the process of constructing a HTA is enlightening in itself (i.e. the analysts’
understanding of the task under analysis increases significantly), HTA’s popularity is due
506 P. Salmon et al.

START

State Overall Goal

State Subordinate Operations Select Next Operation

State Plan
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Check Adequacy of Revise


Redescription Redescription

N
Is Redescription
OK?

Consider the first/next


Suboperation

Is Further Y
Redescription
Required?
Y

Terminate the
Redescription of this Are there any more
Operation Operations?

STOP

Figure 1. Hierarchical task analysis procedure (source: Stanton 2006).

largely to the flexibility and utility of its output. In addition to the exhaustive goal-based
description provided, HTA outputs can be used to inform various additional human
factors analyses and many other human factors methods require an initial HTA as part of
their data input (Stanton et al. 2005). This flexibility has allowed HTA to be applied for
Theoretical Issues in Ergonomics Science 507

a wide range of system design and evaluation purposes, including interface design and
evaluation (e.g. Hodgkinson and Crawshaw 1985, Stammers and Astley 1987, Shepherd
2001), job design (Bruseberg and Shepherd 1997), training programme design and
evaluation (e.g. Piso 1981), human error prediction (e.g. Lane et al. 2007) and analysis (e.g.
Adams and David 2007), team task analysis (Annett 2004, Walker et al. 2006), allocation
of functions analysis (e.g. Marsden and Kirby 2004), workload assessment (Kirwan and
Ainsworth 1992) and procedure design (Stanton 2006).

3. Cognitive work analysis


More of a framework than a rigid methodology, CWA was originally developed at the
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Risø National Laboratory in Denmark (Rasmussen 1986) for use within nuclear power
process control applications. Underlying the approach was a specific need to design for
new or unexpected situations; in a study of industrial accidents and incidents, Risø
researchers found that most accidents began with non-routine operations. CWA’s
theoretical roots lie in general and adaptive control system theory and also Gibson’s
ecological psychology theory (Fidel and Peijtersen 2005). The approach itself is concerned
with constraints rather than goals, which is based on the notion that making constraints
explicit in an interface can potentially enhance human performance (Hajdukiewicz and
Vicente 2004).
The CWA framework comprises five phases, each modelling different constraint sets:
work domain analysis (WDA); control task analysis (CTA); strategies analysis; social
organisation and cooperation analysis (SOCA); and worker competencies analysis. A brief
description of each phase is provided below. The CWA phases, methods and outputs are
presented in Figure 2.
(1) The WDA phase involves modelling the system in question based on its purposes
and the constraints imposed by the environment. The abstraction hierarchy (AH)
and abstraction decomposition space (ADS) approaches are used for this purpose.
WDA identifies a fundamental set of constraints that are imposed on the actions of

Figure 2. Cognitive work analysis phases along with their associated methods and outputs.
508 P. Salmon et al.

any actor (Vicente 1999a). In modelling a system in this way, the systemic
constraints that shape activity are specified. This formative approach leads to an
event, actor and time independent description of the system (Vicente 1999a,
Sanderson 2003).
(2) The second phase, CTA, is used to identify the tasks that are undertaken within the
system and the constraints imposed on these activities during different situations.
CTA focuses on the activity necessary to achieve the purposes, priorities and values
and functions of a work domain (Naikar et al. 2006). Rasmussen’s decision ladder
(Rasmussen 1976; cited in Vicente 1999a) and Naikar et al.’s (2006) contextual
activity template are used for the CTA phase.
(3) The strategies analysis phase is used to identify how the different activities can be
achieved. Vicente (1999a) points out that whereas the CTA phase provides a
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

product description of what needs to be done, strategies analysis provides a process


description of how it can be done. Jenkins et al. (2008a) point out that the
strategies analysis phase fills the ‘black-box’ that is left by the CTA phase. It
involves the identification of the different strategies that agents might employ when
performing control tasks. Of the five phases of CWA, this phase is arguably the
closest to HTA, since it offers a description of the different strategies that could be
used to achieve different functions (similar to the goal decomposition and plans
component of HTA).
(4) The fourth phase, SOCA, is used to identify how the activity and the associated
strategies required can be distributed amongst human operators and technological
artefacts within the system in question and also how these agents could
communicate and cooperate (Vicente 1999a). The objective of this phase is to
determine how social and technical factors can work together in a way that
enhances system performance (Vicente 1999a).
(5) The fifth and final stage, worker competencies analysis, attempts to identify the
competencies that an ideal worker should exhibit (Vicente 1999a). It focuses on the
cognitive skills that are required during task performance. Vicente (1999a)
emphasises that this phase is the point where CWA finally addresses the strengths
and weaknesses of the human operator in relation to the systems design. Worker
competencies analysis uses Rasmussen’s skill, rule and knowledge framework in
order to classify the cognitive activities employed by agents during task
performance.
The different CWA phases therefore allow practitioners to specify the constraints
related to why the system exists, as well as with what the activity under analysis is
conducted (WDA), what activity is conducted (ConTA), how the activity is conducted
(strategies analysis and worker competencies analysis) and also who the activity is
conducted by (SOCA).
The CWA framework is currently receiving great attention (e.g. Bisantz and Burns
2008, Jenkins et al. 2008a). Its attractiveness relates to its flexibility and the varying
perspectives on complex systems that it can provide; it deals with constraints that affect the
who, where, how, why and what associated with a system and its activities. It is also
appealing as it is the first real human factors design method and rather than merely analyse
or describe a system or behaviour, the approach ostensibly can make telling contributions
to system design, as evidenced by a range of successful design applications (e.g. Bisantz
et al. 2003, Ahlstrom 2005). Further, due to its formative nature, it is alleged that it can
inform the design of so-called ‘first of a kind systems’ (Vicente 1999a, Naikar et al. 2003).
Theoretical Issues in Ergonomics Science 509

Its current popularity is such that it has been applied in various complex domains for a
number of different purposes, including system modelling (e.g. Hajdukiewicz 1998), system
design (e.g. Bisantz et al. 2003), automation (Mazaeva and Bisantz 2007), training needs
analysis (e.g. Naikar and Sanderson 1999), training programme evaluation and design (e.g.
Naikar and Sanderson 1999), interface design and evaluation (Vicente 1999), information
requirements specification (e.g. Ahlstrom 2005), tender evaluation (Lintern and Naikar
2000, Naikar and Sanderson 2001), team design (Naikar et al. 2003), allocation of
functions (e.g. Jenkins et al. 2008c), the development of human performance measures (e.g.
Yu et al. 2002, Crone et al. 2003, 2007) and error management strategy design (Naikar and
Saunders 2003). These applications have taken place in a variety of complex safety critical
domains, including air traffic control (e.g. Ahlstrom 2005), aviation (e.g. Naikar and
Sanderson 2001), health care (e.g. Watson and Sanderson 2007), hydropower (e.g.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Memisevic et al. 2005), nuclear power (e.g. Olsson and Lee 1994), naval (e.g. Bisantz et al.
2003), manufacturing (e.g. Higgins 1998), military command and control (e.g. Jenkins
et al. 2008a,b), petrochemical (e.g. Jamieson and Vicente 2001), process control (e.g.
Vicente 1999a), rail (e.g. Jansson et al. 2006) and road transport (e.g. Salmon et al. 2007).

4. Theoretical comparison
From a theoretical point of view, it would appear that there are two fundamental
differences between the two methods. First, HTA focuses on system goals whereas CWA
focuses on the constraints present within a system and, second, HTA fits somewhere
between being descriptive and normative in nature, since it describes how goals actually
are, or should be, achieved, whereas CWA is formative in nature and so describes how
functions and purposes could potentially be achieved.
The goals vs. purposes and constraints distinction is a logical starting point of any
comparison between the two; however, this distinction is not as clear cut as it first appears.
HTA focuses on the hierarchical subdivision of goals, whereas CWA focuses on the
purposes and constraints present within the domain in which these goals are pursued. The
notion that CWA does not focus on goals, however, is contentious. For example, it might
be argued by some that CWA, through the CTA phase, focuses on the goals linked to
‘known recurring activities’; however, this is a moot point. Reviewing the literature on
decision ladder applications, it is clear that some applications focus on goals (e.g.
Rasmussen et al. 1994, Naikar 2009, Jenkins et al. in press), whereas some do not
(Rasmussen 1986, Vicente 1999a, Kilgore et al. 2009, Lintern 2009). Certainly, the decision
ladder method has a goals element to it; however, since CWA is a framework rather than a
rigid methodology, the underlying theory is more important in this sense. It could be
argued that CWA does not analyse goals in the true semantic sense, rather it considers the
influence of goals on decision making and these goals may or may not be considered
dependent on the situation and environmental constraints. The key difference seems to be
that HTA begins with the system goal and uses this as the basis for analysis; whereas CWA
(or more correctly, ‘activity analysis in work domain terms’) considers them but,
importantly, does not use them as the driving force for the analysis. It is also worth
pointing out that if the CWA approach did truly analyse goals, then it would be moving
from formative description to normative description, since goals can only be specified for
situations that are known or can be anticipated. If this were the case, then only the WDA
component of the framework could be considered to be formative.
510 P. Salmon et al.

Moving back to the discussion on goals vs. constraints and purposes, Vicente (1999a)
offers the following definitions for these terms:
. Constraints – Relationships between, or limits on, behaviour. Constraints remove
degrees of freedom (Vicente 1999a, p. 6).
. Purpose – The overarching intentions that a work domain was designed to
achieve. Note that purposes are properties of work domains, not actors, and that
they are relatively permanent (Vicente 1999a, p. 9).
. Goal – A state to be achieved, or maintained, by an actor at a particular time.
Note that goals are attributes of actors, not work domains, and that they are
dynamic (Vicente 1999a, p. 7).
In a simplistic manner, operators perform activities to achieve a goal of some sort and
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

the successful achievement of goals allows a system to enact its purpose. According to
Vicente (1999a), the difference between a goal and a purpose relates to ownership and
dynamicity. Goals, according to Vicente, are dynamic and are held by actors, whereas
Annett and colleagues originally defined a goal as: ‘the objective of the system in some real
terms of production units, quality or other criteria’ (Annett et al. 1971, p. 4). Annett and
Stanton (2000) refer to goals as those things that a person is seeking to achieve. Purposes,
on the other hand, are typically permanent and are a property of the work domain rather
than the actors undertaking tasks in the work domain. Constraints reflect those conditions
in the work domain that affect the activities being performed. This distinction represents
the first significant difference between the two approaches. HTA is concerned with states
that are to be achieved and how they are achieved, whereas CWA is concerned with the
purposes of a system and the limiting factors imposed on purpose-driven behaviour within
that system. When analysing a kettle, for example, one can take the ubiquitous ‘make cup
of tea’ example that is so frequently used to demonstrate HTA. The goal here is normally
expressed as ‘make cup of tea’. The purpose of the kettle ‘system’, however, would be
expressed as ‘provide boiling water’. The goal of ‘make cup of tea’ is dynamic and linked
to an actor at a specific time; the same actor’s goal, or another actor’s goal, on another
occasion may be very different. The goal is therefore dynamic and will change periodically,
for example, think ‘make cup of tea’ vs. ‘make cup of hot chocolate’ vs. ‘boil water to wash
car’. The kettle’s purpose of ‘provide boiling water’, however, will never change, regardless
of the situation or actor involved.
The second fundamental difference between the two, the focus on how activity actually
does unfold vs. how it can potentially unfold is also significant. HTA lies somewhere
between being a descriptive and normative approach to work analysis (Jenkins et al. 2008),
whereas CWA instead formatively describes how work can unfold based on a multi-
perspective view of the constraints that can possibly impact it. HTA focuses on how goals
should be (or are) achieved, whereas CWA focuses on how activity could be undertaken.
Vicente argues that CWA is distinct from approaches such as HTA in that, rather than
prescribe how work should be done or describe how it is currently being done, it seeks to
identify how work could be done if the appropriate tools were made available. (Vicente
1999a, p. 340). This has led to many, such as Naikar et al. (2005), claiming that CWA can
identify the information or knowledge that workers need for dealing with a wide variety of
situations, including novel or unanticipated events, whereas HTA can only specify the
information or knowledge that workers need for dealing with routine or anticipated
situations. Annett (2004), however, would counteract this argument by stating that HTA
seeks to represent system goals and plans rather than focusing solely on observable aspects
of performance. The normative vs. formative output provided by the two is the main focus
Theoretical Issues in Ergonomics Science 511

of those postulating an advantage of one method over the other, since many argue that
CWA can deal with unanticipated, non-routine events, whereas HTA cannot (e.g. Vicente
1999b, Miller and Vicente 2001, Hajdukiewicz and Vicente 2004). This is also where the
notion that CWA can be used to design ‘first-of-a-kind’ systems comes from; since it is a
formative approach that focuses on the purposes of a system, it can be used to analyse
systems that do not yet exist. It is notable, however, that applications in which CWA has
been applied to the design of a novel, first of a kind system are scarce.

5. Methodological comparison
Methodologically, the two approaches also differ. HTA has a well-versed, step-by-step
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

methodology (e.g. Annett 2004, Stanton 2006); whereas CWA, more a framework than a
methodology, does not restrict analysts to specific methodologies for each phase, let alone
a rigid procedure. Of the various methods available for each of the five phases, those used
for the WDA (e.g. AH and ADS) and the ConTA (e.g. decision ladder and contextual
activity template) phases have the most methodological guidance associated with them
(e.g. Naikar et al. 2005, 2006, Jenkins et al. 2008a) and it is notable that there is a general
lack of guidance for the latter phases (e.g. strategies analysis, SOCA and worker
competencies analysis). The two approaches are also similar in that the level of granularity
pursued is largely down to the analyst(s) involved and the purposes of the analysis itself.
For example, both can be used in a rapid fashion for high level descriptions of systems,
whereas should the analysis requirements dictate, both can go to an extremely fine level
of detail.
There are also similarities in the methodological procedures followed when conducting
analyses with both approaches. Both involve the use of a range of sources for collecting the
data required, including observation, questionnaires, interviews with SMEs, walk-
throughs, user trials and documentation review. This is useful since it often allows both
methods to be applied to the same data, which saves considerable time but produces a
comprehensive output. When applying each method, both require a high degree of
reiteration, often with appropriate SMEs, before the analysis is completed to a satisfactory
level. The first pass of both approaches will never produce the final output. Both also
provide highly visual outputs and often require a drawing software package in order to
create the outputs developed.
There are similarities too in the flaws associated with both approaches. Both are
criticised for the high level of resource usage incurred when applied to complex tasks or
systems (although many, including these authors, would argue that the utility of the
outputs generated from both methods far outweighs the cost associated with constructing
it) and concerns are often raised regarding the reliability and validity of both methods. In
response to the high resource usage problem, both approaches have been subject to the
development of software tool support (e.g. Jenkins et al. 2008a, Salmon et al. 2009).
Probably the most important methodological difference between the two approaches
lies in the methodological extensions applied. As referred to above, a range of methods can
be applied to HTA outputs; once the sub-goal hierarchy is complete, there are a number of
additional human factors methods that can be applied in order to extend the analysis
outputs further. For example, a recent HTA software tool developed by the authors
(Salmon et al. 2009) includes the following add-on methods: allocation of functions
analysis (Marsden and Kirby 2004); context and constraints analysis (Shepherd 2001);
DIF analysis, human–computer interaction analysis, interface design analysis
512 P. Salmon et al.

(Hodgkinson and Crawshaw 1985); interface design questions, task decomposition


analysis (Kirwan and Ainsworth 1992); job design analysis (Bruseberg and Shepherd
1997); keystroke level model (Card et al. 1983); operational performance statement,
probability and cost of failure analysis, the systematic human error and prediction
approach (SHERPA; Embrey 1986); team task analysis (Burke 2004); training design
questions (Piso 1981); workload assessment (NASA TLX; Hart and Staveland 1988);
subjective workload assessment technique (Reid and Nygren 1988); and the workload
profile technique (Tsang and Velazquez 1996). To the authors’ knowledge, CWA outputs
currently do not act as the input to additional human factors analyses, although work is in
progress investigating possible extensions to the framework (e.g. Jenkins et al. 2008a). This
can be circumvented slightly by the argument that CWA directly informs design and so
does not require additional methods. However, it is notable that not all of the HTA
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

extensions described are solely design-based (e.g. error prediction and workload
assessment can be undertaken on operational systems).

6. Comparisons based on applications


Previous attempts have been made to compare the two approaches in light of specific
applications. Vicente (1999b) compared WDA and the sub-goal template method, a task
analysis-based approach, when used to identify information requirements for interface
design. In discussing their relative advantages and disadvantages, Vicente suggested that
whilst task analysis approaches are more efficient and require less mental economy, WDA
is more flexible, has a broader scope of application and has a greater ability to inform the
recovery from errors. Vicente’s argument seems to centre on the notion that WDA can
cope with unanticipated events, whereas HTA cannot; it identifies what needs to be done
and how it should be done, but not how it could be done in unforeseen circumstances.
Miller and Vicente (2001) compared the AH (from the WDA phase) and HTA when
used to identify display requirements for the DURESS II feedwater domain (see Vicente
(1999a) for description). The resultant requirements generated suggested that both offer
unique contributions; the display requirements produced by the two approaches were
substantially different but also complementary (Miller and Vicente 2001). Despite this,
Miller and Vicente (2001) did identify the advantages of each approach over the other.
According to Miller and Vicente (2001), the ADS provides explicit knowledge regarding
the affordances (constraints and capabilities for behaviour) of the domain, more readily
identifies information requirements and is independent of the context in which the system
is used, whereas HTA provides procedural knowledge, is more human-centred in that it
focuses on what operators need to do, more readily identifies when, how and with what
priority information will be required and is less independent of the context of use. In
conclusion, Miller and Vicente (2001) suggested that although providing different
perspectives, both approaches produce complementary information about the interaction
that users of a system will have. Further, they noted that the ADS approach, wherever
possible, should precede the HTA approach.
Hajdukiewicz and Vicente (2004) attempt to clarify the relationship between WDA (the
first phase of CWA) and task analysis. Based on a case study using the two approaches to
analyse the DURESS thermal hydraulic process control microworld, Hajdukiewicz and
Vicente (2004) depict the relationship between the two as a series of transformations from
a complete work domain structure (WDA) to a specification of a set of actions and work
domain states (task analysis). They argue that the WDA presents a model of the complete
Theoretical Issues in Ergonomics Science 513

work domain structure and that moving from this to a relevant work domain
structure, which shows all action possibilities for a particular category of events,
represents the first step away from WDA and toward task analysis. Following this, the
transformation from a relevant work domain structure to a utilised work domain structure
depicting the subset of utilised action possibilities at a particular point in time takes a
further step away from WDA and toward task analysis. Next, the transformation from a
utilised work domain structure to a desired work domain state represents the final step
toward task analysis, which maps current states onto desired states via a set of human or
automated actions (Hajdukiewicz and Vicente 2004). They describe WDA as event and
time independent, which supports worker adaptation to novelty and change, and task
analysis as event and time dependent, which supports worker performance in anticipated
situations.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

It is notable that the comparisons discussed are presented, in the main, by proponents
of the CWA approach (e.g. Vicente, Hajdukiewicz); indeed, it could be argued that the
comparisons discussed are slightly one sided in the favour of CWA. Both Miller and
Vicente (2001) and Hajdukiewicz and Vicente (2004) seem to take a constrained view of
HTA and its capabilities and tend to ignore the various extensions that can be added. For
example, in a retort to Miller and Vicente (2001), Stanton (2006) argued that some of the
shortcomings identified by Miller and Vicente (2001) could have been removed had they
considered the various extensions of HTA (e.g. interface design). Stanton (2006) also
argued that Miller and Vicente’s (2001) analysis indicated that they may have been using
HTA in a constrained manner (i.e. focusing on human agents and actions as opposed to
the whole system).
It is also interesting to note that the comparisons discussed do not consider the latter
phases of the CWA framework; rather they compare only the first phase, WDA, with
HTA. This is due partly to the fact that there has been only limited application of the latter
phases of CWA (published in the open literature at least) and that there is only limited
guidance on how to conduct the latter phases. In recent times, however, applications of the
latter phases (e.g. Naikar et al. 2006, Jenkins et al. 2008a) have been described and also
guidance on how to undertake these analyses has emerged (e.g. Naikar et al. 2006, Jenkins
et al. 2008a).

7. Fixed wing rotary mission planning system case study


For the purposes of this article, recent HTA and CWA analyses of a fixed wing rotary
mission planning software tool are presented. The software tool was recently developed in
line with the military’s movement towards so-called network-enabled capability (NEC)-
based systems. NEC is a currently popular organisational paradigm that involves the use
of advanced technology to enhance decision making during operations (Bolia 2005). As
the name suggests, NEC involves the use of linked technological artefacts to enhance
information sharing and interaction between elements of warfare systems. The Ministry of
Defences Joint Services Publication (JSP) 777 (Ministry of Defence 2005) defines NEC as:
The coherent integration of sensors, decision-makers, weapons systems and support
capabilities to achieve the desired effect. It will enable us to operate more effectively in
the future strategic environment through the more efficient sharing and exploitation of
information within the UK Armed Forces and with our coalition partners. The bottom line is
that it will mean better-informed decisions and more timely actions leading to more
precise effects.
514 P. Salmon et al.

Underpinning NEC then is the use of digitised warfare systems, which permit
connectivity between multiple actors and the rapid dissemination of data between them.
Accordingly, there has been a recent spate of digitised mission support systems being
developed, tested and even introduced in theatre. The analysis in this case focuses on a
digitised MPS that, at the time of analysis, was used to support planning for military
helicopter missions.
Mission planning is an essential part of flying a military aircraft. Whilst in the air,
pilots are required to process, in parallel, cognitively intense activities, including time
keeping, hazard perception and off-board communication. These activities are all
conducted whilst attending to the task of navigating through a 3-D airspace. Pilots are
required to constantly evaluate the effects that their actions have on others within the
domain. Decisions need to be made that consider any number of both military and non-
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

military services, organisations and civilian groups. Calculations need to be made based
upon a number of physical considerations. These include environmental constraints,
aircraft performance and payloads. Pilots also need to balance mission objectives with
rules of engagement and high order strategic objectives. Pre-flight planning is one essential
method used to alleviate some of the pilot’s airborne workload. This planning process,
which was formerly conducted on paper maps, is now supported by the MPS tool focused
on in this article.
The MPS tool analysed provides and processes digital information on battlefield data,
threat assessment, intervisibility, engagement zones, communication details, transponder
information and identification friend or foe settings. In short, the MPS is used to plan and
assess single and multiple aircraft sortie missions. When using the MPS system, mission
plans are generated prior to take off on PC-based MPS terminals. Key information
developed in the software tool is transferred to the aircraft via a digital storage device
called a ‘data transfer cartridge’ (DTC). Information is presented on the aircraft’s onboard
flight display. This multi-function display can be used by the pilot to assist in navigation
and target identification. The HFI-DTC consortium were invited by the system’s creator
to undertake a human factors analysis of the MPS in order to generate system redesign
recommendations.

8. Methodology
8.1. Participants
Two human factors researchers from the Ergonomics Research Group at Brunel
University, each with significant experience in the application of both HTA and CWA
in complex systems analyses, took part in this study. For the purposes of data collection
and validation of their analyses, the analysts were given access to four participants who
were domain and mission planning SMEs. The SMEs were all qualified pilots with
significant experience in planning missions using the old paper map system and the new
MPS. One SME is a UK flight instructor and trainer for the MPS and the other three are
serving airmen who use the MPS regularly.

8.2. Materials
The materials used included a laptop computer containing the MPS software tool,
HTA and CWA software tools (developed by the HFI-DTC, see www.hfidtc.com
Theoretical Issues in Ergonomics Science 515

for details on the tools and how to obtain free copies) and video and audio
recording equipment. A number of documents relating to the mission planning procedure
were also used (e.g. training manuals, standard operating instructions, navigation cards,
etc). Paper maps, acetates and drawing equipment were also used to demonstrate the
existing paper map planning process.

8.3. Procedure
An initial 2-day meeting was held in order to introduce the analysts involved to the mission
planning process and to familiarise them with the MPS software tool. Following this, the
data collection procedure involved the conduct of a number of interviews with the four
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

SMEs and also SME walkthroughs of MPS planning tasks. The interviews and
walkthroughs were recorded using audio and video recording equipment. In total, three
interview/walkthrough meetings were held, each lasting approximately 5 hours in
duration.
The data collected were used to inform two separate analyses: one involved using
the CWA framework to analyse the MPS tool; the other involved using the HTA
methodology to analyse the MPS tool. For the CWA analysis, WDA, ConTA, SOCA
and example strategies analyses were developed for a three-phase mission-planning
scenario. For the HTA, four HTAs were constructed: one for a three-phase mission
planning scenario; one for an Afghanistan operational scenario; one generic HTA for
the MPS software tool; one for the traditional paper map (pre-MPS) planning process.
This article focuses on the three-phase mission analyses. In short, the three-phase
mission scenario involved one SME generating a workable plan for a three-phase
mission. A three-phase mission typically involves ingress (travelling from holding point
to target area), delivery of effect (i.e. destroy target) and egress (travelling out of target
area to holding point). Planning for this mission included analysing the battlefield area
(e.g. in terms of hazards, cover and concealment), planning ingress and egress routes,
identifying suitable holding point and battle positions (e.g. suitable position to destroy
target from), calculating intervisibility for the target and calculating and configuring
fuel and payload required. The three-phase mission planning HTA was then used to
inform the conduct of a SHERPA (Embrey 1986) human error identification analysis
and a task decomposition (Kirwan and Ainsworth 1992) analysis. Each of the outputs
derived from both analyses were reviewed by the MPS SMEs and subsequently refined
based on their feedback.

9. Results
An extract of the three-phase mission planning scenario HTA is presented in Figure 3.
Within Figure 3, only the high level goal and sub-goals are displayed (each sub-goal was
decomposed to button or key press level), along with an example decomposition. An
extract of the SHERPA human error analysis is presented in Table 1 and an extract of the
task decomposition analysis is presented in Table 2.
The AH developed for the MPS is presented in Figure 4; this was subsequently used to
inform the development of ConTA, SOCA and strategies analyses. Extracts from these
three analyses are presented in Figure 5.
516 P. Salmon et al.

0: Plan three
phase mission
using MPS

1. Undertake 2. Establish 4. Plan 5. Plan 6. Check 7. Prepare


3. Configure 8. Write data
pre-liminary environment positions & aircraft routes & add DTC for data
aircraft for DTC
activities & constraints routes performance SA download

4.1. Plan and 4.2. Plan &


add positions add routes
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

4.1.1. Plan and 4.1.2. Plan and


add holding add battle
position position

4.2.1. Plan and 4.2.2. Plan and 4.2.3. Plan and


add ingress add effect add egress
route routes route

4.2.2.1. Plan 4.2.2.2. Plan


and add route and add route
in out

Figure 3. Three-phase mission hierarchical task analysis extract; figure shows example decompo-
sition for sub-goal 4. Plan positions & routes. MPS ¼ mission planning system; SA ¼ situation
awareness; DTC ¼ data transfer cartridge.

10. Discussion
The purpose of this article was to compare two popular contemporary human factors
approaches, HTA, a traditional task analytical methodology, and CWA, a contemporary
design framework. First, the outputs of the two are compared and contrasted and then the
potential for applying both approaches in a complementary fashion is discussed.
In the case study presented, each method produced entirely different outputs. It is in
the application of the methods where perhaps the only real similarities between the two
exist. The data collection process for both methods involved observation of process,
interviews with SMEs and reviews of relevant documentation, such as standard operating
procedures. In this case, both incurred similar data collection times and levels of SME
input; however, it is notable that, despite offering more outputs, the CWA approach was
the quickest to use, with an application time of approximately half of that of the HTA in
this case. This was ostensibly down to the level of granularity pursued in the HTA analysis
(for human error and task decomposition analysis purposes a button press level of
granularity was required).
HTA provided a goal-oriented description of the three-phase mission planning process
when using the MPS software tool, including a detailed description of goal attainment
throughout the planning process. Additionally, the plans component described the
contextual conditions that dictate the order in which the goals and sub-goals are
completed. The HTA description is particularly useful since it details exactly what goals
and sub-goals need to be achieved and how these goals and sub-goals are achieved, in
order to plan a three-phase mission using the MPS system. The process of constructing the
HTA itself enabled the analyst to develop a deep understanding of the domain and task
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Table 1. Systematic human error and prediction approach mission planning system extract.

Error
Task Step Mode Description Consequences Recovery P C Remedial Measures

4.2.3.1.1.1. Check cur- C1 User fails to check System may be in Immediate/ 4.2.3.1.1.1 H L – Current Mode
rent system menu current system appropriate mode Display
mode mode for desired opera- – Requirement to
tion (i.e. user might Select Desired
drop a CM rather Function prior to
than zoom in to data input
map) – System reverts to
Standard Mode
every time
4.2.3.1.2.1.1. Check R2 User misreads line of User misunderstands None L H – System auto gener-
Friendly & Enemy sight on intervisibil- line of sight and line ates Line of Sight
Lines of Sight on ity display of sight may be Rating &
Intervisibility inadequate or Recommendations
threatening
4.2.3.1.2.1.2. Check R2 User misreads Back User misunderstands None L H – System auto gener-
Back drop on Drop on intervisi- Back Drop and line ates Back Drop
Intervisibility bility display of sight may be Rating &
inadequate or Recommendations
threatening
4.2.3.2.1. Compute A7 User computes radar Radar range is calcu- None L H
Radar Range range incorrectly lated incorrectly
and the wrong radar
range data data is
presented to the
user
4.2.3.2.2. Check Radar C1 User fails to check Radar range is not 4.2.3.2.2 L H – System prompt to
Ranges radar ranges checked check radar ranges
post intervisibility
Theoretical Issues in Ergonomics Science

function
– System warning if
route/posit ions are
in radar range
– System auto-checks
radar ranges
4.2.3.3.3. Draw line A5 Draw Offset line from/ Offset range and bear- Immediate M M – System auto-con-
from Target to to incorrect point ing data is nects offset line to
Battle Position on the map inappopriate relevant features on
using Offset Tool the map
– Mouse over function
517

presents item details


to user

(continued )
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Table 1. Continued

Error
Task Step Mode Description Consequences Recovery P C Remedial Measures
518

4.2.3.3.4.1. Check R2 Misread Range User misunderstands L H – Larger Text


Range range details – Colour Coding
4.2.3.3.4.1. Check R2 Misread Bearing User misunderstands L H – Larger Text
Bearing Bearing details – Colour Coding
4.2.3.4.3. Left click A6 User selects the wrong Wrong symbol is Immediate H M – Mouse over function
and hold on symbol map display selected and wrong that presents details
on map dispay symbol may be of each item on the
moved on the map map display
display
4.3.2.3.1. Select ‘add to A6 User selects the wrong Wrong function is Immediate M L – Clearer control
end of route’ func- function (e.g. add to selected labelling
tion on the toolbar start of route – Greater control icon
instead of add to separation
end of route) on the
toolbar
4.3.2.3.6.1. Select CM A8 User fails to select CM CM function is not 4.1.3.2 L L – Current Mode
function on the function on the selected Display –
toolbar toolbar Requirement to
Select Desired
Function prior to
data input
P. Salmon et al.

– System reverts to
Standard Mode
every time
4.3.2.3.6.1. Select CM A7 User selects overlay CM function is not 4.1.3.2 L L – Current Mode
function on the drop down menu by selected Display
toolbar mistake – Requirement to
Select Desired
Function prior to
data input
– System reverts to
Standard Mode
every time
4.3.2.3.6.1. Enter way- A7 User enters the wrong Wrong waypoint name Immediate L L N/A
point name waypoint name is entered
4.3.2.3.6.2. Click on A5 User clicks on the Waypoint is dropped Immediate M L N/A
map to drop wrong area on the in the wrong area
waypoint map on the map
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Table 2. Task decomposition extract.

Task Description Task Requirements Interface Features Nature of the Task TaskOutputs

InitiatingC-
Task Type of ue/ Controls & Decisions
Step Description Activity Event Information Location DisplaysUsed Actions Required Required Cx D Cr Output Feedback Comments

4.2.1. Identify Check Beginning of Map/Area Info Map Map Display Check Area (Targets Identificati- H H H Battle N/A Automation of
Suitable Decision Battle Terrain Display Zoom in and out Routes, Hazards, on of most Positions Process-
Battle Action Position Target Info controls Towns etc.) suitable Intelligent MPS
Positions Selection planning Route Info Intervisibilty Zoom in and out Battle system identifies suit-
process Fuel & Load Offset Tool Scope out Terrain Positions able Battle Positions
Performance Info and Positions based on Target and
Weapons Info Set up and check Route Information
Enemy Info intervisibility
Environment and Select appropriate
Constraints Info Battle Positions
4.2.2. Add Battle Action Selection of Battle Positions Map Mouse Select C M icon on Battle L L H Marked up Battle Positions pre- Dynamic
Position(s) Check suitable Map/Area Info Display Keyboard the toolbar Position Battle sented on map Intervisibility
to map Battle Terrain Intervisibilty Map Display Click on desired area Details and Positions on display System – intervisib-
Positions Target Info CM function on map Placement the Map lity system auto auto
Route Info Edit Symbol Classify CM as Battle on map Display recalculates upon
Fuel & Load Winddow Position and enter movement of Battle
Performance Info Intervisibilty Battle Position Position
Weapons Info Offset Tool details (name, Intervisibility
Enemy Info Over lays Colour, quadrant Comparison
Environment and etc.) Check Battle Function-MPS
Theoretical Issues in Ergonomics Science

Constraints Info Postion details sustem compares dif-


Area Features Check Battle Postion ferent Battle
on map Positions and rates
Set up and check each one
Intervisibility

(continued )
519
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Table 2. Continued
520

Task Description Task Requirements Interface Features Nature of the Task TaskOutputs

InitiatingC-
Task Type of ue/ Controls & Decisions
Step Description Activity Event Information Location DisplaysUsed Actions Required Required Cx D Cr Output Feedback Comments

4.2.3 Check and Action Selection of Battle Positions Map Mouse Set up Intervisibility Determine H H H Selection of N/A Dynamic
Modify Check suitable Route Info Display Keyboard Check Intervisibility the most most suitable Intervisibility
Battle Battle Map/Area Info Intervisibilty Map Display Consider Battle suitable Battle System – intervisib-
Positions Positions Terrain CM function Position aspects e.g. Battle Positions for lity system auto
Target Info Edit Symbol range, weapons, back Position(s) the mission recalculates upon
Route Info Window drop etc. based on Positions movement of Battle
Fuel & Load Intervisibilty Modify Battle target, marked up on Position
Performance Info Offset Tool Position (see 4.2.2) weapons, the map Intervisibility
Weapons Info Over lays Compute radar range Intervisibility Comparison
Enemy Info ranges Backdrop, Outputs Function-MPS
Environment and Check radar ranges radar cover- system compares
Constraints Info Check range using age, ingress different Battle
Area Features off set tool and egress Positions and rates
Itervisibility routes etc. each one
Heights Battle Position
P. Salmon et al.

Weapons Range Checklist


Backdrop Battle Position
Radar Coverage Rating System
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Theoretical Issues in Ergonomics Science

Figure 4. Mission planning system abstraction hierarchy.


521
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

522
P. Salmon et al.

Figure 5. Extracts from control task, strategies, and social organisation and cooperation analyses.
Theoretical Issues in Ergonomics Science 523

under analysis and the output produced has utility for both system design and evaluation.
For system design, this HTA output is useful for interface design (by specifying
information requirements), training design (by specifying task requirements and
sequences), procedure design (by specifying task sequences) and allocation of functions
(by specifying which agent does what and when) purposes. For system evaluation, the
output can be used to inform all manner of evaluations, including error prediction and
analysis, interface evaluation, knowledge, skills and attitudes analysis (which can be used
to design and evaluate training programmes) and team task analysis. In this case, the
SHERPA and task decomposition approaches were used to evaluate the MPS in terms of
interface design and potential for design-induced user errors.
CWA, on the other hand, provided four separate but related outputs, including the
ADS, the AH, the ConTA and the SOCA. These encompass a description of the
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

constraints imposed on activity, a formative description of the main activities involved and
the impact of systemic and situational constraints on them and a description of the
potential allocation of these activities between the actors working within the planning
system. Collectively, this enabled the mission-planning process (regardless of whether it
was undertaken using the MPS or paper-based maps) to be exhaustively described in terms
of constraints, which also led to the analyst acquiring a deep understanding of the domain
and mission-planning task. Decomposing activity-related constraints in this way also
offers a powerful description of the system in question and this information can be used to
directly inform system and interface design, training design and allocation of functions.
Despite being less detailed, and at a higher level of analysis than the HTA, this case
study suggests that, based on each method’s outputs alone, CWA offers more (especially
when one considers the quicker application time in this case). The CWA output offers
more perspectives on the system in question; its five pronged approach allows analysts to
more exhaustively describe the constraints on the system under analysis, albeit at a lower
level of granularity than HTA. Whilst HTA enhances its utility by acting as the input to
various other analysis approaches, such as human error identification and interface
analyses approaches, its initial output alone is nowhere near as comprehensive in terms of
the different perspectives on activities that CWA offers.
The difference in the level of granularity offered by the two analyses is significant. The
HTA goal decomposition went as far down as detailing button press activities (e.g. click on
intervisibility on drop down menu, click right mouse button to activate mapping menu,
click mouse on map to drop waypoint), whereas the CWA analysis worked at a much
higher level. This indicates one way in which the two can be used in a complementary
system design framework, since CWA can specify, at a high level, what functions are
required, how they can potentially be undertaken, in terms of high level strategies and by
whom, whereas HTA, at a more detailed level of granularity, can then be used to describe
exactly how these strategies might unfold, which, in turn, can be used to predict any errors
likely to emerge during task performance. By beginning at a high level of granularity and
describing the system and how activities can potentially unfold with CWA, one can then
specify at a greater level of detail, through HTA, how activities should occur and what
likely problems may be encountered. It is worth pointing out that the difference in
granularity of the outputs seen in this case does not mean that CWA cannot achieve the
same levels of granularity as HTA. However, for CWA to produce outputs of a similar
level of granularity to HTA, considerably more application time and data collection time
would be required.
One distinct advantage that HTA currently has over the CWA framework is its ability
to inform other analysis methods. For example, the HTA software tool used for this
524 P. Salmon et al.

analysis boasts 17 additional analysis modules, which use the initial HTA output as their
input (Salmon et al. 2009). At the time of writing this article, CWA does not have any
additional analysis methods available and is limited to its five phases and analyst
interpretation, although work is in progress investigating possible extensions to the
framework (e.g. Jenkins et al. 2008a). This advantage of HTA over CWA is tempered
slightly by the fact that CWA directly informs design and so does not require additional
methods. However, to the authors’ knowledge there are no formal, structured human
factors methods that are underpinned directly by an initial CWA.
What it is that was done with the outputs is an interesting point of comparison
between the two. The detailed task description offered by HTA is useful, but it is more
descriptive than analytical and, save information requirements specification, offers little
direct input into design; only when other additional methods are applied to the HTA
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

output can it be used to inform the design process. In this case, SHERPA human error
identification analysis (Embrey 1986) and task decomposition analysis (Kirwan and
Ainsworth 1992) were undertaken, both of which offered significant design recommen-
dations for future iterations of the MPS. The SHERPA analysis was used to identify
elements of the current MPS graphical user interface, which were inadequate and could
potentially induce user errors during the mission-planning task. For example, the menu
structures used were found to be too deep, which could potentially induce wrong
selection of items from the menus by the user. Also, there were many instances in which
the MPS required the user to input data that were already contained within the system
(i.e. had already been entered by the user elsewhere), which created potential for
incorrect data entry errors. Other examples included instances where the user was
required to make calculations that the system could feasibly make, which created the
potential for miscalculation errors (e.g. of routes, fuel and loads). Remedial measures
were specified for the errors identified. These included primarily simplistic interface
modifications, such as automatic data propagation between windows, the provision of a
mouse-over function (i.e. pertinent information is displayed when user holds mouse over
object on map), the use of scalable icons on the map (i.e. icons that scale automatically
when the user zooms in and out of the map area), the use of shorter and simpler menu
structures, the use of standardised symbology (i.e. consistent between MPS and aircraft
being used) and the provision of filter function, whereby the user could filter the map as
desired to remove unwanted clutter and reduce map-reading errors (i.e. filtering
functions recommended included enemy, friendly, route, ingress, holding areas, battle
positions, egress and route features).
The task decomposition analysis was used in order to decompose and analyse the
component mission-planning tasks to a deeper level of detail. The task decomposition
categories focused on included the type of activity (e.g. action, check, communication), the
initiating event or cue, information requirements, interface features (e.g. location, controls
and displays used), actions and decisions required, complexity and difficulty, outputs and
feedback. The task decomposition analysis in this case was useful in that it specified
explicitly the step-by-step task sequences involved in mission-planning operations. This
enabled the identification of a lack of links between the component parts of the MPS
software tool and the requirement for data propagation and additional automation of
planning tasks.
One of the most attractive aspects of the CWA framework is that it is couched as a
design methodology. It is alleged that CWA directly informs the design of complex
socio-technical systems (e.g. Vicente 1999a), although it must be noted that there has
been limited evidence of this in the open literature. In this case, the CWA outputs
Theoretical Issues in Ergonomics Science 525

Table 3. Mission planning system (MPS) cognitive work analysis (CWA) and hierarchical
task analysis (HTA).
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Note: The table shows what analysis each method produced, what extensions were applied and
what input to the MPS design each method offered.

indicated that the MPS interface was inappropriately designed in relation to the mission-
planning process. In particular, they suggested that the current interface embodies a
physical, rather than functional structure (i.e. the interface design and ordering of tasks
does not bear a strict resemblance to the actual functional structure of mission
planning). This conclusion was used to inform the redevelopment of the MPS training
syllabus structure. The WDA highlighted how the MPS uses a system of different
windows (e.g. aircraft configuration window, payload window, fuel window, route leg
window) to support the mission-planning process and that current training focuses on
training the users how to use each of these component windows, rather than on how to
plan a mission. It was concluded that this might lead to users developing a physical
understanding of the mission-planning process (i.e. understanding of how each
component window works) rather than a functional understanding (i.e. understanding
of the different functions involved and the relationships between them). The WDA
output, therefore, suggested that MPS training should focus initially on the mission-
planning process and then on the MPS functions that support the process, rather than
focus primarily on the MPS software tool. The means–ends links specified, and the
526 P. Salmon et al.

structure of the AH, were used to form the basis for training lesson sequencing and
teaching structure, which, in turn, has led to a more activity-focused teaching structure,
as opposed to the current application-focused training (Jenkins et al. 2008c). The
descriptions provided by the two approaches and the ways in which they were used in
this case are presented in Table 3.
It is clear that there are significant advantages associated with each form of analysis.
The outputs of both are useful in their own right and using both does not seem likely to
create any conflict when designing systems; rather, the outputs can be used in a
complementary manner. Further, when analysing existing systems the outputs derived
from both approaches are complementary. One does not disagree with the other; rather,
each describes the system in a different manner. In addition, often both approaches can be
applied to the same input data. This potential for using the approaches in a
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

complementary manner requires further discussion. The practical question to ask is


whether there are significant gains to be made by applying both methods during system
design or evaluation. Certainly, there is no loss. For system design purposes, most agree
that there is significant utility in applying both (e.g. Vicente 1999b, Miller and Vicente
2001, Hajdukiewicz and Vicente 2004). At the most obvious level, the two can be used in
a complementary fashion by applying them at either end of the system design life cycle.
The formative nature of CWA means that it is better employed during the early stages of
idea forming and concept design specification, whereas the normative nature of HTA
means that it is likely to be more useful during the latter stages when there is an actual
system or design concept present to evaluate and refine. This represents one level on which
the two can be complementary: CWA as the design specification approach; HTA as the
design concept evaluation and redesign approach. In this way, CWA acts as the front end
revolutionary design approach and HTA acts as the latter design phase evolutionary
design approach.
It should not be forgotten, however, that both approaches are perfectly capable of
design specification, problem identification and remedial measure specification when used
in isolation. Exactly how undertaking both forms of analysis better informs the design
process therefore needs exploring. HTA, along with its various extensions (e.g. human
error identification, interface evaluation) is probably better at problem identification
(CWA requires significant SME input or analyst expertise in order to identify valid issues),
such as the identification of errors that are likely to occur, whereas CWA is probably
better generating new designs to solve the problems identified. An example complementary
application in this sense could be for HTA to be used to predict the errors that are likely
occur with a given design concept (produced by CWA) and for CWA to generate new
design solutions to eradicate the errors identified.
But does doing both really add anything? As discussed, both approaches are perfectly
capable of being used, in isolation, throughout the entire system design life cycle, from
design concept specification to operational system, and so one could argue that sticking
with one approach throughout may suffice. Exactly what the gains associated with
applying both methods is therefore of interest. First, the argument that CWA acts as the
revolutionary design specification method and HTA acts as the design evaluation and
refining method is reiterated. The advantage of using both in this sense is that a system
design specification is produced by one and then refined by the other. Second, the
comprehensiveness of the output is assured. In dealing with goals and sub-goals and
purposes and constraints, the system is described in a comprehensive manner. The goals
and sub-goals description shows the activities that go on within the system, whereas the
purposes and constraints analysis shows the constraints affecting activity and also the
Theoretical Issues in Ergonomics Science 527

activity that could go on within the system. Further, the who, what, why, when and where
associated with these goals and constraints is dealt with. Third, the system in question can
be analysed from both a high and low level of granularity. CWA offers a high level
description of the system, whereas HTA offers a more detailed description of activities at
the minute level. Fourth, and finally, all manner of human factors and ergonomics
analyses can be undertaken. As articulated previously, HTA offers numerous human
factors analysis extensions. CWA, on the other hand, deals with the work domain, the
control tasks and decisions, strategies for undertaking tasks, allocation of function across
humans and technological agents and the cognitive skills required.
Analyst experience, skill and methodological preference aside, the selection of one
method over the other is likely to be a function of the analysis requirements, the phase of
design at which the input is required and/or the domain in which the analysis is to take
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

place. Obviously, for first of a kind system design, or for input at the beginning of the
system design life cycle, CWA is more appropriate. For describing and analysing an
existing system or concept, or for making small incremental design modifications, HTA is
more appropriate (in combination with one of its many extensions). If both methods
satisfy the analysis requirements, however, and analysis constraints (e.g. time available,
analyst skill set) mean that only one can be applied, then looking at the domain in which
the analysis is to take place is also useful. For domains in which activity is more structured
and proceduralised, HTA is likely to be more useful; however, for more complex domains,
CWA is likely to be more beneficial.
In closing, it is clear that both approaches are entirely different. HTA describes
systems normatively in terms of goals, plans and activities, whereas CWA describes
systems formatively in terms of the constraints imposed on activities. Both have their
utility and indeed their place in the human factors practitioner’s armoury. Whilst the
findings from this case study have found that CWA offers more in terms of number of
outputs and perspectives on the system under analysis, the findings also showed that HTA
offers more scope for further analysis through established, structured approaches (e.g.
error prediction, interface design, allocation of functions, training programme design,
etc.), albeit at a further time cost. It is concluded, however, that, although inherently
different in terms of their theoretical underpinning, approach and output, HTA and CWA
can be used as complementary approaches for either evaluating or designing complex
systems. In particular, it appears that CWA may be more suited to design specification,
whereas HTA may be more suited to design modification.

Acknowledgements
This work from the Human Factors Integration Defence Technology Centre was part-funded by the
Human Sciences Domain of the UK Ministry of Defence Scientific Research Programme. The
authors would like to thank Nick Wharmby, Shaun Wyatt, Jan Ferraro and Sean Dufosee for their
assistance in the data collection, data analysis and interpretation of the analysis products. Also, the
authors would like to thank the anonymous reviewers whose comments helped to improve this
article significantly.

References

Adams, P. and David, G.C., 2007. Light vehicle fuelling errors in the UK: the nature of the problem,
its consequences and prevention. Applied Ergonomics, 38 (5), 499–511.
Ahlstrom, U., 2005. Work domain analysis for air traffic controller weather displays. Journal of
Safety Research, 36, 159–169.
528 P. Salmon et al.

Annett, J., 2004. Hierarchical task analysis. In: D. Diaper and N.A. Stanton, eds. The handbook of
task analysis for human–computer interaction. Mahwah, NJ: Lawrence Erlbaum Associates,
67–82.
Annett, J. and Stanton, N.A., 1998. Task analysis. Ergonomics, 41 (11), 1529–1536.
Annett, J., et al., 1971. Task analysis. Department of Employment Training Information Paper 6.
London: HMSO.
Annett, J. and Stanton, N.A., 2000. Task analysis. London, UK: Taylor & Francis.
Bisantz, A.M. and Burns, C.M., 2008. Applications of cognitive work analysis. Boca Raton, FL: CRC
Press.
Bisantz, A.M., et al., 2003. Integrating cognitive analyses in a large-scale system design process.
International Journal of Human-Computer Studies, 58, 177–206.
Bolia, R.S., 2005. Intelligent decision support systems in network-centric military operations. In:
Intelligent decisions? Intelligent support? Pre-proceedings for the International Workshop on
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Intelligent Decision Support Systems: Retrospect and prospects, 3–7.


Bruseberg, A. and Shepherd, A., 1997. Job design in integrated mail processing. In: D. Harris, ed.
Engineering psychology and cognitive ergonomics. Volume two: Job design and product design.
Aldershot, Hampshire: Ashgate Publishing, 25–32.
Burke, S.C., 2004. Team task analysis. In: N.A. Stanton, A. Hedge, K. Brookhuis, E. Salas and
H. Hendrick, eds. Handbook of human factors and ergonomics methods. Boca Raton, FL: CRC
Press, 56.1–56.8.
Card, S.K., Moran, T.P., and Newell, A., 1983. The psychology of human computer interaction.
Hillsdale, New Jersey: Lawrence Erlbaum Associates.
Crone, D.J., Sanderson, P.M., and Naikar, N., 2003. Using cognitive work analysis to develop
a capability for the evaluation of future systems. Proceedings of the 47th Annual
Meeting Human Factors and Ergonomics Society, Denver, CO. Santa Monica, CA: HFES,
1938–1942.
Crone, D., et al., 2007. Selecting sensitive measures of performance in complex multivariable
environments. Proceedings of the 2007 Simulation Technology Conference (SimTecT 2007),
Brisbane, Australia, 4–7 June 2007.
Embrey, D.E., 1986. SHERPA: A systematic human error reduction and prediction approach. Paper
presented at the International Meeting on Advances in Nuclear Power Systems, 1986, Knoxville,
Tennessee, USA.
Fidel, R. and Pejtersen, A.M., 2005. Cognitive work analysis. In: K.E. Fisher, S. Erdelez and
E.F. McKenzie, eds. Theories of information behaviour: A researchers guide. Medford, NJ:
Information Today.
Hart, S.G. and Staveland, L.E., 1988. Development of a multi-dimensional workload rating scale:
Results of empirical and theoretical research. In: P.A. Hancock and N. Meshkati, eds. Human
mental workload. Amsterdam, The Netherlands: Elsevier.
Hajdukiewicz, J.R., 1998. Development of a structured approach for patient monitoring in the
operating room. Thesis (Masters). University of Toronto.
Hajdukiewicz, J.R. and Vicente, K.J., 2004. A theoretical note on the relationship between
work domain analysis and task analysis. Theoretical Issues in Ergonomics Science, 5 (6),
527–538.
Higgins, P.G., 1998. Extending cognitive work analysis to manufacturing scheduling. In: P. Calder
and B. Thomas, eds., Proceedings of the 1998 Australian Computer Human Interaction
Conference, OzCHI’98, November 30–December 4, Adelaide, IEEE, 236–243.
Hodgkinson, G.P. and Crawshaw, C.M., 1985. Hierarchical task analysis for ergonomics research.
An application of the method to the design and evaluation of sound mixing consoles. Applied
Ergonomics, 16 (4), 289–299.
Jamieson, G.A. and Vicente, K.J., 2001. Ecological interface design for petrochemical applications:
Supporting operator adaptation, continuous learning and distributed, collaborative work.
Computers and Chemical Engineering, 25, 1055–1074.
Theoretical Issues in Ergonomics Science 529

Jansson, A., Olsson, E., and Erlandsson, M., 2006. Bridging the gap between analysis and design:
improving existing driver interfaces with tools from the framework of cognitive work analysis.
Cognition, Technology & Work, 8 (1), 41–49.
Jenkins, D.P., et al., 2008a. Cognitive work analysis: coping with complexity. Aldershot, UK:
Ashgate.
Jenkins, D.P., et al., 2008b. Applying cognitive work analysis to the design of rapidly reconfigurable
interfaces in complex networks. Theoretical Issues in Ergonomics Science, 9 (4), 273–295.
Jenkins, D.P., et al., 2008c. Using cognitive work analysis to explore activity allocation within
military domains. Ergonomics, 51 (6), 798–815.
Jenkins, D.P., Stanton, N.A., Salmon, P.M., Walker, G.H., and Rafferty, L. (In Press). Using the
decision-ladder to add a formative element to naturalistic decision-making research.
International Journal of Human Computer Interaction.
Kirwan, B. and Ainsworth, L.K., 1992. A guide to task analysis. London, UK: Taylor & Francis.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

Kilgore, R.M., St-Cyr, O., and Jamieson, G.A., 2009. From work domains to worker competencies:
A five-phase CWA for air traffic control. In: A. Bisantz and C. Burns, eds. Applications of
cognitive work analysis. Boca Raton, FL: Taylor and Francis Group, LLC, 15–47.
Lane, R., Stanton, N.A., and Harrison, D., 2007. Applying hierarchical task analysis to medication
administration errors. Applied Ergonomics, 37 (5), 669–679.
Lintern, G. and Naikar, N., 2000. The use of work domain analysis for the design of training teams.
Proceedings of the joint 14th triennial congress of the International Ergonomics Association/44th
Annual Meeting of the Human Factors and Ergonomics Society (HFES/IEA 2000), San Diego,
CA. Santa Monica, CA: Human Factors and Eronomics Society, 198–201.
Lintern, G., 2009. The theoretical foundation of cognitive work analysis. In: A. Bisantz and
C. Burns, eds. Applications of cognitive work analysis. Boca Raton, FL: Taylor and Francis
Group, LLC, 321–355.
Marsden, P. and Kirby, M., 2004. Allocation of functions. In: N.A. Stanton, A. Hedge,
K. Brookhuis, E. Salas and H. Hendrick, eds. Handbook of human factors and ergonomics
methods. Boca Raton, FL: CRC Press.
Mazaeva, N. and Bisantz, A.M., 2007. On the representation of automation using a work domain
analysis. Theoretical Issues in Ergonomics Science, 8 (6), 509–530.
Memesevic, R., Sanderson, P.M., Choudhury, S., and Wong, W. (2005). Work domain analysis and
ecological interface design for hydropower system monitoring and control. In: Proceedings of
the IEEE Conference on Systems, Man, and Cybernetics (IEEE-SMC2005, Hawaii, USA,
3580–3587.
Miller, G.A., Galanter, E., and Pribram, K.H., 1960. Plans and the structure of behaviour. New York:
Holt.
Miller, C.A. and Vicente, K.J., 2001. Comparison of display requirements generated via Hierarchical
Task and Abstraction-Decomposition Space Analysis Techniques. International Journal of
Cognitive Ergonomics, 5 (3), 335–355.
Ministry of Defence, 2005. Joint service publication 777 – Network enabled capability. Version 1,
Edition 1, http://www.mod.uk/DefenceInternet/AboutDefence/CorporatePublications/
ScienceandTechnologyPublications/NEC/, accessed 18th July 2007.
Naikar, N., Hopcroft, R., and Moylan, A., 2005. Work domain analysis: theoretical concepts and
methodology. Defence Science & Technology Organisation Report, DSTO-TR-1665.
Fishermans Bend, Australia: Air Operations Division.
Naikar, N., Moylan, A., and Pearce, B., 2006. Analysing activity in complex systems with cognitive
work analysis: Concepts, guidelines, and case study for control task analysis. Theoretical
Issues in Ergonomics Science, 7 (4), 371–394.
Naikar, N. and Sanderson, P.M., 1999. Work domain analysis for training-system definition.
International Journal of Aviation Psychology, 9, 271–290.
Naikar, N. and Sanderson, P.M., 2001. Evaluating design proposals for complex systems with work
domain analysis. Human Factors, 43, 529–542.
530 P. Salmon et al.

Naikar, N. and Saunders, A., 2003. Crossing the boundaries of safe operation: A technical training
approach to error management. Cognition Technology and Work, 5, 171–180.
Naikar, N., et al., 2003. Technique for designing teams for first-of-a-kind complex systems with
cognitive work analysis: Case study. Human Factors, 45 (2), 202–217.
Naikar, N., 2009. Beyond the design of ecological interfaces: Applications of work domain analysis
and control task analysis to the evaluation of design proposals, team design and training.
In: A. Bisantz and C. Burns, eds. Applications of cognitive work analysis. Boca Raton, FL:
Taylor and Francis Group, LLC, 69–94.
Olsson, G. and Lee, P.L., 1994. Effective interfaces for process operators. The Journal of Process
Control, 4, 99–107.
Piso, E., 1981. Task analysis for process-control tasks: The method of Annett et al. applied.
Occupational Psychology, 54, 347–254.
Rasmussen, J., 1986. Information processing and human-machine interaction: an approach to cognitive
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

engineering [online]. New York: North-Holland. Available from: http://www.ischool.


washington.edu/chii/portal/literature.html [Accessed 14 August 2008].
Rasmussen, J., Pejtersen, A.M., and Goodstein, L.P., 1994. Cognitive systems engineering.
New York: Wiley.
Reid, G.B. and Nygren, T.E., 1988. The subjective workload assessment technique: A scaling
procedure for measuring mental workload. In: P.S. Hancock and N. Meshkati, eds. Human
mental workload. Amsterdam, The Netherlands: Elsevier.
Salmon, P.M., et al., 2007. Work domain analysis and road transport: Implications for vehicle
design. International Journal of Vehicle Design, 45 (3), 426–448.
Salmon, P.M., et al., 2009. Analysing the analysis in task analysis. Unpublished manuscript.
Sanderson, P.M., 2003. Cognitive work analysis across the system lifecycle: Achievements,
challenges, and prospects in aviation, In: P. Pfister and G. Edkins, eds., Aviation Resource
Management, 3, Aldershot: Ashgate.
Shepherd, A., 2001. Hierarchical Task Analysis. London: Taylor & Francis.
Stammers, R.B. and Astley, J.A., 1987. Hierarchical task analysis: Twenty years on. In: Megaw,
E.D., ed., Contemporary Ergonomics 1987. Taylor & Francis, London, 135–139.
Stanton, N.A., 2006. Hierarchical task analysis: Developments, applications, and extensions. Applied
Ergonomics, 37, 55–79.
Stanton, N.A., et al., 2005. Human factors methods: A practical guide for engineering and design.
Aldershot, UK: Ashgate.
Tsang, P.S. and Velazquez, V.L., 1996. Diagnosticity and multidimensional subjective workload
ratings. Ergonomics, 39, 358–381.
Vicente, K.J., 1999a. Cognitive work analysis: Toward safe, productive, and healthy computer-based
work. Mahwah, NJ: Lawrence Erlbaum Associates.
Vicente, K.J., 1999b. Wanted: psychologically relevant, device and event independent work analysis
techniques. Interacting with computers, 11, 237–254.
Walker, G.H., et al., 2006. Event analysis of systemic teamwork (EAST): a novel integration of
ergonomics methods to analyse C4i activity. Ergonomics, 49, 1345–1369.
Watson, M.O. and Sanderson, P.M., 2007. Designing for attention with sound: Challenges and
extensions to ecological interface design. Human Factors, 49 (2), 331–346.
Yu, X., et al., 2002. Toward theory-driven, quantitative performance measurement in ergonomics
science: the abstraction hierarchy as a framework for data analysis. Theoretic Issues in
Ergonomic Science, 3 (2), 124–142.

About the authors


Dr Paul Salmon is a senior research fellow in the Human Factors team at the Monash University Accident
Research Centre.
Theoretical Issues in Ergonomics Science 531

Dr Dan Jenkins is director of Sociotechnic Solutions Limited, a company specialising in the design and
optimisation of complex sociotechnical systems.

Professor Neville Stanton holds a Chair in Human Factors in the School of Civil Engineering and the
Environment at the University of Southampton.

Dr Guy Walker is a senior lecturer in the school of the Built Environment at Heriot Watt University.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014

You might also like