Cognitive Vs Hierarchical TA
Cognitive Vs Hierarchical TA
To cite this article: Paul Salmon , Daniel Jenkins , Neville Stanton & Guy Walker (2010)
Hierarchical task analysis vs. cognitive work analysis: comparison of theory, methodology and
contribution to system design, Theoretical Issues in Ergonomics Science, 11:6, 504-531, DOI:
10.1080/14639220903165169
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Theoretical Issues in Ergonomics Science
Vol. 11, No. 6, November–December 2010, 504–531
School of Civil Engineering and the Environment, Highfield, Southampton, SO17 1BJ, UK;
d
School of the Built Environment, Heriot Watt University, Edinburgh, UK
(Received 15 May 2008; final version received 29 May 2009)
1. Introduction
Out of the abundance of human factors and cognitive engineering methods available,
hierarchical task analysis (HTA; Annett et al. 1971) and cognitive work analysis (CWA;
Vicente 1999a) are arguably the most popular. The former represents the traditional task
analytic approach; the latter represents the more modern system design framework. Both
approaches have distinct theoretical underpinnings and approach the analysis of systems
in quite different ways, yet they are often discussed in the same breath. It is important to
clarify what the theoretical and methodological differences between the two are and, even
more so, to identify if, and how, the two approaches can be used in tandem during
complex system design and evaluation efforts (Hajdukiewicz and Vicente 2004). This
article presents a comparison of the two approaches in terms of their theoretical
underpinning, methodological procedure and contribution to the system design life cycle.
To do this the methods are first discussed in terms of their theoretical underpinning and
methodological nature, following which recent HTA and CWA analyses of a military
rotary wing mission planning system (MPS) software tool are compared and contrasted.
around that time meant that tasks were becoming more cognitive in nature and
approaches that could be used to describe and understand these novel work processes were
subsequently required. Notwithstanding its scientific management origins, HTA was
unique at the time in that, in addition to the physical tasks being performed, it also
attempted to describe the cognitive aspects of goal attainment (something that is often
overlooked by antagonists of the method). Thus, HTA represented a significant departure
from existing approaches of the time since it focused on goals and related cognitive and
physical processes, rather than merely the physical and observable aspects of task
performance.
Stanton (2006) describes the heavy influence of control theory on the HTA
methodology and demonstrates how the test-operate-test-exit unit (central to control
theory) and the notion of hierarchical levels of analysis are similar to HTA representations
(plans and sub-goal hierarchy). HTA itself works by decomposing systems into a hierarchy
of goals, sub-ordinate goals, operations and plans; it focuses on: ‘what an operator . . . is
required to do, in terms of actions and/or cognitive processes to achieve a system goal’
(Kirwan and Ainsworth 1992, p. 1). It is important to note here that an ‘operator’ may be
a human or a technological operator (e.g. system artefacts such as equipment, devices and
interfaces). HTA outputs, therefore, specify the overall goal of a particular system, the
sub-goals to be undertaken to achieve this goal, the operations required to achieve each of
the sub-goals specified and the plans, which are used to ensure that the goals are achieved.
The plans component of HTA is especially important since they specify the sequence, and
under what conditions, different sub-goals have to be achieved in order to satisfy the
requirements of a super-ordinate goal.
The HTA process is simplistic, involving collecting data about the task or system under
analysis (through techniques such as observation, questionnaires, interviews with subject
matter experts (SMEs), walkthroughs, user trials and documentation review to name but a
few) and then using these data to decompose and describe the goals and sub-goals
involved. The HTA procedure is presented in Figure 1.
Despite the vast range of human factors and ergonomics methods available, the
popularity of the HTA methodology is unparalleled and the approach is the most popular
and ubiquitous, not just out of task analysis methods, but out of all human factors and
ergonomics methods (Kirwan and Ainsworth 1992, Annett 2004, Stanton et al. 2005).
HTA has been applied now for over 40 years in all manner of domains and its heavy use
shows no signs of abating, certainly not within human factors and ergonomics circles.
Although the process of constructing a HTA is enlightening in itself (i.e. the analysts’
understanding of the task under analysis increases significantly), HTA’s popularity is due
506 P. Salmon et al.
START
State Plan
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
N
Is Redescription
OK?
Is Further Y
Redescription
Required?
Y
Terminate the
Redescription of this Are there any more
Operation Operations?
STOP
largely to the flexibility and utility of its output. In addition to the exhaustive goal-based
description provided, HTA outputs can be used to inform various additional human
factors analyses and many other human factors methods require an initial HTA as part of
their data input (Stanton et al. 2005). This flexibility has allowed HTA to be applied for
Theoretical Issues in Ergonomics Science 507
a wide range of system design and evaluation purposes, including interface design and
evaluation (e.g. Hodgkinson and Crawshaw 1985, Stammers and Astley 1987, Shepherd
2001), job design (Bruseberg and Shepherd 1997), training programme design and
evaluation (e.g. Piso 1981), human error prediction (e.g. Lane et al. 2007) and analysis (e.g.
Adams and David 2007), team task analysis (Annett 2004, Walker et al. 2006), allocation
of functions analysis (e.g. Marsden and Kirby 2004), workload assessment (Kirwan and
Ainsworth 1992) and procedure design (Stanton 2006).
Risø National Laboratory in Denmark (Rasmussen 1986) for use within nuclear power
process control applications. Underlying the approach was a specific need to design for
new or unexpected situations; in a study of industrial accidents and incidents, Risø
researchers found that most accidents began with non-routine operations. CWA’s
theoretical roots lie in general and adaptive control system theory and also Gibson’s
ecological psychology theory (Fidel and Peijtersen 2005). The approach itself is concerned
with constraints rather than goals, which is based on the notion that making constraints
explicit in an interface can potentially enhance human performance (Hajdukiewicz and
Vicente 2004).
The CWA framework comprises five phases, each modelling different constraint sets:
work domain analysis (WDA); control task analysis (CTA); strategies analysis; social
organisation and cooperation analysis (SOCA); and worker competencies analysis. A brief
description of each phase is provided below. The CWA phases, methods and outputs are
presented in Figure 2.
(1) The WDA phase involves modelling the system in question based on its purposes
and the constraints imposed by the environment. The abstraction hierarchy (AH)
and abstraction decomposition space (ADS) approaches are used for this purpose.
WDA identifies a fundamental set of constraints that are imposed on the actions of
Figure 2. Cognitive work analysis phases along with their associated methods and outputs.
508 P. Salmon et al.
any actor (Vicente 1999a). In modelling a system in this way, the systemic
constraints that shape activity are specified. This formative approach leads to an
event, actor and time independent description of the system (Vicente 1999a,
Sanderson 2003).
(2) The second phase, CTA, is used to identify the tasks that are undertaken within the
system and the constraints imposed on these activities during different situations.
CTA focuses on the activity necessary to achieve the purposes, priorities and values
and functions of a work domain (Naikar et al. 2006). Rasmussen’s decision ladder
(Rasmussen 1976; cited in Vicente 1999a) and Naikar et al.’s (2006) contextual
activity template are used for the CTA phase.
(3) The strategies analysis phase is used to identify how the different activities can be
achieved. Vicente (1999a) points out that whereas the CTA phase provides a
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Its current popularity is such that it has been applied in various complex domains for a
number of different purposes, including system modelling (e.g. Hajdukiewicz 1998), system
design (e.g. Bisantz et al. 2003), automation (Mazaeva and Bisantz 2007), training needs
analysis (e.g. Naikar and Sanderson 1999), training programme evaluation and design (e.g.
Naikar and Sanderson 1999), interface design and evaluation (Vicente 1999), information
requirements specification (e.g. Ahlstrom 2005), tender evaluation (Lintern and Naikar
2000, Naikar and Sanderson 2001), team design (Naikar et al. 2003), allocation of
functions (e.g. Jenkins et al. 2008c), the development of human performance measures (e.g.
Yu et al. 2002, Crone et al. 2003, 2007) and error management strategy design (Naikar and
Saunders 2003). These applications have taken place in a variety of complex safety critical
domains, including air traffic control (e.g. Ahlstrom 2005), aviation (e.g. Naikar and
Sanderson 2001), health care (e.g. Watson and Sanderson 2007), hydropower (e.g.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Memisevic et al. 2005), nuclear power (e.g. Olsson and Lee 1994), naval (e.g. Bisantz et al.
2003), manufacturing (e.g. Higgins 1998), military command and control (e.g. Jenkins
et al. 2008a,b), petrochemical (e.g. Jamieson and Vicente 2001), process control (e.g.
Vicente 1999a), rail (e.g. Jansson et al. 2006) and road transport (e.g. Salmon et al. 2007).
4. Theoretical comparison
From a theoretical point of view, it would appear that there are two fundamental
differences between the two methods. First, HTA focuses on system goals whereas CWA
focuses on the constraints present within a system and, second, HTA fits somewhere
between being descriptive and normative in nature, since it describes how goals actually
are, or should be, achieved, whereas CWA is formative in nature and so describes how
functions and purposes could potentially be achieved.
The goals vs. purposes and constraints distinction is a logical starting point of any
comparison between the two; however, this distinction is not as clear cut as it first appears.
HTA focuses on the hierarchical subdivision of goals, whereas CWA focuses on the
purposes and constraints present within the domain in which these goals are pursued. The
notion that CWA does not focus on goals, however, is contentious. For example, it might
be argued by some that CWA, through the CTA phase, focuses on the goals linked to
‘known recurring activities’; however, this is a moot point. Reviewing the literature on
decision ladder applications, it is clear that some applications focus on goals (e.g.
Rasmussen et al. 1994, Naikar 2009, Jenkins et al. in press), whereas some do not
(Rasmussen 1986, Vicente 1999a, Kilgore et al. 2009, Lintern 2009). Certainly, the decision
ladder method has a goals element to it; however, since CWA is a framework rather than a
rigid methodology, the underlying theory is more important in this sense. It could be
argued that CWA does not analyse goals in the true semantic sense, rather it considers the
influence of goals on decision making and these goals may or may not be considered
dependent on the situation and environmental constraints. The key difference seems to be
that HTA begins with the system goal and uses this as the basis for analysis; whereas CWA
(or more correctly, ‘activity analysis in work domain terms’) considers them but,
importantly, does not use them as the driving force for the analysis. It is also worth
pointing out that if the CWA approach did truly analyse goals, then it would be moving
from formative description to normative description, since goals can only be specified for
situations that are known or can be anticipated. If this were the case, then only the WDA
component of the framework could be considered to be formative.
510 P. Salmon et al.
Moving back to the discussion on goals vs. constraints and purposes, Vicente (1999a)
offers the following definitions for these terms:
. Constraints – Relationships between, or limits on, behaviour. Constraints remove
degrees of freedom (Vicente 1999a, p. 6).
. Purpose – The overarching intentions that a work domain was designed to
achieve. Note that purposes are properties of work domains, not actors, and that
they are relatively permanent (Vicente 1999a, p. 9).
. Goal – A state to be achieved, or maintained, by an actor at a particular time.
Note that goals are attributes of actors, not work domains, and that they are
dynamic (Vicente 1999a, p. 7).
In a simplistic manner, operators perform activities to achieve a goal of some sort and
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
the successful achievement of goals allows a system to enact its purpose. According to
Vicente (1999a), the difference between a goal and a purpose relates to ownership and
dynamicity. Goals, according to Vicente, are dynamic and are held by actors, whereas
Annett and colleagues originally defined a goal as: ‘the objective of the system in some real
terms of production units, quality or other criteria’ (Annett et al. 1971, p. 4). Annett and
Stanton (2000) refer to goals as those things that a person is seeking to achieve. Purposes,
on the other hand, are typically permanent and are a property of the work domain rather
than the actors undertaking tasks in the work domain. Constraints reflect those conditions
in the work domain that affect the activities being performed. This distinction represents
the first significant difference between the two approaches. HTA is concerned with states
that are to be achieved and how they are achieved, whereas CWA is concerned with the
purposes of a system and the limiting factors imposed on purpose-driven behaviour within
that system. When analysing a kettle, for example, one can take the ubiquitous ‘make cup
of tea’ example that is so frequently used to demonstrate HTA. The goal here is normally
expressed as ‘make cup of tea’. The purpose of the kettle ‘system’, however, would be
expressed as ‘provide boiling water’. The goal of ‘make cup of tea’ is dynamic and linked
to an actor at a specific time; the same actor’s goal, or another actor’s goal, on another
occasion may be very different. The goal is therefore dynamic and will change periodically,
for example, think ‘make cup of tea’ vs. ‘make cup of hot chocolate’ vs. ‘boil water to wash
car’. The kettle’s purpose of ‘provide boiling water’, however, will never change, regardless
of the situation or actor involved.
The second fundamental difference between the two, the focus on how activity actually
does unfold vs. how it can potentially unfold is also significant. HTA lies somewhere
between being a descriptive and normative approach to work analysis (Jenkins et al. 2008),
whereas CWA instead formatively describes how work can unfold based on a multi-
perspective view of the constraints that can possibly impact it. HTA focuses on how goals
should be (or are) achieved, whereas CWA focuses on how activity could be undertaken.
Vicente argues that CWA is distinct from approaches such as HTA in that, rather than
prescribe how work should be done or describe how it is currently being done, it seeks to
identify how work could be done if the appropriate tools were made available. (Vicente
1999a, p. 340). This has led to many, such as Naikar et al. (2005), claiming that CWA can
identify the information or knowledge that workers need for dealing with a wide variety of
situations, including novel or unanticipated events, whereas HTA can only specify the
information or knowledge that workers need for dealing with routine or anticipated
situations. Annett (2004), however, would counteract this argument by stating that HTA
seeks to represent system goals and plans rather than focusing solely on observable aspects
of performance. The normative vs. formative output provided by the two is the main focus
Theoretical Issues in Ergonomics Science 511
of those postulating an advantage of one method over the other, since many argue that
CWA can deal with unanticipated, non-routine events, whereas HTA cannot (e.g. Vicente
1999b, Miller and Vicente 2001, Hajdukiewicz and Vicente 2004). This is also where the
notion that CWA can be used to design ‘first-of-a-kind’ systems comes from; since it is a
formative approach that focuses on the purposes of a system, it can be used to analyse
systems that do not yet exist. It is notable, however, that applications in which CWA has
been applied to the design of a novel, first of a kind system are scarce.
5. Methodological comparison
Methodologically, the two approaches also differ. HTA has a well-versed, step-by-step
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
methodology (e.g. Annett 2004, Stanton 2006); whereas CWA, more a framework than a
methodology, does not restrict analysts to specific methodologies for each phase, let alone
a rigid procedure. Of the various methods available for each of the five phases, those used
for the WDA (e.g. AH and ADS) and the ConTA (e.g. decision ladder and contextual
activity template) phases have the most methodological guidance associated with them
(e.g. Naikar et al. 2005, 2006, Jenkins et al. 2008a) and it is notable that there is a general
lack of guidance for the latter phases (e.g. strategies analysis, SOCA and worker
competencies analysis). The two approaches are also similar in that the level of granularity
pursued is largely down to the analyst(s) involved and the purposes of the analysis itself.
For example, both can be used in a rapid fashion for high level descriptions of systems,
whereas should the analysis requirements dictate, both can go to an extremely fine level
of detail.
There are also similarities in the methodological procedures followed when conducting
analyses with both approaches. Both involve the use of a range of sources for collecting the
data required, including observation, questionnaires, interviews with SMEs, walk-
throughs, user trials and documentation review. This is useful since it often allows both
methods to be applied to the same data, which saves considerable time but produces a
comprehensive output. When applying each method, both require a high degree of
reiteration, often with appropriate SMEs, before the analysis is completed to a satisfactory
level. The first pass of both approaches will never produce the final output. Both also
provide highly visual outputs and often require a drawing software package in order to
create the outputs developed.
There are similarities too in the flaws associated with both approaches. Both are
criticised for the high level of resource usage incurred when applied to complex tasks or
systems (although many, including these authors, would argue that the utility of the
outputs generated from both methods far outweighs the cost associated with constructing
it) and concerns are often raised regarding the reliability and validity of both methods. In
response to the high resource usage problem, both approaches have been subject to the
development of software tool support (e.g. Jenkins et al. 2008a, Salmon et al. 2009).
Probably the most important methodological difference between the two approaches
lies in the methodological extensions applied. As referred to above, a range of methods can
be applied to HTA outputs; once the sub-goal hierarchy is complete, there are a number of
additional human factors methods that can be applied in order to extend the analysis
outputs further. For example, a recent HTA software tool developed by the authors
(Salmon et al. 2009) includes the following add-on methods: allocation of functions
analysis (Marsden and Kirby 2004); context and constraints analysis (Shepherd 2001);
DIF analysis, human–computer interaction analysis, interface design analysis
512 P. Salmon et al.
extensions described are solely design-based (e.g. error prediction and workload
assessment can be undertaken on operational systems).
work domain structure and that moving from this to a relevant work domain
structure, which shows all action possibilities for a particular category of events,
represents the first step away from WDA and toward task analysis. Following this, the
transformation from a relevant work domain structure to a utilised work domain structure
depicting the subset of utilised action possibilities at a particular point in time takes a
further step away from WDA and toward task analysis. Next, the transformation from a
utilised work domain structure to a desired work domain state represents the final step
toward task analysis, which maps current states onto desired states via a set of human or
automated actions (Hajdukiewicz and Vicente 2004). They describe WDA as event and
time independent, which supports worker adaptation to novelty and change, and task
analysis as event and time dependent, which supports worker performance in anticipated
situations.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
It is notable that the comparisons discussed are presented, in the main, by proponents
of the CWA approach (e.g. Vicente, Hajdukiewicz); indeed, it could be argued that the
comparisons discussed are slightly one sided in the favour of CWA. Both Miller and
Vicente (2001) and Hajdukiewicz and Vicente (2004) seem to take a constrained view of
HTA and its capabilities and tend to ignore the various extensions that can be added. For
example, in a retort to Miller and Vicente (2001), Stanton (2006) argued that some of the
shortcomings identified by Miller and Vicente (2001) could have been removed had they
considered the various extensions of HTA (e.g. interface design). Stanton (2006) also
argued that Miller and Vicente’s (2001) analysis indicated that they may have been using
HTA in a constrained manner (i.e. focusing on human agents and actions as opposed to
the whole system).
It is also interesting to note that the comparisons discussed do not consider the latter
phases of the CWA framework; rather they compare only the first phase, WDA, with
HTA. This is due partly to the fact that there has been only limited application of the latter
phases of CWA (published in the open literature at least) and that there is only limited
guidance on how to conduct the latter phases. In recent times, however, applications of the
latter phases (e.g. Naikar et al. 2006, Jenkins et al. 2008a) have been described and also
guidance on how to undertake these analyses has emerged (e.g. Naikar et al. 2006, Jenkins
et al. 2008a).
Underpinning NEC then is the use of digitised warfare systems, which permit
connectivity between multiple actors and the rapid dissemination of data between them.
Accordingly, there has been a recent spate of digitised mission support systems being
developed, tested and even introduced in theatre. The analysis in this case focuses on a
digitised MPS that, at the time of analysis, was used to support planning for military
helicopter missions.
Mission planning is an essential part of flying a military aircraft. Whilst in the air,
pilots are required to process, in parallel, cognitively intense activities, including time
keeping, hazard perception and off-board communication. These activities are all
conducted whilst attending to the task of navigating through a 3-D airspace. Pilots are
required to constantly evaluate the effects that their actions have on others within the
domain. Decisions need to be made that consider any number of both military and non-
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
military services, organisations and civilian groups. Calculations need to be made based
upon a number of physical considerations. These include environmental constraints,
aircraft performance and payloads. Pilots also need to balance mission objectives with
rules of engagement and high order strategic objectives. Pre-flight planning is one essential
method used to alleviate some of the pilot’s airborne workload. This planning process,
which was formerly conducted on paper maps, is now supported by the MPS tool focused
on in this article.
The MPS tool analysed provides and processes digital information on battlefield data,
threat assessment, intervisibility, engagement zones, communication details, transponder
information and identification friend or foe settings. In short, the MPS is used to plan and
assess single and multiple aircraft sortie missions. When using the MPS system, mission
plans are generated prior to take off on PC-based MPS terminals. Key information
developed in the software tool is transferred to the aircraft via a digital storage device
called a ‘data transfer cartridge’ (DTC). Information is presented on the aircraft’s onboard
flight display. This multi-function display can be used by the pilot to assist in navigation
and target identification. The HFI-DTC consortium were invited by the system’s creator
to undertake a human factors analysis of the MPS in order to generate system redesign
recommendations.
8. Methodology
8.1. Participants
Two human factors researchers from the Ergonomics Research Group at Brunel
University, each with significant experience in the application of both HTA and CWA
in complex systems analyses, took part in this study. For the purposes of data collection
and validation of their analyses, the analysts were given access to four participants who
were domain and mission planning SMEs. The SMEs were all qualified pilots with
significant experience in planning missions using the old paper map system and the new
MPS. One SME is a UK flight instructor and trainer for the MPS and the other three are
serving airmen who use the MPS regularly.
8.2. Materials
The materials used included a laptop computer containing the MPS software tool,
HTA and CWA software tools (developed by the HFI-DTC, see www.hfidtc.com
Theoretical Issues in Ergonomics Science 515
for details on the tools and how to obtain free copies) and video and audio
recording equipment. A number of documents relating to the mission planning procedure
were also used (e.g. training manuals, standard operating instructions, navigation cards,
etc). Paper maps, acetates and drawing equipment were also used to demonstrate the
existing paper map planning process.
8.3. Procedure
An initial 2-day meeting was held in order to introduce the analysts involved to the mission
planning process and to familiarise them with the MPS software tool. Following this, the
data collection procedure involved the conduct of a number of interviews with the four
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
SMEs and also SME walkthroughs of MPS planning tasks. The interviews and
walkthroughs were recorded using audio and video recording equipment. In total, three
interview/walkthrough meetings were held, each lasting approximately 5 hours in
duration.
The data collected were used to inform two separate analyses: one involved using
the CWA framework to analyse the MPS tool; the other involved using the HTA
methodology to analyse the MPS tool. For the CWA analysis, WDA, ConTA, SOCA
and example strategies analyses were developed for a three-phase mission-planning
scenario. For the HTA, four HTAs were constructed: one for a three-phase mission
planning scenario; one for an Afghanistan operational scenario; one generic HTA for
the MPS software tool; one for the traditional paper map (pre-MPS) planning process.
This article focuses on the three-phase mission analyses. In short, the three-phase
mission scenario involved one SME generating a workable plan for a three-phase
mission. A three-phase mission typically involves ingress (travelling from holding point
to target area), delivery of effect (i.e. destroy target) and egress (travelling out of target
area to holding point). Planning for this mission included analysing the battlefield area
(e.g. in terms of hazards, cover and concealment), planning ingress and egress routes,
identifying suitable holding point and battle positions (e.g. suitable position to destroy
target from), calculating intervisibility for the target and calculating and configuring
fuel and payload required. The three-phase mission planning HTA was then used to
inform the conduct of a SHERPA (Embrey 1986) human error identification analysis
and a task decomposition (Kirwan and Ainsworth 1992) analysis. Each of the outputs
derived from both analyses were reviewed by the MPS SMEs and subsequently refined
based on their feedback.
9. Results
An extract of the three-phase mission planning scenario HTA is presented in Figure 3.
Within Figure 3, only the high level goal and sub-goals are displayed (each sub-goal was
decomposed to button or key press level), along with an example decomposition. An
extract of the SHERPA human error analysis is presented in Table 1 and an extract of the
task decomposition analysis is presented in Table 2.
The AH developed for the MPS is presented in Figure 4; this was subsequently used to
inform the development of ConTA, SOCA and strategies analyses. Extracts from these
three analyses are presented in Figure 5.
516 P. Salmon et al.
0: Plan three
phase mission
using MPS
Figure 3. Three-phase mission hierarchical task analysis extract; figure shows example decompo-
sition for sub-goal 4. Plan positions & routes. MPS ¼ mission planning system; SA ¼ situation
awareness; DTC ¼ data transfer cartridge.
10. Discussion
The purpose of this article was to compare two popular contemporary human factors
approaches, HTA, a traditional task analytical methodology, and CWA, a contemporary
design framework. First, the outputs of the two are compared and contrasted and then the
potential for applying both approaches in a complementary fashion is discussed.
In the case study presented, each method produced entirely different outputs. It is in
the application of the methods where perhaps the only real similarities between the two
exist. The data collection process for both methods involved observation of process,
interviews with SMEs and reviews of relevant documentation, such as standard operating
procedures. In this case, both incurred similar data collection times and levels of SME
input; however, it is notable that, despite offering more outputs, the CWA approach was
the quickest to use, with an application time of approximately half of that of the HTA in
this case. This was ostensibly down to the level of granularity pursued in the HTA analysis
(for human error and task decomposition analysis purposes a button press level of
granularity was required).
HTA provided a goal-oriented description of the three-phase mission planning process
when using the MPS software tool, including a detailed description of goal attainment
throughout the planning process. Additionally, the plans component described the
contextual conditions that dictate the order in which the goals and sub-goals are
completed. The HTA description is particularly useful since it details exactly what goals
and sub-goals need to be achieved and how these goals and sub-goals are achieved, in
order to plan a three-phase mission using the MPS system. The process of constructing the
HTA itself enabled the analyst to develop a deep understanding of the domain and task
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Table 1. Systematic human error and prediction approach mission planning system extract.
Error
Task Step Mode Description Consequences Recovery P C Remedial Measures
4.2.3.1.1.1. Check cur- C1 User fails to check System may be in Immediate/ 4.2.3.1.1.1 H L – Current Mode
rent system menu current system appropriate mode Display
mode mode for desired opera- – Requirement to
tion (i.e. user might Select Desired
drop a CM rather Function prior to
than zoom in to data input
map) – System reverts to
Standard Mode
every time
4.2.3.1.2.1.1. Check R2 User misreads line of User misunderstands None L H – System auto gener-
Friendly & Enemy sight on intervisibil- line of sight and line ates Line of Sight
Lines of Sight on ity display of sight may be Rating &
Intervisibility inadequate or Recommendations
threatening
4.2.3.1.2.1.2. Check R2 User misreads Back User misunderstands None L H – System auto gener-
Back drop on Drop on intervisi- Back Drop and line ates Back Drop
Intervisibility bility display of sight may be Rating &
inadequate or Recommendations
threatening
4.2.3.2.1. Compute A7 User computes radar Radar range is calcu- None L H
Radar Range range incorrectly lated incorrectly
and the wrong radar
range data data is
presented to the
user
4.2.3.2.2. Check Radar C1 User fails to check Radar range is not 4.2.3.2.2 L H – System prompt to
Ranges radar ranges checked check radar ranges
post intervisibility
Theoretical Issues in Ergonomics Science
function
– System warning if
route/posit ions are
in radar range
– System auto-checks
radar ranges
4.2.3.3.3. Draw line A5 Draw Offset line from/ Offset range and bear- Immediate M M – System auto-con-
from Target to to incorrect point ing data is nects offset line to
Battle Position on the map inappopriate relevant features on
using Offset Tool the map
– Mouse over function
517
(continued )
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Table 1. Continued
Error
Task Step Mode Description Consequences Recovery P C Remedial Measures
518
– System reverts to
Standard Mode
every time
4.3.2.3.6.1. Select CM A7 User selects overlay CM function is not 4.1.3.2 L L – Current Mode
function on the drop down menu by selected Display
toolbar mistake – Requirement to
Select Desired
Function prior to
data input
– System reverts to
Standard Mode
every time
4.3.2.3.6.1. Enter way- A7 User enters the wrong Wrong waypoint name Immediate L L N/A
point name waypoint name is entered
4.3.2.3.6.2. Click on A5 User clicks on the Waypoint is dropped Immediate M L N/A
map to drop wrong area on the in the wrong area
waypoint map on the map
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Task Description Task Requirements Interface Features Nature of the Task TaskOutputs
InitiatingC-
Task Type of ue/ Controls & Decisions
Step Description Activity Event Information Location DisplaysUsed Actions Required Required Cx D Cr Output Feedback Comments
4.2.1. Identify Check Beginning of Map/Area Info Map Map Display Check Area (Targets Identificati- H H H Battle N/A Automation of
Suitable Decision Battle Terrain Display Zoom in and out Routes, Hazards, on of most Positions Process-
Battle Action Position Target Info controls Towns etc.) suitable Intelligent MPS
Positions Selection planning Route Info Intervisibilty Zoom in and out Battle system identifies suit-
process Fuel & Load Offset Tool Scope out Terrain Positions able Battle Positions
Performance Info and Positions based on Target and
Weapons Info Set up and check Route Information
Enemy Info intervisibility
Environment and Select appropriate
Constraints Info Battle Positions
4.2.2. Add Battle Action Selection of Battle Positions Map Mouse Select C M icon on Battle L L H Marked up Battle Positions pre- Dynamic
Position(s) Check suitable Map/Area Info Display Keyboard the toolbar Position Battle sented on map Intervisibility
to map Battle Terrain Intervisibilty Map Display Click on desired area Details and Positions on display System – intervisib-
Positions Target Info CM function on map Placement the Map lity system auto auto
Route Info Edit Symbol Classify CM as Battle on map Display recalculates upon
Fuel & Load Winddow Position and enter movement of Battle
Performance Info Intervisibilty Battle Position Position
Weapons Info Offset Tool details (name, Intervisibility
Enemy Info Over lays Colour, quadrant Comparison
Environment and etc.) Check Battle Function-MPS
Theoretical Issues in Ergonomics Science
(continued )
519
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Table 2. Continued
520
Task Description Task Requirements Interface Features Nature of the Task TaskOutputs
InitiatingC-
Task Type of ue/ Controls & Decisions
Step Description Activity Event Information Location DisplaysUsed Actions Required Required Cx D Cr Output Feedback Comments
4.2.3 Check and Action Selection of Battle Positions Map Mouse Set up Intervisibility Determine H H H Selection of N/A Dynamic
Modify Check suitable Route Info Display Keyboard Check Intervisibility the most most suitable Intervisibility
Battle Battle Map/Area Info Intervisibilty Map Display Consider Battle suitable Battle System – intervisib-
Positions Positions Terrain CM function Position aspects e.g. Battle Positions for lity system auto
Target Info Edit Symbol range, weapons, back Position(s) the mission recalculates upon
Route Info Window drop etc. based on Positions movement of Battle
Fuel & Load Intervisibilty Modify Battle target, marked up on Position
Performance Info Offset Tool Position (see 4.2.2) weapons, the map Intervisibility
Weapons Info Over lays Compute radar range Intervisibility Comparison
Enemy Info ranges Backdrop, Outputs Function-MPS
Environment and Check radar ranges radar cover- system compares
Constraints Info Check range using age, ingress different Battle
Area Features off set tool and egress Positions and rates
Itervisibility routes etc. each one
Heights Battle Position
P. Salmon et al.
522
P. Salmon et al.
Figure 5. Extracts from control task, strategies, and social organisation and cooperation analyses.
Theoretical Issues in Ergonomics Science 523
under analysis and the output produced has utility for both system design and evaluation.
For system design, this HTA output is useful for interface design (by specifying
information requirements), training design (by specifying task requirements and
sequences), procedure design (by specifying task sequences) and allocation of functions
(by specifying which agent does what and when) purposes. For system evaluation, the
output can be used to inform all manner of evaluations, including error prediction and
analysis, interface evaluation, knowledge, skills and attitudes analysis (which can be used
to design and evaluate training programmes) and team task analysis. In this case, the
SHERPA and task decomposition approaches were used to evaluate the MPS in terms of
interface design and potential for design-induced user errors.
CWA, on the other hand, provided four separate but related outputs, including the
ADS, the AH, the ConTA and the SOCA. These encompass a description of the
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
constraints imposed on activity, a formative description of the main activities involved and
the impact of systemic and situational constraints on them and a description of the
potential allocation of these activities between the actors working within the planning
system. Collectively, this enabled the mission-planning process (regardless of whether it
was undertaken using the MPS or paper-based maps) to be exhaustively described in terms
of constraints, which also led to the analyst acquiring a deep understanding of the domain
and mission-planning task. Decomposing activity-related constraints in this way also
offers a powerful description of the system in question and this information can be used to
directly inform system and interface design, training design and allocation of functions.
Despite being less detailed, and at a higher level of analysis than the HTA, this case
study suggests that, based on each method’s outputs alone, CWA offers more (especially
when one considers the quicker application time in this case). The CWA output offers
more perspectives on the system in question; its five pronged approach allows analysts to
more exhaustively describe the constraints on the system under analysis, albeit at a lower
level of granularity than HTA. Whilst HTA enhances its utility by acting as the input to
various other analysis approaches, such as human error identification and interface
analyses approaches, its initial output alone is nowhere near as comprehensive in terms of
the different perspectives on activities that CWA offers.
The difference in the level of granularity offered by the two analyses is significant. The
HTA goal decomposition went as far down as detailing button press activities (e.g. click on
intervisibility on drop down menu, click right mouse button to activate mapping menu,
click mouse on map to drop waypoint), whereas the CWA analysis worked at a much
higher level. This indicates one way in which the two can be used in a complementary
system design framework, since CWA can specify, at a high level, what functions are
required, how they can potentially be undertaken, in terms of high level strategies and by
whom, whereas HTA, at a more detailed level of granularity, can then be used to describe
exactly how these strategies might unfold, which, in turn, can be used to predict any errors
likely to emerge during task performance. By beginning at a high level of granularity and
describing the system and how activities can potentially unfold with CWA, one can then
specify at a greater level of detail, through HTA, how activities should occur and what
likely problems may be encountered. It is worth pointing out that the difference in
granularity of the outputs seen in this case does not mean that CWA cannot achieve the
same levels of granularity as HTA. However, for CWA to produce outputs of a similar
level of granularity to HTA, considerably more application time and data collection time
would be required.
One distinct advantage that HTA currently has over the CWA framework is its ability
to inform other analysis methods. For example, the HTA software tool used for this
524 P. Salmon et al.
analysis boasts 17 additional analysis modules, which use the initial HTA output as their
input (Salmon et al. 2009). At the time of writing this article, CWA does not have any
additional analysis methods available and is limited to its five phases and analyst
interpretation, although work is in progress investigating possible extensions to the
framework (e.g. Jenkins et al. 2008a). This advantage of HTA over CWA is tempered
slightly by the fact that CWA directly informs design and so does not require additional
methods. However, to the authors’ knowledge there are no formal, structured human
factors methods that are underpinned directly by an initial CWA.
What it is that was done with the outputs is an interesting point of comparison
between the two. The detailed task description offered by HTA is useful, but it is more
descriptive than analytical and, save information requirements specification, offers little
direct input into design; only when other additional methods are applied to the HTA
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
output can it be used to inform the design process. In this case, SHERPA human error
identification analysis (Embrey 1986) and task decomposition analysis (Kirwan and
Ainsworth 1992) were undertaken, both of which offered significant design recommen-
dations for future iterations of the MPS. The SHERPA analysis was used to identify
elements of the current MPS graphical user interface, which were inadequate and could
potentially induce user errors during the mission-planning task. For example, the menu
structures used were found to be too deep, which could potentially induce wrong
selection of items from the menus by the user. Also, there were many instances in which
the MPS required the user to input data that were already contained within the system
(i.e. had already been entered by the user elsewhere), which created potential for
incorrect data entry errors. Other examples included instances where the user was
required to make calculations that the system could feasibly make, which created the
potential for miscalculation errors (e.g. of routes, fuel and loads). Remedial measures
were specified for the errors identified. These included primarily simplistic interface
modifications, such as automatic data propagation between windows, the provision of a
mouse-over function (i.e. pertinent information is displayed when user holds mouse over
object on map), the use of scalable icons on the map (i.e. icons that scale automatically
when the user zooms in and out of the map area), the use of shorter and simpler menu
structures, the use of standardised symbology (i.e. consistent between MPS and aircraft
being used) and the provision of filter function, whereby the user could filter the map as
desired to remove unwanted clutter and reduce map-reading errors (i.e. filtering
functions recommended included enemy, friendly, route, ingress, holding areas, battle
positions, egress and route features).
The task decomposition analysis was used in order to decompose and analyse the
component mission-planning tasks to a deeper level of detail. The task decomposition
categories focused on included the type of activity (e.g. action, check, communication), the
initiating event or cue, information requirements, interface features (e.g. location, controls
and displays used), actions and decisions required, complexity and difficulty, outputs and
feedback. The task decomposition analysis in this case was useful in that it specified
explicitly the step-by-step task sequences involved in mission-planning operations. This
enabled the identification of a lack of links between the component parts of the MPS
software tool and the requirement for data propagation and additional automation of
planning tasks.
One of the most attractive aspects of the CWA framework is that it is couched as a
design methodology. It is alleged that CWA directly informs the design of complex
socio-technical systems (e.g. Vicente 1999a), although it must be noted that there has
been limited evidence of this in the open literature. In this case, the CWA outputs
Theoretical Issues in Ergonomics Science 525
Table 3. Mission planning system (MPS) cognitive work analysis (CWA) and hierarchical
task analysis (HTA).
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Note: The table shows what analysis each method produced, what extensions were applied and
what input to the MPS design each method offered.
indicated that the MPS interface was inappropriately designed in relation to the mission-
planning process. In particular, they suggested that the current interface embodies a
physical, rather than functional structure (i.e. the interface design and ordering of tasks
does not bear a strict resemblance to the actual functional structure of mission
planning). This conclusion was used to inform the redevelopment of the MPS training
syllabus structure. The WDA highlighted how the MPS uses a system of different
windows (e.g. aircraft configuration window, payload window, fuel window, route leg
window) to support the mission-planning process and that current training focuses on
training the users how to use each of these component windows, rather than on how to
plan a mission. It was concluded that this might lead to users developing a physical
understanding of the mission-planning process (i.e. understanding of how each
component window works) rather than a functional understanding (i.e. understanding
of the different functions involved and the relationships between them). The WDA
output, therefore, suggested that MPS training should focus initially on the mission-
planning process and then on the MPS functions that support the process, rather than
focus primarily on the MPS software tool. The means–ends links specified, and the
526 P. Salmon et al.
structure of the AH, were used to form the basis for training lesson sequencing and
teaching structure, which, in turn, has led to a more activity-focused teaching structure,
as opposed to the current application-focused training (Jenkins et al. 2008c). The
descriptions provided by the two approaches and the ways in which they were used in
this case are presented in Table 3.
It is clear that there are significant advantages associated with each form of analysis.
The outputs of both are useful in their own right and using both does not seem likely to
create any conflict when designing systems; rather, the outputs can be used in a
complementary manner. Further, when analysing existing systems the outputs derived
from both approaches are complementary. One does not disagree with the other; rather,
each describes the system in a different manner. In addition, often both approaches can be
applied to the same input data. This potential for using the approaches in a
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
activity that could go on within the system. Further, the who, what, why, when and where
associated with these goals and constraints is dealt with. Third, the system in question can
be analysed from both a high and low level of granularity. CWA offers a high level
description of the system, whereas HTA offers a more detailed description of activities at
the minute level. Fourth, and finally, all manner of human factors and ergonomics
analyses can be undertaken. As articulated previously, HTA offers numerous human
factors analysis extensions. CWA, on the other hand, deals with the work domain, the
control tasks and decisions, strategies for undertaking tasks, allocation of function across
humans and technological agents and the cognitive skills required.
Analyst experience, skill and methodological preference aside, the selection of one
method over the other is likely to be a function of the analysis requirements, the phase of
design at which the input is required and/or the domain in which the analysis is to take
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
place. Obviously, for first of a kind system design, or for input at the beginning of the
system design life cycle, CWA is more appropriate. For describing and analysing an
existing system or concept, or for making small incremental design modifications, HTA is
more appropriate (in combination with one of its many extensions). If both methods
satisfy the analysis requirements, however, and analysis constraints (e.g. time available,
analyst skill set) mean that only one can be applied, then looking at the domain in which
the analysis is to take place is also useful. For domains in which activity is more structured
and proceduralised, HTA is likely to be more useful; however, for more complex domains,
CWA is likely to be more beneficial.
In closing, it is clear that both approaches are entirely different. HTA describes
systems normatively in terms of goals, plans and activities, whereas CWA describes
systems formatively in terms of the constraints imposed on activities. Both have their
utility and indeed their place in the human factors practitioner’s armoury. Whilst the
findings from this case study have found that CWA offers more in terms of number of
outputs and perspectives on the system under analysis, the findings also showed that HTA
offers more scope for further analysis through established, structured approaches (e.g.
error prediction, interface design, allocation of functions, training programme design,
etc.), albeit at a further time cost. It is concluded, however, that, although inherently
different in terms of their theoretical underpinning, approach and output, HTA and CWA
can be used as complementary approaches for either evaluating or designing complex
systems. In particular, it appears that CWA may be more suited to design specification,
whereas HTA may be more suited to design modification.
Acknowledgements
This work from the Human Factors Integration Defence Technology Centre was part-funded by the
Human Sciences Domain of the UK Ministry of Defence Scientific Research Programme. The
authors would like to thank Nick Wharmby, Shaun Wyatt, Jan Ferraro and Sean Dufosee for their
assistance in the data collection, data analysis and interpretation of the analysis products. Also, the
authors would like to thank the anonymous reviewers whose comments helped to improve this
article significantly.
References
Adams, P. and David, G.C., 2007. Light vehicle fuelling errors in the UK: the nature of the problem,
its consequences and prevention. Applied Ergonomics, 38 (5), 499–511.
Ahlstrom, U., 2005. Work domain analysis for air traffic controller weather displays. Journal of
Safety Research, 36, 159–169.
528 P. Salmon et al.
Annett, J., 2004. Hierarchical task analysis. In: D. Diaper and N.A. Stanton, eds. The handbook of
task analysis for human–computer interaction. Mahwah, NJ: Lawrence Erlbaum Associates,
67–82.
Annett, J. and Stanton, N.A., 1998. Task analysis. Ergonomics, 41 (11), 1529–1536.
Annett, J., et al., 1971. Task analysis. Department of Employment Training Information Paper 6.
London: HMSO.
Annett, J. and Stanton, N.A., 2000. Task analysis. London, UK: Taylor & Francis.
Bisantz, A.M. and Burns, C.M., 2008. Applications of cognitive work analysis. Boca Raton, FL: CRC
Press.
Bisantz, A.M., et al., 2003. Integrating cognitive analyses in a large-scale system design process.
International Journal of Human-Computer Studies, 58, 177–206.
Bolia, R.S., 2005. Intelligent decision support systems in network-centric military operations. In:
Intelligent decisions? Intelligent support? Pre-proceedings for the International Workshop on
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Jansson, A., Olsson, E., and Erlandsson, M., 2006. Bridging the gap between analysis and design:
improving existing driver interfaces with tools from the framework of cognitive work analysis.
Cognition, Technology & Work, 8 (1), 41–49.
Jenkins, D.P., et al., 2008a. Cognitive work analysis: coping with complexity. Aldershot, UK:
Ashgate.
Jenkins, D.P., et al., 2008b. Applying cognitive work analysis to the design of rapidly reconfigurable
interfaces in complex networks. Theoretical Issues in Ergonomics Science, 9 (4), 273–295.
Jenkins, D.P., et al., 2008c. Using cognitive work analysis to explore activity allocation within
military domains. Ergonomics, 51 (6), 798–815.
Jenkins, D.P., Stanton, N.A., Salmon, P.M., Walker, G.H., and Rafferty, L. (In Press). Using the
decision-ladder to add a formative element to naturalistic decision-making research.
International Journal of Human Computer Interaction.
Kirwan, B. and Ainsworth, L.K., 1992. A guide to task analysis. London, UK: Taylor & Francis.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Kilgore, R.M., St-Cyr, O., and Jamieson, G.A., 2009. From work domains to worker competencies:
A five-phase CWA for air traffic control. In: A. Bisantz and C. Burns, eds. Applications of
cognitive work analysis. Boca Raton, FL: Taylor and Francis Group, LLC, 15–47.
Lane, R., Stanton, N.A., and Harrison, D., 2007. Applying hierarchical task analysis to medication
administration errors. Applied Ergonomics, 37 (5), 669–679.
Lintern, G. and Naikar, N., 2000. The use of work domain analysis for the design of training teams.
Proceedings of the joint 14th triennial congress of the International Ergonomics Association/44th
Annual Meeting of the Human Factors and Ergonomics Society (HFES/IEA 2000), San Diego,
CA. Santa Monica, CA: Human Factors and Eronomics Society, 198–201.
Lintern, G., 2009. The theoretical foundation of cognitive work analysis. In: A. Bisantz and
C. Burns, eds. Applications of cognitive work analysis. Boca Raton, FL: Taylor and Francis
Group, LLC, 321–355.
Marsden, P. and Kirby, M., 2004. Allocation of functions. In: N.A. Stanton, A. Hedge,
K. Brookhuis, E. Salas and H. Hendrick, eds. Handbook of human factors and ergonomics
methods. Boca Raton, FL: CRC Press.
Mazaeva, N. and Bisantz, A.M., 2007. On the representation of automation using a work domain
analysis. Theoretical Issues in Ergonomics Science, 8 (6), 509–530.
Memesevic, R., Sanderson, P.M., Choudhury, S., and Wong, W. (2005). Work domain analysis and
ecological interface design for hydropower system monitoring and control. In: Proceedings of
the IEEE Conference on Systems, Man, and Cybernetics (IEEE-SMC2005, Hawaii, USA,
3580–3587.
Miller, G.A., Galanter, E., and Pribram, K.H., 1960. Plans and the structure of behaviour. New York:
Holt.
Miller, C.A. and Vicente, K.J., 2001. Comparison of display requirements generated via Hierarchical
Task and Abstraction-Decomposition Space Analysis Techniques. International Journal of
Cognitive Ergonomics, 5 (3), 335–355.
Ministry of Defence, 2005. Joint service publication 777 – Network enabled capability. Version 1,
Edition 1, http://www.mod.uk/DefenceInternet/AboutDefence/CorporatePublications/
ScienceandTechnologyPublications/NEC/, accessed 18th July 2007.
Naikar, N., Hopcroft, R., and Moylan, A., 2005. Work domain analysis: theoretical concepts and
methodology. Defence Science & Technology Organisation Report, DSTO-TR-1665.
Fishermans Bend, Australia: Air Operations Division.
Naikar, N., Moylan, A., and Pearce, B., 2006. Analysing activity in complex systems with cognitive
work analysis: Concepts, guidelines, and case study for control task analysis. Theoretical
Issues in Ergonomics Science, 7 (4), 371–394.
Naikar, N. and Sanderson, P.M., 1999. Work domain analysis for training-system definition.
International Journal of Aviation Psychology, 9, 271–290.
Naikar, N. and Sanderson, P.M., 2001. Evaluating design proposals for complex systems with work
domain analysis. Human Factors, 43, 529–542.
530 P. Salmon et al.
Naikar, N. and Saunders, A., 2003. Crossing the boundaries of safe operation: A technical training
approach to error management. Cognition Technology and Work, 5, 171–180.
Naikar, N., et al., 2003. Technique for designing teams for first-of-a-kind complex systems with
cognitive work analysis: Case study. Human Factors, 45 (2), 202–217.
Naikar, N., 2009. Beyond the design of ecological interfaces: Applications of work domain analysis
and control task analysis to the evaluation of design proposals, team design and training.
In: A. Bisantz and C. Burns, eds. Applications of cognitive work analysis. Boca Raton, FL:
Taylor and Francis Group, LLC, 69–94.
Olsson, G. and Lee, P.L., 1994. Effective interfaces for process operators. The Journal of Process
Control, 4, 99–107.
Piso, E., 1981. Task analysis for process-control tasks: The method of Annett et al. applied.
Occupational Psychology, 54, 347–254.
Rasmussen, J., 1986. Information processing and human-machine interaction: an approach to cognitive
Downloaded by [Temple University Libraries] at 17:06 17 November 2014
Dr Dan Jenkins is director of Sociotechnic Solutions Limited, a company specialising in the design and
optimisation of complex sociotechnical systems.
Professor Neville Stanton holds a Chair in Human Factors in the School of Civil Engineering and the
Environment at the University of Southampton.
Dr Guy Walker is a senior lecturer in the school of the Built Environment at Heriot Watt University.
Downloaded by [Temple University Libraries] at 17:06 17 November 2014