Adapting Usability Investigations For Agile User-Centered Design
Adapting Usability Investigations For Agile User-Centered Design
112-132
Keywords
Permission to make digital or hard copies of all or part of this work usability method, Agile, XP, iterative development,
for personal or classroom use is granted without fee provided that
copies are not made or distributed for profit or commercial
software, case study, field study, contextual inquiry,
advantage and that copies bear this notice and the full citation on ethnography, formative usability testing, user-centered
the first page. To copy otherwise, or republish, to post on servers design, iterative design.
or to redistribute to lists, requires prior specific permission and/or a
fee. Copyright 2006, ACM.
113
Figure 1. In a waterfall development cycle, analysis, design, coding, and quality assurance testing are separate stages of a software
release that spans months or years. In Agile development, each of a set of incremental mini-releases (each created in 2-4 weeks) has
these stages. Adapted from Cutter Consortium [8].
At the beginning of each iteration cycle, the full, cross- determine when that feature is complete, and also
functional Agile team meets to do cycle planning. They includes a time estimate for completion.
determine the theme, or user story, of the next
The Agile team meets daily at a short, stand-up
working version, and the features to put in it. Future
meeting (sometimes called a scrum), where team
cycles remain more loosely planned, since each cycle
members each describe what they are working on, and
planning session is based on the most current
any blocking issues. Scrums, through face-to-face
information.
communication, take the place of detailed documents
Cycle planning is guided by an overall vision or plan for to guide project planning.
the release. At Alias, Agile teams did release-level
Working versions are periodically delivered to users or
planning during a brief 4- to 6-week phase called Cycle
to customers to validate the acceptance criteria for the
Zero. The first iteration immediately follows Cycle Zero.
feature cards. Their feedback influences current cycle
Each feature (the smallest development component, as implementation, and directs future cycle planning.
defined by developers) is described on an index card
Note that in Agile terminology, a customer is not a
called a feature card. Feature cards are grouped into
person external to the product team who purchases or
iteration cycle clusters, and displayed in a public space
uses the product, but a role filled by one or more
for the whole Agile team as a communication artifact, in
members of the product team. The duties of the Agile
lieu of more traditional planning documents. Each
customer include acting as the voice of the end-user on
feature card describes the acceptance criteria that
the development team, and helping to prioritize and
115
plan cycles and releases. Miller [18] suggests that Changes to the timing of usability
interaction designers who are willing to understand and investigations
accept Agile development concepts are well-suited to
Problems with the timing of waterfall UCD
take on the Agile customer role.
Previously, in waterfall development projects, the User
This is particularly relevant to Agile usability Experience Team conducted usability investigations as
investigations because Agile projects are highly early as we could during the development cycle.
feedback-driven, yet product teams often rely on user
We performed contextual inquiry (sometimes combined
opinion in situations where observation is more
with field usability testing of the prior release) before or
appropriate (such as the focus group elicitation strategy
at the onset of a project, often during a market
described earlier). Usability practitioners can be the
validation for a release. During the design phase, we
best-suited members of an Agile team to prevent this
would rapidly iterate on key designs for a release, using
type of data bias because of their skills in gathering and
formative in-house usability testing to direct the re-
analyzing user experience data. On the Agile projects
design of prototypes [19,20]. We would then describe
that our User Experience Team works on, interaction
the validated designs as feature specifications, and
designers assume the role of the Agile customer.
pass them on to development to be incorporated into
The iterative and incremental lifecycle of Agile the implemented code.
development methods described in Figure 1 are similar
In theory, the analysis and design phases preceded the
to those of other iterative development processes (such
implementation (see Figure 2). However, in practice,
as the Rational Unified Process). They differ mainly in
developers would begin coding at the onset of a project
the length of the iteration timeboxes (in Agile,
without waiting for feature specifications. The result
measured in weeks, rather than months), the fixed
was that the implementation of some features would
nature of the cycle end dates, and the highly
begin before they were designed. To combat this
collaborative and document-light form of project
tendency, we investigated, designed, and validated well
planning and implementation. (There are other
in advance, often conducting usability investigations
differences as well, but these affect developers rather
almost a full release ahead. However, this led to writing
than usability practitioners.)
many unused or out-of-date feature specifications,
Because of the similarities in their development since we could not anticipate all planning issues (such
lifecycles, the adaptations to usability investigations as business goals).
described in this paper may also benefit usability
practitioners working on projects using an iterative
development process.
116
Figure 2. In the perfect theoretical version of waterfall development, usability investigations contributing to the analysis and design
phases were supposed to precede coding, but in reality developers would begin coding immediately.
Also, since months often would pass between when we We were caught between wanting to push our
specified a design and the time it was coded, requirements gathering as late as possible, so that the
implementation would sometimes drift from the design most timely information would inform the product
intent. direction, and yet not leaving it so late that too many
undesigned features started being implemented [13].
Finally, development for all features going into a
release would begin simultaneously, with one developer Just-in-time design
working on each feature. Because there were more In contrast, on Agile projects the Agile team only
developers than interaction designers, the result was focuses on a few new features at a time. This means
that some features in a release were designed while that the User Experience Team does not have to work
other features were not. Furthermore, because work on on all the designs in a release at the same time.
all features was partially implemented, waterfall Instead, we can focus on the most important designs, a
products often shipped with incomplete features, few at a time.
despite delaying the release date to try and
At any given time during the Agile development cycle
accommodate feature completion.
for a release, we conduct usability activities for only
those key designs. We then work closely with
117
developers to ensure that implementations of the Two parallel tracks: iterating the design and
designs do not drift from the validated design intent. implementation separately, but simultaneously
A key principle of our User Experience Team’s UCD
Because developers are working on only a subset of
process is design iteration; we need to be able to catch
features at one time, and interaction designers are
design failures early, change designs as many times as
designing the same subset, this also means that any
needed, and then incorporate the design fixes [19].
features that require careful UCD work receive it. Since
everything in the product must be done, traces of half- Therefore, we not only need to conduct formative
complete features don't impede the user experience. usability tests to check our prototypes, but we need to
do so before coding begins, while the design is still
In waterfall UCD, field investigation data would usually
malleable. Because coding begins immediately in Agile
not have a visible product impact until the next release.
development, we needed to find a way to separate
Just-in-time design spreads out contextual inquiry and
design iterations from implementation iterations.
field work through the whole development process,
instead of concentrating those activities at the To do this, the UCD work was done in an Interaction
beginning of the lifecycle (or in the lifecycle of the Designer Track while developers worked in a separate
previous release). Consequently, the data that we bring and parallel Developer Track [13,14]. The Agile UCD
to Agile projects during cycle planning is up-to-the- parallel tracks for development and interaction design
minute. This allows product improvements to be are illustrated in Figure 3.
implemented within the current release—sometimes
even in the next working version.
118
Figure 3. To allow the User Experience Team to iterate on designs, we usability tested prototypes at least one cycle ahead of
developers, and then passed on the validated designs to be implemented. We would also conduct contextual inquiry for workflows at
least two cycles ahead, and usability test the implemented working version to check for design drift.
Usability investigation activities in Cycle Zero exploratory designs for market validation. Based on
Cycle Zero is the brief requirements-gathering phase at these data, deriving the design principles that
the start of the project. Usability investigation activities inform and guide design decisions for the product.
in Cycle Zero depend on whether the product is the
(For an ongoing release) Analyzing and summarizing
next release of an existing product or completely new.
prior contextual inquiry and usability test data.
They can include the following activities:
Based on these data, elucidating release-level
Gathering data to refine or hone product- and design goals to inform and guide design decisions
release-level goals. Facilitating the alignment of all through all iterations.
team members’ understanding of these goals, so (For a completely new market or capability)
they constitute a shared vision. Developing brief and vivid descriptions of target
(For a completely new product) Interviewing or users and workflows (light personas and scenarios)
conducting contextual inquiry during customer site from investigations.
visits for market validation. Preparing high-level
119
For example, we conducted an investigation during Usability investigation activities in iteration cycles
Cycle Zero for SketchBook Pro v2.0, a digital sketching In Cycle 1, usability investigation activities can include:
application developed specifically for use with tablet
Designing prototypes for Cycle 2, and conducting
input devices. Version 1.0 of the product had a free
trial version. The release goal of v2.0 was “to improve rapid formative usability testing to refine their
the rate of conversion of trial users to paid users by design.
removing barriers to purchase.” The User Experience Conducting contextual inquiry and interviews to
Team helped to focus the feature set by conducting a investigate designs for Cycle 3.
survey targeted at people who had downloaded
During the first few early cycles, to give interaction
SketchBook v1.0, but who had not purchased it. These
designers time to do these usability investigations,
data helped refine the feature set for v2.0 from over
developers work on coding software architecture (which
100 potential features to five major workflows. The
requires no user interface design) or important features
release goal also informed design prioritization during
that need only minor design.
cycle planning throughout the release lifecycle.
For example, during Cycle 1 for SketchBook Pro v2.0
Usability activities for Cycle Zero of the first release of
developers worked on adding Adobe Photoshop export.
Autodesk Showcase (a real-time automotive 3D
This was identified as a key issue that affected users’
visualization product) were different. We helped the
purchasing decisions. It required significant
team prepare for a market validation trip to Europe,
development effort, but had a very simple UI (adding
and also traveled there with the project manager and
an item to the File Save As type list).
subject matter expert. We interviewed potential
purchasers about their work activities for the areas that In Cycle 2, the designs from Cycle 1 are presented to
the new product would support. We then reported developers, who begin coding them. Interaction
these back to the larger team. Also, these data were designers work closely with developers to answer
the foundation for the design principles we wrote for questions about the design as it is implemented.
Autodesk Showcase that allowed us to make Cycle 2 usability investigation activities can include:
prioritization and design decisions as development
progressed. Prototyping and usability testing for Cycle 3 designs,
using the requirements information gathered in
Cycle Zero usability activities are those that most
Cycle 1.
closely resemble their waterfall UCD counterparts.
However, they occur in weeks rather than months. Contextual inquiry to investigate designs for Cycle 4.
This pattern of designing at least one cycle ahead of usability test design chunks in the Interaction Designer
developers, and gathering requirements at least two Track, carrying the progressively built design forward in
cycles ahead, continues until the product is released. this track until it is complete. Then, we pass the
finished design to the Developer Track for
In cycles later in the release, while we continue to
implementation [14].
focus on checking the implementation of designs, we
can also begin some contextual investigations to Interaction designers are trained to consider
prepare for the Cycle Zero of the next release [13,14]. experiences holistically, so breaking designs into
pieces—especially into pieces that do not initially
Changes to the granularity of usability support workflows—can be difficult at first, but it is a
investigations skill that comes with practice. Design chunking yields
Problems with the size of the problems to investigate many benefits in Agile UCD, which we will describe in
The parallel tracks allowed the User Experience Team later sections.
to iterate designs before they were implemented. To deconstruct a large design into smaller pieces, it is
However, we still had to deal with the reality of cycles essential to start with well-defined design goals and to
that were only two to four weeks long. We could understand the high-level design intent. Our design
complete small designs in this timeframe, but complex goals are derived from observation, which is why
designs required more than four weeks to finish. We contextual inquiry plays a critical role in our Agile UCD
needed to figure out how to do usability investigations process. Each design chunk lets us progressively
for designs spanning more than one Agile cycle. achieve a subset of the design goals.
Furthermore, the overall speed of Agile UCD was much The priority and sequence of the design chunks is
faster than when we were doing waterfall UCD. We had determined by what we can validate at any given time
to move much more quickly toward design solutions in the product lifecycle. We examine the full list of
with a fewer number of usability tests within a release. design goals, and decide which we can attain with the
Design chunking: Breaking designs apart into current resources within a cycle’s length. There is also
cycle-sized pieces an ordering dependency. In Agile projects, components
We looked to the Agile model of implementation for build on one another, so early design chunks must be
hints about how to approach this dilemma. Working low-level and fundamental—design attributes that will
versions are implemented mini-releases that not change as more design chunks are added on top of
incrementally build on each other. Based on the same them. For example, for SketchBook Pro v2.0, we
principles, we decided to create mini-designs that needed to design the ability to move, rotate, and scale
incrementally build on each other. a selected area in a canvas. Contextual investigation
during usability testing of the prior release told us that
We break large designs into small, cycle-sized pieces these functions were experienced as one single high-
called design chunks that incrementally add elements level activity (positioning and fitting a selection).
to the overall design. We investigate, prototype, and
121
Other observations were that users didn’t tend to need To evaluate these design alternatives, we asked
this functionality in the first few hours of use in the internal users (quality assurance, training, support, or
application, and that it was often used with 2D layers. subject matter experts) to use high-fidelity prototypes
to rotate images, and observed the interaction. The
A few of the design goals that we derived from these
usability acceptance criteria for the design chunk
investigations included:
included checking whether users could rotate the
Cursor travel should be minimized when swapping canvas accurately without instruction, and whether the
between move, rotate, and scale. interaction felt smooth to them. We could get feedback
from each tester in less than two minutes—we did not
The interaction should feel natural and smooth.
even have to set up informal usability test sessions to
Users should be able to figure out how to position a achieve this. We could just drop by someone’s desk
selected area with out-of-box materials, but not with a tablet PC that had the prototypes installed, and
necessarily in the first hour of use. ask for a few minutes of their time.
The interaction for the Move, Rotate, and Scale modes chunk, we could address other more interdependent
were good choices as early design chunks for the design problems.
following reasons: In contrast, the types of design chunks that we usually
complete in later cycles include the following:
We could prototype them very quickly. Our student
intern coded more than a dozen variations for this Prototypes that require an implementation or
design chunk within a cycle. technology that hasn’t been completed yet.
It was easy to usability test them. In-house testers Design chunks that provide workflow-level, rather
were sufficient to evaluate the usability criteria, and than operation-level, functionality.
the prototypes for all modes took less than 15
minutes to test. Design chunks to support any investigation of a
discoverability or learning goal, such as the design
We anticipated that some set-up or explanation of how a user will access a new function. Since
would be needed to run the prototypes, and we these are designs that depend on the first
knew that the testing tasks would be highly artificial, experience of a user, you need to replicate that
and operation-level, rather than workflow-level. For experience to test them. These prototypes should be
example, we told our testers that we had two incorporated into a copy of a working version to
different algorithms for rotating (without explaining avoid task bias.
how each one worked), and we had to intervene to
switch between the two prototypes. This was fine for Design chunks that are hubs for other designs. For
internal users, but it would be inappropriate and example, many different designs converged in the
confusing for an external end-user to evaluate these Brush Palette, which is why it was one of the last
prototypes. designs that we completed for SketchBook Pro v2.0.
At the same time that we break big designs into these
The interaction was a fundamental, low-level design
small chunks, we are still completing small designs to
component. There could be many different ways
pass to development for the next iteration cycle. Key to
that we would approach how to position a selected
our success as interaction designers on Agile teams is
area, but no matter what happened for later design
that we keep ahead of development, feeding a steady
chunks, we would need to decide on the drag
stream of designs into the Developer Track. For this
interaction.
reason, we only use design chunking for a few key
The design problem could be validated in isolation. large designs per interaction designer in a release.
For the design goal we were looking at (a natural
All of the illustrating examples in this article are slightly
and smooth interaction), we didn’t have to evaluate
simplified for clarity. It is possible to chunk more
the three modes in combination, so prototypes for
complex designs than the one described. This is
each mode could be quickly built separately. Once
because design chunks are not complete designs. They
all the modes were combined in a later design
are simply design components that can be prototyped,
123
iterated, and validated within Agile timeframes. By contextual information as we progress. In essence,
design chunking, we do not ignore high-level design we need to mine more ore while drilling fewer holes.
considerations; instead, we work toward them in cycle- To overcome these hurdles, we took the Agile idea of
sized steps. progressive mini-iterations one step further. In addition
Progressive refinement of protocols: breaking usability to design chunking, we also progressively performed
testing, contextual inquiry, and recruiting into cycle- the following activities:
sized pieces
defined test protocols
Agile UCD presents particular challenges in protocol
design for usability investigations, because of two recruited test participants
considerations:
conducted contextual investigations.
The progressively incremental character of Just as working versions are mini-implementations that
both implementation and design. It is one thing get progressively closer to the final release, and design
to decide to design in chunks that build chunks are mini-designs that get progressively closer to
incrementally, but how is it possible to validate and the final designs, we use usability testers who get
investigate many small pieces for different designs progressively closer to our end-users, ask them to let
simultaneously? It is impossible to usability test us observe them doing work that gets progressively
early-release design chunks with external users, and closer to their real-world activities, and then ask them
seemingly impossible to conduct meaningful to do those activities as usability test tasks [21].
contextual investigations to understand the work
Because usability investigations are a limited resource,
users might do with them. Yet we needed to explore
we need to both maximize the information we collect
our users’ work domains to derive relevant
per session, and to hoard the validation commodity of
activities, both to design later design chunks and
our external users. We reserve external users to test
also to provide real-world (and hence, unbiased)
only mid- to late-stage design chunks, and the focus of
validation activities.
those usability tests is on validating design goals that
The fixed number of usability investigations can only be determined by an actual user.
that fit within the timeframe of a cycle. Because
For the earliest design chunks, as described in the
Agile development is faster than waterfall, the time
previous section, we use in-house users who share
to create a release is briefer. Thus, we have fewer
some characteristics with our end-users (that is, people
usability tests per design that we did in waterfall
who are not developers, and with the same domain
UCD, and in particular, we have fewer opportunities
knowledge as our users). We ask them to do operation-
to test full workflows before they are implemented.
level tasks that would probably not occur in isolation in
Yet, we need to uncover more information during
real-world work (such as arbitrarily rotating an image
each investigation, since we need to collect
to different angles, for no reason).
124
In later design chunks that have prototypes capable of students, or students with digital sketching experience
evaluating more holistic activities, we refine usability from industrial design programs or art colleges.
test activities. Beginning with our internal testers, we
We continue to refine our usability test tasks with these
prepare very artificial tasks. For example, as shown in
participants, asking them whether the activities we ask
Figure 5, we asked SketchBook Pro testers to position
them to do represent how they might really use the
and resize the shapes to fit within the appropriate
tool, and adjust the protocols for later testers.
boundary lines. This was a completely arbitrary and
unnatural task. We admitted this to internal usability Finally, when late-stage design chunks are available
testers, and then asked for a more realistic example of that can emulate partial in-product workflows, we bring
when they might use the function. We used their these workflow prototypes to actual end-user sites for
examples as our next iteration of the usability test task. usability testing. We call these users Design Partners.
They commit to evaluating a longitudinal series of
workflow prototypes, and sometimes also working
versions. They act as expert usability testers, and also
as observable beta testers. (Users who cannot commit
to a series of visits are used to test mid-stage design
chunks at our lab.)
Asking external users to bring work artifacts Here are three hypothetical examples of usability
(identified by a remote interview) to our company. investigation sessions for different design chunks of
This does not yield any environmental data about several complex designs:
our users, but can still provide us workflow context.
(Early design cycle) In-house usability test,
Asking Design Partners to set up work artifacts at with internal users. A 15-minute session where a
their site. We visit them to watch both a high-level QA person evaluates 6-8 very low-level prototypes
walkthrough of the artifacts, and a compressed for two designs by performing operation-level tasks.
workflow demonstration to view detailed During the session, we ask for a better activity-level
interactions. task for each tool.
Installing a working version at a Design Partner’s (Mid-release design cycle) In-house usability
site, and then watching an artifact walkthrough and test, with external users. A one-hour session.
compressed workflow demonstration (on- or off- Before the test, we interview the tester by telephone
site). about a workflow for a future design (two cycles
This last stage—the ability to observe how using the later), and ask her to bring some relevant files on a
actual implemented product changes the work behavior USB drive. We begin the session with the contextual
of users—is unique to Agile contextual investigations. investigation, watching her walk us through the files
This contextual data can inform future designs within and demonstrate key interactions. We also usability
the timeframe of a release. Comparable waterfall test four prototypes exploring different stages for
contextual inquiry sessions could only guide designs for two design chunks we are designing in the current
the following release. cycle. During the session, we check that our test
activities are representative. If needed, we will
Mixing design chunks: studying more than one design
adjust the tasks for the tester coming in the next
problem at a time
day.
There seem to be an overwhelming number of usability
investigation activities for any given cycle in the (Late design cycle) Usability investigation, at a
Interaction Designer Track described in Figure 3. Design Partner’s site. A 2-hour session, the
second in a series of longitudinal visits to the same
Design chunking is what gives us the freedom to solve
user. Before our visit, we ask the tester to set up
the problem of how to do so much at the same time in
some files to show us what type of work he was
fewer sessions. We combine smaller-scaled
doing with the working version that we left with him
investigations for different design chunks into single
on our last visit. We also interview him concerning a
usability investigation sessions.
workflow for a future design (for the next release),
and ask if he can show us the relevant files. When
we arrive, we observe as he walks us through the
files and demonstrates the interactions. We note
126
which are blue to differentiate them from feature cards, feature requests
and have no implementation time estimates [14].
major usability problems with design prototypes
In addition to keeping in touch with the whole Agile
team through the daily scrum, we work with developers bugs in working versions.
very closely throughout design and development. With these stories and demos, we replace personas
Although the dual tracks depicted in Figure 3 seem with people, and scenarios with workflows and sample
separate, in reality, interaction designers need to work files.
communicate every day with developers. This is not
We use index cards as a reporting artifact for these
only to ensure that designs are being implemented
data, so the team is reminded of the presentation
correctly, but also so that we have a thorough
during later cycles. To prepare for this, on the way back
understanding of technical constraints that affect
from a Design Partner site, the interaction designers
design decisions [13].
write out individual usability issues, feature requests,
Issue Cards: communicating to persuade and bugs on issue cards [14].
We report information gathered from our Design
After we present the information to the Agile team,
Partners the day after the visit in a review meeting that
team members decide what to do about feature
happens after the scrum. Any interested Agile team
requests or unexpected uses of the product. Sometimes
members can stay for the presentation, and it’s usually
an issue card is moved immediately into the cycle
the full team. This is far more people than those who
planning board, and becomes a feature or design card.
read our reports in the waterfall UCD days, and a wider
Any bugs are logged in our bug-tracking database.
cross-section of the team than those who used to listen
to our debrief presentations. It often includes, for The remaining issue cards are tracked on a User
example, technical writers and QA people. Experience board in the same public space as the cycle
planning board. (See Figure 6.) On each issue card, the
We present the following kinds of information in the
interaction designers note any fixes for a design issue,
form of verbal stories, supplemented where necessary
the usability testers who found it, and the iteration
by a demonstration of the prototypes or working
progress.
versions to represent the observed interactions:
word entirely. However, we had a key insight: we As working versions are completed, the design
ourselves are the primary readers of design documents. chronology is extended to include additional
We can, therefore, write the shortest possible workflow information and links to bugs related to
documents that still capture the information that we changes in design, or unexpected uses.
need to reference [14]. To give an idea of the length of this type of light
This information is the design history for a feature or specification, the Move/Rotate/Scale selection feature
set of features. The purpose of the record is principally for SketchBook Pro was written as two documents (one
to avoid “design thrash,” where design decisions are for each implementation chunk). One described the
accidentally re-visited, particularly between versions, or look and interaction behavior for the Move/Rotate/Scale
if a new User Experience Team member is added to a UI widget, and the other described the interaction of
project. this widget within a selected area of the canvas. The
description sections of these documents (excluding the
We generally write one document for each
design chronology and feature cards) were,
implementation chunk. The document is written in a
respectively, 1,215 and 735 words long.
medium that is easy to update (such as a Wiki page).
Design history documents are available to the whole
Information in a design history document can include
Agile team, but we have found that few team members
the following:
read them, preferring the immediacy of face-to-face
Design goals and a brief description of the problems conversation to clarify issues.
that the design addresses.
Reflections
A high-level description of the design, including
Five years ago, our User Experience Team faced the
rough sketches, and a pointer to the last design
challenge of adjusting our practices. We didn’t
prototype.
anticipate it then, but now we prefer Agile user-
Links to related design history documents. centered design for the following reasons:
A reverse chronology of the design iterations, More of the product is designed than before.
including the reasons for the design changes, and
Usability investigations are conducted throughout
design limitations and constraints defined by
usability investigation data as design work proceeds. the entire product release lifecycle, rather than
Relevant usability investigation data are recorded in clustered at the front end of a release, or in the
this chronology. This section of the document is prior release.
written as design cycles progress. The oldest entry The most important designs are worked on first, and
in this design chronology describes technology there is no effort wasted writing unused designs.
constraints.
Prototype demonstrations and daily conversation [9] Dumas, J., and Redish, J. (1999). A Practical Guide
have largely replaced detailed documents, such as to Usability Testing (revised edition). Bristol, U.K.:
Intellect.
usability test reports and UI specifications, when
communicating with the product team. Documents [10] Rubin, J. (2001). Handbook of Usability Testing:
How to Plan, Design, and Conduct Effective Tests.
are now written for interaction designers, to record
New York, NY: John Wiley & Sons.
a history of design decisions.
[11] Holtzblatt, K., and Beyer, H. (1996). Contextual
References design: defining customer-centered systems. San
Francisco, CA: Morgan Kaufmann Publishers.
[1] Constantine, L. L. (2002). Process Agility and
Software Usability. Information Age, August 2002. [12] Holtzblatt, K., Wendell, J.B., and Wood, S. (2005).
Rapid Contextual Design. San Francisco, CA:
[2] Beck, K. and Cooper, A. (2002). Extreme Morgan Kaufman/Elsevier.
Programming vs. Interaction Design. Retrieved on
December 8, 2006 from [13] Miller, L. (2005). Case Study of Customer Input For
www.fawcette.com/interviews/beck_cooper/ a Successful Product. Proceedings of Agile 2005.
Denver: Agile Alliance.
[3] Patton, J. (2004). Interaction Design Meets Agility.
Retrieved on December 8, 2006 from [14] Sy, D. (2005). Strategy & tactics for Agile design: a
www.agilealliance.org/system/article/file/1368/file. design case study. Proceedings of UPA 2005.
pdf Montréal: Usability Professionals’ Association.
[4] Pearson, G., and Pearsall, S. (2005). Becoming [15] Highsmith, J. A. III (2000). Adaptive Software
Agile: Usability and Short Project Cycles. User Development: A Collaborative Approach to
Experience, 4(4), 2005. Managing Complex Systems. New York, NY: Dorset
House Publishing Co., Inc.
[5] McInerney, P., and Maurer, F. (2005). UCD in Agile
Projects: Dream Team or Odd Couple?, [16] Beck, K. (2000). Extreme Programming Explained:
interactions, 12(6), November + December 2005, Embrace Change. Boston, MA: Addison-Wesley
19-23. Professional.
[6] Lee, J.C. (2006). Embracing Agile Development of [17] Schwaber, K., and Beedle, M. (2002). Agile
Usable Software Systems. Proceedings of CHI Software Development with Scrum. Upper Saddle
2006. Montréal: ACM. River, NJ: Prentice Hall.
[7] Lievesley, M.A., and Yee, J.S.R. (2006). The Role of [18] Miller, L. (2006). Interaction Designers and Agile
the Interaction Designer in an Agile Software Development: A Partnership. Proceedings of UPA
Development Process. Proceedings of CHI 2006. 2006. Denver/Broomfield: Usability Professionals’
Montréal: ACM. Association.
[8] Highsmith, J. A. III (2002). Agile Software [19] Schrag, J. (2006). Using Formative Usability
Development (course notes). Arlington, MA: Cutter Testing as a Fast UI Design Tool. Proceedings of
Consortium. UPA 2006. Denver/Broomfield: Usability
Professionals’ Association.
132