Lecture 6
Lecture 6
Usability Testing
Analytical evaluation
Aims:
• Describe the key concepts associated
with inspection methods.
• Explain how to do heuristic evaluation
and walkthroughs.
• Explain the role of analytics in evaluation.
Inspections
• Several kinds.
• Experts use their knowledge of users &
technology to review software usability.
• Expert critiques (crits) can be formal or
informal reports.
• Heuristic evaluation is a review guided
by a set of heuristics.
• Walkthroughs involve stepping through
a pre-planned scenario noting potential
problems.
Heuristic evaluation
• Developed Jacob Nielsen in the early
1990s.
• Based on heuristics distilled from an
empirical analysis of 249 usability
problems.
• These heuristics have been revised for
current technology.
• Heuristics being developed for mobile
devices, wearables, virtual worlds, etc.
• Design guidelines form a basis for
developing heuristics.
Nielsen’s original heuristics
• Visibility of system status.
• Match between system and real world.
• User control and freedom.
• Consistency and standards.
• Error prevention.
• Recognition rather than recall.
• Flexibility and efficiency of use.
• Aesthetic and minimalist design.
• Help users recognize, diagnose, recover
from errors.
• Help and documentation.
Design Principles and Usability Heuristics
You can inspect an interface for usability problems with these principles
Design principles
• broad usability statements that guide a developer’s design efforts
• derived by evaluating common design problems across many
systems
Heuristic evaluation
• same principles used to “evaluate” a system for usability
problems
• becoming very popular
– user involvement not required
– catches many design flaws
• is an “expert review”
1
Design principles and usability heuristics (II)
Advantages
• the “minimalist” approach
– a few general guidelines can correct for the majority of usability
problems
– easily remembered, easily applied with modest effort
2
Heuristic Evaluation
Usability principles
• Nielsen’s “heuristics”
– there are several slightly different sets (we will see one) of heuristics
• supplementary list of category-specific heuristics
– competitive analysis & user testing of existing products
3
Phases of Heuristic Evaluation
1) Pre-evaluation training
• give evaluators needed domain knowledge and information
on the scenario
2) Evaluation
• individuals evaluate and then aggregate results
3) Severity rating
• determine how severe each problem is (priority)
4) Debriefing
• discuss the outcome with design team
4
Examples
Severity Rating
Used to allocate resources to fix problems
Combination of
• frequency
• impact
• persistence (one time or repeating)
5
Nielsen’s Example Ratings List
Debriefing
Conduct with evaluators, observers, and development team
members
6
Results of Using HE
http://www.useit.com/papers/heuristic/heuristic_evaluation.html
7
Why Multiple Evaluators (cont)?
Good? Bad?
This has changed over time as people went away from audio tape in their lives…
8
1 Simple and natural dialogue
Present exactly the information the user needs.
• less is more
– less to learn, to get wrong, to distract...
9
2 Speak the users’ language
Use terminology based on users’ language for task.
Bad Better
10
3 Minimize user’s memory load
Promote recognition over recall.
• computers are good at remembering thing, people not as much…
• menus, icons, choice dialog boxes vs command lines, field formats
• relies on visibility of objects to the user (but less is more!)
Bad
Better
Bad Better
11
4: Be consistent
Consistency of effects.
• same words, commands, actions will always have the same effect in
equivalent situations
– predictability
Cancel
• forms follow boiler plate
• same visual appearance across the system (e.g. widgets)
– e.g. different scroll bars in a single window system!
Consistency of input.
• consistent syntax across complete system
4: Be consistent
In application suites, have individual applications consistent
with the other individual applications in the suite.
-vs-
12
4: Be consistent
In application suites, have individual applications consistent
with the other individual applications in the suite.
5: Provide feedback
Continuously inform the user about.
• what it is doing
• how it is interpreting the user’s input
• user should always be aware of what is going on
What’s it
Time for
doing?
coffee.
13
5. Provide feedback
What mode
am I in now?
What did I
select? How is the
system
interpreting
my actions?
5. Provide feedback
Should be as specific as possible, based on user’s input.
Bad Better
Best within the context of the action rather than with a dialog box.
14
5. Provide feedback
Response time is important…
• how users perceive delays
0.1 second max: perceived as “instantaneous”
1 seconds max: user’s flow of thought stays uninterrupted, but delay noticed
10 seconds: limit for keeping user’s attention focused on the dialog
> 10 seconds: user will want to perform other tasks while waiting and might
think that the application has failed
5. Provide feedback
Dealing with long delays…
• Cursors
– for short transactions
• Percent-done dialogs
– for longer transactions
• how much left
• estimated time
• what it is doing
NOTE: When giving this type of feedback, take care to do so in a meaningful
fashion based upon percent of time. For example, if doing a progress bar for an
e-mail client, rather than the % of messages sent, use % of size of messages.
• “Still Working”
– for unknown/changing times
15
6. Provide clearly marked exits
How do
I get
out of
this?
Strategies:
• Cancel button (for dialogs waiting for user input)
• Universal Undo (can get back to previous state)
• Interrupt (especially for lengthy operations)
• Quit (for leaving the program at any time)
Core
• Defaults (for restoring a property sheet) Dump
16
7. Provide shortcuts
Experienced users should be able to perform frequently
used operations quickly!
Strategies:
• keyboard and mouse accelerators
– abbreviations
– command completion
– menu shortcuts
– function keys
– double clicking vs menu selection
• navigation jumps
– e.g., going to window/location directly, and avoiding intermediate nodes
• history systems
– WWW: ~60% of pages are revisits
Keyboard
accelerators for
menus
Customizable
toolbars and
palettes for
frequent actions
Right-click raises
toolbar dialog box
Right-click raises
object-specific
menu
Scrolling controls
for page-sized
increments
17
Alternate
representation for
quickly doing
different set of
tasks
Toolset brought in
appropriate to this
representation
Errors we make
• Mistakes
– arise from conscious deliberations that lead to an error instead of the correct
solution
• Slips
– unconscious behavior that gets misdirected en route to satisfying goal
•e.g. drive to store, end up in the office
18
Types of slips
Capture error (habit)
• a frequently performed activity takes charge “on autopilot” instead of the
one intended at the time
– occurs when common and rarer actions have same initial sequence
–change clothes for dinner and find oneself in bed (William James, 1890)
–confirm saving of a file when you don’t want to replace it
I can’t
believe I
pressed
Yes...
Types of slips
Description error
• intended action has much in common with others that are possible
– usually occurs when right and wrong objects physically near each other
–pour juice into bowl instead of glass
–go jogging, come home, throw sweaty shirt in toilet instead of laundry basket
–move file to trash instead of to folder
Loss of activation
• forgetting what the goal is while undergoing the sequence of actions
– start going to room and forget why you are going there
– navigating menus/dialogs and can’t remember what you are looking for
– but continue action to remember (or go back to beginning)!
Mode errors
• people do actions in one mode thinking they are in another
– refer to file that’s in a different directory
– look for commands / menu options that are not relevant
19
Designing for slips
General rules
• Prevent slips before they occur
• Detect and correct slips when they do occur
• User correction through feedback and undo
Examples
• capture errors
– instead of confirmation, make actions undoable
– allows reconsideration of action by user
•e.g. Mac trash can can be opened and “deleted” file taken back out
• description errors
– in icon-based interfaces, make sure icons are not too similar,
– check for reasonable input, etc.
• loss of activation
– if system knows goal, make it explicit
– if not, allow person to see path taken
• mode errors
– have as few modes as possible (preferably none)
– make modes highly visible
Warn
• warn people that an unusual situation is occurring
• when overused, becomes an irritant
– e.g.,
•audible bell
•alert box
20
Generic system responses for errors continued...
Do nothing
• illegal action just doesn’t do anything
• user must infer what happened
– enter letter into a numeric-only field (key clicks ignored)
– put a file icon on top of another file icon (returns it to original position)
Self-correct
• system guesses legal action and does it instead
• but leads to a problem of trust
– spelling corrector
Teach me
• system asks user what the action was supposed to have meant
• action then becomes a legal one
HUH ?!?
21
8 Deal with errors in a positive and helpful manner
Bad
Try again…
Error 25
Better
Cannot open “chapter 5” because the application “Microsoft Word”
is not on your system
22
8 Deal with errors in a positive and helpful manner
Prevent errors.
• try to make errors “impossible” to make
• modern widgets: only “legal commands” selected, or “legal data” entered
(which if these might allow you to enter February 29th, 2014?)
Consumer
Manuals...
23
9. Provide help
Help is not a replacement for bad design!
Simple systems:
Volume 37:
• walk up and use; minimal instructions A user's
guide to...
Most other systems:
• feature rich
• some users will want to become “experts” rather than “casual” users
• intermediate users need reminding, plus a learning path
Usually used when users are in some kind of panic, they will
want (and perhaps need) immediate help.
• indicates need for online documentation, good search/lookup tools
• online help can be specific to current context
• Kindle “Mayday” option?
24
Types of help
Tutorial and/or getting started manuals.
• short guides that people are likely to read when first obtaining their systems
– encourages exploration and getting to know the system
– tries to get conceptual material across and essential syntax
Types of help
Reference manuals.
• used mostly for detailed lookup by experts
– rarely introduces concepts
– thematically arranged
• on-line HTML
– search / find
– table of contents
– index
– cross-index
25
Types of help
Reminders to the user.
• short reference cards used to be VERY popular
– expert user who just wants to check facts
– novice who wants to get overview of system’s capabilities
Types of help
Context-sensitive help.
• system provides help on the interface component the user is currently
working with
– Macintosh “balloon help”
– Microsoft “What’s this” help
•brief help explaining whatever the user is pointing at on the screen
Title bar
To move the window, position the
pointer in the title bar,
press the button, and drag it to the new
position
26
Types of help
Wizards specific to task.
• walks user through typical tasks
• but dangerous if user gets stuck
What’s my
computer’s
name?
Fred?
Intel?
AST?
Types of help
Tips to the user.
• provides migration path to learning system features
• also context-specific tips on being more efficient
• must be “smart”, otherwise boring and/or tedious and/or interrupts user’s
work flow (ie: Office Assistant had good and bad elements)
27
3 stages for doing heuristic
evaluation
• Briefing session to tell experts what to
do.
• Evaluation period of 1-2 hours in which:
– Each expert works separately;
– Take one pass to get a feel for the product;
– Take a second pass to focus on specific
features.
• Debriefing session in which experts
work together to prioritize problems.
Advantages and problems
• Few ethical & practical issues to
consider because users not involved.
• Can be difficult & expensive to find
experts.
• Best experts have knowledge of
application domain & users.
• Biggest problems:
– Important problems may get missed;
– Many trivial problems are often identified;
– Experts have biases.
Heuristics for websites focus
on key criteria (Budd, 2007)
• Clarity
• Minimize unnecessary complexity &
cognitive load
• Provide users with context
• Promote positive & pleasurable user
experience
Cognitive walkthroughs
• Focus on ease of learning.
• Designer presents an aspect of the
design & usage scenarios.
• Expert is told the assumptions
about user population, context of
use, task details.
• One or more experts walk through
the design prototype with the
scenario.
• Experts are guided by 3 questions.
The 3 questions
• Will the correct action be sufficiently evident
to the user?
– Will users know what to do to achieve tasks?
• Will the user notice that the correct action is
available?
– Will users see how to do it (feasible functionality)?
• Will the user associate and interpret the
response from the action correctly?
– Will users receive feedback on correct and
incorrect actions?
As the experts work through the scenario they note
problems.
Pluralistic walkthrough
• Variation on the cognitive walkthrough
theme.
• Performed by a carefully managed team.
• The panel of experts begins by working
separately.
• Then there is managed discussion that
leads to agreed decisions.
• The approach lends itself well to
participatory design.
Analytics
Total score = 22