0% found this document useful (0 votes)
13 views27 pages

W02 DesignHeuristics - UsabilityTesting 01

The document outlines the content of a User Interface Design course, focusing on design heuristics and usability testing. It emphasizes the importance of evaluating user interfaces through methods like heuristic evaluation and discusses the factors that impact the validity of such evaluations. The document also introduces Nielsen's 10 heuristics for usability and provides guidance on conducting heuristic evaluations effectively.

Uploaded by

lichenzheng2000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views27 pages

W02 DesignHeuristics - UsabilityTesting 01

The document outlines the content of a User Interface Design course, focusing on design heuristics and usability testing. It emphasizes the importance of evaluating user interfaces through methods like heuristic evaluation and discusses the factors that impact the validity of such evaluations. The document also introduces Nielsen's 10 heuristics for usability and provides guidance on conducting heuristic evaluations effectively.

Uploaded by

lichenzheng2000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

CMPT 363: User Interface Design

Fall 2022

Week 2: Design Heuristics, Usability Testing


Instructor:Victor Cheung, PhD
School of Computing Science, Simon Fraser University

© Victor Cheung, 2022


Recap from Last Lecture

• HCI & UX
• HCI: concerned with design, evaluation, & implementation, as well as phenomena surrounding them
• UX: encompasses all aspects of the end-user’s interaction
• Importance of good user interfaces
• How do we design good user interfaces
• Usability (goals)
• Design principles (guidelines)
• Affordances for computer interfaces
• Some differences from traditional objects with physical forms

2
Review from Last Lecture

• For each of the design, state which principle is it most likely based on:
• Make the “add to cart” button in a shopping website bigger and increase its contrast against the
background → Visibility
• Change the cursor shape to a ↔ when the user hovers the mouse cursor over an edge of the
window to let them adjust the window size → Affordance (or perceived affordance)
• All the labels of buttons in the interface begin with an action verb like “remove”, “add”, “edit” →
Consistency
• Show a loading animation when the system is processing the user’s request → Feedback
• If an item is not available for purchase, grey it out and make it not selectable → Constraints
3
Today

• Evaluating Interfaces
• Heuristic Evaluation (will continue in the next lecture)

• Assignment 1 is available on Canvas (based on heuristic evaluation), ask questions at Canvas Discussion
• Office hours (starting from Week 3)
• Arshdeep Singh Ahuja | Mondays 10a-11a @Zoom (link can be found in the Teaching Team page on Canvas)
• Rishabh Kaushal | Wednesdays 8:30a-9:30a @Zoom (link can be found in the Teaching Team page on Canvas)

4
Evaluating Interfaces

Why? What? Where? When?

5
Why Do We Want to Evaluate a User Interface?

• Users expect more than just a usable system


• Not just “does it get the job done?”, but also “how well does it get the job done?”
• Can be conducted before product launch to discover flaws (and fix them)

6
https://xkcd.com/2072/
What to Evaluate?

• All levels
• From prototype to final product
• All parts
• From individual components to its entity
• All attributes
• Aesthetics, safety, speed, error rate, duration

7
Where to Evaluate?

• Laboratory
• More control on the procedures & environment, good to pin-point causal relationships (internal validity)
• Example: invite people to a facility and measure how fast they can find something in the interface
• Natural setting (aka in-the-wild)
• More realistic scenarios, typically more generalizable to the rest of the public (external validity)
• Example: give people an evaluation unit and have them try it out at home for a week

• Depends on what to evaluate


• For example, performance vs experience, specific vs general, stage of development

8
When to Evaluate?

• Any time!
• The idea is to make sure users’ needs are being met at every step during the design process
• It can be a formative process where the interface is evaluated while it is being formed (qualitative-driven),
or a summative process where the outcome of using the interfaces is evaluated (quantitative-driven)

“when the cook tastes the soup, that’s formative;


when the guests taste the soup, that’s summative.” – R. Stake. 2004. p17
9

Stake R. Standards-Based and Responsive Evaluation. Sage Publications 2004.


Issues about Evaluation

• Factors that could impact validity of evaluation


• The surroundings (e.g., weather, noise, distraction)
• Bias (e.g., users doing the evaluation, questions being asked)
• Hawthorn effect (people modify their behaviour due to awareness of being observed)
• Ways to mitigate these factors
• Better control of the environment
• More users doing the evaluation, come up with non-biasing questions
• Better design of the evaluation process (will talk more about this in Week 12 – experimental design)

10
Types of Evaluation (ID-Book Ch 14.3)

• Controlled settings directly involving users


• Usually done in labs to provide the most control (mostly called usability testing/studies/experiments)
• Natural settings involving users
• Usually done outside labs where the interface is designed to be used at (mostly called in-the-wild studies)

• Any settings not directly involving users


• Consultants/field experts instead of users (mostly called analytical evaluation)

11
Analytical Evaluation

• Done without involving users, instead by experts to predict user behaviour and to identify usability problems
based on knowledge of usability, users’ behaviour, the contexts in which the system will be used, and the kinds
of activities that users undertake (ID-Book, 14.3.3)
• Useful for uncovering key design issues before usability testings are conducted
• Examples:
• Heuristic Evaluation (we’ll cover this today)
• Cognitive Walkthrough (details in later lectures)
• Fitt’s Law Analysis (details in later lectures)

12
Heuristic Evaluation

Part of Discount Usability Engineering

13
Discount Usability Engineering

• “Something is better than nothing” – Jakob Nielsen


• Simple testing (5 users)
• Simple prototypes (for early testing)
• Heuristic evaluation

• Quick read (and a 3m16s video) on Discount Usability: 20 Years, by Jakob Nielsen
https://www.nngroup.com/articles/discount-usability-20-years/

14
The Heuristic Evaluation Method

• Experts evaluate the user interface using guidelines to “inspect” its usability
• Involving multiple evaluators can cover more usability problems

Key findings
• the easier problems are found by many evaluators
• the harder problems are found by only a few evaluators
• no single evaluator is the best (the harder problems are
not found by those who are the most successful)

Illustration showing which evaluators found which usability problems in a heuristic evaluation of a banking system. 15
Source: https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
How to Conduct Heuristic Evaluation

• Have multiple “evaluators” individually go through the interface several times


• Each evaluator inspects various components and compares them with usability principles (heuristics)
• General rules that seem to describe common properties of usable interfaces (e.g., Nielsen’s 10)
• Any other relevant ones for a specific element/product (e.g., redundancy for critical actions)
• Evaluators communicate and aggregate their findings

• Typically takes 1-2 hours. Split into smaller sessions for large/complicated interfaces

16
Nielsen, J. 1990. Paper versus computer implementations as mockup scenarios for heuristic evaluation.
Proc. IFIP INTERACT90 Third Intl. Conf. Human-Computer Interaction (Cambridge, U.K., August 27-31), 315-320.
How Many Evaluators?

• Nielsen suggests 3 to 5

Illustration showing proportion of usability problems found (left) and benefit-to-cost ratio (right) with various
number of evaluators. Assuming $15,000 per problem and $4000 + 600n to hire the evaluators. 17
Source: https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
Outputs of Heuristic Evaluation

• A list of usability problems referencing usability principles being violated


• Design advice
• As an extension of the HE method
• Typically involves evaluators, observers, and design team representatives
• Brainstorm possible redesigns to address identified problems
• Discuss positive aspects of the design (HE doesn’t normally has this)

• Note: it is possible to evaluate the system even when it only exists on paper only (Nielsen 1990)

18
Nielsen, J. 1990. Paper versus computer implementations as mockup scenarios for heuristic evaluation.
Proc. IFIP INTERACT90 Third Intl. Conf. Human-Computer Interaction (Cambridge, U.K., August 27-31), 315-320.
The Nielsen’s 10

• Visibility of system status


• Match between system + real world
• User control and freedom
• Consistency and standards
• Recognition rather than recall
• Error prevention
• Flexibility and efficiency of use
• Aesthetic and minimalist design
• Help users with errors
• Help and documentation

19
#1 Visibility of System Status

• Keep users informed about what is going on in the system, through appropriate feedback within
reasonable time
• Example 1: What page they are on and what part of a process (# of steps left)
• Example 2: In offline mode when connection is lost, or by choice
• Increases transparency, predictability, sense of control, and trust

Indication of
normal/incognito
modes in Google 20
Chrome
#2 Match between System & The Real World

• Speak the users’ language that are familiar to the users, follow real-world conventions for showing
information, use real-life metaphors to convey meaning
• Example 1: Say things from users’ perspective “You have bought…” instead of “We have sold you”
• Example 2: Interface components mimic corresponding tools in real-life
• Prevents errors, facilitate understanding, promote feelings of familiarity and care, leading to loyal users

Use icon that reflects


real-life counter-part Use words that are
common for shopping 21
#3 User Control & Freedom

• Expect mistakes, provide an “emergency exit”, support undo and redo when possible
• Example 1: Provide a “start over” option for form filling
• Example 2: Allow users to remove items from a shopping cart
• Allows flexibility and promote confidence. Even frequent users make mistakes!
• Note: this heuristic doesn’t always refer to mistakes, it can also be ease & availability of control

Allowing users to
undo email actions
in the Gmail app.

22
#4 Consistency & Standards

• Use the same wordings, associate the same meaning with the same actions, as frequently as possible
• Example 1: Same command results in the same effect (Ctrl+C = copy)
• Example 2: Layout in the same website remains the same
• Promotes familiarity (branding), people like consistency (very powerful attribute to usability)
• Fun read: https://www.nngroup.com/articles/power-law-learning/

Button for adding content


is consistently available at
the lower right corner
across Google apps.

23
#5 Recognition Rather Than Recall

• Minimize users’ memory load by making objects, actions, and option visible
• Example 1: Provide history of actions or sites visited
• Example 2: Auto-fill of previously inputted emails or addresses
• Reliefs users from having to remember information from one part of the dialogue to another
previews
also help

VS

24
Summar y

• Evaluating Interfaces
• Why? What? Where? When?
• Heuristic Evaluation
• Expected output: usability problems, design advice
• Nielsen’s Heuristics (first 5 of 10)

25
Post-Lecture Activity

• Read/watch these (and those in the slides)


• Chapters 14 & 15 of ID-Book: Introducing Evaluation & Evaluation Studies
• Thinking Aloud: The #1 Usability Tool
https://www.nngroup.com/articles/thinking-aloud-the-1-usability-tool/
• Usability evaluation and analysis
https://www.hotjar.com/usability-testing/evaluation-analysis/

• Think about these as exercises:


• Suppose you are designing the interface for a new smartwatch for fitness. What will you test? Where will you test it? When in the
development process will you test it?
• Which heuristics do you think are the more important ones for a new user of the interface?

26
Exercises

• MC Question – Which of the following is *not* an expected benefit from usability testing?
• A - Validate the design with end users
• B - Make potential customers want to buy your product
• C - Identify issues of the interface
• D - Develop empathy towards end users

27

You might also like