0% found this document useful (0 votes)
44 views73 pages

Hci Unit IV

Uploaded by

Irene Thomas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views73 pages

Hci Unit IV

Uploaded by

Irene Thomas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 73

Institute of Engineering, Dept.

of Computer Engineering

Course Introduction Session

Third Year, Computer Engineering

310245B-Elective-I -Human Computer Interface

Vipin K. Wani
Course Objectives:
❑ To understand the importance of HCI design process in software development
❑ To learn fundamental aspects of designing and implementing user interfaces
❑ To study HCI with technical, cognitive and functional perspectives
❑ To acquire knowledge about variety of effective human-computer-interactions
❑ To co-evaluate the technology with respect to adapting changing user requirements in
interacting with computer

Dept. Computer Engineering, Institute of Engineering 2


Course Outcomes:
On completion of the course, student will be able to –
o Design effective Human Computer Interface for all kinds of users
o Apply and analyze the user-interface with respect to golden rules of interface
oAnalyze and evaluate the effectiveness of a user-interface design
o Implement the interactive designs for feasible data search and retrieval
o Analyze the scope of HCI in various paradigms like ubiquitous computing, virtual reality,
multi-media, World wide web related environments.
oAnalyze and identify user models, user support, and stakeholder requirements of HCI
Systems.

Dept. Computer Engineering, Institute of Engineering 3


Course Contents
Unit IV: Usability Evaluation and Universal Design
User interface design process: Designing for People: Seven commandments, Usability Assessment in the Design process,
Common Usability problems, Practical and Objective measures of Usability, Formative and Summative evaluation, Usability
specifications for evaluation, Analytic methods, Model based analysis, GOMS model, Empirical methods, Field studies, Usability
testing in Laboratory, Controlled experiments, Heuristic Evaluation, Cognitive Walkthrough.
Evaluation framework: Paradigms and techniques, DECIDE: a framework to guide evaluation, Universal design principles, Multi-
modal interaction, Designing for diversity.

Dept. Computer Engineering, Institute of Engineering 4


Unit IV

Usability Evaluation and Universal Design

Dept. Computer Engineering, Institute of Engineering 5


Pre-Requisite Discussion on Design Process:
➢ To know the characteristics of Human Beings.
➢ To understand human considerations and common usability problems in
interface
➢ To understand the methods to analyze business requirements.
➢ To get knowledge about design standards and style guidelines.

Dept. Computer Engineering, Institute of Engineering 6


Obstacles and Pitfalls in the Development Path
Gould (1988) has made these general observations about design:

➢ Nobody ever gets it right the first time.


➢ Development is chock-full of surprises.
➢ Good design requires living in a sea of changes.
➢ Making contracts to ignore change will never eliminate the need for change.
➢ Even if you have made the best system humanly possible, people will still make mistakes
when using it.
➢ Designers need good tools.
➢ You must have behavioral design goals like performance design goals.
The first five conditions listed will occur naturally because people are people, both as users and as developers.
These kinds of behavior must be understood and accepted in design. User mistakes, while they will always
occur, can be reduced.

Dept. Computer Engineering, Institute of Engineering 7


Obstacles and Pitfalls in the Development Path Conti….
Pitfalls in the design process exist because of a flawed design process, including a failure to
address critical design issues, an improper focus of attention, or development team
organization failures.
Common pitfalls are:
✓ No early analysis and understanding of the user’s needs and expectations.
✓ A focus on using design features or components that are “neat” or “glitzy.”
✓ Little or no creation of design element prototypes.
✓ No usability testing.
✓ No common design team vision of user interface design goals.
✓ Poor communication between members of the development team

Dept. Computer Engineering, Institute of Engineering 8


Designing for People: The Five Commandments
The complexity of a graphical or Web interface will always magnify any problems that do
occur. Pitfalls can be eliminated if the following design commandments remain foremost in
the designer’s mind.
Five Commandments are:

✓ Gain a complete understanding of users and their tasks:.


✓ Solicit early and ongoing user involvement
✓ Perform rapid prototyping and testing
✓ Modify and iterate the design as much as necessary
✓ Integrate the design of all the system components

Dept. Computer Engineering, Institute of Engineering 9


Usability Assessment in the Design process
Usability

The term usability used to describe the effectiveness of human performance.


The term usability is defined as “the capability to be used by humans easily and
effectively, where,

easily = to a specified level of subjective assessment,


effectively = to a specified level of human performance.”

Dept. Computer Engineering, Institute of Engineering 10


Common Usability Problems

Mandel (1994) lists the 10 most common usability problems in graphical systems
as reported by IBM usability specialists. They are:
1. Ambiguous menus and icons.
2. Languages that permit only single-direction movement through a system.
3. Input and direct manipulation limits.
4. Highlighting and selection limitations.
5. Unclear step sequences.
6. More steps to manage the interface than to perform tasks.
7. Complex linkage between and within applications.
8. Inadequate feedback and confirmation.
9. Lack of system anticipation and intelligence.
10. Inadequate error messages, help, tutorials, and documentation.

Dept. Computer Engineering, Institute of Engineering 11


Some Practical Measures of Usability

➢ Are people asking a lot of questions or often reaching for a manual?


➢ Are frequent exasperation responses heard?
➢ Are there many irrelevant actions being performed?
➢ Are there many things to ignore?
➢ Do a number of people want to use the product?

Dept. Computer Engineering, Institute of Engineering 12


Some Objective Measures of Usability
Shackel (1991) presents the following more objective criteria for measuring usability.
➢ How effective is the interface? Can the required range of tasks be accomplished:
❖ At better than some required level of performance (for exp, in terms of speed and errors)?
❖ By some required percentage of the specified target range of users?
❖ Within some required proportion of the range of usage environments?
➢ How learnable is the interface? Can the interface be learned:
❖ Within some specified time from commissioning and start of user training?
❖ Based on some specified amount of training and user support?
❖ Within some specified relearning time each time for intermittent users?
➢ How flexible is the interface? Is it flexible enough to:
❖ Allow some specified percentage variation in tasks and/or environments beyond those first specified?
❖ What are the attitudes of the users? Are they: Within acceptable levels of human cost in terms of
tiredness, discomfort, frustration, and personal effort?
❖ Such that satisfaction causes continued and enhanced usage of the system?

Dept. Computer Engineering, Institute of Engineering 13


Formative & Summative Evaluation
➢ Formative evaluation is typically conducted during the development or improvement of a program or
course.
➢ Summative evaluation involves making judgments about the efficacy of a program or course at its
conclusion.
➢ Formative evaluation takes place during evaluation process.
➢ Summative evaluation is aimed at measure of quality.
➢ Formative evaluation focuses on determining which aspect of design work well or not & why?
➢ Summative evaluation describe how well design can process by comparing with benchmark.
Formative & Summative Evaluation
Formative & Summative Evaluation
Analytical Model
An analytical model is primarily quantitative or computational in nature and represents the system in terms
of a set of mathematical equations that specify parametric relationships and their associated parameter
values as a function of time, space, and/or other system parameters.
GOMS Model
GOMS (Goals, Operators, Methods and Selection rules)
• GOMS model proposed by Card et. al (1983)

➢ Each user task is described by a goal and a method


➢A method is a sequence of steps, each consisting of one or more
operators
➢Can have more than one method for a task, in which case need a
selection rule
GOMS (Goals, Operators, Methods and Selection rules)
Goals, operators, methods, and selection rules are a method derived from human-computer interaction (HCI) and

construct a description of human performance. The level of granularity willvary based on the needs of the analysis.

✓ The Goal is what the user wants to accomplish.

✓ The Operator is what the user does to accomplish the goal.

✓ The Method is a series of operators that are used to accomplish the goal.

✓ Selection rules are used if there are multiple methods, to determine how one was selected overthe others.
GOMS
▪ GOMS provides a higher-level language for task analysis and UI
modeling
▪ Generates a set of quantitative and qualitative measures based on
description of the task and user interface
▪ Provides a hierarchy of goals and methods to achieve them
▪ Different GOMS variants use different terms, operate at various
levels of abstraction, and make different simplifying assumptions
GOMS analysis –To drag a file to destination operation
–Method for goal: drag item to destination.
•Step 1. Locate icon for item on screen.
•Step 2. Move cursor to item icon location.
•Step 3. Hold mouse button down.
•Step 4. Locate destination icon on screen.
•Step 5. Move cursor to destination icon.
•Step 6. Verify the destination icon.
•Step 7. Release mouse button.
•Step 8. Return with goal accomplished
GOMS example
GOAL: CLOSE-WINDOW
. [select GOAL: USE-MENU-METHOD
. MOVE-MOUSE-TO-FILE-MENU
. PULL-DOWN-FILE-MENU
. CLICK-OVER-CLOSE-OPTION
GOAL: USE-CTRL-W-METHOD
. PRESS-CONTROL-W-KEYS]

For a particular user:

Rule 1: Select USE-MENU-METHOD unless another


rule applies
Rule 2: If the application is GAME,
select CTRL-W-METHOD
GOMS
• Analysis is developed through hierarchical goal decomposition
➢Start with highest level goals: what the user is trying to accomplish in the application
domain
➢ Provide a method in terms of high-level operators
➢ Each operator can be potentially decomposed:
• Replace operator with equivalent sub-goal, e.g.,
Step 2: OPEN MENU → Step 2: Accomplish_goal: OPEN MENU
• Specify a method for the sub-goal:
• Method_for_goal: OPEN MENU
Step 1: MOVE-CURSOR MENU-BAR
Step 2: CLICK MENU-NAME
Step 3: Accomplish_goal OPEN MENU
Advantages of GOMS
➢ Gives qualitative & quantitative measures

➢ Model explains the results

➢ Less work than user study – no users!

➢ Easy to modify when UI is revised


Disadvantages of GOMS
➢ Only work for Goal Direction task

➢ Not for the novice user

➢ Not as easy as heuristics analysis, guidelines.


Cognitive models
• Cognitive modeling is an area of computer science that deals with
simulating human problem-solving and mental processing in a
computerized model.
• Such a model can be used to simulate or predict human behavior or
performance on tasks similar to the ones modeled and improve human-
computer interaction
Cognitive models
• goal and task hierarchies

• linguistic

• physical and device

• architectural
Cognitive models
Cognitive models
• They model aspects of user:
• understanding
• knowledge
• intentions
• processing
• Common categorisation:
• Competence vs. Performance
• Computational flavour
• No clear divide
Goal and task hierarchies
• Mental processing as divide-and-conquer
• Example: sales report
produce report
gather data
. find book names
. . do keywords search of names database
. . . … further sub-goals
. . sift through names and abstracts by hand
. . . … further sub-goals
. search sales database - further sub-goals
layout tables and histograms - further sub-goals
write description - further sub-goals
Motivation
• From task analysis to formal description and prediction of interaction
➢ GOMS is a task analytic notation for procedural knowledge
➢ Syntax and semantics similar to a programming language
➢Assumes a simplified cognitive architecture (the Model Human Processor from
previous lectures)
➢ Can be executed in simulation (production rule system)
➢Static and run-time properties can provide quantitative predictions of usability,
e.g.: Time to complete task, Complexity/difficulty/ knowledge requirements,
Short term memory load Novice vs. expert behaviour, rate of learning Effects on
all these of changing the interface
DIRECT METHODS
Usability Laboratory Testing

▪ Users at work are observed, evaluated, and measured in a specially constructedlaboratory


to establish the usability of the product at that point in time.
▪ Usability tests uncover what people actually do, not what they think they do acommon
problem with verbal descriptions
▪ The same scenarios can be presented to multiple users, providing comparativedata from
several users.

Dept. Computer Engineering, Institute of Engineering 34


DIRECT METHODS
Advantages

▪ The significant advantage of the direct methods is the opportunity they provide tohear the
user’s comments in person and firsthand.
▪ Person-to-person encounters permit multiple channels of communication (bodylanguage,
voice inflections, and so on) and provide the opportunity toimmediately follow up on
vague or incomplete data.
▪ Here are some recommended direct methods for getting input from users.

Dept. Computer Engineering, Institute of Engineering 35


An evaluation framework
Motivation

• since it can be difficult to put together a customized evaluation plan


for each project, it is often useful to follow a template or framework
• evaluation paradigms have been identified over time as practical
combinations of methods

FJK 2005
Objectives

• Explain key evaluation concepts & terms.


• Describe the evaluation paradigms & techniques used in interaction
design.
• Discuss the conceptual, practical and ethical issues that must be
considered when planning evaluations.
• Introduce the DECIDE framework.

FJK 2005
Evaluation paradigm
• Any kind of evaluation is guided explicitly or implicitly by a set of
beliefs
• these beliefs are often supported by theory
• The beliefs and the methods associated with them are known as an
‘evaluation paradigm’
e.g. Role of users, who controls, location, when used, type of data…

FJK 2005
Evaluation Techniques

• usability testing
• field studies
• predictive evaluation
• Combined Approach
• Walkthrough
Usability Testing
• recording the performance of typical users
• on typical tasks in controlled settings
• field observations may also be used
• users are watched
• recorded on video
• their activities are logged
• mouse movements, key presses
• evaluation
• calculation of performance times
• identification of errors
• explanation why the users did what they did
• user satisfaction
• questionnaires and interviews are used to elicit the opinions of users
Field Studies
• done in natural settings
• to understand what users do naturally and how technology impacts
them
• in product design field studies can be used to
- identify opportunities for new technology
- determine design requirements
- decide how best to introduce new technology
- evaluate technology in use
Predictive(Analytical) Evaluation
• experts apply their knowledge of typical users to predict usability
problems
• often guided by heuristics
• another approach involves theoretical models
• users need not be present
• relatively quick & inexpensive
Overview of Techniques
• observing users
• asking users about their opinions
• asking experts about their opinions
• testing the performance of users
• modeling the task performance of users
DECIDE:
A framework to guide evaluation
• Determine the goals the evaluation addresses.
• Explore the specific questions to be answered.
• Choose the evaluation paradigm and techniques to answer the
questions.
• Identify the practical issues.
• Decide how to deal with the ethical issues.
• Evaluate, interpret and present the data.
Determine the Goals
• What are the high-level goals of the evaluation?
• Who wants it and why?
• The goals influence the paradigm for the study
• Some examples of goals:
−Identify the best metaphor on which to base the design.
−Check to ensure that the final interface is consistent.
−Investigate how technology affects working practices.
−Improve the usability of an existing product .
Explore the Questions
• All evaluations need goals & questions to guide them so time is not wasted
on ill-defined studies.
• For example, the goal of finding out why many customers prefer to
purchase paper airline tickets rather than e-tickets can be broken down into
sub-questions:
- What are customers’attitudes to these new tickets?
- Are they concerned about security?
- Is the interface for obtaining them poor?
• What questions might you ask about the design of a cell phone?
Choose the Evaluation Paradigm & Techniques

• The evaluation paradigm strongly influences the techniques used,


how data is analyzed and presented.
• E.g. field studies do not involve testing or modeling
Identify Practical Issues
• For example, how to:
• select users
• stay on budget
• staying on schedule
• find evaluators
• select equipment
Decide on Ethical Issues
• Develop an informed consent form
• Participants have a right to:
- know the goals of the study
- what will happen to the findings
- privacy of personal information
- not to be quoted without their agreement
- leave when they wish
- be treated politely
Evaluate, Interpret and Present Data
• How data is analyzed and presented depends on the paradigm and
techniques used.
• The following also need to be considered:
- Reliability: can the study be replicated?
- Validity: is it measuring what you thought?
- Biases: is the process creating biases?
- Scope: can the findings be generalized?
- Ecological validity: is the environment of
the study influencing it
• e.g. Hawthorn effect
Pilot studies
• A small trial run of the main study.
• The aim is to make sure your plan is viable.
• Pilot studies check:
• that you can conduct the procedure
• that interview scripts, questionnaires, experiments, etc. work
appropriately
• It’s worth doing several to iron out problems before doing the main
study.
Key points
• An evaluation paradigm is an approach that is influenced by particular theories
and philosophies.
• Five categories of techniques were identified: observing users, asking users,
asking experts, user testing, modeling users.
• The DECIDE framework has six parts:
- Determine the overall goals
- Explore the questions that satisfy the goals
- Choose the paradigm and techniques
- Identify the practical issues
- Decide on the ethical issues
- Evaluate ways to analyze & present data
• Do a pilot study
Activity: DECIDE Framework
• apply the DECIDE framework to one of the projects in this class
• design and development tools tutorial
• Learning Commons storyboard
A assignment for you …
• Find an evaluation study from the list of URLs on this site or one of your
own choice.
• Use the DECIDE framework to analyze it.
• Which paradigms are involved?
• Does the study report address each aspect of DECIDE?
• Is triangulation used? If so which techniques?
• On a scale of 1-5, where 1 = poor and 5 = excellent, how would you rate this
study?
Universal design
➢ Universal design refers to designing interfaces for all people.

➢ This means that the interface design should be usable by everyone in any situation.

➢ Universal design doesn't only mean making the design accessible for people with disabilities; instead, it

means designing the interface for all people inclusively.


Universal design Principles
Universal design principles
• equitable use
• flexibility in use
• simple and intuitive to use
• perceptible information
• tolerance for error
• low physical effort
• size and space for approach and use
Multi-Sensory Systems
• More than one sensory channel in interaction
• e.g. sounds, text, hypertext, animation, video, gestures, vision
• Used in a range of applications:
• particularly good for users with special needs, and virtual reality
• Will cover
• general terminology
• speech
• non-speech sounds
• handwriting
• considering applications as well as principles
Usable Senses
The 5 senses (sight, sound, touch, taste and smell) are used by us every
day
• each is important on its own
• together, they provide a fuller interaction with the natural world
Computers rarely offer such a rich interaction
Can we use all the available senses?
• ideally, yes
• practically – no
We can use • sight • sound • touch (sometimes)
We cannot (yet) use • taste • smell
Speech

Human beings have a great and natural mastery of speech

•makes it difficult to appreciate the complexities but it’s an easy


medium for communication
Structure of Speech
phonemes
• 40 of them
• basic atomic units
• sound slightly different depending on the context they are in, these larger
units are …
allophones
• all the sounds in the language
• between 120 and 130 of them
• these are formed into …
morphemes
• smallest unit of language that has meaning.
Speech (cont.)
Other terminology:
• prosody
• alteration in tone and quality
• variations in emphasis, stress, pauses and pitch
• impart more meaning to sentences.
• co-articulation
• the effect of context on the sound
• transforms the phonemes into allophones
• syntax – structure of sentences
• semantics – meaning of sentences
Speech Recognition Problems
• Different people speak differently:
• accent, intonation, stress, idiom, volume, etc.
• The syntax of semantically similar sentences may vary.
• Background noises can interfere.
• People often “ummm.....” and “errr.....”
• Words not enough - semantics needed as well
• requires intelligence to understand a sentence
• context of the utterance often has to be known
• also information about the subject and speaker
e.g. even if “Errr.... I, um, don’t like this” is recognised, it is a fairly
useless piece of information on it’s own
Heuristic evaluation

1. Heuristic evaluation is a process where experts use rules of thumb to measure the
usability of user interfaces in independent walkthroughs and report issues.

2. Evaluators use established heuristics (e.g., Nielsen-Molich's) and reveal insights


that can help design teams enhance product usability from early in development.

3. A heuristic evaluation is one where an evaluator or group of evaluators walks


through the design of an interface and decides whether or not it complies with these
“rules of thumb”.

65
Heuristic evaluation

66
Multi-modal interaction

67
Overview
• Introduction
• Problem statement
• Technologies used
• Speech
• Hand gesture input
• Gazetracking
• Design of the system
• Multimodal issues

68
Overview
• Testing
• program tests
• usability tests
• human factors studies
• Conclusions and recommendations
• Future work
• Video

69
Multimodal HCI
• Currently: mouse, keyboard input

• More natural communication technologies available:


• sight
• sound
• touch
• Robust and intelligent combination of these technologies

70
Aim

71
Multi-modal vs. Multi-media
• Multi-modal systems
• use more than one sense (or mode ) of interaction
e.g. visual and aural senses: a text processor may speak the words as well
as echoing them to the screen

• Multi-media systems
• use a number of different media to communicate information
e.g. a computer-based teaching system:may use video, animation, text and
still images: different media all using the visual mode of interaction; may
also use sounds, both speech and non-speech: two more media, now using
a different mode
Problem statement
Prof. James L. Flanagan

Thank You

73

You might also like