0% found this document useful (0 votes)
36 views109 pages

Hci Unit 01

Chapter 1 introduces Human-Computer Interaction (HCI), emphasizing its importance in designing user-friendly interfaces across various technologies. It discusses the interplay between humans, computers, and interactions, highlighting the need for user-centric design principles to enhance usability and accessibility. The chapter also outlines the multi-disciplinary applications of HCI in fields such as healthcare, education, and entertainment, showcasing its relevance in improving user experiences and system efficiency.

Uploaded by

larry44177
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views109 pages

Hci Unit 01

Chapter 1 introduces Human-Computer Interaction (HCI), emphasizing its importance in designing user-friendly interfaces across various technologies. It discusses the interplay between humans, computers, and interactions, highlighting the need for user-centric design principles to enhance usability and accessibility. The chapter also outlines the multi-disciplinary applications of HCI in fields such as healthcare, education, and entertainment, showcasing its relevance in improving user experiences and system efficiency.

Uploaded by

larry44177
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 109

chapter 1

the human
HCI
Reference : Alan dix, Janet Finlay, Gregory D. Abowd, Russell Beale, third edition
Human–Computer Interaction

CH 1 : Introduction
Introduction to HCI, Importance of good interface design, Notions-
Human, Computer & Interaction. Multi-disciplinary Applications of
HCI.
Interfaces in the Real World
• Not just computers!
– Mobile
– Wristwatch
– Car Music system
– Copier
– Car
– Plane cockpit
– Airline reservation
– Air traffic control
– Running shoes!

3
4
Introduction to HCI

Few reasons why combining HCI with Front-End Development is valuable:


• User-Centric Design: HCI emphasizes designing technology solutions with the end-user in mind. By
incorporating HCI principles into Front-End Development, you can create interfaces that are intuitive, easy to
navigate, and aligned with user expectations.
• Enhanced User Experience (UX): HCI principles contribute to a positive UX. Considering factors like user
behavior, preferences, and accessibility can result in interfaces that are not only functional but also enjoyable for
users.
• Usability Testing: HCI involves methods like usability testing to evaluate the effectiveness of an interface.
Integrating these testing techniques into your Front-End Development process helps identify and address
potential issues before deployment.
• Adaptability and Accessibility: HCI promotes designing interfaces that are accessible to a diverse range of
users, including those with disabilities. Adhering to accessibility standards in your Front-End work ensures that
your applications are inclusive and can be used by a broader audience.
• Continuous Improvement: HCI encourages an iterative design process. Applying this approach to Front-End
Development means that you can continuously refine and improve your interfaces based on user feedback and
changing requirements.

5
• The human, the user, is, after all, the one whom computer systems are designed to assist.
• A person’s interaction with the outside world occurs through information being received and sent: input
and output. In an interaction with a computer the user receives information that is output by the
computer, and responds by providing input to the computer – the user’s output becomes the computer’s
input and vice versa.

• In today’s world, technology has permeated every aspect of our lives, from smartphones to smart homes.
However, the evolution of technology was only possible with carefully considering user experience. This
is where human-computer interaction (HCI) comes into play. HCI is a specialized field of study and
research focusing on creating user-friendly and interactive human computer interfaces.

• HCI, which stands for Human-Computer Interaction, refers to studying and designing how humans interact
with computers and other technological systems. It encompasses understanding users’ behavior, needs, and
preferences and designing interfaces and interactions that are intuitive, efficient, and enjoyable. HCI
involves a multidisciplinary approach, drawing from psychology, design, computer science, and
ergonomics fields. By considering factors such as usability, accessibility, and user experience, HCI aims to
create user-centered technology that enhances human capabilities, ultimately improving the interaction and
communication between humans and machines.

6
• Human-computer interaction (HCI) is a multidisciplinary field of
study focusing on the design of computer technology and, in
particular, the interaction between humans (the users) and
computers. While initially concerned with computers, HCI has since
expanded to cover almost all forms of information technology
design.

7
• Important qualities of User Interface Design are following :
1. Simplicity :
1. User Interface design should be simple.
2. Less number of mouse clicks and keystrokes are required to accomplish this task.
3. It is important that new features only add if there is compelling need for them and they add significant values
to the application.
2. Consistency :
1. The user interface should have a more consistency.
2. Consistency also prevents online designers information chaos, ambiguity and instability.
3. We should apply typeface, style and size convention in a consistent manner to all screen components that will
add screen learning and improve screen readability. In this we can provide permanent objects as unchanging
reference points around which the user can navigate.
3. Intuitiveness :
1. The most important quality of good user interface design is intuitive.
2. Intuitive user interface design is one that is easy to learn so that user can pick it up quickly and easily.
3. Icons and labels should be concise and cogent. A clear unambiguous icon can help to make user interface
intuitive and a good practice is make labels conform to the terminology that the application supports.
4. Prevention :
1. A good user interface design should prevents users from performing an in-appropriate task and this is
accomplished by disabling or “graying cut” certain elements under certain conditions.
5. Forgiveness :
1. This quality can encourage users to use the software in a full extent.
2. Designers should provide users with a way out when users find themselves somewhere they should not go.
6. Graphical User Interface Design :
1. A graphic user interface design provides screen displays that create an operating environment for the user and
form an explicit visual and functional context for user’s actions.
2. It includes standard objects like buttons, icons, text, field, windows, images, pull-down and pop-up screen 8
menus.
Importance of good interface design

Good interface design is important because it can:


• Improve user experience: A well-designed interface can enhance how users interact with a website or
application.
• Build trust: Good design can help build trust with customers and make a strong first impression.
• Increase sales: Good design can boost sales and improve a company's financial situation.
• Create a strong customer base: A good interface can create a strong link between a website and its
customers.
• Reduce problems: A good interface can reduce problems and increase user involvement.
• Improve functionality: A good interface can perfect functionality.
• Ensure compatibility: Identifying interfaces can help ensure compatibility between a system and other
systems.
• Avoid cost overruns: Missing or incorrectly defined interfaces can cause cost overruns and product failures.

Some qualities of good interface design include:


• Being user-centered
• Having easy-to-understand navigation
• Being clear and concise
• Being consistent
• Using simple language and terminology

9
• Notions of Human, Computer, and Interaction
• Human-Computer Interaction (HCI) is an interdisciplinary field focused on the design, evaluation, and implementation of interactive
systems for human use. It involves understanding how humans interact with computers and designing systems that improve this
interaction. To grasp the core of HCI, we must understand the three key notions:

1. Human
• The human element in HCI represents the users interacting with the system. It involves studying human capabilities, limitations, and
needs.
• Characteristics:
• Cognitive Abilities: Humans process information differently, influenced by memory, attention, learning, and problem-solving.
• Physical Abilities: Include motor skills like typing or touch gestures.
• Perceptual Abilities: Vision, hearing, and touch are critical in understanding interface design.
• Emotional Responses: The user’s feelings, such as frustration or satisfaction, play a role in how they perceive and interact with a
system.
• Goals for HCI:
• Design systems that cater to human strengths and compensate for weaknesses.
• Ensure usability by considering accessibility, ergonomics, and user diversity.
2. Computer
• The computer refers to the hardware and software systems with which humans interact. This includes devices, applications, and digital
environments.
• Characteristics:
• Hardware: Includes physical components like keyboards, monitors, touch screens, and wearable devices.
• Software: Operating systems, applications, and interfaces are the layers humans engage with.
• Processing Power: Determines responsiveness and capability for real-time interactions.
• Interaction Models: Such as Command-Line Interfaces (CLI), Graphical User Interfaces (GUI), Natural User Interfaces (NUI), and
Virtual/Augmented Reality (VR/AR).
• Goals for HCI:
• Build systems that respond efficiently to user actions.
• Develop interfaces that communicate system capabilities clearly and intuitively

10
3. Interaction
• Interaction is the dynamic exchange between humans and computers to achieve a specific goal. It involves
input from humans and feedback from computers.
• Characteristics:
• Input Methods: Includes typing, clicking, swiping, voice commands, gestures, or gaze tracking.
• Output Methods: Visual (screens), auditory (speakers), and tactile (vibration/haptic feedback).
• Feedback Mechanisms: The system’s response to user input, such as a confirmation sound, a progress bar, or
an error message.
• Context of Use: Interaction is influenced by the environment, task complexity, and user expertise.
• Goals for HCI:
• Create seamless, efficient, and engaging interaction processes.
• Minimize user errors and support effective task completion.

The Interplay of Human, Computer, and Interaction


• The essence of HCI lies in designing systems that:
1.Recognize human capabilities and limitations.
2.Use of computer systems’ strengths effectively.
3.Facilitate efficient and satisfying interactions.
• By understanding these three notions and their relationships, designers can build systems that not only meet
functional requirements but also enhance the overall user experience.

11
• Multi-Disciplinary Applications of HCI
• Human-Computer Interaction (HCI) is inherently multi-disciplinary, blending insights and methods from various fields to
design interactive systems that meet user needs. Its applications span diverse domains, reflecting its adaptability and
relevance in solving real-world challenges. Below is an overview of some key multi-disciplinary applications of HCI:

1. Healthcare
• HCI applications in healthcare improve patient outcomes, medical training, and operational efficiency.
• Telemedicine: User-friendly interfaces for remote consultations, appointment scheduling, and medical record access.
• Medical Devices: Usability of diagnostic tools (e.g., MRI machines, glucose monitors) to ensure safety and efficiency.
• Assistive Technologies: Devices for individuals with disabilities, such as screen readers, prosthetics, and communication
aids.
• Health Apps: Fitness trackers and mobile apps for monitoring physical activity, diet, and mental health.

2. Education and E-Learning


• HCI plays a significant role in creating accessible, engaging, and effective learning environments.
• Learning Platforms: Interactive tools like MOOCs (e.g., Coursera, Khan Academy) that support diverse learning styles.
• Educational Games: Gamified learning experiences to enhance engagement and retention.
• Accessibility: Adaptive interfaces for learners with disabilities, such as text-to-speech tools and closed captioning.
• Virtual and Augmented Reality (VR/AR): Immersive experiences for practical learning, such as virtual labs or historical
reconstructions.

3. Business and E-Commerce


• HCI enhances user experiences and drives customer satisfaction in commercial systems.
• User Interfaces (UIs): Intuitive designs for websites, apps, and software platforms.
• Customer Support: Chatbots, virtual assistants, and automated support systems.
• Data Visualization: Dashboards and tools for analyzing business metrics.
• Personalization: AI-driven recommendations based on user behavior and preferences.

12
4. Entertainment and Gaming
• HCI innovations transform how users interact with entertainment systems.
• Game Design: User-centered approaches to create immersive and accessible games.
• Streaming Platforms: Interfaces for services like Netflix, Spotify, and YouTube, emphasizing simplicity and engagement.
• Immersive Experiences: VR/AR for gaming, storytelling, and concerts.
• Accessibility: Features like customizable controls and audio descriptions for inclusivity.

5. Transportation
• HCI contributes to designing safer and more efficient transportation systems.
• Autonomous Vehicles: Interfaces for monitoring and controlling self-driving cars.
• Navigation Systems: User-friendly GPS interfaces and voice-guided systems.
• Public Transportation Apps: Tools for trip planning, real-time updates, and ticketing.
• Aircraft Cockpits: Usability improvements for pilots to ensure flight safety.

6. Social Media and Communication


• HCI principles are crucial in designing platforms for connectivity and expression.
• Social Networking Sites: Intuitive interfaces for platforms like Facebook, Instagram, and LinkedIn.
• Messaging Apps: Features like emojis, GIFs, and reactions for enhanced interaction.
• Accessibility: Support for multilingual users and assistive technologies.

7. Smart Cities and Internet of Things (IoT)


• HCI drives the usability of systems designed for urban living and connected devices.
• Smart Home Interfaces: Apps and voice commands for managing devices like thermostats and security cameras.
• Urban Infrastructure: Interfaces for monitoring traffic, pollution, and energy consumption.
• Wearable Technology: Smartwatches and fitness bands for health tracking.

13
Human-Computer Interaction (HCI) has many applications, including:
• Virtual reality
HCI can create more immersive and natural interfaces that provide users with a new perspective.
• Voice search
Voice search apps like Amazon Alexa and Google Voice Search allow users to interact with devices by
speaking.
• Eye-tracking
Eye-tracking systems can help businesses monitor employee focus and distractions.
• Emotion recognition
Emotion recognition can help HCI applications better understand users' preferences and intentions.
• Ergonomics
HCI can help create user-friendly interfaces and systems that cater to users' needs.
• Automated teller machines
ATMs use HCI to make the process of withdrawing and depositing cash efficient.
• Self-service kiosks
Kiosks use HCI to allow users to purchase tickets efficiently.
• Hospitality
HCI is used in the hospitality industry to manage hotel operations with a central computer.
• Healthcare
HCI can be used to enhance patient safety and optimize healthcare processes.
Other applications of HCI include:
• Wearable computing
• Internet of Things (IoT)
• Artificial intelligence (AI)
• Accessibility
• Intelligent tutoring
• Computer games
• E-Health applications

14
HCI
• Graphical user interfaces (GUIs). Users interact with visual representations
on digital control panels. A website menu is a GUI.
Voice-controlled interfaces (VUIs). Users interact with these through their
voices. Most smart assistants Google Assistant, Cortana, Siri on iPhone, and
Alexa on Amazon devices are VUIs.
Gesture-based interfaces. Users engage with 3D design spaces through
bodily motions: e.g., in virtual reality (VR) apps and games.
the human

• Information i/o …
– visual, auditory, haptic, movement
• Information stored in memory
– sensory, short-term, long-term
• Information processed and applied
– reasoning, problem solving, skill, error
• Emotion influences human capabilities
• Each person is different
Vision

Two stages in vision

• physical reception of stimulus

• processing and interpretation of


stimulus
The Eye - physical reception

• mechanism for receiving light and


transforming it into electrical energy
• light reflects from objects
• images are focused upside-down on
retina
• retina contains rods for low light vision
and cones for colour vision
• ganglion cells (brain!) detect pattern
and movement
Human
• The capabilities and limitations of visual processing

Optical Illusions

19
20
Reading

• Several stages:
– visual pattern perceived
– decoded using internal representation of language
– interpreted using knowledge of syntax, semantics,
pragmatics

• Reading involves saccades and fixations


• Perception occurs during fixations
• Word shape is important to recognition
• Negative contrast improves reading from
computer screen
Hearing

• Provides information about environment:


distances, directions, objects etc.
• Physical apparatus:
– outer ear – protects inner and amplifies
sound
– middle ear – transmits sound waves as
vibrations to inner ear
– inner ear – chemical transmitters are
released
and cause impulses in auditory nerve
• Sound
– pitch – sound frequency
– loudness – amplitude
– timbre – type or quality
Hearing (cont)

• Humans can hear frequencies from 20Hz to


15kHz
– less accurate distinguishing high frequencies than
low.

• Auditory system filters sounds


– can attend to sounds over background noise.
– for example, the cocktail party phenomenon.
Touch

• Provides important feedback about environment.


• May be key sense for someone who is visually impaired.
• Stimulus received via receptors in the skin:
– thermoreceptors – heat and cold
– nociceptors – pain
– mechanoreceptors – pressure
(some instant, some continuous)

• Some areas more sensitive than others e.g. fingers.


• Kinethesis - awareness of body position
– affects comfort and performance.
Movement

• Time taken to respond to stimulus:


reaction time + movement time
• Movement time dependent on age, fitness etc.
• Reaction time - dependent on stimulus type:
– visual ~ 200ms
– auditory ~ 150 ms
– pain ~ 700ms

• Increasing reaction time decreases accuracy in


the unskilled operator but not in the skilled
operator.
Movement (cont)
• Fitts' Law describes the time taken to hit a
screen target:
• This scientific law predicts that the time required to rapidly move to a target
area is a function of the ratio between the distance to the target and the
width of the target.
Mt = a + b log2(D/S + 1)
a and b are empirically determined constants
Mt is movement time
D is Distance
S is Size of target

targets as large as possible



distances as small as possible

where:
Fitts’ law states that the amount of
time required for a person to move
a pointer (e.g., mouse cursor) to a
target area is a function of the
distance to the target divided by the
size of the target. Thus, the longer
the distance and the smaller the
target’s size, the longer it takes.
Deductive Reasoning

• Deduction:
– derive logically necessary conclusion from given
premises.
e.g. If it is Friday then she will go to work
It is Friday
Therefore she will go to work.

• Logical conclusion not necessarily true:


e.g. If it is raining then the ground is dry
It is raining
Therefore the ground is dry
Inductive Reasoning

• Induction:
– generalize from cases seen to cases unseen
e.g. all elephants we have seen have trunks
therefore all elephants have trunks.

… but useful!
• Humans not good at using negative evidence
e.g. Wason's cards.
Abductive reasoning

• reasoning from event to cause


e.g. Sam drives fast when drunk.
If I see Sam driving fast, assume drunk.

• Unreliable:
– can lead to false explanations
the computer
HCI
• Graphical user interfaces (GUIs). Users interact with visual representations
on digital control panels. A website menu is a GUI.
Voice-controlled interfaces (VUIs). Users interact with these through their
voices. Most smart assistants Google Assistant, Cortana, Siri on iPhone, and
Alexa on Amazon devices are VUIs.
Gesture-based interfaces. Users engage with 3D design spaces through
bodily motions: e.g., in virtual reality (VR) apps and games.
The Computer
a computer system is made up of various elements

each of these elements affects the interaction


– input devices – text entry and pointing
– output devices – screen (small&large), digital paper
– virtual reality – special interaction and display devices
– physical interaction – e.g. sound, haptic, bio-sensing
– paper – as output (print) and input (scan)
– memory – RAM & permanent media, capacity & access
– processing – speed of processing, networks
Keyboards

• Most common text input device


• Allows rapid entry of text by experienced
users

• Keypress closes connection, causing a


character code to be sent
• Usually connected by cable, but can be
wireless
layout – QWERTY

• Standardised layout
but …
– non-alphanumeric keys are placed differently
– accented symbols needed for different scripts
– minor differences between UK and USA keyboards

• QWERTY arrangement not optimal for typing


– layout to prevent typewriters jamming!
• Alternative designs allow faster typing but large social
base of QWERTY typists produces reluctance to change.
QWERTY (ctd)
special keyboards

• designs to reduce fatigue for RSI


• for one handed use
e.g. the Maltron left-handed keyboard
phone pad and T9 entry
• use numeric keys with
multiple presses
2 –abc 6
mno -
3 -def 7
pqrs-
4 -ghi 8
tuv -
5 -jkl 9
wxyz-
hello = 4433555[pause]555666
surprisingly fast!
• T9 predictive entry
– type as if single key for each letter
– use dictionary to ‘guess’ the right word
– hello = 43556 …
– but 26 -> menu ‘am’ or ‘an’
Speech recognition

• Improving rapidly
• Most successful when:
– single user – initial training and learns peculiarities
– limited vocabulary systems

• Problems with
– external noise interfering
– imprecision of pronunciation
– large vocabularies
– different speakers
Numeric keypads

• for entering numbers quickly:


– calculator, PC keyboard
• for telephones
1 2 3 7 8 9
not the same!! 4 5 6 4 5 6
7 8 9 1 2 3
ATM like phone
0 # 0 . =
*
telephone calculator
positioning, pointing and drawing

mouse, touchpad
trackballs, joysticks etc.
touch screens, tablets
eyegaze, cursors
Stylus and light pen

Stylus
– small pen-like pointer to draw directly on screen
– may use touch sensitive surface or magnetic detection
– used in PDA, tablets PCs and drawing tables

Light Pen
– now rarely used
– uses light from screen to detect location

BOTH …
– very direct and obvious to use
– but can obscure screen
Eyegaze

• control interface by eye gaze direction


– e.g. look at a menu item to select it
• uses laser beam reflected off retina
– … a very low power laser!
• https://www.youtube.com/watch?v=XrVJ1JaM
Sew
Cursor keys

• Four keys (up, down, left, right) on keyboard.


• Very, very cheap, but slow.
• Useful for not much more than basic motion for text-
editing tasks.
• No standardised layout, but inverted “T”, most common
display devices

bitmap screens (CRT & LCD)


large & situated displays
digital paper
bitmap displays

• screen is vast number of coloured dots


Cathode ray tube

• Stream of electrons emitted from electron gun, focused


and directed by magnetic fields, hit phosphor-coated
screen which glows
• used in TVs and computer monitors
DVST
Liquid crystal displays

• How it works …
– Top plate transparent and polarised, bottom plate reflecting.
– Light passes through top plate and crystal, and reflects back to
eye.
– Voltage applied to crystal changes polarisation and hence colour
– N.B. light reflected not emitted => less eye strain
– https://www.youtube.com/watch?v=82KaxKyuwV4
special displays

Random Scan (Directed-beam refresh, vector display)


– draw the lines to be displayed directly
– no jaggies
– lines need to be constantly redrawn
– rarely used except in special instruments

Direct view storage tube (DVST)


– Similar to random scan but persistent => no flicker
– Can be incrementally updated but not selectively erased
– Used in analogue storage oscilloscopes
virtual reality and 3D interaction

positioning in 3D space
moving and grasping
seeing 3D (helmets and caves)
positioning in 3D space

• the 3D mouse
– six-degrees of movement: x, y, z

• data glove
• fibre optics used to detect finger position
• VR helmets
– detect head motion and possibly eye gaze

• whole body tracking


– accelerometers strapped to limbs or reflective dots
and video processing
physical controls

• specialist controls needed …


– industrial controls, consumer products,
etc.
easy-clean
smooth buttons

multi-function
control
large buttons
clear dials

tiny buttons
paper: printing and scanning

print technology
fonts, page description, WYSIWYG
scanning, OCR
Types of dot-based printers
• dot-matrix printers
– use inked ribbon (like a typewriter
– line of pins that can strike the ribbon, dotting the paper.
– typical resolution 80-120 dpi
• ink-jet and bubble-jet printers
– tiny blobs of ink sent from print head to paper
– typically 300 dpi or better .
• laser printer
– like photocopier: dots of electrostatic charge deposited on
drum, which picks up toner (black powder form of ink)
rolled onto paper which is then fixed with heat
– typically 600 dpi or better.
Fonts
• Font – the particular style of text
Courier font
Helvetica font
Palatino font
Times Roman font
• ×∝≡↵ℜ€⊗↵~€ (special symbol)

• Size of a font measured in points (1 pt about 1/72”)


(vaguely) related to its height
This is ten point Helvetica
This is twelve point
This is fourteen point
This is eighteen point
and this is twenty-four point
chapter

the interaction
The Interaction

• interaction models
• translations between user and system
• ergonomics
– physical characteristics of interaction
• interaction styles
• the nature of user/system dialog
• context
– social, organizational, motivational
Gestalt theory

• Proximity
• Similarity
• Continuity
• Synchrony
Proximity

Definition: Objects that are close together are perceived as a group.

Example: Form fields for "First Name" and "Last Name" placed close together are seen as
related inputs.
Similarity

Definition: Elements that look similar (in color, shape, size) are seen as belonging together.

Example: Menu items in the same font and size look like part of the same group.
Continuity (or Good Continuation)

Definition: Users tend to follow lines or curves; elements aligned in a path are seen as related.

Example: Navigation links arranged in a horizontal line are seen as a continuous menu.
Common Fate (Synchrony)

Definition: Elements that move together are perceived as part of the same group.

Example: Animations where buttons move together when a menu expands indicate they’re functionally
related.
What is interaction?

communication
user system

but is that all … ?


– see “language and action” in chapter 4 …
models of interaction

terms of interaction
Norman model
interaction framework
Some terms of interaction

domain – the area of work under study


e.g. graphic design
goal – what you want to achieve
e.g. create a solid red triangle
task – how you go about doing it
– ultimately in terms of operations or actions
e.g. … select fill tool, click over triangle
Donald Norman’s model

• Seven stages
– user establishes the goal
– formulates intention
– specifies actions at interface
– executes action
– perceives system state
– interprets system state
– evaluates system state with respect to goal

• Norman’s model concentrates on user’s view


of the interface
an example based on a user using a smart thermostat at home:
1. Goal Formation
•The user forms a goal they want to achieve.
•Example: The user wants to set the room temperature to 72°F because it feels too cold.
2. Forming the Intention
•The user decides on a course of action to achieve the goal.
•Example: The user decides to adjust the thermostat using the mobile app.
3. Specifying an Action
•The user plans the specific action required to achieve their intention.
•Example: The user decides to open the thermostat app, navigate to the temperature control screen,
and adjust the temperature to 72°F.
4. Executing the Action
•The user performs the planned action.
•Example: The user opens the app, selects the thermostat device, and slides the temperature control to
72°F.
5. Perceiving the State of the System
•The user observes the system's feedback to their action.
•Example: The app displays that the temperature has been set to 72°F, and the thermostat emits a
confirmation beep.
6. Interpreting the State of the System
•The user interprets the feedback to ensure the action had the desired effect.
•Example: The user sees that the app and the thermostat both show the updated temperature,
indicating the action was successful.
7. Evaluating the Outcome
•The user compares the new system state with their original goal.
•Example: After a while, the user notices the room feels comfortable at 72°F, confirming that their goal
has been achieved.
•This process is iterative, and if any stage fails (e.g., the thermostat doesn’t adjust the temperature),
the user may need to repeat or revise their actions.
execution/evaluation loop
goal

execution evaluation
system
• user establishes the goal
• formulates intention
• specifies actions at interface
• executes action
• perceives system state
• interprets system state
• evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
• user establishes the goal
• formulates intention
• specifies actions at interface
• executes action
• perceives system state
• interprets system state
• evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
• user establishes the goal
• formulates intention
• specifies actions at interface
• executes action
• perceives system state
• interprets system state
• evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
• user establishes the goal
• formulates intention
• specifies actions at interface
• executes action
• perceives system state
• interprets system state
• evaluates system state with respect to goal
Using Norman’s model

Some systems are harder to use than others


The Gulf of Execution refers to the gap between what the user wants to do (goal/intention) and
the actions provided by the system to achieve that goal.

Gulf of Execution
In human computer interaction, the gulf of
execution is the gap between a user's goal for
action and the means to execute that goal

Gulf of Evaluation
the gulf of evaluation is the degree to which the
system or artifact provides representations that can
be directly perceived and interpreted in terms of the
expectations and intentions of the user.
The Gulf of Evaluation refers to the gap between the system’s output and the user’s ability to understand if
their action was successful.
Abowd and Beale framework
extension of Norman…
their interaction framework has 4 parts O
– user output
– input
– system
S U
core task
– output
I
input
each has its own unique language
interaction ⇒ translation between languages

problems in interaction = problems in translation


Using Abowd & Beale’s model
user intentions
→ translated into actions at the interface
→ translated into alterations of system
state
→ reflected in the output display
→ interpreted by the user

general framework for understanding interaction


– not restricted to electronic computer systems
– identifies all major components involved in interaction
– allows comparative assessment of systems
– an abstraction
ergonomics

physical aspects of interfaces


industrial interfaces
Ergonomics

• Study of the physical characteristics of


interaction

• Also known as human factors – but this can


also be used to mean much of HCI!

• Ergonomics good at defining standards and


guidelines for constraining the way we design
certain aspects of systems
Ergonomics - examples

• arrangement of controls and displays


e.g. controls grouped according to function or
frequency of use, or sequentially
• surrounding environment
e.g. seating arrangements adaptable to cope with
all sizes of user
• health issues
e.g. physical position, environmental conditions
(temperature, humidity), lighting, noise,
• use of colour
e.g. use of red for warning, green for okay,
awareness of colour-blindness etc.
interaction styles

dialogue … computer and user

distinct styles of interaction


Common interaction styles

• command line interface


• menus
• natural language
• question/answer and query dialogue
• form-fills and spreadsheets
• WIMP
• point and click
• three–dimensional interfaces
Command line interface

• Way of expressing instructions to the


computer directly
– function keys, single characters, short
abbreviations, whole words, or a combination

• suitable for repetitive tasks


• better for expert users than novices
• offers direct access to system functionality
• command names/abbreviations should be
meaningful!
Typical example: the Unix system
Menus

• Set of options displayed on the screen


• Options visible
– less recall - easier to use
– rely on recognition so names should be meaningful
• Selection by:
– numbers, letters, arrow keys, mouse
– combination (e.g. mouse plus accelerators)
• Often options hierarchically grouped
– sensible grouping is needed
• Restricted form of full WIMP system
Natural language

• Familiar to user
• speech recognition or typed natural language
• Problems
Boy hit the dog with the stick
• Solutions
– try to understand a subset
– pick on key words
Query interfaces

• Question/answer interfaces
– user led through interaction via series of questions
– suitable for novice users but restricted functionality
– often used in information systems

• Query languages (e.g. SQL)


– used to retrieve information from database
– requires understanding of database structure and
language syntax, hence requires some expertise
Form-fills

• Primarily for data entry or data retrieval


• Screen like paper form.
• Data put in relevant place
• Requires
– good design
– obvious correction
facilities
Spreadsheets

• first spreadsheet VISICALC, followed by


Lotus 1-2-3
MS Excel most common today
• sophisticated variation of form-filling.
– grid of cells contain a value or a formula
– formula can involve values of other cells
e.g. sum of all cells in this column
– user can enter and alter data spreadsheet
maintains consistency
WIMP Interface

Windows
Icons
Menus
Pointers
… or windows, icons, mice, and pull-down menus!

• default style for majority of interactive


computer systems, especially PCs and desktop
machines
Point and click interfaces

• used in ..
– multimedia
– web browsers
– hypertext

• just click something!


– icons, text links or location on map

• minimal typing
Three dimensional interfaces

• virtual reality
• ‘ordinary’ window systems
– highlighting flat buttons …
– visual affordance
– indiscriminate use
click me!
just confusing!
• 3D workspaces … or sculptured
– use for extra virtual space
– light and occlusion give depth
– distance effects
elements of the wimp interface

windows, icons, menus, pointers


+++
buttons, toolbars,
palettes, dialog boxes

also see supplementary material


on choosing wimp elements
Windows

• Areas of the screen that behave as if they


were independent
– can contain text or graphics
– can be moved or resized
– can overlap and obscure each other, or can be laid
out next to one another (tiled)

• scrollbars
– allow the user to move the contents of the window
up and down or from side to side
• title bars
– describe the name of the window
Icons

• small picture or image


• represents some object in the interface
– often a window or action
• windows can be closed down (iconised)
– small representation fi many accessible
windows
• icons can be many and various
– highly stylized
– realistic representations.

– ICONIZING
Pointers

• important component
– WIMP style relies on pointing and selecting things
• uses mouse, trackpad, joystick, trackball,
cursor keys or keyboard shortcuts
• wide variety of graphical images
Menus

• Choice of operations or services offered on the screen


• Required option selected with pointer

problem – take a lot of screen space


solution – pop-up: menu appears when needed
Kinds of Menus

• Menu Bar at top of screen (normally), menu


drags down
– pull-down menu - mouse hold and drag down menu
– drop-down menu - mouse click reveals menu
– fall-down menus - mouse just moves over bar!

• Contextual menu appears where you are


– pop-up menus - actions for selected object/right
click
– pie menus - arranged in a circle
• easier to select item (larger target area)
• quicker (same distance to any option)
… but not widely used!
Menus extras

• Cascading menus
– hierarchical menu structure
– menu selection opens new menu

• Keyboard accelerators
– key combinations - same effect as menu item
– two kinds
• active when menu open – usually first letter
• active when menu closed – usually Ctrl + letter
usually different !!!
Menus design issues

• which kind to use


• what to include in menus at all
• words to use (action or description)
• how to group items
• choice of keyboard accelerators
Buttons

• individual and isolated regions within a


display that can be selected to invoke
an action

• Special kinds
– radio buttons
– set of mutually exclusive choices
– check boxes
– set of non-exclusive choices
Toolbars

• long lines of icons …


… but what do they do?

• fast access to common actions

• often customizable:
– choose which toolbars to see
– choose what options are on it
Palettes and tear-off menus

• Problem
menu not there when you want it

• Solution
palettes – little windows of actions
– shown/hidden via menu option
e.g. available shapes in drawing package
tear-off and pin-up menus
– menu ‘tears off’ to become palette
Dialogue boxes

• information windows that pop up to


inform of an important event or request
information.

e.g: when saving a file, a dialogue box is


displayed to allow the user to specify the
filename and location. Once the file is
saved, the box disappears.
• https://www.interaction-design.org/literature/topics/human-computer-
interaction?srsltid=AfmBOorz0W5Ut_dEK7zsCtUrJYE_MA1-
c8cYmbjEkb71lScv6XSW1uWr#what_is_human-computer_interaction_(hci)?-0

• https://www.youtube.com/watch?v=cK3YXzvfZIE&list=PLlgPXNRcnX3EtESM59
QVohcQBJ6N7H4JF&index=3
How do Humans Interact with Computers. A Cycle of Interaction in HCI

109

You might also like