HCI Automation
HCI Automation
INTERACTION
C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW
• Human-computer interaction
o HCI principles in UI design
• The study of how people use complex technological artifacts (May, 2001)
o And how these artifacts can be designed to facilitate this use
• HCI activity → Computer mediating between user and task (Card, 2014)
o Task can be done without computer but with other tools
o Similar in principle, but not HCI
• Asks the question: How can technology help people achieve their goals?
o How does interface design support goal-directed behaviour?
• Examples:
o Personal change – health & fitness, time management, financial monitoring
o Consumers – shopping, banking, gaming
o Work – productivity, collaboration
o Social relationships – social media, communication
HUMAN-COMPUTER INTERACTION
User experience & interface
• Usability
o Learnability; ease of understanding system
o Ease and efficiency of achieving goals
• E.g., Workers performing same task may configure their computers differently
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Make system robust to errors
• Make commission of severe errors difficult – i.e., avoid accidental activation
o Confirmation buttons, warnings, multiple steps
• User must know that error occurred, what the error is, and how to undo it
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• HCI in typical computers focused on supporting info search & retrieval
o How easily & efficiently can the user navigate the system?
• Legibility
o Font choice – Arial, Helvetica, Verdana, etc.
o Size
• Navigation
o Good UI design guides users through task; makes navigation easy
o Simplify steps needed to achieve goals
o “3 click rule” → find important info within 3 clicks
• Always doable?
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Readability – The “F-Shaped Visual Scan”
o Users scan computer screens in an F-shaped
pattern (Nielson, 2006)
• Content area, not page itself
o The more an ad looks like a native site component, the more fixation time it gets
• For web designers → ensure content on page does not resemble banner ads; make all
components in line with other design elements
• For advertisers → masquerade ads to look a part of the webpage it is placed on
Wearable technology
• HCI integrated into clothing/accessories → practical functions/features
o E.g., Fitbit, Apple Watch, Android Wear, Google Glass
Computers in cars
• Cars now provide drivers with a lot of info and added functionality/features
o E.g., entertainment, vehicle control, connectivity, automated activities
Computers in cars
HCI design considerations
• Limit distractions & amount of visual info on
displays
o Distracts drivers from primary task of driving
o E.g., Lengthy written texts, info clutter
• Simplify interactions
o Reduce no. of options, steps, & screens
o Systems demanding glances > 2s → risk
A U TO M AT I O N
C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW
• Benefits of automation
Examples:
• Manufacturing, lifting
• Kneading bread
• Floor sweeping
• Heating/cooling systems
• Driving → automated parking, adaptive cruise control
• Autopilot function
• Hazard detectors
• Predictive displays
AUTOMATION
Why automate?
• When tasks are impossible/hazardous
o Robotic handling of hazardous materials (or in hazardous environments)
o Heavy-lifting beyond human capacities
o Complex mathematical processes (statistical analysis)
o Automatic readers for visually impaired
AUTOMATION
Why automate?
• When tasks are difficult
o Operators can carry out, but effortful & error-prone
• E.g., ‘Simple’ calculations, assembly, autopilot systems, medical diagnosis & decision-making
• Automation reliability
Over-trust
Subjective Trust
Under-trust
Automation Reliability
Wickens et al. (2013)
PROBLEMS WITH AUTOMATION
Trust calibration & distrust
• Poor calibration of trust
o Distrust – fail to trust automation as much as is appropriate; leads to disuse
• E.g., Preference for manual control; alarms; Excel formulas; perception-enhancing automation
Examples:
Sorkin (1989)
• Train engineers taping over alert speakers due to typically false alarms
• Causes of complacency:
o Top-down processing – Human tendency to let experience guide our expectations; TDP > BUP
o Path of least cognitive effort
o Perceived authority or reliability
• E.g., Pilot following advice of flight planning automated system although wrong
• E.g., Flight simulation experiment (Mosier et al., 1992):
o 75% of pilots wrongly shut down engine due to wrong diagnosis & recommendation of automation
o Only 25% of pilots committed same error when using traditional checklist (i.e., checking raw data)
PROBLEMS WITH AUTOMATION
Overtrust, complacency, OOTLUF
• Automation overtrust & overdependence → lead to deskilling
o Ability to manually perform automated task declines over time
o E.g., Skill loss among pilots of highly automated aircrafts; mitigated by occasionally hand flying
(Wiener, 1988)
o E.g., Calculators
• Designing for People: An Introduction to Human Factors Engineering (Lee, Wickens, Liu,
& Boyle, 2017)
• Engineering Psychology and Human Performance (4th Ed.) (Wickens et al., 2013)