0% found this document useful (0 votes)
12 views31 pages

Chapter 1 - The Human Part 1

The document discusses human-computer interaction, focusing on input and output channels through senses like vision, hearing, and touch. It explores the mechanisms of perception, memory, reasoning, and problem-solving, emphasizing the complexities of human cognition and the importance of understanding these processes for effective interface design. Additionally, it highlights the role of errors and mental models in human behavior and decision-making.

Uploaded by

Kerbz Aguilar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views31 pages

Chapter 1 - The Human Part 1

The document discusses human-computer interaction, focusing on input and output channels through senses like vision, hearing, and touch. It explores the mechanisms of perception, memory, reasoning, and problem-solving, emphasizing the complexities of human cognition and the importance of understanding these processes for effective interface design. Additionally, it highlights the role of errors and mental models in human behavior and decision-making.

Uploaded by

Kerbz Aguilar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

The Human

Input Output
Channels
Input Output Channels
This discusses the interaction between humans and computers through
input and output channels. It explores how senses like sight, hearing, and
touch are used to receive information from computers, while motor
control, primarily through fingers, is used to provide input. It also touches
on the potential roles of taste and smell in specialized computer systems.
Overall, it emphasizes the interconnectedness of input and output
channels in human-computer interaction.
Vision
Human vision involves both physical reception of stimuli and the
subsequent processing and interpretation of that stimuli. While there are
limitations to what can be seen due to the properties of the eye and
visual system, visual processing can construct images from incomplete
information. Understanding both stages is crucial for designing effective
computer systems. This discussion will start with an examination of the
eye's role as a physical receptor before delving into the processing
aspects of basic vision.
The human eye
The human eye receives light and converts it into electrical energy, which
is then transmitted to the brain. It consists of various components such
as the cornea, lens, and retina. The retina contains two types of
photoreceptors: rods, which are highly sensitive to light and allow vision
in low illumination but cannot resolve fine detail, and cones, which are
less sensitive to light but enable color vision. Rods dominate peripheral
vision, while cones are mainly concentrated in the fovea, facilitating
detailed vision and color perception.
Visual Perception
Visual perception involves more than just the physical mechanisms of the
eye. It encompasses the filtering and processing of information received
by the visual apparatus, enabling recognition of scenes, determination of
distances, and differentiation of colors. Understanding how we perceive
size, depth, brightness, and color is crucial for designing effective visual
interfaces.
Perceiving size and dept

Our visual system effortlessly interprets images, accounting for size and
distance, aiding in distance judgment. Despite varying appearances,
objects are identified consistently. Visual angle, influenced by object size
and distance, indicates spatial extent. Visual acuity is crucial for
detecting fine detail. Size constancy maintains consistent object size
perception despite changes in visual angle with distance. Depth
perception cues like overlap and familiarity assist in determining object
distance. Familiarity with object sizes can influence distance judgments,
as seen in a humorous advertising example.
Perceiving Brightness
Brightness perception is subjective, influenced by luminance, which
depends on an object's reflective properties and light received. Contrast,
the difference in luminance between an object and its background, also
affects brightness perception. Visual acuity improves with increased
luminance, but flicker becomes more noticeable. Flicker, perceived as
rapid switching of light, is more prominent in peripheral vision and with
larger displays.
Perceiving Color

Color perception involves hue, intensity, and saturation. Hue is


determined by light wavelength, with blues short, greens medium, and
reds long. About 150 hues can be discerned by the average person.
Cones in the eye detect different colors, with color vision best in the
fovea. Color blindness, affecting 8% of males and 1% of females, often
impairs red-green discrimination.
Hearing
Hearing provides significant information about our environment, often
underestimated compared to sight. By listening, we can discern various
sounds, determine their sources, and estimate distances. For example,
we can identify passing cars, nearby machinery, aircraft overhead, and
bird songs, discerning their locations and distances. The auditory system
conveys detailed environmental information, but how does it operate?
The human ear
Hearing begins with sound waves, which the ear receives and transmits
through three sections: the outer ear, middle ear, and inner ear. The
outer ear, comprising the pinna and auditory canal, protects the middle
ear and amplifies sounds. The middle ear, connected to the outer ear by
the tympanic membrane, contains ossicles that transmit vibrations to the
cochlea in the inner ear. This relay is necessary because the cochlea is
filled with denser liquid. In the cochlea, vibrations stimulate hair cells that
release chemical transmitters, triggering impulses in the auditory nerve.
Processing Sound
Sound is characterized by pitch, loudness, and timbre, with the ear
detecting frequencies from 20 Hz to 15 kHz. The auditory system filters
sounds, facilitating selective hearing and focus on important information.
Despite its potential, sound is underutilized in interface design, although
multimedia interfaces can incorporate it effectively. Sound recognition
capabilities suggest its potential for conveying system information in
interface design.
Touch
Touch, often underestimated compared to sight and hearing, provides
crucial environmental information, warning of dangers like hot surfaces
and aiding in tasks such as lifting objects. In human-computer interaction,
touch offers vital feedback, like feeling button depressions. For those with
impaired senses, touch can be primary, as seen with braille users.
Mechanoreceptors in the skin respond to pressure, with rapidly and slowly
adapting receptors distinguishing immediate and continuous pressure.
Sensitivity varies across the body, with fingers being most acute.
Kinesthesis, awareness of body and limb positions, influences comfort
and performance, such as in touch typing.
Movement
Motor control involves multiple processing stages from receiving a
stimulus to generating a response. Movement time depends on physical
characteristics like age and fitness, while reaction time varies based on
sensory channel. Skill and practice can reduce reaction time, but fatigue
may increase it. Speed and accuracy in movement impact interactive
system design, with Fitts' law formalizing the relationship between target
size, distance, and movement time. Larger targets and shorter distances
are preferable, influencing menu design considerations.
Human Memory
Human Memory
Memory plays a crucial role in everyday activities, storing factual
knowledge, procedural skills, and past experiences. Our memory system
enables us to recall information, perform actions, and maintain our sense
of identity. Understanding how memory works involves exploring its
capabilities and limitations. Memory is typically categorized into three
types: sensory buffers, short-term memory (working memory), and long-
term memory. While there's debate over whether these represent
separate systems or functions within one system, they interact to process
and store information.
Sensory Memory
Sensory memory briefly stores stimuli from the senses like visual,
auditory, and tactile information before being overwritten. Iconic memory
retains visual stimuli briefly, demonstrated by the persistence of images
after stimulus removal. Echoic memory briefly holds auditory information,
allowing playback of sounds. Attention filters stimuli for transfer to short-
term memory, guided by arousal and interest. Selective attention enables
focusing on relevant stimuli, illustrated by the cocktail party effect.
Information in sensory memory is quickly replaced or transferred to more
permanent storage.
Short Term Memory
Short-term memory, akin to a mental scratch-pad, stores temporary
information needed for immediate tasks, like mental calculations or
comprehension during reading. It allows rapid access but decays quickly,
lasting only about 200 milliseconds. Its capacity is limited, typically
holding 7 ± 2 items or chunks of information. Chunking, or grouping
information into meaningful units, can increase this capacity by
optimizing memory use. Closure, the successful formation of chunks, aids
in task completion and minimizes errors.
Long Term Memory Structure
Long-term memory consists of episodic memory and semantic memory.
Episodic memory stores personal experiences in chronological order,
allowing us to recall past events. Semantic memory contains factual
information, concepts, and skills acquired over time. Information in
semantic memory is derived from episodic memory, facilitating learning
from experiences. Semantic memory is often conceptualized as a
network, enabling access to information, representation of relationships
between concepts, and inference.
Long Term Memory Process
Long-term memory involves storage, forgetting, and retrieval processes.
Information is stored through rehearsal, with distributed practice being
more effective. Meaningful information is easier to remember than
meaningless data. Forgetting can occur due to decay or interference,
influenced by emotional factors. Retrieval involves recall and recognition,
aided by cues like categories and vivid imagery. Vivid imagery helps
people remember details not explicitly stated, as demonstrated in
experiments where subjects visualize scenes.
Thinking:
Reasoning and
Problem Solving
Thinking: Reasoning and
Problem Solving
We've explored how information is processed and manipulated in the
human system, a complex area that distinguishes humans from other
information-processing systems. While animals can receive and store
information, humans uniquely use it to reason and solve problems, even
when information is partial or unavailable. Human thought is conscious
and self-aware, allowing us to identify the products of our thought
processes. Thinking can vary in complexity, from simple calculations to
understanding complex topics like politics. We'll delve into two main
categories of thinking: reasoning and problem-solving, recognizing that
they often overlap in practice.
Reasoning
Reasoning is the process by which we use the knowledge we have to
draw conclusions or infer something new about the domain of interest.
There are a number of different types of reasoning: deductive, inductive
and abductive. We use each of these types of reasoning in everyday life,
but they differ in significant ways.
Inductive Reasoning
Inductive reasoning involves generalizing from observed cases to infer
information about unobserved cases. While unreliable, it's a useful
process in learning about the environment. We often rely on induction to
make assumptions about the world, even though it can't be proven true,
only false. Positive evidence is typically given more weight than negative
evidence, as shown in Wason's experiment where participants tend to
focus on confirming evidence rather than disconfirming evidence.
Abductive Reasoning
Abductive reasoning involves inferring the cause or explanation for
observed facts or events. It's the method used to derive explanations for
what we observe. Despite its unreliability, people often rely on abduction
until evidence supports an alternative explanation. In interactive
systems, if an event consistently follows an action, users may assume
causation unless evidence to the contrary is provided, leading to
potential confusion or errors.
Problem Solving
Problem solving is about finding solutions to new tasks using existing
knowledge. The Gestalt view and problem space theory are two major
theories explaining how humans solve problems. The Gestalt view
emphasizes knowledge reuse and insight, while problem space theory
sees the mind as a limited information processor. Later interpretations
combined aspects of both theories into comprehensive models of problem
solving.
Gestalt Theory
Gestalt theory proposed that problem solving involves both reproductive
and productive processes. Reproductive problem solving relies on past
experience, while productive problem solving involves insight and
restructuring of the problem. Experimental evidence, such as Kohler's
observations of apes and Maier's pendulum problem with humans,
supported these claims. However, Gestalt theory lacks sufficient evidence
and structure to fully explain problem solving. Despite its limitations, it
paved the way for the development of information-processing theory.
Problem Space Theory
Newell and Simon's problem space theory posits that problem solving
involves moving through problem states using operators. Heuristics, like
means-ends analysis, help select operators to reach the goal state. This
theory accounts for human processing constraints and has been applied
to well-defined problem domains. Real-world problems may pose
challenges in finding necessary knowledge or specifying clear goals.
Despite this, the problem space framework offers a clear theory of
problem solving applicable to various scenarios.
Analog in problem solving
Analogy plays a crucial role in problem solving by leveraging knowledge
from familiar domains to tackle novel problems through analogical
mapping. Researchers have explored this phenomenon using analogous
stories, demonstrating significant improvements in problem-solving
success rates when subjects are provided with relevant analogies.
However, the effectiveness of analogies depends on their semantic
closeness to the problem domain, highlighting the importance of
recognizing and applying analogous information in problem-solving
contexts.
Skill acquisition
Skill acquisition involves the gradual development of expertise within a
specific domain, contrasting with the handling of unfamiliar problems. By
examining the disparities between novice and expert behavior in various
domains, we gain insight into the process of acquiring skills and its
impact on problem-solving abilities.
Errors and mental models
Human errors vary in severity, from trivial inconveniences to catastrophic
events like plane crashes or nuclear accidents. Understanding why errors
occur is crucial for prevention. Errors can stem from changes in familiar
patterns of behavior or from flawed mental models—personal theories
used to interpret systems. Mental models are often incomplete, unstable,
internally inconsistent, unscientific, and based on misinterpretations of
evidence. Identifying and correcting errors requires addressing these
underlying causes.

You might also like