0% found this document useful (0 votes)
10 views

Module 1. Lesson 1 & 2 HCI

Human-computer interaction (HCI) focuses on designing and evaluating interactive systems to improve user experience and productivity. Key principles of HCI include understanding user needs, reducing memory load, striving for consistency, and preventing errors. Effective user interface design is crucial for usability, impacting user satisfaction and organizational performance.

Uploaded by

Bangkal Ml
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Module 1. Lesson 1 & 2 HCI

Human-computer interaction (HCI) focuses on designing and evaluating interactive systems to improve user experience and productivity. Key principles of HCI include understanding user needs, reducing memory load, striving for consistency, and preventing errors. Effective user interface design is crucial for usability, impacting user satisfaction and organizational performance.

Uploaded by

Bangkal Ml
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Module 1 Lesson 1

What is a HCI?

Human-computer interaction is a discipline concerned with the design, evaluation


and implementation of interactive computing systems for human use and with the
study of major phenomena surrounding them. Human–computer interaction (HCI),
alternatively man–machine interaction (MMI) or computer– human interaction
(CHI) is the study of interaction between people (users) and computers.

A basic goal of HCI is to improve the interactions between users and computers and
by making computers more usable and receptive to the user's needs.
A long term goal of HCI is to design systems that minimize the barrier between the
human's cognitive model of what they want and to accomplish and the computer's
understanding of the user's task.

Interaction refers to an abstract model by which humans interact with the computing
device for a given task, and an interface is a choice of technical realization (hardware
or software) of such a given interaction model.
Interaction is a concept to be distinguished from another similar term, interface. The
letter I in HCI refers to both interaction and interface, encompassing the abstract model
and the technological methodology.

High usability means that the resulting interfaces are easy to use, efficient for the
task, ensure safety, and lead to a correct completion of the task. Usable and efficient
interaction with the computing device in turn translates to higher productivity.

WHY IS HCI IMPORTANT

• User-centered design is getting a crucial role


• It is getting more important today to increase competitiveness via HCI studies
(Norman, 1990)
• High-cost e-transformation investments
• Users lose time with badly designed products and services
• Users even give up using bad interface
– Ineffective allocation of resources
Principles of HCI

• Know Thy User

This principle simply states that the interaction and interface should cater to the needs
and capabilities of the target user of the system in design. This overall concept was well
captured by the phrase, “Know thy user,” coined by Hansen in 1971.

• Understand the Task


Another almost-commonsensical principle is to base HCI design on the understanding
of the task. The term task refers to the job to be accomplished by the user through the
use of the interactive system.

• Reduce Memory Load

Designing interaction with as little memory load as possible is a principle that also has a
theoretical basis. Humans are certainly more efficient in carrying out tasks that require
less memory burden, long or short term.

Keeping the user’s short-term memory load light is of particular importance with regard
to the interface’s role as a quick and easy guidance to the completion of the task. The
capacity of the human’s short-term memory (STM) is about 5–9 chunks of information (or items
meaningful with respect to the task), famously known as the “magic number”.

• Strive for Consistency

In the longer term, one way to unburden the memory load is to keep consistency. This
applies to both within an application and across different applications and both the
interaction model and interface implementation. One way the Microsoft Windows®–
based applications maintain their competitiveness is by promoting consistent and
familiar interface.

• Remind Users and Refresh Their Memory

Any significant task will involve the use of memory, so another good strategy is to
employ interfaces that give continuous reminders of important information and thereby
refresh the user’s memory. The human memory dissipates information quite quickly,
and this is especially true when switching tasks in multitasking situations (which is a
very prevalent form of interaction these days).

• Prevent Errors/Reversal of Action


While supporting a quick completion of the task is important, error free operation is
equally important. As such, the interaction and interface should be designed to avoid
confusion and mental overload. Naturally, all of the aforementioned principles apply
here. In addition, one effective technique is to present or solicit only the relevant
information/action as required at a given time. Inactive menu items are good examples
of such a technique. Also, having the system require the user to choose from
possibilities (e.g., menu system) is generally a safer approach than to rely on recall.

• Naturalness

The final major HCI principle is to favor “natural” interaction and interfaces. Naturalness
refers to a trait that is reflective of various operations in our everyday life. For instance, a
perfect HCI may one day be realized when a natural language– based conversational
interface is possible, because this is the prevalent way that humans communicate.
However, it can be tricky to directly translate real-life styles and modes of interaction to and
for interaction with a computer.

DEFINING THE USER INTERFACE


User interface design is a subset of a field of study called human-computer interaction
(HCI). Human-computer interaction is the study, planning, and design of how people and
computers work together so that a person's needs are satisfied in the most effective way.
HCI designers must consider a variety of factors:

– what people want and expect, physical limitations and abilities people possess,
--how information processing systems work,

– what people find enjoyable and attractive.

– Technical characteristics and limitations of the computer hardware and


software must also be considered.

The user interface is to the part of a computer and its software that people can see,
hear, touch, talk to, or otherwise understand or direct. The user interface has essentially
two components: input and output.

Input is how a person communicates his / her needs to the computer. Some common input
components are the keyboard, mouse, trackball, one's finger, and one's voice.
Output is how the computer conveys the results of its computations and requirements to
the user. Today, the most common computer output mechanism is the display screen,
followed by mechanisms that take advantage of a person's auditory capabilities: voice and
sound. The use of the human senses of smell and touch output in interface design still
remain largely unexplored.

Proper interface design will provide a mix of well-designed input and output
mechanisms that satisfy the user's needs, capabilities, and limitations in the most
effective way possible. The best interface is one that it not noticed, one that permits the
user to focus on the information and task at hand, not the mechanisms used to present
the information and perform the task.

The Importance of Good Design


With today's technology and tools, and our motivation to create really effective and
usable interfaces and screens. But we never seem to have time to find out what makes
good design, nor to properly apply it. After all, many of us have other things to do in
addition to designing interfaces and screens. So we take our best shot given the
workload and time constraints imposed upon us. The result, too often, is woefully
inadequate. Interface and screen design were really a matter of common sense, we
developers would have been producing almost identical screens for representing the
real world. Example bad designs – Closed door with complete wood – suggestion :
glass door

THE IMPORTANCE OF THE USER INTERFACE


A well-designed interface and screen is terribly important to our users. It is their window
to view the capabilities of the system. It is also the vehicle through which many critical
tasks are presented. These tasks often have a direct impact on an organization's
relations with its customers, and its profitability. A screen's layout and appearance affect
a person in a variety of ways. If they are confusing and inefficient, people will have
greater difficulty in doing their jobs and will make more mistakes. Poor design may even
chase some people away from a system permanently. It can also lead to aggravation,
frustration, and increased stress.

A BRIEF HISTORY OF THE HUMAN-COMPUTER INTERFACE


The need for people to communicate with each other has existed since we first walked upon
this planet. The lowest and most common level of communication modes we share are
movements and gestures. Movements and gestures are language independent, that is, they
permit people who do not speak the same language to deal with one another. The next
higher level, in terms of universality and complexity, is spoken language. Most people can
speak one language, some two or more. A spoken language is a very efficient mode of
communication if both parties to the communication understand it. At the third and highest
level of complexity is written language. While most people speak, not all can write. But for
those who can, writing is still nowhere near as efficient a means .of communication as
speaking. In modem times, we have the typewriter, another step upward in communication
complexity. Significantly fewer people type than write. (While a practiced typist can find
typing faster and more efficient than handwriting, the unskilled may not find this the case.)
Spoken language, however, is still more efficient than typing, regardless' of typing skill level.
Through its first few decades, a computer's ability to deal with human communication was
inversely related to
what was easy for people to do. The computer demanded rigid, typed input through a
keyboard; people responded slowly using this device and with varying degrees of skill.
The human-computer dialog reflected the computer's preferences, consisting of one
style or a combination of styles using keyboards, commonly referred to as Command
Language, Question and Answer, Menu selection, Function Key Selection, and Form
Fill-In. Throughout the computer's history, designers have been developing, with varying
degrees of success, other human-computer interaction methods that utilize more
general, widespread, and easier-to-learn capabilities: voice and handwriting. --Systems
that recognize human speech and handwriting now exist, although they still lack the
universality and richness of typed input.

A BRIEF HISTORY OF SCREEN DESIGN


Developers have been designing screens since a cathode ray tube display was first
attached to a computer, more widespread interest in the application of good design
principles to screens did not begin to emerge until the early 1970s, when IBM
introduced its 3270 cathode ray tube text-based terminal. A 1970s screen often
resembled the one pictured in Figure 1.1. It usually consisted of many fields with very
cryptic and often unintelligible captions.

Figure 1.1. 3270 cathode ray tube text-based terminal

It was visually cluttered, and often possessed a command field that challenged the user
toremember what had to be keyed into it. Ambiguous messages often required referral
to a manual to interpret. Effectively using this kind of screen required a great deal of
practice and patience. Most early screens were monochromatic, typically presenting
green text on black backgrounds.

At the turn of the decade guidelines for text-based screen design were finally made
widely available and many screens began to take on a much less cluttered look through
concepts such as grouping and alignment of elements, as illustrated in Figure 1.2. User
memory was supported by providing clear and meaningful field captions and by listing
commands on the screen, and enabling them to be applied, through function keys.
Messages also became clearer. These screens were not entirely clutter-free, however.
Instructions and reminders to the user had to be inscribed on the screen in the form of
prompts or completion aids such as the codes PR and Sc. Not all 1980s screens looked
like this, however. In the 1980s, 1970s-type screens were still being designed, and
many still reside in systems today.

Figure 1.2. Text Based Screen Design

The advent of graphics yielded another milestone in the evolution of screen design, as
illustrated in Figure above, while some basic "design principles did not change,
groupings and alignment, for example, Borders were made available to visually
enhance groupings, and buttons and menus for implementing commands replaced
function keys.

Multiple properties of elements were also provided, including many different font sizes
and styles, line thicknesses, and colors. The entry field was supplemented by a
multitude of other kinds of controls, including list boxes, drop-down combination boxes,
spin boxes, and so forth. These new controls were much more effective in supporting a
person's memory, now simply allowing for selection from a list instead of requiring a
remembered key entry. Completion aids disappeared from screens, replaced by one of
the new listing controls. Screens could also be simplified, the much more powerful
computers being able to quickly present a new screen.

In the 1990s, our knowledge concerning what makes effective screen design continued
to expand. Coupled with ever-improving technology, the result was even greater
improvements in the user-computer screen interface as the new century dawned.

GRAPHICAL SYSTEMS ADVANTAGES AND DISADVANTAGES


ADVANTAGES

• Symbols recognized faster than text

• Faster learning

• Faster use and problem solving

• Easier remembering

• More natural

• Exploits visual/spatial cues

• Fosters more concrete thinking

• Provides context

• Fewer errors

• Increased feeling of control

• Immediate feedback

• Predictable system responses

• Easily reversible actions

• Less anxiety concerning use

• More attractive

• May consume less space

• Replaces national languages

• Easily augmented with text displays

• Smooth transition from command language system

DISADVANTAGES

• Greater design complexity.

• Learning still necessary

• Replaces national languages


• Easily augmented with text displays

• Smooth transition from command language system

• Lack of experimentally-derived design guidelines

• use a pointing device may also have to be learned

• Working domain is the present

• Human comprehension limitations

• Window manipulation requirements

• Production limitations

• Few tested icons exist

• Inefficient for touch typists

• Inefficient for expert users

• Not always the preferred style of interaction

• Not always fastest style of interaction

• Increased chances of clutter and confusion

• May consume more screen space

• Hardware limitations

CHARACTERISTICS OF THE GRAPHICAL USER INTERFACE


A graphical system possesses a set of defining concepts. Included are sophisticated
visual Presentation, pick-and click interaction, a restricted set of interface options,
visualization, object orientation, extensive use of a person's recognition memory, and
concurrent performance of functions.

Sophisticated Visual Presentation:

Visual presentation is the visual aspect of the interface. It is what people see on the screen.
The sophistication of a graphical system permits displaying lines, including drawings and
icons. It also permits the displaying of a variety of character fonts, including different sizes
and styles. The display of 16 million or more colors is possible on some screens. Graphics
also permit animation and the presentation of photograph and motion video. The
meaningful interface elements visually presented to the user in a graphical System include
windows (primary, secondary, or dialog boxes), menus (menu bar, pull down, popup,
cascading), icons to represent objects such as programs or files, assorted screen-based
controls (text boxes, list boxes, combination boxes, settings, scroll bar and buttons), and a
mouse pointer and cursor. -- The objective is to reflect visually on screen the real world of
the user as realistically, meaningfully, simply, and clearly possible.

A graphical system possesses a set of defining concepts. Included are sophisticated


visual presentation, pick-and click interaction, a restricted set of interface options,
visualization, object orientation, extensive use of a person's recognition memory, and
concurrent performance of functions.

Restricted Set of Interface Options:

The array of alternatives available to the user is what is presented on the screen or may
be retrieved through what is presented on the screen, nothing less, nothing more. This
concept fostered the acronym WYSIWYG.

Pick-and-Click Interaction:
Elements of a graphical screen upon which some action is to be performed must first
identified. The motor activity required of a person to identify this element for a proposed
action is commonly referred to as pick, the signal to perform an action as cue. The primary
mechanism for performing this pick-and-click is most often the mouse and its buttons. The
user moves the mouse pointer to the relevant element (pick) and the action is signaled
(click). Pointing allows rapid selection and feedback. The hand and mind seem to work
smoothly and efficiently together. The secondary mechanism for performing these selection
actions is the keyboard most systems permit pick-and-click to be performed using the
keyboard as well.

Visualization:

Visualization is a cognitive process that allows people to understand. Information that is


difficult to perceive, because it is either too voluminous or too abstract Presenting
specialized graphic portrayals facilitates visualization. The best visualization method for
an activity depends on what People are trying to learn from the data. The goal is not
necessarily to reproduce a really graphical image, but to produce one that conveys the
most relevant information. Effective visualizations can facilitate mental insights, increase
productivity, and for faster and more accurate use of data.

Object Orientation:
A graphical system consists of objects and actions. Objects are what people see on screen.
They are manipulated as a single unit. Objects can be composed of sub objects. For
example, an object may be a document. The document's sub objects may be a paragraph,
sentence, word, and letter. A collection is the simplest relationship-the objects sharing a
common aspect. A collection might be the result of a query or a multiple selection of
objects. Operations can be applied to a collection of objects.
constraint is a stronger object relationship. Changing an object in a set affects some other
object in the set. A document being organize into pages is an example of a constraint. A
composite exists when the relationship between objects becomes so significant that the
aggregation itself can be identified as an object. Examples include a range of cells
organized into a spreadsheet, or a collection of words organized into a paragraph.
container is an object in which other objects exist. Examples include text in a document or
documents in a folder.

A container often influences the behavior of its content. It may add or suppress certain
properties or operations of objects placed within it, control access to its content, or
control access to kinds of objects it will accept. These relationships help define an
object's type. Similar traits and behaviors exist in objects of the same object type.
Another important object characteristic is persistence.

Persistence is the maintenance of a state once it is established. An object's state (for


example, window size, cursor location, scroll position, and so on) should always be
automatically preserved when the user changes it.
Use of Recognition Memory :
Continuous visibility of objects and actions encourages use of a person's more powerful
recognition memory. The "out of sight, out of mind" problem is eliminated

Lesson 2
Interaction Design
A central concern of interaction design is to develop interactive products that
are usable.
By this we mean products that are generally easy to learn, effective to use, and
provide an enjoyable user experience. A good place to start thinking about how
to design usable interactive products is to compare examples of well-designed
and poorly designed ones. Through identifying the specific weaknesses and
strengths of different interactive products, we can begin understand what it
means for something to be usable or not.

Designing interactive products requires considering who is going to be using


them, how they are going to be used, and where they are going to be used.
Another key concern is to understand the kind of activities people are doing when
interacting with these products. The appropriateness of different kinds of
interfaces and arrangements of input and output devices depends on what kinds
of activities are to be supported. For example, if the activity is to enable people to
bank online, then an interface that is secure, trustworthy, and easy to navigate is
essential. In addition, an interface that allows the user to find out information
about new services offered by the bank without it being intrusive would be useful.

The world is becoming suffused with technologies that support increasingly


diverse activities. Just think for a minute about what you can currently do using
digital technology: send messages, gather information, write essays, control
power plants, program, draw, plan, calculate, monitor others, and play games—
just to name but a few. Now think about the types of interfaces and interactive
devices that are available. They too are equally diverse: multitouch displays,
speech-based systems, handheld devices, wearables, and large interactive
displays—again, to name but a few. There are also many ways of designing how
users can interact with a system, for instance, via the use of menus, commands,
forms, icons, gestures, and so on. Furthermore, ever more innovative everyday
artifacts are being created using novel materials, such as e-textiles and
wearables (see Figure 1.1).

Figure 1.1 Turn signal biking jacket using e-textiles developed


by Leah Beuchley
The Internet of Things (IoT) now means that many products and sensors can be
connected to each other via the Internet, which enables them to talk to each other.
Popular household IoT-enabled products include smart heating and lighting and home
security systems where users can change the controls from an app on their phone or
check out who is knocking on their door via a doorbell webcam. Other apps that are
being developed are meant to make life easier for people, like finding a car parking
space in busy areas.

The interfaces for everyday consumer items, such as cameras, microwave ovens,
toasters, and washing machines, which used to be physical and the realm of product
design, are now predominantly digitally based, requiring interaction design (called
consumer electronics). The move toward transforming human-human transactions into
solely interface-based ones has also introduced a new kind of customer interaction.
Self-checkouts at grocery stores and libraries are now the norm where it is
commonplace for customers to check out their own goods or books themselves, and at
airports, where passengers check in their own luggage. While more cost-effective and
efficient, it is impersonal and puts the onus on the person to interact with the system.
Furthermore, accidentally pressing the wrong button or standing in the wrong place at a
self-service checkout can result in a frustrating, and sometimes mortifying, experience.

What is Interaction Design?

•Designing interactive products to support the way people communicate


and interact in their everyday and working lives.
• Put another way, it is about creating user experiences that enhance and
augment the way people work, communicate, and interact.
• More generally,
Terry Winograd originally described it as “designing spaces for human
communication and interaction” (1997, p. 160).

John Thackara viewed it as “the why as well as the how of our daily interactions
using computers” (2001, p. 50), while

Dan Saffer emphasized its artistic aspects: “the art of facilitating interactions
between humans through products and services” (2010, p. 4).

A number of terms have been used since to emphasize different aspects of what is
being designed, including user interface design (UI), software design, user-centered
design, product design, web design, user experience design, and interactive system
design.
Interaction design is generally used as the overarching term to describe the field,
including its methods, theories, and approaches. UX is used more widely in industry to
refer to the profession. However, the terms can be used interchangeably. Also, it
depends on their ethos and brand.

The Components of Interaction Design


We view interaction design as fundamental to many disciplines, fields, and approaches
that are concerned with researching and designing computer-based systems for people.
Figure 1.2 presents the core ones along with interdisciplinary fields that comprise one or
more of these, such as cognitive ergonomics. It can be confusing to try to work out the
differences between them as many overlap. The main differences between interaction
design and the other approaches referred to in the figure come largely down to which
methods, philosophies, and lenses they use to study, analyze, and design products.
Another way they vary is in terms of the scope and problems they address. For
example, information systems is concerned with the application of computing
technology in domains such as business, health, and education, whereas ubiquitous
computing is concerned with the design, development, and deployment of pervasive
computing technologies (for example, IoT) and how they facilitate social interactions
and human experiences.

Figure 1.2 Relationship among contributing academic disciplines, design practices, and
interdisciplinary fields concerned with interaction design (double-headed arrows mean
overlap

Who Is Involved in Interaction Design?


Designers need to know many different things about users, technologies, and the
interactions among them to create effective user experiences. At the least, they need to
understand how people act and react to events and how they communicate and interact
with each other. To be able to create engaging user experiences, they also need to
understand how emotions work, what is meant by aesthetics, desirability, and the role of
narrative in human experience. They also need to understand the business side,
technical side, manufacturing side, and marketing side. Clearly, it is difficult for one
person to be well versed in all of these diverse areas and also know how to apply the
different forms of knowledge to the process of interaction design.

Interaction design is ideally carried out by multidisciplinary teams, where the skill sets of
engineers, designers, programmers, psychologists, anthropologists,
sociologists, marketing people, artists, toy makers, product managers, and others
are drawn upon. It is rarely the case, however, that a design team would have all of
these professionals working together. Who to include in a team will depend on a
number of factors, including a company’s design philosophy, size, purpose, and product
line.

One of the benefits of bringing together people with different backgrounds and training
is the potential of many more ideas being generated, new methods developed, and
more creative and original designs being produced. However, the downside is the costs
involved. The more people there are with different backgrounds in a design team, the
more difficult it can be to communicate and make progress with the designs being
generated. Why? People with different backgrounds have different perspectives and
ways of seeing and talking about the world. What one person values as important
others may not even see (Kim, 1990). Similarly, a computer scientist’s understanding of
the term representation is often very different from that of a graphic designer or
psychologist.

What this means in practice is that confusion, misunderstanding, and communication


breakdowns can surface in a team. The various team members may have different ways of
talking about design and may use the same terms to mean quite different things. Other
problems can arise when a group of people who have not previously worked as a team are
thrown together. For example, Aruna Balakrishnan et al. (2011) found that integration
across different disciplines and expertise is difficult in many projects, especially when it
comes to agreeing on and sharing tasks. The more disparate the team members—in terms
of culture, background, and organizational structures—the more complex this is likely to be.

The User Experience


The user experience refers to how a product behaves and is used by people in the real
world.
Jakob Nielsen and Don Norman (2014) define it as encompassing
“all aspects of the end-user’s interaction with the company, its services, and its
products.”

Jesse Garrett (2010, p. 10), “Every product that is used by someone has a user
experience: newspapers, ketchup bottles, reclining armchairs, cardigan
sweaters.”
More specifically, it is about how people feel about a product and their pleasure and
satisfaction when using it, looking at it, holding it, and opening or closing it. It includes
their overall impression of how good it is to use, right down to the sensual effect small
details have on them, such as how smoothly a switch rotates or the sound of a click and
the touch of a button when pressing it. An important aspect is the quality of the
experience someone has, be it a quick one, such as taking a photo; a leisurely one,
such as playing with an interactive toy; or an integrated one, such as visiting a museum
(Law et al., 2009).
It is important to point out that one cannot design a user experience, only design for a
user experience. In particular, one cannot design a sensual experience, but only create
the design features that can evoke it. For example, the outside case of a smartphone
can be designed to be smooth, silky, and fit in the palm of a hand; when held, touched,
looked at, and interacted with, that can provoke a sensual and satisfying user
experience. Conversely, if it is designed to be heavy and awkward to hold, it is much
more likely to end up providing a poor user experience—one that is uncomfortable and
unpleasant.

Designers sometimes refer to UX as UXD. The addition of the D to UX is meant to


encourage design thinking that focuses on the quality of the user experience
rather than on the set of design methods to use (Allanwood and Beare, 2014).

Don Norman (2004) has stressed for many years, “It is not enough that we build
products that function, that are understandable and usable, we also need to build
joy and excitement, pleasure and fun, and yes, beauty to people’s lives.”

There are many aspects of the user experience that can be considered and many ways
of taking them into account when designing interactive products. Of central importance
are the usability, functionality, aesthetics, content, look and feel, and emotional
appeal.

Jack Carroll (2004) stresses other wide-reaching aspects, including fun, health, social
capital (the social resources that develop and are maintained through social
networks, shared values, goals, and norms), and cultural identity, such as age,
ethnicity, race, disability, family status, occupation, and education.

Accessibility and Inclusiveness


Accessibility refers to the extent to which an interactive product is accessible by as
many people as possible. Companies like Google and Apple provide tools for their
developers to promote this. The focus is on people with disabilities. For example,
Android OS provides a range of tools for those with disabilities, such as hearing aid
compatibility to a built-in screen reader, while Apple VoiceOver lets the user know
what’s happening on its devices, so they can easily navigate and even know who is in a
selfie just taken, by listening to the phone. Inclusiveness means being fair, open, and
equal to everyone. Inclusive design is an over arching approach where designers strive
to make their products and services accommodate the widest possible number of
people. An example is ensuring that smartphones are being designed for all and made
available to everyone— regardless of their disability, education, age, or income.

Accessibility can be achieved in two ways: first, through the inclusive design of
technology, and second, through the design of assistive technology.

When designing for accessibility, it is essential to understand the types of impairments


that can lead to disability as they come in many forms. They are often classified by the
type of impairment, for example:

• Sensory impairment (such as loss of vision or hearing)

• Physical impairment (having loss of functions to one or more parts of the


body, for example, after a stroke or spinal cord injury)

• Cognitive (for instance, learning impairment or loss of memory/cognitive


function due to old age or a condition such as Alzheimer’s disease)

Within each type is a complex mix of people and capabilities. For example, a person
might have only peripheral vision, be color blind, or have no light perception (and be
registered blind). All are forms of visual impairment, and all require different design
approaches. Color blindness can be overcome by an inclusive design approach.
Designers can choose colors that will appear as separate colors to everyone. However,
peripheral vision loss or complete blindness will often need an assistive technology to
be designed.

Impairment can also be categorized as follows:


• Permanent (for example, long-term wheelchair user)
• Temporary (such as after an accident or illness)
• Situational (for instance, a noisy environment means a person can’t hear)
People with permanent disabilities often use assistive technology in their everyday life,
which they consider to be life-essential and an extension of their self (Holloway and Dawes,
2016). Examples include wheelchairs (people now refer to “wearing their wheels,” rather
than “using a wheelchair”) and augmented and alternative communication aids. Much
current HCI research into disability explores how new technologies, such as IoT,
wearables, and virtual reality, can be used to improve upon existing assistive
technologies.

Usability and User Experience Goals


Usability refers to ensuring that interactive products are easy to learn, effective to use,
and enjoyable from the user’s perspective. It involves optimizing the interactions people
have with interactive products to enable them to carry out their activities at work, at
school, and in their everyday lives. More specifically, usability is broken down into the
following six goals:

• Effective to use (effectiveness) - is a general goal, and it refers to how good a


product is at doing what it is supposed to do.

• Efficient to use (efficiency) - refers to the way a product supports users in


carrying out their tasks.

• Safe to use (safety) - involves protecting the user from dangerous conditions
and undesirable situations.

• Having good utility (utility) - refers to the extent to which the product provides
the right kind of functionality so that users can do what they need or want to do.

• Easy to learn (learnability) - refers to how easy a system is to learn to use.

• Easy to remember how to use (memorability) - refers to how easy a product


is to remember how to use, once learned.

User Experience Goals


A diversity of user experience goals has been articulated in interaction design,
which covers a range of emotions and felt experiences. These include desirable
and undesirable ones, as shown in Table 1.1
Many of these are subjective qualities and are concerned with how a system feels to a
user. They differ from the more objective usability goals in that they are concerned with
how users experience an interactive product from their perspective, rather than
assessing how useful or productive a system is from its own perspective. Whereas the
terms used to describe usability goals comprise a small distinct set, many more terms
are used to describe the multifaceted nature of the user experience. They also overlap
with what they are referring to. In so doing, they offer subtly different options for
expressing the way an experience varies for the same activity over time, technology,
and place.

Design Principles

Design principles are used by interaction designers to aid their thinking when designing for
the user experience. These are generalizable abstractions intended to orient designers
toward thinking about different aspects of their designs. A well-known example is feedback:
Products should be designed to provide adequate feedback to the users that informs them
about what has already been done so that they know what to do next in the interface.
Another one that is important is findability (Morville, 2005). This refers to the degree to
which a particular object is easy to discover or locate—be it navigating a website, moving
through a building, or finding the delete image option on a digital camera. Related to this is
the principle of navigability: Is it obvious what to do and where to go in an interface.

Design principles are derived from a mix of theory-based knowledge, experience, and
common sense. They tend to be written in a prescriptive manner, suggesting to
designers what to provide and what to avoid at the interface—if you like, the dos and
don’ts of interaction design.

A number of design principles have been promoted. The best known are concerned with
how to determine what users should see and do when carrying out their tasks using an
interactive product. Here we briefly describe the most common ones: visibility, feedback,
constraints, consistency, and affordance.

Visibility - The more visible functions are, the more likely it is that users will be able to
know what to do next.

Feedback - Related to the concept of visibility is feedback. This is best illustrated by an


analogy to what everyday life would be like without it.

Constraints - The design concept of constraining refers to determining ways of


restricting the kinds of user interaction that can take place at a given moment.

Consistency - This refers to designing interfaces to have similar operations and use
similar elements for achieving similar tasks.

Affordance - This is a term used to refer to an attribute of an object that allows people
to know how to use it. At a simple level, to afford means “to give a clue”’ (Norman,
1988). When the affordances of a physical object are perceptually obvious, it is easy to
know how to interact with it.

You might also like