Human Computer Interaction Applications
Human Computer Interaction Applications
Historical Evolution
From the initial computers performing batch processing to the user-centric design,
there were several milestones which are mentioned below −
Heuristic Evaluation
Heuristics evaluation is a methodical procedure to check user interface for usability problems.
Once a usability problem is detected in design, they are attended as an integral part of constant
design processes. Heuristic evaluation method includes some usability principles such as
Nielsen’s ten Usability principles.
General Interaction
Guidelines for general interaction are comprehensive advices that focus on general instructions
such as −
Be consistent.
Offer significant feedback.
Ask for authentication of any non-trivial critical action.
Authorize easy reversal of most actions.
Lessen the amount of information that must be remembered in between
actions.
Seek competence in dialogue, motion and thought.
Excuse mistakes.
Classify activities by function and establish screen geography accordingly.
Deliver help services that are context sensitive.
Use simple action verbs or short verb phrases to name commands.
Information Display
Information provided by the HCI should not be incomplete or unclear or else the application
will not meet the requirements of the user. To provide better display, the following guidelines
are prepared −
Data Entry
The following guidelines focus on data entry that is another important aspect of HCI −
It thus refers to the Usability Function features of the entire process of abstracting,
implementing & testing hardware and software products. Requirements gathering stage to
installation, marketing and testing of products, all fall in this process.
Usability
Usability has three components − effectiveness, efficiency and satisfaction, using which, users
accomplish their goals in particular environments. Let us look in brief about these components.
Usability Study
The methodical study on the interaction between people, products, and environment based on
experimental assessment. Example: Psychology, Behavioural Science, etc.
Usability Testing
The scientific evaluation of the stated usability parameters as per the user’s requirements,
competences, prospects, safety and satisfaction is known as usability testing.
Acceptance Testing
Acceptance testing also known as User Acceptance Testing (UAT), is a testing procedure that
is performed by the users as a final checkpoint before signing off from a vendor. Let us take
an example of the handheld barcode scanner.
Let us assume that a supermarket has bought barcode scanners from a vendor. The
supermarket gathers a team of counter employees and make them test the device in a mock
store setting. By this procedure, the users would determine if the product is acceptable for
their needs. It is required that the user acceptance testing "pass" before they receive the final
product from the vendor.
Software Tools
A software tool is a programmatic software used to create, maintain, or otherwise support other
programs and applications. Some of the commonly used software tools in HCI are as follows
−
Specification Methods − The methods used to specify the GUI. Even though these are lengthy
and ambiguous methods, they are easy to understand.
Grammars − Written Instructions or Expressions that a program would understand. They
provide confirmations for completeness and correctness.
Transition Diagram − Set of nodes and links that can be displayed in text, link frequency,
state diagram, etc. They are difficult in evaluating usability, visibility, modularity and
synchronization.
State-charts − Chart methods developed for simultaneous user activities and external actions.
They provide link-specification with interface building tools.
Interface Building Tools − Design methods that help in designing command languages, data-
entry structures, and widgets.
Interface Mock-up Tools − Tools to develop a quick sketch of GUI. E.g., Microsoft Visio,
Visual Studio .Net, etc.
Software Engineering Tools − Extensive programming tools to provide user interface
management system.
Evaluation Tools − Tools to evaluate the correctness and completeness of programs.
Let us see the following model in software engineering for interactive designing.
The Unidirectional movement of the waterfall model of Software Engineering shows that
every phase depends on the preceding phase and not vice-versa. However, this model is not
suitable for the interactive system design.
The interactive system design shows that every phase depends on each other to serve the
purpose of designing and product creation. It is a continuous process as there is so much to
know and users keep changing all the time. An interactive system designer should recognize
this diversity.
Prototyping
Prototyping is another type of software engineering models that can have a complete range of
functionalities of the projected system.
In HCI, prototyping is a trial and partial design that helps users in testing design ideas without
executing a complete system.
Example of a prototype can be Sketches. Sketches of interactive design can later be produced
into graphical interface. See the following diagram.
The above diagram can be considered as a Low Fidelity Prototype as it uses manual
procedures like sketching in a paper.
A Medium Fidelity Prototype involves some but not all procedures of the system. E.g., first
screen of a GUI.
Finally, a Hi Fidelity Prototype simulates all the functionalities of the system in a design.
This prototype requires, time, money and work force.
UCD Drawbacks
Passive user involvement.
User’s perception about the new interface may be inappropriate.
Designers may ask incorrect questions to users.
Diagram
It is significant that everything in the GUI is arranged in a way that is recognizable and pleasing
to the eye, which shows the aesthetic sense of the GUI designer. GUI aesthetics provides a
character and identity to any product.
The profession has boomed in the last decade even when the usability has been there forever.
And since new products are developed frequently, the durability prognosis also looks great.
As per an estimation made on usability specialists, there are mere 1,000 experts in India.
The overall requirement is around 60,000. Out of all the designers working in the
country, HCI designers count for approximately 2.77%.
HCI Analogy
Let us take a known analogy that can be understood by everyone. A film director is a person
who with his/her experience can work on script writing, acting, editing, and cinematography.
He/ She can be considered as the only person accountable for all the creative phases of the
film.
Similarly, HCI can be considered as the film director whose job is part creative and part
technical. An HCI designer have substantial understanding of all areas of designing. The
following diagram depicts the analogy −
HCI Design
HCI design is considered as a problem solving process that has components like planned usage,
target area, resources, cost, and viability. It decides on the requirement of product similarities
to balance trade-offs.
The following points are the four basic activities of interaction design −
Identifying requirements
Building alternative designs
Developing interactive versions of the designs
Evaluating designs
Activity Theory − This is an HCI method that describes the framework where the human-
computer interactions take place. Activity theory provides reasoning, analytical tools and
interaction designs.
User-Centred Design − It provides users the center-stage in designing where they get the
opportunity to work with designers and technical practitioners.
Principles of User Interface Design − Tolerance, simplicity, visibility, affordance,
consistency, structure and feedback are the seven principles used in interface designing.
Value Sensitive Design − This method is used for developing technology and includes three
types of studies − conceptual, empirical and technical.
o Conceptual investigations works towards understanding the values of the investors who
use technology.
o Empirical investigations are qualitative or quantitative design research studies that
shows the designer’s understanding of the users’ values.
o Technical investigations contain the use of technologies and designs in the conceptual
and empirical investigations.
Participatory Design
Participatory design process involves all stakeholders in the design process, so that the end
result meets the needs they are desiring. This design is used in various areas such as software
design, architecture, landscape architecture, product design, sustainability, graphic design,
planning, urban design, and even medicine.
Participatory design is not a style, but focus on processes and procedures of designing. It is
seen as a way of removing design accountability and origination by designers.
Task Analysis
Task Analysis plays an important part in User Requirements Analysis.
Task analysis is the procedure to learn the users and abstract frameworks, the patterns used in
workflows, and the chronological implementation of interaction with the GUI.
It analyses the ways in which the user partitions the tasks and sequence them.
What is a TASK?
Human actions that contributes to a useful objective, aiming at the system, is a task. Task
analysis defines performance of users, not computers.
Dialog Representation
To represent dialogs, we need formal techniques that serves two purposes −
Introduction to Formalism
There are many formalism techniques that we can use to signify dialogs. In this chapter, we
will discuss on three of these formalism techniques, which are −
STNs are the most spontaneous, which knows that a dialog fundamentally denotes to a
progression from one state of the system to the next.
Circles − A circle refers to a state of the system, which is branded by giving a name to the
state.
Arcs − The circles are connected with arcs that refers to the action/event resulting in the
transition from the state where the arc initiates, to the state where it ends.
STN Diagram
State-Charts
State-Charts represent complex reactive systems that extends Finite State Machines (FSM),
handle concurrency, and adds memory to FSM. It also simplifies complex system
representations. State-Charts has the following states −
Illustration
For each basic state b, the super state containing b is called the ancestor state.
A super state is called OR super state if exactly one of its sub states is active,
whenever it is active.
Let us see the State Chart Construction of a machine that dispense bottles on inserting coins.
The above diagram explains the entire procedure of a bottle dispensing machine. On pressing
the button after inserting coin, the machine will toggle between bottle filling and dispensing
modes. When a required request bottle is available, it dispense the bottle. In the background,
another procedure runs where any stuck bottle will be cleared. The ‘H’ symbol in Step 4,
indicates that a procedure is added to History for future access.
Petri Nets
Petri Net is a simple model of active behaviour, which has four behaviour elements such as −
places, transitions, arcs and tokens. Petri Nets provide a graphical explanation for easy
understanding.
A Petri Net is a graph model for the control behaviour of systems exhibiting concurrency
in their operation. The graph is bipartite, the two node types being places drawn as
circles, and transitions drawn as bars. The arcs of the graph are directed and run from
places to transitions or vice versa.
Place − This element is used to symbolize passive elements of the reactive system. A place is
represented by a circle.
Transition − This element is used to symbolize active elements of the reactive system.
Transitions are represented by squares/rectangles.
Arc − This element is used to represent causal relations. Arc is represented by arrows.
Token − This element is subject to change. Tokens are represented by small filled circles.
Visual Thinking
Visual materials has assisted in the communication process since ages in form of paintings,
sketches, maps, diagrams, photographs, etc. In today’s world, with the invention of technology
and its further growth, new potentials are offered for visual information such as thinking and
reasoning. As per studies, the command of visual thinking in human-computer interaction
(HCI) design is still not discovered completely. So, let us learn the theories that support visual
thinking in sense-making activities in HCI design.
An initial terminology for talking about visual thinking was discovered that included concepts
such as visual immediacy, visual impetus, visual impedance, and visual metaphors, analogies
and associations, in the context of information design for the web.
As such, this design process became well suited as a logical and collaborative method during
the design process. Let us discuss in brief the concepts individually.
Visual Immediacy
It is a reasoning process that helps in understanding of information in the visual
representation. The term is chosen to highlight its time related quality, which also serves as an
indicator of how well the reasoning has been facilitated by the design.
Visual Impetus
Visual impetus is defined as a stimulus that aims at the increase in engagement in the
contextual aspects of the representation.
Visual Impedance
“Directness” has been considered as a phenomena that contributes majorly to the manipulation
programming. It has the following two aspects.
Distance
Direct Engagement
Distance
Distance is an interface that decides the gulfs between a user’s goal and the level
of explanation delivered by the systems, with which the user deals. These are
referred to as the Gulf of Execution and the Gulf of Evaluation.
Direct Engagement
It is described as a programming where the design directly takes care of the controls of the
objects presented by the user and makes a system less difficult to use.
The scrutiny of the execution and evaluation process illuminates the efforts in using a system.
It also gives the ways to minimize the mental effort required to use a system.
Problems with Direct Manipulation
Even though the immediacy of response and the conversion of objectives to actions has made
some tasks easy, all tasks should not be done easily. For example, a repetitive operation is
probably best done via a script and not through immediacy.
Direct manipulation interfaces finds it hard to manage variables, or illustration of discrete
elements from a class of elements.
Direct manipulation interfaces may not be accurate as the dependency is on the user rather
than on the system.
An important problem with direct manipulation interfaces is that it directly supports the
techniques, the user thinks.
Time
Numeric ordering
Physical properties
A designer must select one of the following prospects when there are no task-related
arrangements −
Menu Layout
Menus should be organized using task semantics.
Broad-shallow should be preferred to narrow-deep.
Positions should be shown by graphics, numbers or titles.
Subtrees should use items as titles.
Items should be grouped meaningfully.
Items should be sequenced meaningfully.
Brief items should be used.
Consistent grammar, layout and technology should be used.
Type ahead, jump ahead, or other shortcuts should be allowed.
Jumps to previous and main menu should be allowed.
Online help should be considered.
Titles
Item placement
Instructions
Error messages
Status reports
Keyboards
Use of TAB key or mouse to move the cursor
Error correction methods
Field-label meanings
Permissible field contents
Use of the ENTER and/or RETURN key.
Interaction design is one of the most critical facets of user experience design. It makes the
product’s interface respond to user’s action, aiding in human-to-computer interaction.
Key takeaways:
Interaction design is a multidisciplinary design field that focuses on the interaction
between users and digital products, systems, or interfaces.
It involves designing how users engage with and experience a product, with the goal of
making that interaction intuitive and efficient.
It’s one of the most challenging stages of UX design process. UXPin’s code-based design tool
reduces those challenges by allowing designers to build functional prototypes with extreme
fidelity and interactivity. Deliver better customer experiences today.
Design better products with States, Variables, Auto Layout and more.
Often shortened to IxD, interaction design uses appropriate interactive elements, such as
transitions, microinteractions, animation, but also text, color, visuals, and layout impact users’
feeling and behavior–allowing them to design interactions strategically to elicit the appropriate
response.
A good use of interaction design successfully leads to positive user experiences, including:
Faster learnability
The goal of interaction design is to create an engaging user experience that facilitates seamless
interaction with the technology. It encompasses understanding user needs, behaviors, and
expectations to design interfaces that are not only functional but also enjoyable to use.
By focusing on how users interact with technology, interaction design in HCI aims to enhance
usability, accessibility, and overall satisfaction.
User interface design focuses on visual design and aesthetics, including color, fonts,
iconography, layouts, etc. They decide what a user interface must look like.
To summarize:
In smaller companies and startups, a UI designer is often responsible for both tasks, while the
roles are separate in larger organizations. Like anything in digital product design, the roles and
responsibilities can synergize. It all depends on the company, product, and organizational
structure.
Interaction design is a specialized discipline within UX design. Where UX looks at the entire
user experience and how everything ties together, interaction designers focus on user
interactions and motion.
User experience designers apply UX fundamentals like design thinking, human-centered
design, and user research to make decisions. They’re specifically concerned with a user’s
tasks, actions, and environment, while interaction designers focus on making the digital
product respond to user actions in an appropriate way. They tend to think about what happens
when a user clicks a button, types a phrase into a search bar or hovers over an image.
With many features and limited space, prioritizing visibility is a significant design challenge.
Don Norman’s theory is that the more visible something is, the more likely a user sees and
interacts with it. Interaction designers must balance visibility prioritization based on user needs
and business goals.
A typical example of visibility is prioritizing navigation links on mobile devices. What links
are visible via the app bar, and what do designers place in the navigation drawer behind
a hamburger menu?
Feedback
Feedback is how a digital product or system communicates with users. Interaction designers
have several ways to express this feedback, including motion or animation, tactile, audio, copy,
etc.
They must also consider accessibility and how products relay feedback to all types of users
and assistive technologies.
Constraints
Cluttered UIs with too many possibilities confuse users and create usability issues. Good
interaction design limits (or constrains) user actions to guide them through the product more
efficiently.
We see these constraints most commonly with landing pages. Designers strip away navigation,
links, and anything else that might tempt users to leave the page, leaving only a
prominent button CTA or form. Constraining users to a single action allows them to focus on
the content that leads to a conversion.
Mapping
Interaction designers must create a clear relationship between controls and their effect on a
digital product. The idea is to map these relationships to feel natural to users.
For example, the top button on an iPhone increases the volume while the lower one decreases.
This intuitive layout means users don’t have to think about which button performs which
action.
The more intuitive and obvious a product is to use, the easier and more enjoyable the
experience.
Consistency
Consistency is vital for interaction and UI design. Inconsistency can confuse users and create
usability issues. Designers not only have to design consistent UIs and interactions but also
consider consistency across multiple screen sizes and devices.
Affordance
Affordance tells users how to use something or perform an action. It’s an interaction designer’s
job to ensure that it’s obvious to users how to complete tasks using UI elements.
For example, a submit button’s disabled state tells users to complete a form’s required fields
before submitting. Using a different color and underline for links tells users which text they
can click.
Cognition
Interaction designers must have a basic understanding of cognitive psychology in UX design–
attention and perception, memory, problem-solving, and creative thinking. The aim is to
design products and experiences that don’t overload these mental processes.
Gestalt principles: how the human brain perceives visuals to create familiar structures.
Von Restorff effect: predicts that in a group of objects, the one that differs stands out
or is most likely to be remembered.
Hick’s Law: the more choices you give someone, the longer it’ll take them to make a
decision.
The Principle of Least Effort: users will make choices or take action requiring the
least amount of energy.
The Serial Positioning Effect: humans are most likely to remember the first (primacy
effect) and last (recency effect) items in a list, sentence, or piece of content.
The Principle of Perpetual Habit: people rely on familiar routines and habits–which
is why it’s crucial to use universal design patterns.
The Principle of Emotional Contagion: humans will mimic or empathize with the
emotions and behaviors of others, including animals and animations–which is why
designers use faces (even emojis) to emphasize feeling and emotion.
Fitts’s Law: the time required to move to a target area is a function between the distance
and the target’s size.
We found this helpful interaction design checklist from the US Government’s Technology
Transformation Services website, usability.gov. The checklist includes several questions to
consider when designing interactions.
Define how users interact with the interface – click/tap, push, swipe, drag & drop,
keyboard controls, etc.
Give users clues about behaviour before they take action – correct labelling,
different colours for links, using consistency for clickable UI elements, etc.
Anticipate and mitigate errors – how do you prevent errors while providing helpful
messages to correct problems?
Consider system feedback and response time – what happens after users complete an
action, and how soon does that feedback appear?
Strategically think about each element – have you chosen the appropriate
element/pattern? Is there enough space between clickable elements to avoid errors?
Have you followed design psychology principles (mentioned above)? Scrutinize every
decision from a user’s perspective.
Simplify for learnability – make user interfaces and tasks as simple as possible, use
familiar patterns, and minimize cognitive-draining tasks and features to simplify the
user experience.
What Interaction Designers Do?
An interaction designer’s role focuses on how users interact with products, particularly digital
ones like websites, apps, or software interfaces. Their job is to ensure that these interactions
are intuitive, seamless, and enjoyable.
Interactive designers spend a lot of time researching who the users are and what they need.
This includes conducting user research, interviews, and analyzing data to figure out the
problems they face and how the product can solve them. Understanding these needs is crucial
to designing interactions that make sense for the user.
Once interaction designers know what users need, they design user flows, which are basically
maps that outline the steps a user takes to complete a task in the product. For example, in an
e-commerce app, the user flow might be from adding an item to their cart, through the checkout
process, to receiving a confirmation. The goal is to make these steps as easy and efficient as
possible.
This is where interactive designers focus on buttons, navigation, and forms—all the interactive
elements users click, tap, or swipe. They design these elements to be clear, functional, and
accessible. They’re always thinking about things like: “Does this button stand out? Will the
user know what happens when they click it?”
Interactive designers build prototypes—early models of the product—so we can test how
people actually use it. This stage is all about testing assumptions. They gather feedback from
users and stakeholders, see what’s working and what’s not, and refine the design based on that.
Interactive designers work closely with UX designers, developers, and product managers to
make sure the designs are feasible and meet business goals. Developers need to know exactly
how interactions should work (like what happens when you hover over a button), and I’m there
to clarify and iterate as needed.
6. Ensure Consistency
A big part of their role is making sure the design is consistent across the entire product. Users
should feel familiar as they move through different sections. That means sticking to the same
design patterns for similar tasks and interactions.
While their main focus is the user experience, interactive designers also need to align the
design with business objectives. For example, if the goal is to increase sign-ups, they might
design an interaction that nudges users towards the registration page without feeling pushy
or disrupting the user journey.
8. Stay Updated
Finally, interaction designers keep up with design trends, tools, and best practices. Interaction
design evolves quickly, and it’s important to stay ahead to ensure the product
remains competitive and user-friendly.
In essence, They’re here to make sure the product not only looks good but works in a way
that’s easy and satisfying for users. Their focus is always on improving the interaction between
the user and the product.
Interaction designers must create multiple frames to replicate basic code functionality, which
takes considerable time and effort. With UXPin’s code-based design tool, designers can
achieve significantly better results with less effort. Here’s how:
States
UXPin enables designers to create multiple States for a single component. For example, you
can build a button with default, hover, active and disabled states, each with separate properties
and triggers.
Interactions
With UXPin Interactions, designers can build immersive, code-like experiences far beyond
the capabilities of image-based design tools. UXPin offers a wide range of triggers, actions,
and animations to create fully functional, animated prototypes.
Conditional Interactions allow designers to take prototypes a step further with Java script-like
“if-then” and “if-else” conditions to create dynamic user experiences.
Variables
In UXPin, form fields look and function like the final product. Variables allow designers to
capture user inputs and use that data elsewhere in the prototype–like a personalized welcome
message after completing an onboarding form.
Expressions
UXPin Expressions take prototyping to another level with code-like functionality, including
form validation and computational components (updating a shopping cart). When combined
with States, Interactions, and Variables, Expressions allow designers to build prototypes that
function like the final product.
These powerful features mean interaction designers don’t have to learn code or rely on
engineers to build fully functioning prototypes for accurate testing. With UXPin, designers
can build, test, and iterate faster and achieve significantly better results.
In Human-Computer Interaction (HCI), there are various interaction styles that cater to
different user needs and technological capabilities. Here's an overview of the styles you've
mentioned, along with some emerging ones:
Description: Users interact with the system by asking questions and receiving
responses. This can be through chat-bots, voice assistants, or text-based systems.
Example: Siri, Google Assistant, or automated customer service bots.
Emerging Trends: AI-driven systems becoming more sophisticated with natural
language processing (NLP), making interactions more fluid and conversational.
2. Form-based
Description: Interaction is based on filling out forms or fields. Users input information
into predefined fields, and the system processes this data.
Example: Online surveys, registration forms, or data entry systems.
Emerging Trends: Dynamic forms that change based on user input (e.g., adaptive
forms) or voice-enabled form filling.
3. Command Language
4. Menus
Description: The user selects options from a list of predefined choices. This is often
used in graphical user interfaces (GUIs).
Example: Software applications with dropdown menus, navigation bars, or right-click
context menus.
Emerging Trends: Context-aware menus that adapt to the user’s task or preferences,
and integration with AI to predict the most likely options.
5. Natural Language
Description: Users interact with systems using everyday language, either spoken or
written, making the system easier to use for non-experts.
Example: Chatbots, voice assistants like Alexa, or natural language search.
Emerging Trends: More sophisticated NLP capabilities allowing for better
understanding of context, sentiment, and even multi-turn conversations.
6. Direct Manipulation
Description: Users interact with objects on the screen as if they were real physical
objects, typically using a pointing device like a mouse or touchscreen.
Example: Drag-and-drop interfaces, resizing windows, manipulating graphical
elements on a screen (e.g., in design software).
Emerging Trends: Gesture-based manipulation using motion sensors, touchscreens,
and haptic feedback for a more immersive experience.
Description: AR overlays digital content onto the real world, often viewed through a
smartphone, tablet, or AR glasses.
Example: Pokémon Go, Google Lens, or AR-based navigation apps.
Emerging Trends: More sophisticated AR interfaces integrated with wearable tech like
AR glasses, real-time object recognition, and AR-enhanced shopping experiences.