Hci Chap 3
Hci Chap 3
1. Introduction
In human-computer interaction (HCI), commands and natural languages define how users
communicate with a system.
Command language: Users input specific keywords or phrases to execute tasks (e.g.,
mkdir to create a directory in a command-line interface).
Natural language: Allows users to interact with systems using conversational or
everyday language (e.g., "What's the weather tomorrow?").
Key Aspects:
o Group commands based on their functions (e.g., "File," "Edit," "View").
o Provide shortcuts for frequently used commands.
o Use clear feedback to indicate if the command was executed successfully.
Example: Command-line interfaces like Linux organize commands into categories such
as file management, system operations, and process handling.
Strategies:
o Use hierarchical menus for complex systems to guide users step by step.
o Offer modifiers (e.g., ls -l for a detailed list view in Linux).
o Allow chaining commands (e.g., using a pipeline | in Linux to combine multiple
commands).
Structure:
o Commands should follow consistent syntax (e.g., <action> <object> such as
delete file.txt).
Command names should be meaningful and memorable. When space or time is constrained,
abbreviations are used:
Natural language processing (NLP) bridges the gap between humans and machines. Users can
interact in their everyday language without needing to learn commands.
Applications:
o Virtual assistants (e.g., Siri, Alexa).
o Chatbots for customer support.
o Translators and search engines.
Challenges:
o Ambiguity in language (e.g., "book" could mean a noun or verb).
o Multilingual support and slang recognition.
Interaction Devices
1. Introduction
Interaction devices are hardware tools or input mechanisms that enable users to interact with
computers. They are categorized based on input (e.g., keyboards, pointing devices) and output
(e.g., displays, audio).
Keyboards and keypads are primary input devices for text entry and commands.
Keyboards:
o Standard QWERTY layout is the most common.
o Variations include ergonomic keyboards, mechanical keyboards, and on-screen
virtual keyboards.
Keypads:
o Found in devices like calculators, ATMs, and mobile phones.
o Compact and efficient for numeric entry.
Design Considerations:
o Key spacing, size, and tactile feedback for usability.
o Accessibility features like Braille keyboards for visually impaired users.
3. Pointing Devices
Pointing devices allow users to interact with graphical elements on the screen.
Types:
o Mouse: Common for desktop computers.
o Touchpads: Built into laptops, supporting gestures like swiping and pinching.
o Stylus and Pen: Used for drawing or precision input on tablets.
o Trackballs and Joysticks: Used in specialized applications like gaming or CAD
(Computer-Aided Design).
Factors:
o Precision, speed, and compatibility with different systems.
These interfaces enable interaction using voice or sound, enhancing accessibility and hands-free
control.
Speech Interfaces:
o Allow users to give verbal commands (e.g., "Call John").
o Require speech recognition systems to process input.
Auditory Interfaces:
o Provide feedback through sounds or alerts (e.g., beeping for errors).
Benefits:
o Accessibility for users with physical disabilities.
o Useful in environments where hands-free operation is required (e.g., driving).
Small Displays:
o Found in mobile phones, smartwatches, or IoT devices.
o Require minimalist and scalable interfaces due to size constraints.
o Focus on clarity and touch responsiveness.
Large Displays:
o Used for presentations, control panels, or public information systems.
o Can show complex data or multiple windows simultaneously.
Key Considerations:
o Resolution and brightness for readability.
o Scalability to accommodate varying screen sizes.