Ift 121 - Notes
Ift 121 - Notes
System Concept
1. Computer Hardware
Physical equipment used for input, output and processing. The hardware structure
depends upon the type and size of the organization. It consists of an input and an
output device, operating system, processor, and media devices. This also includes
computer peripheral devices.
2. Computer Software
The application program used to control and coordinate the hardware components. It
is used for analysing and processing of the data. These programs include a set of
instruction used for processing information. Software is further classified into three
types:
System Software
Application Software
Procedures
3. Databases
Data are the raw facts and figures that are unorganized that are later processed to
generate information. Softwares are used for organizing and serving data to the user,
managing physical storage of media and virtual resources. As the hardware can’t
work without software the same as software needs data for processing. Data are
managed using Database management system. Database software is used for
efficient access for required data, and to manage knowledge bases.
4. Network
5. Human Resources
It is associated with the manpower required to run and manage the system. People
are the end user of the information system, end-user use information produced for
their own purpose, the main purpose of the information system is to benefit the end
user. The end user can be accountants, engineers, salespersons, customers, clerks, or
managers etc. People are also responsible to develop and operate information
systems. They include systems analysts, computer operators, programmers, and
other clerical IS personnel, and managerial techniques.
Important/Function:
They also offer tools and processes to manage this data effectively, ensuring
its accuracy, consistency, and accessibility.
Raw data is often meaningless on its own. Information systems transform this
data into useful information through various processing techniques.
They can perform calculations, sort and filter data, and generate reports that
provide insights into business operations, trends, and performance.
3. Decision-Making Support:
6. Competitive Advantage:
7. Knowledge Management:
1. Relevance:
The information provided by the system should be relevant to the needs of the
users and the tasks at hand. It should answer the specific questions being asked
and support the decision-making process.
2. Accuracy:
The data and information within the system must be accurate and reliable.
Errors and inconsistencies can lead to poor decisions and operational
inefficiencies.
3. Timeliness:
4. Completeness:
5. Conciseness:
While completeness is important, the information should also be concise and
easy to understand. Too much information can be overwhelming and lead to
confusion.
6. Accessibility:
7. Security:
8. Cost-effectiveness:
9. Flexibility:
10. User-friendliness:
The system should be easy to use and understand, even for non-technical
users. This means having a user-friendly interface and providing adequate
training.
11. Integration:
The system should be able to integrate with other systems within the
organization, allowing for seamless data flow and information sharing.
12. Scalability:
Networks: Local Area Networks (LANs), Wide Area Networks (WANs), wireless
networks, network protocols (like TCP/IP) enabling data transmission between
devices.
THE INTERNET
ARPANET: The internet's origins can be traced back to the late 1960s with
the creation of ARPANET (Advanced Research Projects Agency Network) by
the U.S. Department of Defense. It was designed as a decentralized network to
ensure communication could continue even in the event of a nuclear attack.
Packet Switching: ARPANET pioneered the concept of "packet switching,"
where data is broken down into smaller packets and sent across the network
independently, then reassembled at the destination. This made the network
more efficient and resilient.
TCP/IP: In the 1970s, Vint Cerf and Bob Kahn developed TCP/IP
(Transmission Control Protocol/Internet Protocol), a set of communication
protocols that became the standard for the internet. This allowed different
networks to connect and communicate with each other, leading to the
"network of networks" we know today.
NSFNET: In the 1980s, the National Science Foundation (NSF) in the U.S.
established NSFNET, a high-speed backbone network that further expanded
the internet's reach and accessibility.
World Wide Web: In 1989, Tim Berners-Lee invented the World Wide Web
(WWW) at CERN, introducing hypertext and the concept of web pages linked
together by hyperlinks. This made the internet much more user-friendly and
accessible to the general public.
Commercialization: The 1990s saw the commercialization of the internet,
with the development of web browsers like Mosaic and Netscape, and the rise
of internet service providers (ISPs). This led to an explosion in internet usage
and the dot-com boom.
Every device attempting to access the internet is initially linked either physically
through cables or wirelessly. For instance, a computer can establish a physical
connection to a modem using an Ethernet cable or connect wirelessly through Wi-
Fi or Bluetooth signals.
Each computer connected to the internet is also assigned a unique IP address that
enables the device to be recognized.
When one device attempts to send a message to another device, the data is sent
over the internet in the form of packets and each packet is assigned a port
number that will connect it to its endpoint.
A packet that has both a unique IP address and port number can be translated from
alphabetic text into electronic signals by traveling through the layers of the Open
Systems Interconnection (OSI) model from the top application layer to the
bottom physical layer.
The message is then sent over the internet where it's received by the internet
service provider's (ISP) router.
The router examines the destination address assigned to each packet and
determines where to send it.
Eventually, the packet reaches the client and travels in reverse from the bottom
physical layer of the OSI model to the top application layer. During this process,
the routing data -- the port number and IP address -- is stripped from the packet,
thus enabling the data to be translated back into alphabetic text and completing
the transmission process.
Servers. Servers are the computers that provide services or share stored resources
with the client devices. Their main job is to comply with client requests by
providing the requested information or performing the requested tasks.
IP addresses. IP addresses are used to identify devices on the internet. These can
include IPv4 addresses such as 192.168.1.1, which is the default IP address
many router manufacturers use to access a router's interface. IPv4 addresses are
shorter than IPv6 addresses, which are designed to handle the increasing number
of connected devices.
ISPs. ISPs are companies that provide users with internet connectivity. They
operate the infrastructure, including the cables and routers needed to connect
users to the global network.
Firewalls and security measures. Incoming and outgoing network traffic on the
internet is monitored and controlled by different types of security firewalls and
security measures. Firewalls safeguard networks and devices against unauthorized
internet access, cyber threats and malicious activities.
Difference between the World Wide Web and the internet
The key difference between the internet and the World Wide Web (WWW or web) is
that the internet is a global connection of networks, while the web is a collection of
information or websites that can be accessed using the internet. In other words, the
internet is the infrastructure and the web is a service on top of it.
Here's a table highlighting the differences between the World Wide Web (WWW) and
the Internet:
The web is the most widely used part of the internet. Its outstanding feature is
hypertext, a method of instantly cross-referencing text. Used in blog posts, Hypertext
Markup Language (HTML) web pages, social media posts and online shopping
websites, hypertext appears in a different color than the rest of the text and is often
also underlined. When a user clicks on one of these words or phrases, they're
transferred to the related site or webpage. Buttons, images or portions of images are
also used as hyperlinks.
The web provides access to billions of pages of information. Web browsing is done
through a web browser, such as Chrome, Edge or Firefox. The appearance of a
particular website can vary slightly, depending on the browser used. Newer versions
of a particular browser can render more complex features, such as animation, virtual
reality, sound and music files.
Compatibility with other media types. Due to the standardized protocols and
formats that it offers, the internet facilitates compatibility with various media
types. This enables seamless integration and interaction across diverse multimedia
such as photos, videos and audio files.
Easy accessibility. Web browsers such as Chrome or Firefox are used to access
the internet. For end users and developers, these programs are simple to use,
comprehend and easy to develop.
Some specific examples of how the internet is used include the following:
Email and other forms of communication, such as Internet Relay Chat, internet
telephony, instant messaging and video conferencing.
Use a virtual private network or, at least, a private browsing mode, such as
Google Chrome's Incognito window.
Use secure protocols, such as HTTPS, instead of HTTP for online transactions.
Deactivate autofill.
Use caution with spam emails and never open or download content from unknown
sources.
Additionally, there's an element of the internet called the dark web. The dark web is
hidden and inaccessible through standard browsers. Instead, it uses the Tor and I2P
browsers which let users remain completely anonymous. While this anonymity can be
a great way to protect an online user's security and free speech, or for the government
to keep classified data hidden, the dark web also creates an environment that
facilitates cybercrime, the transfer of illegal goods and terrorism.
On the other side, people believe the internet increases civic engagement, sociability
and the intensity of relationships.
Whether the effects are good or bad, the internet has changed the way society
interacts and connects. People are constructing social relationships based on
individual interests, projects and values. Communities are being formed by like-
minded individuals not only offline and in person, but through the internet and the
multitude of online environments it creates and offers. Social networking sites -- like
Facebook and LinkedIn -- have become the preferred platforms for both businesses
and individuals looking to perform all kinds of tasks and communicate with others.
Allows users to save data and easily share files with cloud storage on cloud
computing platforms.
Enables users to monitor and control personal accounts instantly, such as bank
accounts or credit card bills.
What Is HCI?
The emergence of HCI dates back to the 1980s, when personal computing was on the
rise. It was when desktop computers started appearing in households and corporate
offices. HCI’s journey began with video games, word processors, and numerical units.
However, with the advent of the internet and the explosion of mobile and diversified
technologies such as voice-based and Internet of Things (IoT), computing became
omnipresent and omnipotent. Technological competence further led to the evolution
of user interactions. Consequently, the need for developing a tool that would make
such man-machine interactions more human-like grew significantly. This established
HCI as a technology, bringing different fields such as cognitive engineering,
linguistics, neuroscience, and others under its realm.
Today, HCI focuses on designing, implementing, and evaluating interactive
interfaces that enhance user experience using computing devices. This includes
user interface design, user-centered design, and user experience design.
Human-Computer Interaction
1. The user
The user component refers to an individual or a group of individuals that participate in
a common task. HCI studies users’ needs, goals, and interaction patterns. It analyzes
various parameters such as users’ cognitive capabilities, emotions, and experiences to
provide them with a seamless experience while interacting with computing systems.
A user operates a computer system with an objective or goal in mind. The computer
provides a digital representation of objects to accomplish this goal. For example,
booking an airline for a destination could be a task for an aviation website. In such
goal-oriented scenarios, one should consider the following aspects for a better user
experience:
3. The interface
The interface is a crucial HCI component that can enhance the overall user interaction
experience. Various interface-related aspects must be considered, such as interaction
type (touch, click, gesture, or voice), screen resolution, display size, or even color
contrast. Users can adjust these depending on the user’s needs and requirements.
For example, consider a user visiting a website on a smartphone. In such a case, the
mobile version of the website should only display important information that allows
the user to navigate through the site easily. Moreover, the text size should be
appropriately adjusted so that the user is in a position to read it on the mobile device.
Such design optimization boosts user experience as it makes them feel comfortable
while accessing the site on a mobile phone.
4. The context
HCI is not only about providing better communication between users and computers
but also about factoring in the context and environment in which the system is
accessed. For example, while designing a smartphone app, designers need to evaluate
how the app will visually appear in different lighting conditions (during day or night)
or how it will perform when there is a poor network connection. Such aspects can
have a significant impact on the end-user experience.
Thus, HCI is a result of continuous testing and refinement of interface designs that
can affect the context of use for the users.
Importance of HCI
HCI is crucial in designing intuitive interfaces that people with different abilities and
expertise usually access. Most importantly, human-computer interaction is helpful for
communities lacking knowledge and formal training on interacting with specific
computing systems.
With efficient HCI designs, users need not consider the intricacies and complexities
of using the computing system. User-friendly interfaces ensure that user interactions
are clear, precise, and natural.
Today, technology has penetrated our routine lives and has impacted our daily
activities. To experience HCI technology, one need not own or use a smartphone or
computer. When people use an ATM, food dispensing machine, or snack vending
machine, they inevitably come in contact with HCI. This is because HCI plays a vital
role in designing the interfaces of such systems that make them usable and efficient.
2. Industry
Industries that use computing technology for day-to-day activities tend to consider
HCI a necessary business-driving force. Efficiently designed systems ensure that
employees are comfortable using the systems for their everyday work. With HCI,
systems are easy to handle, even for untrained staff.
HCI is critical for designing safety systems such as those used in air traffic control
(ATC) or power plants. The aim of HCI, in such cases, is to make sure that the system
is accessible to any non-expert individual who can handle safety-critical situations if
the need arises.
3. Accessible to disabled
The primary objective of HCI is to design systems that make them accessible, usable,
efficient, and safe for anyone and everyone. This implies that people with a wide
range of capabilities, expertise, and knowledge can easily use HCI-designed systems.
It also encompasses people with disabilities. HCI tends to rely on user-centered
techniques and methods to make systems usable for people with disabilities.
4. An integral part of software success
HCI is an integral part of software development companies that develop software for
end-users. Such companies use HCI techniques to develop software products to make
them usable. Since the product is finally consumed by the end-user, following HCI
methods is crucial as the product’s sales depend on its usability.
Today, user manuals for general computer systems are a rarity. Very few advanced
and complex computing systems provide user manuals. In general, users expect the
systems to be user-friendly and enable them to access the system within a few minutes
of interacting with it. Here, HCI is an effective tool that designers can use to
design easy-to-use interfaces. HCI principles also ensure that the systems have
obvious interfaces and do not require special training to be used. Hence, HCI makes
computing systems suitable for an untrained community.
Examples of HCI
Technological development has brought to light several tools, gadgets, and devices
such as wearable systems, voice assistants, health trackers, and smart TVs that have
advanced human-computer interaction technology.
Let’s look at some prominent examples of HCI that have accelerated its evolution.
1. IoT technology
IoT devices and applications have significantly impacted our daily lives. According to
a May 2022 report by IoT Analytics, global IoT endpoints are expected to reach 14.4
billion in 2022 and grow to 27 billion (approx.) by 2025. As users interact with such
devices, they tend to collect their data, which helps understand different user
interaction patterns. IoT companies can make critical business decisions that can
eventually drive their future revenues and profits.
2. Eye-tracking technology
Eye-tracking is about detecting where a person is looking based on the gaze point.
Eye-tracking devices use cameras to capture the user’s gaze along with some
embedded light sources for clarity. Moreover, these devices use machine learning
algorithms and image processing capabilities for accurate gaze detection.
Businesses can use such eye-tracking systems to monitor their personnel’s visual
attention. It can help companies manage distractions that tend to trouble their
employees, enhancing their focus on the task. In this manner, eye-tracking
technology, along with HCI-enabled interactions, can help industries monitor the
daily operations of their employees or workers.
Other applications include driver monitoring systems that ensure road security.
Moreover, in the future, HCI-enabled eye-tracking systems may allow users to scroll
through a computer screen just by rolling their eyeballs.
Speech recognition technology interprets human language, derives meaning from it,
and performs the task for the user. Recently, this technology has gained significant
popularity with the emergence of chatbots and virtual assistants.
4. AR/VR technology
AR and VR are immersive technologies that allow humans to interact with the digital
world and increase the productivity of their daily tasks. For example, smart glasses
enable hands-free and seamless user interaction with computing systems. Consider an
example of a chef who intends to learn a new recipe. With smart glass technology, the
chef can learn and prepare the target dish simultaneously.
Moreover, the technology also reduces system downtime significantly. This implies
that as smart AR/VR glasses such as ‘Oculus Quest 2’ are supported by apps, the
faults or problems in the system can be resolved by maintenance teams in real-time.
This enhances user experience in a minimum time span. Also, the glasses can detect
the user’s response to the interface and further optimize the interaction based on the
user’s personality, needs, and preferences.
Thus, AR/VR technology with the blend of HCI ensures that the task is accomplished
with minimal errors and also achieves greater accuracy and quality. Currently, HCI
research is targeting other fields of study, such as brain-computer
interfaces and sentiment analysis, to boost the user’s AR/VR experience.
A recent development in this regard has been enabled via Dexta Haptic Gloves. These
VR gloves can sense and process touch parameters such as surface hardness, softness,
etc. These gloves can memorize a user’s finger movements by locking and unlocking
the finger joints as they interact in the VR environment. Later, the gloves can replicate
the recorded data of feelings across various degrees in real life.
5. Cloud computing
Today, companies across different fields are embracing remote task forces. According
to a ‘Breaking Barriers 2020’ survey by Fuze (An 8×8 Company), around 83% of
employees feel more productive working remotely. Considering the current trend,
conventional workplaces will witness a massive rejig and transform entirely in a
couple of decades. Thanks to cloud computing and human-computer interaction, such
flexible offices have become a reality.
Moreover, an employee can access data on the cloud from any physical location by
using cloud-based SaaS services. Such virtual settings streamline workflows and
support seamless collaboration with remote teams across industry verticals without
impacting productivity. Thus, with time, the idea of traditional offices may cease to
exist, mainly because of SaaS and HCI.
Goals of HCI
The principal objective of HCI is to develop functional systems that are usable, safe,
and efficient for end-users. The developer community can achieve this goal by
fulfilling the following criteria:
Design methods, techniques, and tools that allow users to access systems based
on their needs
Adjust, test, refine, validate, and ensure that users achieve effective
Always give priority to end-users and lay the robust foundation of HCI
1. Usability
Usability is key to HCI as it ensures that users of all types can quickly learn and use
computing systems. A practical and usable HCI system has the following
characteristics:
How to use it: This should be easy to learn and remember for new and
infrequent users to learn and remember. For example, operating systems with a
user-friendly interface are easier to understand than DOS operating systems that
situations. This may refer to users making mistakes and errors while using the
system that may lead to severe consequences. Users can resolve this through
HCI practices. For example, systems can be designed to prevent users from
provide recovery plans once the user commits mistakes. This may give users
Efficient: An efficient system defines how good the system is and whether it
accomplishes the tasks that it is supposed to. Moreover, it illustrates how the
Utility: Utility refers to the various functionalities and tools provided by the
system to complete the intended task. For example, a sound utility system offers
Enjoyable: Users find the computing system enjoyable to use when the
2. User experience
User experience is a subjective trait that focuses on how users feel about the
computing system when interacting with it. Here, user feelings are studied
individually so that developers and support teams can target particular users to evoke
positive feelings while using the system.
HCI systems classify user interaction patterns into the following categories and
further refine the system based on the detected pattern:
INFORMATION MANAGEMENT
Information management is the process of acquiring, using, and sharing information
within an organization. It involves managing information to help a business meet its
goals.
Information technology (IT) has permeated virtually every aspect of modern life,
leading to a wide array of application domains. Here are some of the key areas where
IT plays a crucial role:
1. Healthcare:
2. Education:
6. Entertainment:
7. Security:
Information technology (IT) development and use introduce a wide range of security
issues. These issues can compromise data, systems, and even entire organizations.
Common Security Threats:
Generative AI, like that powering large language models (LLMs), is transforming
content creation and interactive experiences.
ML algorithms are becoming more sophisticated, leading to improved predictions
and personalized experiences.
Cybersecurity: With the rise of digital data and interconnected systems, cybersecurity is
paramount.
AI-driven threat detection and response systems are crucial for safeguarding digital
assets.
Emphasis on cybersecurity resilience, including securing remote work environments.
Cloud Computing:
5G and Connectivity:
Quantum Computing:
Quantum computing holds the potential to revolutionize computation, enabling
solutions to complex problems that are beyond the reach of classical computers.
Applications are emerging in areas like cryptography, drug discovery, and financial
modeling