0% found this document useful (0 votes)
44 views34 pages

Cos 101

COS 101 is an introductory course in computing sciences covering the history, components, and applications of computers. Students will learn about hardware, software, the Internet, and various computing disciplines, as well as practical skills in using technology. The course also explores future trends in computing and job specializations in the field.

Uploaded by

shammah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views34 pages

Cos 101

COS 101 is an introductory course in computing sciences covering the history, components, and applications of computers. Students will learn about hardware, software, the Internet, and various computing disciplines, as well as practical skills in using technology. The course also explores future trends in computing and job specializations in the field.

Uploaded by

shammah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

COS 101: Introduction to Computing Sciences (3 Units C: LH 30; PH 45) Learning

Outcomes
At the end of the course, students should be able to:
1. explain basic components of computers and other computing devices;
2. describe the various applications of computers;
3. explain information processing and its roles in the society;
4. describe the Internet, its various applications and its impact;
5. explain the different areas of the computing discipline and its specializations; and
6. demonstrate practical skills on using computers and the internet.
Course Contents
Brief history of computing. Description of the basic components of a computer/computing
device. Input/Output devices and peripherals. Hardware, software and human ware. Diverse
and growing computer/digital applications. Information processing and its roles in society.
The Internet, its applications and its impact on the world today. The different areas/programs
of the computing discipline. The job specializations for computing professionals. The future of
computing.
Lab Work: Practical demonstration of the basic parts of a computer. Illustration of different
operating systems of different computing devices including desktops, laptops, tablets, smart
boards and smart phones. Demonstration of commonly used applications such as word
processors, spreadsheets, presentation software and graphics. Illustration of input and output
devices including printers, scanners, projectors and smartboards. Practical demonstration of
the Internet and its various applications. Illustration of browsers and search engines. How to
access online resources.
Week 1: Introduction to Computing and a Brief History of Computing
• Overview of Computing: What is computing? A brief introduction to the role of
computing in society.
• The History of Computing:
o Early tools for computation (Abacus, the Antikythera mechanism).
o Key developments in computation:
Early History of Computer One of the earliest and most well-known devices was
an abacus. Then in 1822, the father of computers, Charles Babbage began
developing what would be the first mechanical computer. And then in 1833 he
actually designed an Analytical Engine which was a general-purpose computer.
▪ Ada Lovelace’s contribution to programming.
▪ The Turing Machine and Alan Turing’s contributions.
o The evolution of computers:
▪ First Generation (Vacuum tubes).
▪ Second Generation (Transistors).
▪ Third Generation (Integrated Circuits).
▪ Fourth Generation (Microprocessors and Personal Computers).
▪ Fifth Generation (AI and Quantum Computing).
o Milestones: ENIAC, UNIVAC, IBM mainframes, personal computing
revolution.
Week 2: Basic Components of a Computer/Computing Device
• Overview of a Computer System:
o Definition of a computer as a programmable device for processing data.
o Central Processing Unit (CPU): The brain of the computer.
o Memory and Storage:
▪ Primary memory (RAM).
▪ Secondary storage (HDD, SSD).
▪ Cache and Virtual Memory.
o Motherboard: Connecting all components.
o Bus System: Communication channels between components.
o Power Supply: Ensures stable energy for the system.
Week 3: Input/Output Devices and Peripherals
• Input Devices:
o Keyboards, mice, touchpads, scanners, microphones, webcams.
o Emerging input technologies: speech recognition, gesture control.
• Output Devices:
o Monitors (CRT, LCD, LED).
o Printers (Inkjet, Laser, 3D printers).
o Speakers and headphones.
• Storage Devices as Peripherals:
o External hard drives, flash drives, cloud storage.
• Networking Devices: Routers, switches, modems.
• Human-Computer Interaction: Principles of designing effective user interfaces.
Week 4: Hardware, Software, and Human Ware
• Hardware: The physical components of a computer system.
o Explanation of different types of hardware: CPU, storage devices, networking
devices, etc.
• Software: The intangible components of a computer system.
o System Software: Operating Systems (Windows, Linux, macOS).
o Application Software: Programs that perform specific tasks (Word processors,
games, web browsers).
o Firmware: Embedded software in hardware devices.
• Human Ware: The human component in the computing system.
o The role of users, developers, engineers, and system administrators.
o Human-computer interaction and ergonomics.
Week 5: Diverse and Growing Computer/Digital Applications
• Application Domains:
o Business: Enterprise resource planning (ERP), customer relationship
management (CRM), e-commerce.
o Education: E-learning platforms, MOOCs, digital classrooms.
o Health Care: Telemedicine, Electronic Health Records (EHR), health
informatics.
o Entertainment: Video games, streaming services (Netflix, YouTube), digital
media production.
o Government and Society: E-Government, digital voting systems, public data
access.
o Social Media: Impact on communication, marketing, and society.
o Artificial Intelligence and Automation: Robotics, smart devices, machine
learning.
Week 6: Information Processing and Its Role in Society
• Information Processing:
o Definition: The transformation of data into meaningful information.
o Stages of Information Processing: Input, processing, storage, output.
o Data vs. Information: Understanding the difference between raw data and
processed information.
• Information and Decision-Making:
o Role of information in business decisions, government policy, and scientific
research.
• Ethical Issues in Information Processing:
o Privacy concerns, data security, and digital ethics.
o The rise of misinformation and its societal impacts.
Week 7: The Internet: Its Applications and Impact
• The Internet:
o Overview of the Internet: What it is, how it works (TCP/IP, DNS, HTTP,
routers).
o Evolution of the Internet: ARPANET, Web 1.0 to Web 3.0.
• Internet Applications:
o The World Wide Web (browsers, search engines, websites).
o Email, instant messaging, and social media platforms.
o E-commerce (Amazon, eBay), cloud services (Google Drive, Dropbox).
o The Internet of Things (IoT): Smart homes, connected devices.
• Impact of the Internet:
o Social: Communication, collaboration, and globalization.
o Economic: E-commerce, digital economy, job creation in tech.
o Political: Digital activism, online privacy, cyber security concerns.
Week 8: Areas of Computing Discipline and Programs
• Computer Science: Theoretical foundations, algorithms, programming languages.
• Software Engineering: Design, development, testing, and maintenance of software.
• Information Technology (IT): Managing IT infrastructure, networks, and systems.
• Cybersecurity: Protecting systems from hacking, data breaches, and cyber-attacks.
• Data Science: Analyzing large datasets to derive insights and inform decisions.
• Artificial Intelligence and Machine Learning: Development of intelligent systems and
automation.
• Human-Computer Interaction (HCI): User interface design, usability, and
accessibility.
• Computational Biology/Health Informatics: Using computing for biological research
and healthcare.
• Emerging Fields: Quantum computing, blockchain, and the Metaverse.
Week 9: Job Specializations for Computing Professionals
• Overview of Computing Careers:
o Software Developer, Systems Analyst, Network Engineer, Database
Administrator, AI Specialist.
• Specializations:
o Frontend vs. Backend Development.
o DevOps, Cloud Computing, IT Support.
o Cybersecurity Analyst, Ethical Hacker, Data Scientist.
o Game Developer, UX/UI Designer.
o Research and Academia: Computer Science Researcher, University Faculty.
• Skills Needed:
o Technical skills: Programming, systems administration, database
management, cloud services.
o Soft skills: Communication, teamwork, problem-solving, adaptability.
• Job Market Outlook:
o Current trends in tech job demand (AI, cybersecurity, cloud computing).
o High-growth fields and industries.
Week 10: The Future of Computing
• Emerging Technologies:
o Quantum Computing: Potential to revolutionize data processing power.
o Artificial Intelligence and Deep Learning: Impact on automation, decision-
making, and creativity.
o Blockchain: Distributed ledger technology and its impact on security and
finance.
o 5G and Edge Computing: Faster, more efficient networks, enabling new
applications.
• Future of Human-Computer Interaction:
o Augmented Reality (AR) and Virtual Reality (VR).
o Brain-computer interfaces (BCI).
• Ethical Considerations and Challenges:
o Privacy and surveillance in the digital age.
o The digital divide and ensuring equitable access to technology.
o The role of regulation and governance in the future of technology.
• The Role of Computing in Society:
o Sustainability and environmental impact of computing.
o The growing importance of data in driving decision-making.
o Opportunities and challenges in the digital transformation of industries.

A Brief History of Computers


Ancient Times
Early Man relied on counting on his fingers and toes (which by the way, is the
basis for our base 10 numbering system). He also used sticks and stones as
markers. Later notched sticks and knotted cords were used for counting. Finally
came symbols written on hides, parchment, and later paper. Man invents the
concept of number, then invents devices to help keep up with the numbers of his possessions.

Roman Empire

The ancient Romans developed an Abacus, the first


"machine" for calculating. While it predates the Chinese
abacus we do not know if it was the ancestor of that
Abacus. Counters in the lower groove are 1 x 10n, those in
the upper groove are 5 x 10n

Industrial Age - 1600

John Napier, a Scottish nobleman and politician devoted much of his


leisure time to the study of mathematics. He was especially interested
in devising ways to aid computations. His greatest contribution was
the invention of logarithms. He inscribed logarithmic measurements on a set of 10
wooden rods and thus was able to do multiplication and division by matching up numbers on
the rods. These became known as Napier’s Bones.

1621 - The Sliderule

Napier invented logarithms, Edmund Gunter invented the logarithmic scales (lines
etched on metal or wood), but it was William Oughtred, in England who invented
the sliderule. Using the concept of Napier’s bones, he inscribed logarithms on strips of wood
and invented the calculating "machine" which was used up until the mid-1970s when the first
hand-held calculators and microcomputers appeared.
1642 - Blaise Pascal(1623-1662)

Blaise Pascal, a French mathematical genius, at the age of


19 invented a machine, which he called the Pascaline that
could do addition and subtraction to help his father, who
was also a mathematician. Pascal’s machine consisted of a
series of gears with 10 teeth each, representing the numbers 0 to 9. As each gear made one turn
it would trip the next gear up to make 1/10 of a revolution. This principle remained the
foundation of all mechanical adding machines for centuries after his death. The Pascal
programming language was named in his honor.

1673 - Gottfried Wilhelm von Leibniz (1646-1716)

Gottfried Wilhelm von Leibniz invented differential and integral


calculus independently of Sir Isaac Newton, who is usually given
sole credit. He invented a calculating machine known as Leibniz’s
Wheel or the Step Reckoner. It could add and subtract, like
Pascal’s machine, but it could also multiply and divide. It did this
by repeated additions or subtractions, the way mechanical adding machines of the mid to late
20th century did. Leibniz also invented something essential to modern computers — binary
arithmetic.

1725 - The Bouchon Loom


Basile Bouchon, the son of an organ maker, worked in the textile industry. At this
time fabrics with very intricate patterns woven into them were very much in vogue.
To weave a complex pattern, however involved somewhat complicated manipulations
of the threads in a loom which frequently became tangled, broken, or out of place.
Bouchon observed the paper rolls with punched holes that his father made to program
his player organs and adapted the idea as a way of "programming" a loom. The paper passed
over a section of the loom and where the holes appeared certain threads were lifted. As a result,
the pattern could be woven repeatedly. This was the first punched paper, stored program.
Unfortunately the paper tore and was hard to advance. So, Bouchon’s loom never really caught
on and eventually ended up in the back room collecting dust.

1728 - Falçon Loom


In 1728 Jean-Batist Falçon, substituted a deck of punched cardboard cards for the paper roll of
Bouchon’s loom. This was much more durable, but the deck of cards tended to get shuffled and
it was tedious to continuously switch cards. So, Falçon’s loom ended up collecting dust next to
Bouchon’s loom.
1745 - Joseph Marie Jacquard (1752-1834)
It took inventor Joseph M. Jacquard to bring together
Bouchon’s idea of a continuous punched roll, and Falcon’s ides
of durable punched cards to produce a really workable
programmable loom. Weaving operations were controlled by
punched cards tied together to form a long loop. And, you
could add as many cards as you wanted. Each time a thread was woven in, the roll was clicked
forward by one card. The results revolutionized the weaving industry and made a lot of money
for Jacquard. This idea of punched data storage was later adapted for computer data input.

1822 – Charles Babbage (1791-1871) and Ada Augusta, The Countess of Lovelace

Charles Babbage is known as the Father of the modern computer


(even though none of his computers worked or were even
constructed in their entirety). He first designed plans to build,
what he called the Automatic Difference Engine. It was
designed to help in the construction of mathematical tables for
navigation. Unfortunately, engineering limitations of his time made it impossible for the
computer to be built. His next project was much more ambitious.

While a professor of mathematics at Cambridge University (where Stephen Hawkin is


now), a position he never actually occupied, he proposed the construction of a
machine he called the Analytic Engine. It was to have a punched card input, a
memory unit (called the store), an arithmetic unit (called the mill), automatic printout,
sequential program control, and 20-place decimal accuracy. He had actually worked
out a plan for a computer 100 years ahead of its time. Unfortunately it was never
completed. It had to wait for manufacturing technology to catch up to his ideas.

During a nine-month period in 1842-1843, Ada Lovelace translated Italian mathematician Luigi
Menabrea's memoir on Charles Babbage's Analytic Engine. With her translation she appended a
set of notes which specified in complete detail a method for calculating Bernoulli numbers with
the Engine. Historians now recognize this as the world's first computer program and honor her
as the first programmer. Too bad she has such an ill-received programming language named
after her.

1880s – Herman Hollerith (1860-1929)


The computer trail next takes us to, of all places, the U.S. Bureau of
Census. In 1880 taking the U.S. census proved to be a monumental
task. By the time it was completed it was almost time to start over for
the 1890 census. To try to overcome this problem the Census Bureau
hired Dr. Herman Hollerith. In 1887, using Jacquard’s idea of the punched card data storage,
Hollerith developed a punched card tabulating system, which allowed the census takers to
record all the information needed on punched cards which were then placed in a special
tabulating machine with a series of counters. When a lever was pulled a number of pins came
down on the card. Where there was a hole the pin went through the card and made contact with
a tiny pool of mercury below and tripped one of the counters by one. With Hollerith’s machine
the 1890 census tabulation was completed in 1/8 the time. And they checked the count twice.

After the census Hollerith turned to using his tabulating machines for business and in 1896
organized the Tabulating Machine Company which later merged with other companies to
become IBM. His contribution to the computer then is the use of punched card data storage.
BTW: The punched cards in computers were made the same size as those of Hollerith’s
machine. And, Hollerith chose the size he did because that was the same size as the one dollar
bill at that time and therefore he could find plenty of boxes just the right size to hold the cards.

1939-1942 Dr. John Vincent Atanasoff(1903-1995) and Clifford Berry (1918-1963)

Dr. John Vincent Atanasoff and his graduate assistant,


Clifford Barry, built the first truly electronic computer,
called the Atanasoff-Berry Computer or ABC. Atanasoff
said the idea came to him as he was sitting in a small
roadside tavern in Illinois. This computer used a circuit
with 45 vacuum tubes to perform the calculations, and capacitors for storage. This was also the
first computer to use binary math.

1943 – Colossus I

The first really successful electronic


computer was built in Bletchley Park,
England. It was capable of performing only
one function, that of code breaking during
World War II. It could not be re-
programmed.

1944 – Mark I - Howard Aiken (1900-1973) and Grace Hopper (1906-1992)


In 1944 Dr. Howard Aiken of Harvard
finished the construction of the
Automatic Sequence Controlled
Calculator, popularly known as the Mark
I. It contained over 3000 mechanical
relays and was the first electro-mechanical
computer capable of making logical decisions,
like if x==3 then do this not like If its raining
outside I need to carry an umbrella. It could
perform an addition in 3/10 of a second. Compare
that with something on the order of a couple of
nano-seconds (billionths of a second) today.

The important contribution of this machine was that it was programmed by means of a punched
paper tape, and the instructions could be altered. In many ways, the Mark I was the realization
of Babbage’s dream.

One of the primary programmers for the Mark I was Grace


Hopper. One day the Mark I was malfunctioning and not
reading its paper tape input correctly. Ms Hopper checked out
the reader and found a dead moth in the mechanism with its
wings blocking the reading of the holes in the paper tape. She
removed the moth, taped it into her log book, and recorded... Relay #70 Panel F (moth) in
relay. First actual case of bug being found.
She had debugged the program, and while the word bug had been used to describe defects since
at least 1889, she is credited with coining the word debugging to describe the work of
eliminating program errors.

It was Howard Aiken, in 1947, who made the rather short-sighted comment to the effect that the
computer is a wonderful machine, but I can see that six such machines would be enough to
satisfy all the computing needs of the entire United States.

1946 – ENIAC - J. Prosper Eckert (1919-1995) and John W. Mauchly (1907-1980)

The first all electronic computer was the Electrical Numerical Integrator
and Calculator, known as ENIAC. It was designed by J. Prosper Eckert
and John W. Mauchly of the Moore School of Engineering at the
University of Pennsylvania. ENIAC was the first multipurpose electronic computer, though very
difficult to re-program. It was primarily used to computer aircraft courses, shell trajectories, and
to break codes during World War II.
ENIAC occupied a 20 x 40 foot
room and used 18,000 vacuum
tubes. ENIAC also could never
be turned off. If it was it blew
too many tubes when turned back on. It had a
very limited storage capacity and it was
programmed by jumper wires plugged into a
large board.

1948 – The Transister

In 1948 an event occurred that was to forever change the course of


computers and electronics. Working at Bell Labs three scientists, John
Bordeen (1908-1991) (left), Waltar Brattain (1902-1987) (right), and
William Shockly (1910-1989) (seated) invented the transistor.

The change over from vacuum tube circuits to transistor circuits occurred between 1956 and
1959. This brought in the second generation of computers, those based on transisters. The first
generation was mechanical and vacuum tube computers.

1951 – UNIVAC

The first practical electronic computer was built by Eckert and Mauchly
(of ENIAC fame) and was known as UNIVAC (UNIVersal Automatic
Computer). The first UNIVAC was used by the Bureau of Census. The
unique feature of the UNIVAC was that it was not a one-of-a-kind computer. It was mass
produced.

1954 – IBM 650


In 1954 the first electronic computer for business was installed at General
Electric Appliance Park in Louisville, Kentucky. This year also saw the
beginning of operation of the IBM 650 in Boston. This comparatively
inexpensive computer gave IBM the lead in the computer market. Over 1000 650s were sold.

1957-59 – IBM 704

From 1957-1959 the IBM 704 computer appeared, for which the Fortran
language was developed. At this time the state of the art in computers allowed
1 component per chip, that is individual transistors.

1958 - 1962 – Programming languages

From 1958-1962 many programming languages were developed.

FORTRAN (FORmula TRANslator)


COBOL (COmmon Business Oriented Language)
LISP (LISt Processor)
ALGOL (ALGOrithmic Language)
BASIC (Beginners All-purpose Symbolic Instruction Code)

1964 – IBM System/360

In 1964 the beginning of the third-generation computers came


with the introduction of the IBM System/360. Thanks to the
new hybrid circuits (that gross looking orange thing in the
circuit board on the right), the state of the art in computer technology allowed for 10
components per chip.

1965 - PDP-8

In 1965 the first integrated circuit computer, the PDP-8 from Digital Equipment
Corporation appeared. (PDP stands for Programmable Data Processor) After this the real
revolution in computer cost and size began.

1970 - Integrated Circuits


By the early 70s the state of the art in computer technology allowed for 1000
components per chip. To get an idea of just how much the size of electronic
components had shrunk by this time look at the image on the right. The woman is
peering through a microscope at a 16K RAM memory integrated circuit.
The stand she has her microscopy sitting on is a 16K vacuum tube memory curcuit
from about 20 years previous.

1971

The Intel corporation produced the first microprocessor chip


which was a 4-bit chip. Today’s chips are 64-bit. At
approximately 1/16 x 1/8 inches in size, this chip contained 250
transistors and had all the computing power of ENIAC. It
matched IBM computers of the early 60s that had a CPU the size of an office desk.

1975 – Altair 8800

The January 1975 issue of Popular Electronics carried an article, the


first, to describe the Altair 8800, the first low-cost microprocessor
computer which had just became commercially available.

Late 1970s to early 1980s – The Microcomputer Explosion

During this period many companies appeared and


disappeared, manufacturing a variety
of microcomputers (they were called micro to distinguish
them from the mainframes which some people referred to
as real computers). There was Radio Shack’s TRS-80, the Commodore 64, the Atari, but...

1977 - The Apple II


The most successful of the early microcomputers was the Apple II,
designed and built by Steve Wozniak. With fellow computer whiz
and business savvy friend, Steve Jobs, they started Apple Computer
in 1977 in Woz’s garage. Less than three years later the company earned over $100 million. Not
bad for a couple of college dropout computer geeks.
1981
In 1981, IBM produced their first microcomputer. Then the clones started to
appear. This microcomputer explosion fulfilled its slogan computers by the
millions for the millions. Compared to ENIAC, microcomputers of the early 80s:
Were 20 times faster (Apple II ran at the speed of ¼ Megahertz).
Had a memory capacity as much as 16 times larger (Apple had 64 K).
Were thousands of times more reliable.
Consumed the power of a light bulb instead of a locomotive.
Were 1/30,000 the size.
Cost 1/10,000 as much in comparable dollars
(An Apple II with full 64 K of RAM cost $1200 in 1979.
That’s the equivalent of about $8000 to $10000 in today's dollars)

1984-1989

In 1984 the Macintosh was introduced. This was the first mass-
produced, commercially-available computer with a Graphical User
Interface. In 1989 Windows 1.0 was introduced for the PC. It was sort
of Mac-like but greatly inferior. Macintosh owners were know to refer to
it sarcastically as AGAM-84 Almost as Good As Macintosh 84.

1990s
Compared to ENIAC, microcomputers of the 90s:
Were 36,000 times faster (450 Megahertz was the average speed)
Had a memory capacity 1000 to 5000 times larger (average was between 4 and 20
Megabytes)
Were 1/30,000 the size
Cost 1/30,000 as much in comparable dollars (A PC still cost around $1500 the
equivalent of about $2500 in 2008 dollars)
Early 2000s
Compared to ENIAC, microcomputers of the early 2000s:
Are 180,000 times faster (2.5+ Gigahertz is the average speed)
Have a memory capacity 25,000 times larger (average 1+ Gigabytes of RAM)
Are 1/30,000 the size
Cost 1/60,000 as much in comparable dollars (A PC can cost from $700 to $1500)
Data Storage
Data storage has also grown in capacity and shrunk in size as dramatically as have computers.
Today a single data DVD will hold around 4.8 gigabytes. It would take 90,000,000 punch cards
to hold the same amount of data. And, there is talk of a new high density video disk (HVD) that
will be able to hold fifty times that much data. That's more than 240 gigabytes.

Just how much data is that


8 bits = 1 byte
1024 bytes = 1 kilobyte
1024 K = 1 Megabyte = 1,048,576 bytes
1024 Mb = 1 Gigabyte = 10,73,741,824 bytes
1024 Gb = 1 Terabyte = 1,099,511,627,776 bytes
1024 Tb = 1 Petabyte = 1,125,899,906,842,624 bytes
1024 Pb = 1 Exabyte = 1,152,921,504,606,846,976 bytes
1024 Eb = 1 Zettabyte = 1,180,591,620,717,411,303,424 bytes
1024 Zb = 1 Yottabyte = 1,208,925,819,614,629,174,706,176 bytes

Details of Functional Components of a Digital Computer


Input Unit :The input unit consists of input devices that are attached to the computer. These
devices take input and convert it into binary language that the computer understands. Some of the
common input devices are keyboard, mouse, joystick, scanner etc.
Central Processing Unit (CPU) : Once the information is entered into the computer by the
input device, the processor processes it. The CPU is called the brain of the computer because it is
the control center of the computer. It first fetches instructions from memory and then interprets
them so as to know what is to be done. If required, data is fetched from memory or input device.
Thereafter CPU executes or performs the required computation and then either stores the output
or displays on the output device. The CPU has three main components which are responsible for
different functions – Arithmetic Logic Unit (ALU), Control Unit (CU) and Memory registers
Arithmetic and Logic Unit (ALU) : The ALU, as its name suggests performs mathematical
calculations and takes logical decisions. Arithmetic calculations include addition, subtraction,
multiplication and division. Logical decisions involve comparison of two data items to see which
one is larger or smaller or equal.
Control Unit : The Control unit coordinates and controls the data flow in and out of CPU and
also controls all the operations of ALU, memory registers and also input/output units. It is also
responsible for carrying out all the instructions stored in the program. It decodes the fetched
instruction, interprets it and sends control signals to input/output devices until the required
operation is done properly by ALU and memory.
Memory Registers : A register is a temporary unit of memory in the CPU. These are used to
store the data which is directly used by the processor. Registers can be of different sizes(16 bit, 32
bit, 64 bit and so on) and each register inside the CPU has a specific function like storing data,
storing an instruction, storing address of a location in memory etc. The user registers can be used
by an assembly language programmer for storing operands, intermediate results etc. Accumulator
(ACC) is the main register in the ALU and contains one of the operands of an operation to be
performed in the ALU.
Memory : Memory attached to the CPU is used for storage of data and instructions and is called
internal memory The internal memory is divided into many storage locations, each of which can
store data or instructions. Each memory location is of the same size and has an address. With the
help of the address, the computer can read any memory location easily without having to search
the entire memory. when a program is executed, it’s data is copied to the internal memory and is
stored in the memory till the end of the execution. The internal memory is also called the Primary
memory or Main memory. This memory is also called as RAM, i.e. Random Access Memory. The
time of access of data is independent of its location in memory, therefore this memory is also called
Random Access memory (RAM).
A computer that follows a set of software instructions, which is essentially every computer. A
general-purpose computer can perform any data processing operation that the instructions in a
program tell it to do. However, there are countless appliances, gadgets, toys and devices with
computer processors that follow only one set of instructions permanently built into the chip. Such
computers are known as "microcontrollers." They are produced by the billions each year and are
not user programmable.
A computer includes some basic elements. These incorporate hardware, software, programs, data,
and connectivity. No computer can operate in the lack of these elements. Apart from these
elements, by definition, components of a computer system are the fundamental elements that make
the functioning of electronic equipment smooth and faster. There are three basic components which
include:
1. Input Unit
2. Output Unit
3. CPU(Central processing unit)
While there are additional components as well, these three are primarily accountable for making a
computer function. Hence, these are also called building blocks of a computer system. All types of
computers follow the same basic logical structure as follows:
1. Take Input->This is the method of inserting data and instructions into the computer system.
2. Store Data->Collecting data and instructions so that they are ready for processing as and
when needed.
3. Processing Data->Working on arithmetic, and logical operations on data to transform them
into useful data.
4. Output Information->The means of generating useful information or results for the user,
such as a printed record or visual display.
5. Control the workflow->Manages the method and sequence in which all of the preceding
operations are performed.
Input Unit
This part of the computer encapsulates devices with the help of which the user feeds data to the
computer. It creates an interface between the user and the computer. The input devices transform
the information into a form acceptable by the computer. Data can be in the form of numbers,
actions, words, directions, instructions, etc. Computers then practice their CPU to process this data
and deliver output.
For instance, a computer keyboard is an input unit that enters symbols, numbers, and characters.
Likewise, even a mouse works as an input unit for entering commands and directions. Other
examples of input devices include JoyStick, Optical Mark Reader (OMR), Light pen, Magnetic
Ink Card Reader (MICR), Track Ball, Graphic Tablet, Scanner, Microphone, Optical Character
Reader (OCR), Barcode Reader, etc.
Central Processing Unit (CPU)
The Central Processing Unit or CPU is also known as the brain of the computer. CPU executes all
types of data processing functions. It saves data/intermediate results/instructions (program) and
controls the operation of all parts of the computer.
Following are the points to remember for Central Processing Unit (CPU):
1. The CPU is taken as the brain of the computer.
2. CPU facilitates all types of data processing operations.
3. It saves data, intermediate results, and instructions (program).
4. It handles the operating of all parts of the computer.
The CPU itself has the following three components.
1. Memory or Storage Unit
2. Control Unit
3. ALU (Arithmetic Logic Unit)
Memory or Storage Unit
This part of the computer system works to store instructions, data, and intermediate results. This
unit passes data to other parts of the computer when required. It is also referred to as an internal
storage unit or most commonly, the main memory or the primary storage or Random Access
Memory (RAM).
It comes in various speeds, power, and capability. Primary memory and secondary memory are
two important types of memories used in the computer system. Responsibilities of the memory
unit are:
1. Works to store all the data and the instructions required for processing.
2. Works to store intermediate results of processing.
3. Works to store the final results of processing before these results are forwarded to an output
device.
4. All inputs and outputs are supplied through the main memory.
Control Unit
This unit manages the operations of all parts of the computer but does not carry out any calculations
or comparisons or actual data processing operations.
Responsibilities of this unit are :
1. For facilitating the transfer of data and instructions among other units of a system.
2. It manages and correlates all the units of the system.
3. It receives the instructions from the memory, interprets them, and directs the operation of
the system.
4. It interacts with Input/output units to transfer data/results from storage.
5. It does not perform processes or store data.
Arithmetic Logic Unit (ALU)
This unit consists of two subsections namely,
1. Arithmetic Section: The responsibility of the arithmetic unit is to execute arithmetic
operations like addition, subtraction, multiplication, and division. A complete set of
complex operations are executed by making iterative use of the above operations.
2. Logic Section: The responsibility of the logic unit is to execute logic operations like
comparing, selecting, matching, and merging data.
Output Unit
This part of the computer encapsulates devices with the help of which the user receives the
information from the computer. Output devices transform the output from the computer into a form
understandable by the users. Thus, output units generate the data formatted by the computer as per
users’ interests.
Some of the output devices are; Monitor, Printer, projector, speakers, headphones, etc.
Interconnection between Functional Components

A computer consists of input unit that takes input, a CPU that processes the input and an output
unit that produces output. All these devices communicate with each other through a common
bus. A bus is a transmission path, made of a set of conducting wires over which data or
information in the form of electric signals, is passed from one component to another in a
computer. The bus can be of three types – Address bus, Data bus and Control Bus.

MOTHERBOARD
A motherboard (also called main board or system board) is a basic foundation of a computer
that connects all the crucial components or parts of a system. It performs the following
significant functions like: Distributing power from the power supply to all hardware components.
A motherboard (also called main board or system board) is a basic foundation of a computer that
connects all the crucial components or parts of a system. It performs the following significant
functions like:
• Distributing power from the power supply to all hardware components.
• Transferring of data and instructions between various hardware components.
• Providing various sockets and pads for mounting electronic components.
• Offering expansion slots to add other components, such as graphics card, network cards,
etc.
In older desktop computers, there had very few integrated components onto the motherboard. It
needs a large number of adapter cards for interfacing videos, hard disk, and floppy disk. In contrast,
as the technology advanced, various interfaces have accommodated on the motherboard and fewer
adapters are needed.
Nowadays, almost all the electronic components, such as CPU, RAM, expansion slots, heat
sink/fan assembly, BIOS chip, etc. have integrated onto the motherboard of all personal computers
(PCs). It also holds the expansion bus, Input/Output (I/O) interface, drive controllers, and system
memory.
In this tutorial, we will understand different components of a computer motherboard, what they
do, and where they are located on the motherboard of a computer.
Hardware Components of Computer Motherboard with Functions

A typical computer motherboard contains the following electronic components or parts that are as:
• Chipsets
• CPU or processor sockets or slots
• Memory slots
• Expansion slots
• BIOS chip
• CMOS battery
• Power connectors
• Keyboard and mouse connectors
• Onboard disk drive connectors
• Peripheral ports and connectors
• Jumpers and DIP switches
• Case fan and Heat sink

Let’s understand each component of the motherboard in brief.


Chipsets
A chipset is a set of semiconductor chips (or circuits) on the motherboard that provides interfaces
for memory, expansion cards, and other peripheral devices. It is the foundation of the motherboard
and made up of one or several integrated circuit chips.
It works closely with the CPU processor to collectively control the memory, buses on the
motherboard, and some onboard peripheral devices. Therefore, a chipset on the motherboard must
be compatible with the processor that it serves.
A chipset and socket determines what type of processor a board can support, how fast it will run,
how fast buses will run, and speed, type, and amount of memory we can have. The original
manufacturers such as Intel and AMD usually give the name and model number to the chipsets.
We can divide the functions of a chipset into main categories – Northbridge and Southbridge. Let’s
take a brief look at both.
Bus System: Communication channels between components.
A bus is a group of parallel wires along which data can flow. The system bus is made up of a
number of such communication channels that connect the processor and other components such
as memory, input and output devices together. A computer will normally have several buses
that are used for specific purposes
Power Supply: Ensures stable energy for the system.
Power supplies ensure that electronic circuits receive stable and consistent voltage and current
levels. If the voltage isn't regulated, voltage spikes and fluctuations can cause serious damage

Week 3: Lecture Notes: Input/Output Devices and Peripherals


1. Input Devices
Input devices allow users to interact with a computer by entering data, commands, or control
signals. Their functionality and diversity play a critical role in computer usability.
a. Common Input Devices
1. Keyboards
o Functionality: A keyboard is the primary tool for text-based input. It allows users
to type characters, issue commands, and navigate software through shortcuts.
o Components:
▪ Alphanumeric Keys: For typing letters and numbers.
▪ Function Keys (F1–F12): Perform specific tasks in software.
▪ Control Keys: Include Shift, Ctrl, and Alt for modifying other keys'
behavior.
▪ Navigation Keys: Arrow keys, Home, End, Page Up/Down for document
navigation.
o Types:
▪ QWERTY Keyboards: Standard layout widely used globally.
▪ Ergonomic Keyboards: Designed to minimize strain by providing a natural
typing posture.
▪ Mechanical Keyboards: Use physical switches for durability and tactile
feedback.
▪ Virtual Keyboards: On-screen keyboards for touch-enabled devices.
2. Mice
o Functionality: A mouse enables precise cursor movement and interaction with
graphical user interfaces (GUIs).
o Components:
▪ Buttons: Left and right buttons for selection and context menus.
▪ Scroll Wheel: Facilitates vertical or horizontal navigation.
▪ Sensor: Tracks movement, either via optical or laser technology.
o Types:
▪ Optical Mouse: Relies on light reflection for movement detection.
▪ Trackball Mouse: A stationary device with a rolling ball to control the
pointer.
▪ Wireless Mouse: Eliminates cables, using Bluetooth or RF communication.
3. Touchpads
o Overview: Found on laptops, touchpads replace mice by detecting finger gestures.
o Features:
▪ Gestures: Multi-touch support for zooming, rotating, and swiping.
▪ Click Zones: Integrated left and right-click functionality.
4. Scanners
o Purpose: Converts physical documents, images, or objects into digital formats.
o Types:
▪ Flatbed Scanners: Used for high-quality scans of photos or documents.
▪ Handheld Scanners: Portable and used for smaller scanning tasks.
▪ Sheet-Fed Scanners: Designed for high-volume document scanning.
o Applications: Digitization of records, image editing, and archiving.
5. Microphones
o Purpose: Captures audio for communication, recording, or processing.
o Types:
▪ Dynamic Microphones: Durable, suitable for live sound recording.
▪ Condenser Microphones: More sensitive, ideal for studio recordings.
o Uses:
▪ Voice recognition software.
▪ Video conferencing tools like Zoom or Teams.
▪ Gaming communication platforms.
6. Webcams
o Functionality: Captures live video, often used for communication, security, or
streaming.
o Types:
▪ Integrated Webcams: Built into laptops and tablets.
▪ Standalone Webcams: External USB-connected devices.
o Applications: Online meetings, streaming, and remote surveillance.
b. Emerging Input Technologies
1. Speech Recognition
o Definition: Converts spoken language into text or commands using natural
language processing (NLP).
o Mechanism: Microphones capture audio, which is processed by AI algorithms to
recognize words.
o Applications:
▪ Virtual Assistants: Alexa, Siri, and Google Assistant.
▪ Accessibility Tools: Enabling hands-free interaction for differently-abled
users.
▪ Customer Support: Voice-driven menus and chatbot integration.
2. Gesture Control
o Definition: Interprets physical gestures, such as hand movements, for device
control.
o Technologies Used:
▪ Cameras and depth sensors for capturing movement.
▪ AI for gesture recognition and response.
o Applications:
▪ Gaming consoles like Microsoft Kinect.
▪ AR/VR systems for immersive interaction.
▪ Smart TVs and home automation.
2. Output Devices
Output devices present processed data to the user in formats such as text, audio, video, or printed
material.
a. Monitors
1. CRT (Cathode Ray Tube)
o Mechanism: Uses electron beams to light up phosphorescent screens.
o Advantages: High contrast and deep color rendering.
o Disadvantages: Bulky, heavy, and energy-inefficient.
2. LCD (Liquid Crystal Display)
o Mechanism: Utilizes liquid crystals illuminated by a backlight.
o Advantages: Slim, lightweight, and energy-efficient.
o Disadvantages: Limited viewing angles and color accuracy compared to modern
displays.
3. LED (Light Emitting Diode)
o Mechanism: Uses LEDs for backlighting.
o Advantages: Better brightness, color accuracy, and slim design.
o Applications: Widely used in high-definition TVs and monitors.
b. Printers
1. Inkjet Printers
o How It Works: Sprays liquid ink droplets onto paper.
o Advantages: Affordable, high-quality photo printing.
o Limitations: Slower printing speed and higher cost per page.
2. Laser Printers
o How It Works: Uses toner powder and heat to fuse text or images onto paper.
o Advantages: Fast, cost-effective for large print volumes.
o Applications: Offices and businesses.
3. 3D Printers
o How It Works: Builds objects layer by layer from materials like plastic, resin, or
metal.
o Applications: Manufacturing, medical implants, and prototyping.
c. Speakers and Headphones
1. Speakers
o Converts electrical audio signals into sound waves.
o Used for music, movies, and virtual meetings.
2. Headphones
o Offers private audio for individual users.
o Types:
▪ Over-Ear: Encloses the ears for immersive sound.
▪ In-Ear: Portable and lightweight.
▪ Noise-Canceling: Blocks external noise for a focused experience.
3. Storage Devices as Peripherals
a. External Hard Drives
• Large-capacity storage devices for data backups.
• Typically connected via USB or Thunderbolt.
b. Flash Drives
• Small, portable, and durable storage devices.
• Suitable for quick file transfers.
c. Cloud Storage
• Data stored on remote servers accessible via the internet.
• Advantages include scalability and remote access.
4. Networking Devices
a. Routers
• Direct network traffic and connect devices to the internet.
b. Switches
• Manage communication between devices in local networks.
c. Modems
• Convert signals for internet connectivity.
5. Human-Computer Interaction (HCI)
1. Usability Principles:
o Simplicity, consistency, and feedback mechanisms.
2. Trends:
o Voice control, gesture-based systems, and AR/VR.
Week 4: Hardware, Software, and Human Ware

Hardware: The Physical Components of a Computer System


Hardware refers to the tangible, physical parts of a computer system that can be seen and touched.
These components work together to process, store, and transmit data. The primary categories of
hardware include:
1. Central Processing Unit (CPU)
• Definition: The CPU, often called the "brain" of the computer, is responsible for executing
instructions and performing calculations.
• Components:
o Control Unit (CU): Directs operations within the computer, interpreting
instructions from software.
o Arithmetic Logic Unit (ALU): Performs mathematical calculations and logical
operations.
o Registers: Small, high-speed storage locations for temporary data storage during
processing.
• Types:
o Single-core CPUs
o Multi-core CPUs (dual-core, quad-core, etc.)
2. Storage Devices
• Definition: Devices used to store data, instructions, and information for short-term or long-
term use.
• Types:
o Primary Storage: Temporary memory used for immediate processing, e.g., RAM
(volatile).
o Secondary Storage: Permanent storage, e.g., HDD, SSD, USB drives.
o Tertiary Storage: For archival and backup purposes, e.g., magnetic tapes, optical
discs (DVDs, Blu-ray).
3. Input and Output Devices
• Input Devices: Enable users to provide data to the computer, e.g., keyboard, mouse,
scanner.
• Output Devices: Display or output data to users, e.g., monitor, printer, speakers.
4. Networking Devices
• Definition: Devices that facilitate data communication between computers and networks.
• Examples:
o Routers: Connect different networks and manage traffic between them.
o Switches: Connect devices within a local network.
o Modems: Convert data for transmission over telecommunication lines.
5. Peripheral Devices
• Definition: External hardware connected to the computer to enhance its functionality.
• Examples:
o External storage (portable SSDs)
o Multimedia devices (cameras, microphones)
o Specialized devices (graphic tablets, VR headsets)

Software: The Intangible Components of a Computer System


Software refers to the set of instructions and data that enable hardware to perform specific tasks.
Unlike hardware, software is intangible and exists in the form of code and data files.
1. System Software
• Definition: Software that manages hardware and provides a platform for other software.
• Examples:
o Operating Systems (OS): Act as intermediaries between hardware and application
software.
▪ Windows: Known for its graphical user interface and compatibility with
various hardware.
▪ Linux: Open-source OS with multiple distributions (Ubuntu, Fedora)
popular in servers.
▪ macOS: Exclusive to Apple devices, noted for its stability and design.
o Utility Programs: Software for maintenance tasks, e.g., antivirus software, disk
cleanup tools.
2. Application Software
• Definition: Software designed to perform specific user-oriented tasks.
• Examples:
o Productivity Tools: Word processors (Microsoft Word), spreadsheets (Excel).
o Entertainment: Video games, multimedia players (VLC).
o Web-based: Browsers (Google Chrome, Firefox).
3. Firmware
• Definition: Specialized software embedded in hardware devices to control their operations.
• Examples:
o BIOS/UEFI in computers for boot-up processes.
o Firmware in printers to manage printing operations.
o Embedded systems in appliances like washing machines and microwaves.

Human Ware: The Human Component in the Computing System


Human ware refers to the people involved in the operation, development, and maintenance of
computer systems.
1. Roles in Computing Systems
• Users: Individuals who interact with computer systems to achieve specific goals, e.g.,
students, professionals, and gamers.
• Developers: Programmers and software engineers who design and create software
applications.
• System Administrators: IT professionals responsible for managing and maintaining
computer systems and networks.
• Engineers: Hardware and software engineers who design and develop computing devices
and software.
Human-Computer Interaction (HCI)
Definition:
Human-Computer Interaction (HCI) is a multidisciplinary field focused on the design, evaluation,
and implementation of interactive computing systems for human use. It examines how people
interact with technology and aims to improve the user experience.
Key Goals of HCI
1. Enhancing Usability: Making systems intuitive, efficient, and easy to use.
2. Reducing Errors: Designing systems that minimize user mistakes and ensure recoverability.
3. Improving Accessibility: Ensuring systems are usable by people with diverse abilities,
including those with disabilities.
4. Increasing User Satisfaction: Creating enjoyable and fulfilling user experiences.
Principles of HCI
1. User-Centered Design (UCD):
o Focuses on the needs, preferences, and limitations of end-users during the design
process.
o Iterative design cycles involve user feedback and usability testing.
2. Consistency:
o Interfaces should follow standard design conventions to reduce the learning curve.
o For example, consistent button placement and color schemes across applications.
3. Feedback and Affordances:
o Feedback provides users with confirmation of their actions, e.g., a notification for
a completed file download.
o Affordances refer to design elements that suggest how they should be used, e.g., a
slider for volume control.
4. Simplicity:
o Avoid overcomplicating interfaces with unnecessary features or clutter.
o Example: Minimalist web design where content is clear and navigation is intuitive.
5. Error Prevention and Recovery:
o Prevent errors by guiding users (e.g., graying out unavailable options).
o Provide clear error messages and solutions for quick recovery (e.g., "Undo"
options).
6. Accessibility:
o Incorporate features like screen readers, voice commands, and adaptable font sizes
to make technology inclusive.
Applications of HCI
• Web and Mobile App Design: Ensuring responsive, intuitive, and engaging user interfaces.
• Virtual Reality (VR) and Augmented Reality (AR): Designing immersive and interactive
experiences.
• Healthcare Systems: Developing user-friendly electronic health records (EHRs) for
medical staff.
• Assistive Technology: Creating devices for individuals with disabilities, such as eye-
tracking systems or prosthetics interfaces.

Ergonomics
Definition:
Ergonomics, also known as "human factors engineering," is the science of designing work
environments and systems to optimize human well-being and overall system performance.
Goals of Ergonomics
1. Prevent Physical Strain and Injury: Reducing risks of repetitive strain injuries (RSIs) and
musculoskeletal disorders (MSDs).
2. Enhance Productivity: Designing comfortable and efficient workstations.
3. Improve User Comfort: Reducing fatigue and discomfort over extended use.
4. Ensure Safety: Designing systems and devices that minimize accidents.
Key Areas of Ergonomics
1. Workstation Design:
o Desk and Chair Positioning:
▪ The desk should be at elbow height, allowing arms to rest comfortably.
▪ Chairs should support the natural curve of the spine and allow feet to rest
flat on the floor.
o Monitor Placement:
▪ The monitor should be at eye level and at least 20-24 inches away from the
user to reduce eye strain.
2. Input Devices:
o Use ergonomic keyboards with a split layout to minimize wrist strain.
o Mice should fit comfortably in the hand, promoting a natural wrist position.
3. Lighting and Visual Ergonomics:
o Avoid glare on screens by positioning monitors away from windows or using anti-
glare screens.
o Use task lighting to illuminate work areas without causing shadows.
o Follow the 20-20-20 Rule: Every 20 minutes, look at something 20 feet away for
20 seconds to reduce eye strain.
4. Posture and Movement:
o Encourage frequent breaks and movement to avoid prolonged static postures.
o Consider standing desks or adjustable workstations to allow for a mix of sitting and
standing during the day.
5. Environmental Factors:
o Maintain a comfortable room temperature (68-72°F or 20-22°C).
o Reduce background noise to improve focus and reduce stress.
Ergonomic Tools and Accessories
• Footrests: For shorter users or high desks, footrests can improve posture.
• Wrist Supports: Cushioned wrist pads help reduce pressure during typing.
• Adjustable Chairs: Chairs with customizable height, armrests, and lumbar support.
• Standing Mats: For users of standing desks, mats reduce fatigue from standing for long
periods.

Conclusion
Integrating HCI and Ergonomics
HCI and ergonomics complement each other in designing systems that prioritize the user's physical
and cognitive comfort. By combining the principles of HCI with ergonomic considerations, we
can create technology that is not only functional but also enhances overall user well-being.

You might also like