Ict Chapter1
Ict Chapter1
INTRODUCTION TO
ICT 1
Course Description
This course introduces BTLE students to explore the science, culture, and ethics of
information technology, its various uses and applications, as well as its influence on
culture and society. It also aims to strike a balance between conceptual instruction and
socially – and culturally – oriented discussions as it not only explains the basic concepts
or key terms in IT but also features the major IT trends along with the issues and
challenges these developments bring. This course also will give an overview on Technical
Drafting, Illustration and 2D Animation. It also establishes the basics of Medical
Transcription and Web application.
Time Frame
3 weeks
Key Terms
ENIAC – Electronic Numeric Integrator and Calculator
Pre- test
Let us determine how much you already know about Information and Communication
Technology. Take this activity.
1. What is ICT?
Lesson Proper
IT vs. ICT
When do we use the term ICT and how does it differ from IT? ICT, or
information and communications technology, is often used in a more general
sense, and is described as using computers and other digital technologies to
assist individuals or institutions in handling or using information. ICT is technology
that supports activities involving information such as gathering, processing,
storing, and presenting data. Increasingly, these activities also involve
collaboration and communication.
Mechanical Computers
In this second generation, the transistor was used as the interior sections of
the computer. Transistors were much smaller, faster, and more dependable than
the vacuum tubes of the first – generation computer. They generated less heat
and consumed less electricity but were very costly.
In 1965, Jack Kilby invented the integrated circuit (IC) that was used
instead of transistors as the interior sections to build the computer. A single IC
has many transistors, resistors and capacitors that even the full circuit board of a
transistor can be replaced entirely with one chip. This chip made the computers
smaller, unfailing, and effective. In this third generation, remote processing, time
– sharing and multiprogramming operating system were used.
From 1971 to 1980, very large scale integrated (VLSI) circuits were used to
build computers. These circuits have about 5,000 transistors and other circuit
elements with their connected circuits connected on a single chip known as
the microprocessor. These fourth generation computers such as personal
computers became more powerful, dense, reliable, and inexpensive. The other
application of this microprocessor technology can be used and applied for
pocket calculators, television sets, automotive devices and audio and video
appliances.
fig.1.5 4th generation computer
In this fifth generation, the VLSI technology has evolved into what is called
ultra large –scale integration (ULSI) technology, with the manufacture of
microprocessor chips having 10 million electronic components. This generation
involves computer intelligence which is associated with artificial intelligence (AI),
natural language, and expert systems that interpret the means and practices of
producing computers that think like human beings.
INPUT DEVICES
For example: Using a keyboard we can type things on a Notepad and the
computer processes the entered data and then displays the output of the same
of the screen.
The data entered can be in the form of numbers, alphabet, images, etc.
We enter the information using an input device, the processing units convert it
into computer understandable languages and then the final output is received
by a human-understandable language.
•Keyboard – is the most common input device that accepts letters, numbers,
and commands from the user.
•Mouse – lets one select options from on-screen menus. A mouse is used by
moving it on a flat surface, pressing its two buttons (left and right), and scrolling
the wheel that is located between the buttons. There are also alternatives to
using a mouse. A trackball has a ball that can rotate using a finger or the palm
of a hand to move the pointer. A touchpad also called a trackpad is a touch-
sensitive pad that lets the user move the pointer by touching and dragging his
or her finger on the pad. Touchpads are commonly built-in on laptop
computers.
•Microphone – allows a user to speak into the computer to input data and
instructions. While there are available stand-alone microphones for computers,
most of the time, users buy a headset – a combination of microphone and
earphones – for the sake of practicality.
•Scanner – converts printed material (such as text and pictures) into a form the
computer can use. There are different types of scanners available; the most
common is the flatbed scanner. Scanners look like miniature printers with a flip-
up cover protecting the glass platen. They are often built into multi-function
printers (MFPs). Another type becoming common nowadays is the hand-held or
portable scanners. Portable scanners can be small enough to fit inside one’s
pocket. Pen scanners are just a bit bigger than fountain pens and can scan the
text of a document line by line. They do not give high-resolution scans and are
more expensive than flatbed scanners.
•Digital camera – allows one to take pictures then transfer the photographed
images to the computer or printer instead of storing the images on a traditional
film.
•PC video camera – is a digital video camera that enables users to create a
movie or take still photographs electronically. With the PC video camera
attached to the computer, user can see each other as they communicate via
the computer.
SYSTEM UNIT
The system unit is the enclosure composed of the main elements of a computer
that are used to administer data. This can be referred to as a computer case or tower.
The circuitry of the system unit containing the primary components of a computer with
connectors into which other circuit boards can be positioned is recognized as
motherboard.
Fig. 1.7 Basic parts of a system unit
OUTPUT DEVICES
•Printer – produces text and graphics on a physical medium such as paper. The
two types of printer are the impact printer and the non-impact printer. An
impact printer makes contact with the paper by pressing an inked ribbon
against the paper using a hammer or pins. An example of an impact printer is
the dot-matrix printer. A non-impact printer does not use a striking device to
produce characters on the paper, and because it does not hammer against
the paper, the printer produces less noise. Examples of non-impact printers are
inkjet printers and laser printers.
Fig. 1.8 ink jet printer
•Monitor – displays text, graphics, and videos on a screen. Many monitors look
similar to a television. The three types of monitor available in the market are the
following:
•Speed
In the system unit, operation gets done through electronic circuits. When
data, instructions, and information drift along these circuits, they travel at
incredibly fast speeds. Most computers bring out billions of operations in a single
second. The world’s fastest computer can perform trillions of operations in one
second.
•Accuracy
•Communication
Classifying Computers
•Tablet computers – are hand-held computers with touch sensitive screen for
typing and navigation.
•Smart TVs – are the latest television sets that include applications present in
computers. For example, videos can be streamed from the internet directly onto
the TV. The TV can also be used as a computer monitor and gaming monitor
Evolution of Media
The media has transformed itself based on two things – (1) how
information is presented; and (2) how the connection is established? Woodcut
printing on cloth or on paper was used in the early 15th century. It was in 1436
when Johannes Gutenberg started working on a printing press which used relief
printing and a molding system. Now, the modern printing press delivers
messages in print, such as newspapers, textbooks, and magazines.
Print Media
Early news was presented to local populations through the print press.
While several colonies had printers and occasional newspapers, high literacy
rates combined with the desire for self-government made Boston a perfect
location for the creation of a newspaper, and the first continuous press was
started there in 1704.
Radio
Radio news made its appearance in the 1920s. The National Broadcasting
Company (NBC) and the Columbia Broadcasting System (CBS) began running
sponsored news programs and radio dramas. Comedy programs, such as Amos
’n’ Andy, The Adventures of Gracie, and Easy Aces, also became popular
during the 1930s, as listeners were trying to find humor during the Depression. Talk
shows, religious shows, and educational programs followed, and by the late
1930s, game shows and quiz shows were added to the airwaves. Almost 83
percent of households (in the United States) had a radio by 1940, and most
tuned in regularly.
Fig. 1.12 Radio Station in the 1920’s
Television
Television combined the best attributes of radio and pictures and
changed media forever. The first official broadcast in the United States was
President Franklin Roosevelt’s speech at the opening of the 1939 World’s Fair in
New York. The public did not immediately begin buying televisions, but
coverage of World War II changed their minds. CBS reported on war events and
included pictures and maps that enhanced the news for viewers. By the 1950s,
the price of television sets had dropped, more televisions stations were being
created, and advertisers were buying up spots.
Things are very different today. Now computers use the internet and
provide mass communication, or the exchange of information on a large scale.
Today, this means communication across the entire world at the speed of light.
When you turn on that desktop, laptop, tablet, or smartphone, you almost
certainly use it for one or more of the following:
To communicate with friends and family. Think: email, Twitter, Facebook, Skype, and so
on.
To gather information via the news, a Google search, and so on.
To learn something on Study.com or via a university video.
To work (perhaps remotely while sitting in your jammies at home) when you've called in
sick or on a business trip abroad.
To be entertained, whether it's on YouTube, Netflix, or somewhere else.
Technology trends tend to change as time goes by. Following Moore’s Law,
technology is bound to upgrade itself every two years. In 1965, Gordon Moore
predicted that the computing power would dramatically increase over time. But
as it increases, cost is expected to proportionally go down. Nowadays, the two-
year prediction has even decreased, with releases happening less than two
years, if not, yearly. Aside from cost, sizes of hardware have also decreased in
making devices, such as smartphones, more useful and dependable.
A few years ago, people would have a cellular phone, a digital camera, a
portable music player, and a laptop with them. They would use their phones to
text and make calls, the digital camera to take pictures, and the music player to
listen to music while doing their work on their laptop. Now, smartphone are
capable of doing the aforementioned activities and a lot more through
applications or “apps” made available online. The term smartphone was coined
as the cellular or mobile phone has now become smarter – it can run apps
which help people perform their day-to-day activities. Apps turn mobile devices
into miniature PCs capable of browsing the web, taking down notes, and
playing games, among others. By default, smartphones come installed with
basic apps such as making calls, sending text messages, playing music, and
managing schedules. As of 2017, Google Play has approximately 2.8 million
apps available for Android users, whereas Apple’s App Store is estimated to
have 2.2 million. These apps are not just limited to smartphones. They can be
used in other mobile devices including tablets and other recent trend wearable
devices.
The so-called Internet of Things (IoT) has been trending since 2016. The vision of
IoT has evolved due to the convergence of multiple technologies, including
pervasive wireless communication, data analytics, machine learning, and use of
hardware technology such as sensors, microprocessors, and microcontrollers.
Controlling home appliances through a mobile phone such as switching the
lights on and off, setting the timer for the washing machine and controlling the
television to record shows that might be missed due to traffic is now possible.
Also called artificial intelligence, having machines decide for you seems to be a
daunting task and would probably make you think of robots and talking
computers, similar to Iron Man’s Jarvis. However, with the recent release of
Apple’s iPhone X, it was not just the design that improved. The iOS’ intelligent
personal assistant, Siri, has also been upgraded with enhanced learning making
the use of iPhone more efficient. Its latest features include the following:
Face ID – the owner’s face is the new password. It is more secure way to unlock
the phone and keep the data safe.
A11 Bionic – is claimed to be the “most powerful and smartest chip ever in a
smartphones,” with a neural engine capable of up to 600 billion operations per
second. A neural engine, a type of artificial intelligence, enables computers to
learn from observations.
Wireless charging – a cable is not needed to charge the device.
iOS 11 – in the latest operating system, one can scan documents in the notes
app, edit live photos, and in the U.S., even pay friends in Messages. Siri can also
translate languages.
Augmented Reality – is not just designed for fun. It is also meant for productivity
and efficiency. It allows one to navigate without looking at a map, visualize
industrial equipment and many others.
4. Automation
5. Big data
Big data is a term that describes large and complex volumes of data. But it is
now how much data an organization has it is what they do with it that matter.
Big data can be analyzed for insights which can help management make
better decisions and come up with more effective strategic plans. Waze is an
example of an app which uses big data. The app helps users determine traffic,
direction and route estimators, and notifies users with accidents alerts and road
hazards, among others.
7. Everything on demand
POSITIVE EFFECTS
There are new ways of learning such as the use of learning management system
(LMS) which implements educational enhancements such as distance learning
and online tutorials, virtual reality, and interactive multimedia.
Security
With the advancement that ICT brings, individuals and organizations can solve
any security problems. Examples of security measure on applications are: (1) the
use of encryption method to keep and protect data from any malicious
software; (2) the use of password/s for personal information protection; (30 the
development of physical security systems such as biometrics in a form of
fingerprint, facial recognition, iris (eye) recognition, and voice recognition.
NEGATIVE EFFECTS
ICT brings not only improvement but also threat to security. Data or files must
always be kept secure and safe. The internet connection must always be
safeguarded from different attacks such as data modification, identity/IP
address spoofing, password-based attacks, denial-of-service, etc. computers
should also be protected from various forms of viruses and malware which are
released almost every day.
While some people apply ethical principles to the use of ICT, others simply
do not, hence the proliferation of cyber malpractices.
1. Plagiarism
2. Exploitation
3. Libel
1. Student
2. Future Educator
3. Individual
References
https://searchcio.techtarget.com/definition/ICT-information-and-communications-
technology-or-technologies
https://opentextbc.ca/computerstudies/chapter/classification-of-generations-of-
computers/
https://courses.lumenlearning.com/atd-baycollege-
americangovernment/chapter/the-evolution-of-the-media/
https://study.com/academy/lesson/the-computer-as-a-mass-communication-
tool.html#:~:text=The%20internet%2Dconnected%20computer%20has,Twitter%2C%20
email%2C%20or%20Zoom.