Module 1
Module 1
Bachelor of Science
in
Computer Science
Prepared by:
Dether John Gorre
Module 1
Vision
The Computer Science Department at the Lanao School of Science whose graduates are globally
recognized as innovative and well-prepared computing professionals.
Mission
Objectives
Introduction
All computing is based on the coordinated use of computer devices, called hardware, and
the computer programs that drive them, called software, and all software applications are built
using data and process specifications, called data structures and algorithms. These fundamentals
have remained remarkably stable over the history of computing, in spite of the continual advance
of the hardware and software technologies, and the continual development of new paradigms for
data and process specifications. This chapter defines the notion of computing, discusses the
concepts of hardware and software, and concludes with an introduction to the development of
software, called computer programming. The remainder of the text focuses in on the
development of computer software, providing a detailed discussion of the principles of software
as well as a snapshot of the current culture of the software development field.
1. Nearly all of the most exciting and important technologies, arts, and sciences of today
and tomorrow are driven by computing.
2. Understanding computing illuminates the deep insights and questions into the nature
of our minds, our culture, and our universe.
Computing
any activity that uses computers to manage, process, and communicate information. It
includes development of both hardware and software. Computing is a critical, integral
component of modern industrial technology. Major computing disciplines include computer
engineering, software engineering, computer science, information systems, and information
technology
History of Computing
Historically, computers were human clerks who calculated in accordance with effective
methods. These human computers did the sorts of calculation nowadays carried out by electronic
computers, and many thousands of them were employed in commerce, government, and research
establishments. The term computing machine, used increasingly from the 1920s, refers to any
machine that does the work of a human computer, i.e., any machine that calculates in accordance
with effective methods. During the late 1940s and early 1950s, with the advent of electronic
computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’,
initially usually with the prefix ‘electronic’ or ‘digital’.
14th C. - Abacus - an instrument for performing calculations by sliding counters along rods
or in grooves
17th C. - Slide rule - a manual device used for calculation that consists in its simple form of
a ruler and a movable middle piece which are graduated with similar logarithmic scales
(Picture from the The Museum of HP Calculators)
1804 - Jacquard loom - a loom programmed with punched cards invented by Joseph Marie
Jacquard
ca 1850 - Difference Engine , Analytical Engine - Charles Babbage and Ada Byron (See her
picture.). Babbage's description, in 1837, of the Analytical Engine, a hand cranked,
mechanical digital computer anticipated virtually every aspect of present-day computers. It
wasn't until over a 100 years later that another all purpose computer was conceived. Sketch
of the Engine and notes by Ada Byron King, Countess of Lovelace.
1939 -1942 - Atanasoff Berry Computer - built at Iowa State by Prof. John V. Atanasoff and
graduate student Clifford Berry. Represented several "firsts" in computing, including a
binary system of arithmetic, parallel processing, regenerative memory, separation of
memory and computing functions, and more. Weighed 750 lbs. and had a memory storage of
3,000 bits (0.4K). Recorded numbers by scorching marks into cards as it worked through a
problem.
1940s - Colossus - a vacuum tube computing machine which broke Hitler's codes during
WW II. It was instrumental in helping Turing break the German's codes during WW II to
turn the tide of the war. In the summer of 1939, a small group of scholars became
codebreakers, working at Bletchley Part in England. This group of pioneering codebreakers
helped shorten the war and changed the course of history.
1946 - ENIAC - World's first electronic, large scale, general-purpose computer, built by
Mauchly and Eckert, and activated at the University of Pennsylvania in 1946. ENIAC
recreated on a modern computer chip. See an explanation of ENIAC on a Chip by the Moore
School of Electrical Engineering, University of Pennsylvania. The ENIAC is a 30 ton
machine that measured 50 x 30 feet.
1950s -1960s - UNIVAC - "punch card technology" The first commercially successful
computer, introduced in 1951 by Remington Rand. Over 40 systems were sold. Its memory
was made of mercury filled acoustic delay lines that held 1,000 12digit numbers. It used
magnetic tapes that stored 1MB of data at a density of 128 cpi. UNIVAC became
synonymous with computer (for a while). See UNIVAC photo. See UNIVAC flow chart
1960-1968 - transistor based technology. It almost completely replaced the vacuum tube
because of its reduced cost, weight, and power consumption and its higher reliability. The
transistor is made to alter its state from a starting condition of conductivity to a final
condition of insulation.
1969 - The Internet, originally the ARPAnet (Advanced Research Projects Agency network),
began as a military computer network.
1976 - CRAY 1 - The world's first electronic digital computer, developed in 1946. A
75MHz, 64-bit machine with a peak speed of 160 megaflops, (one million floating point
operations per second) the world's fastest processor at that time.
1976 - Apples/MACs - The Apple was designed by Steve Wozniak and Steve Jobs. Like
modern computers, early Apples had a peripheral keyboard and mouse, and had a floppy
drive that held 3.5" disks. The Macintosh replaced the Apple.
1978 to 1986 - large scale integration (LSI); Alto - early workstation with mouse; Apple,
designed by Steve Wozniak and Steve Jobs. Apple was the first to have a "windows" type
graphical interface and the computer mouse. The PC and clone market begin to expand.
This begins first mass market of desktop computers.
1986 to today - the age of the networked computing, the Internet, and the WWW.
1990 - Tim Berners-Lee invented the networked hypertext system called the World Wide
Web.
1992 - Bill Gates' Microsoft Corp. released Windows 3.1, an operating system that made
IBM and IBM-compatible PCs more user-friendly by integrating a graphical user interface
into the software. In replacing the old Windows command-line system, however, Microsoft
created a program similar to the Macintosh operating system. Apple sued for copyright
infringement, but Microsoft prevailed. Windows 3.1 went to Win 95, then Win 98, now
Windows XP There are other OSs, of course, but Windows is the dominant OS today.
1995 - large commercial Internet service providers (ISPs), such as MCI, Sprint, AOL and
UUNET, began offering service to large number of customers.
1996 - Personal Digital Assistants (such as the Palm Pilot became available to consumers.
They can do numeric calculations, play games and music and download information from
the Internet.
Forms of Computing
Parallel Computing
Your computer has a processor (CPU) that runs all your programs. Let’s say you ask your
computer to do (1+2) * (10–7). If you have only one processor, it will first calculate 1+2 and
then 10–7 and then both these results will be multiplied. But imagine you had two processors.
Now you can ask the first processor to do 1+2 and at the same time, the second processor can
calculate 10–7 and these results will be multiplied to give you the answer. Which case do you
think, will give you a faster result? Of course, the case where you had two processors.
Distributed Computing
Let’s say you are the controller of examination (COE) and have 100 answer sheets to be
evaluated. You ask the course instructor (CI) to evaluate and return them in a day. Obviously, the
CI cannot check them all in a day, so she decides to distribute them amongst her teaching
assistants (TA(s)). Now she got all the work done in a very short time and returns them back to
COE. This is what distributed computing means.
In simple words, distributed computing is a network of computer systems that all perform
together to give a feeling as if the whole work has been done by one single system. For the COE
in our example, it looks as if the CI did all the work because he is unaware of what happened
after he gave the answer sheets to the CI, but he got the required work done from her.
The major goal of Distributed computing is to give users easy access to a wide
variety(heterogeneous) of computer resources.
Each TA here is an individual computer that has its own memory and processors, unlike
parallel computing that shares memory between multiple processors.
In the personal computing environment, there is a single computer system. All the system
processes are available on the computer and executed there. The different devices that constitute
a personal computing environment are laptops, mobiles, printers, computer systems, scanners
etc.
The time-sharing computing environment allows multiple users to share the system
simultaneously. Each user is provided a time slice and the processor switches rapidly among the
users according to it. Because of this, each user believes that they are the only ones using the
system.
In client server computing, the client requests a resource and the server provides that
resource. A server may serve multiple clients at the same time while a client is in contact with
only one server. Both the client and server usually communicate via a computer network but
sometimes they may reside in the same system.
A distributed computing environment contains multiple nodes that are physically separate
but linked together using the network. All the nodes in this system communicate with each other
and handle processes in tandem. Each of these nodes contains a small part of the distributed
operating system software.
The computing is moved away from individual computer systems to a cloud of computers
in cloud computing environment. The cloud users only see the service being provided and not the
internal details of how the service is provided. This is done by pooling all the computer resources
and then managing them using a software.
Information Systems
a. Hardware
Information systems hardware is the part of an information system you can touch
– the physical components of the technology. Computers, keyboards, disk drives,
iPads, and flash drives are all examples of information systems hardware.
b. Software
Software is a set of instructions that tells the hardware what to do. Software is not
tangible – it cannot be touched. When programmers create software programs, what
they are really doing is simply typing out lists of instructions that tell the hardware
what to do. There are several categories of software, with the two main categories
being operating-system software, which makes the hardware usable, and application
software, which does something useful. Examples of operating systems include
Microsoft Windows on a personal computer and Google’s Android on a mobile
phone. Examples of application software are Microsoft Excel and Angry Birds.
c. Data
The third component is data. You can think of data as a collection of facts. For
example, your street address, the city you live in, and your phone number are all
pieces of data. Like software, data is also intangible. By themselves, pieces of data
are not really very useful. But aggregated, indexed, and organized together into a
database, data can become a powerful tool for businesses. In fact, all of the
definitions presented at the beginning of this chapter focused on how information
systems manage data. Organizations collect all kinds of data and use it to make
decisions. These decisions can then be analyzed as to their effectiveness and the
organization can be improved.
Now that we have explored the different components of information systems, we need to
turn our attention to the role that information systems play in an organization. In fact, we might
say that one of the roles of information systems is to take data and turn it into information, and
then transform that into organizational knowledge. To get a full appreciation of the role
information systems play, we will review how they have changed over the years.
Rom the late 1950s through the 1960s, computers were seen as a way to more efficiently
do calculations. These first business computers were room-sized monsters, with several
refrigerator-sized machines linked together. This software, running on a mainframe
computer, gave companies the ability to manage the manufacturing process, making it more
efficient. From tracking inventory to creating bills of materials to scheduling production, the
MRP systems gave more businesses a reason to want to integrate computing into their processes.
IBM became the dominant mainframe company. Continued improvement in software and the
availability of cheaper hardware eventually brought mainframe computers into most large
businesses.
The PC Revolution
Its immediate popularity sparked the imagination of entrepreneurs everywhere, and there
were quickly dozens of companies making these personal computers. Though at first just a niche
product for computer hobbyists, improvements in usability and the availability of practical
software led to growing sales. The most prominent of these early personal computer makers was
a little company known as Apple Computer, headed by Steve Jobs and Steve Wozniak, with the
hugely successful Apple I. Not wanting to be left out of the revolution, in 1981 IBM hurriedly
released their own version of the personal computer, simply called the PC Businesses, who had
used IBM mainframes for years to run their businesses, finally had the permission they needed to
bring personal computers into their companies, and the IBM PC took off. During the
1980s, many new computer companies sprang up, offering less expensive versions of the PC.
Client-Server
In the mid-1980s, businesses began to see the need to connect their computers together as
a way to collaborate and share resources. Software companies began developing applications that
allowed multiple users to access the same data at the same time.
Computers were now seen as tools to collaborate internally, within an
organization. In fact, these networks of computers were becoming so powerful that they
were replacing many of the functions previously performed by the larger mainframe
computers at a fraction of the cost. It was during this era that the first Enterprise Resource
Planning systems were developed and run on the client-server architecture.
First invented in 1969, the Internet was confined to use by universities, government
agencies, and researchers for many years. One exception to this was the ability to expand
electronic mail outside the confines of a single organization. Companies began connecting their
internal networks to the Internet in order to allow communication between their employees and
employees at other companies.
In 1989, Tim Berners-Lee developed a simpler way for researchers to share information
over the network at CERN laboratories, a concept he called the World Wide Web. This invention
became the launching point of the growth of the Internet as a way for businesses to share
information about themselves. As web browsers and Internet connections became the norm,
companies rushed to grab domain names and create websites.
In 1991, the National Science Foundation, which governed how the Internet was used,
lifted restrictions on its commercial use. A mad rush of investment in Internet-based businesses
led to the dot-com boom through the late 1990s, and then the dot-com bust in 2000. While much
can be learned from the speculation and crazy economic theories espoused during that
bubble, one important outcome for businesses was that thousands of miles of Internet
connections were laid around the world during that time. A whole new industry of computer and
Internet security arose.
Web 2.0
This new type of interactive website, where you did not have to know how to create a
web page or do any programming in order to put information online, became known as web
2.0. Web 2.0 is exemplified by blogging, social networking, and interactive comments being
available on many websites. This new web-2.0 world, in which online interaction became
expected, had a big impact on many businesses and even whole industries.Others, such as video
rental chains and travel agencies, simply began going out of business as they were replaced by
online technologies.
Name: Date;
Quiz:
2 forms of computing
5 types of computing environment
3 components of information system
Question: 5pts