0% found this document useful (0 votes)
76 views111 pages

01 Lecture Week 1-1

This document provides an introduction to the fundamentals of computer science. It discusses computer science as the study of algorithms, including their formal properties, hardware realizations, linguistic realizations, and applications. It also outlines the organization of the textbook into six levels covering these topics and provides the course schedule, assessment details, contact information, and learning management portal.

Uploaded by

Harsh Chauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
76 views111 pages

01 Lecture Week 1-1

This document provides an introduction to the fundamentals of computer science. It discusses computer science as the study of algorithms, including their formal properties, hardware realizations, linguistic realizations, and applications. It also outlines the organization of the textbook into six levels covering these topics and provides the course schedule, assessment details, contact information, and learning management portal.

Uploaded by

Harsh Chauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 111

Fundamentals of

Computational Intelligence
COMP1002 & COMP8802

Lecture 1
An Introduction to Computer
Science & Algorithms

Denis Kalkofen

Slides based on material provided by Cengage and Paulo Santos


Fundamentals of Computational Intelligence

• Discuss the Fundamentals of Computer Science


Computer Science
• Defnition by Norman Gibbs and Allen Tucker: Computer scientists design
and develop algorithms to solve problems,
Computer Science
• Defnition by Norman Gibbs and Allen Tucker: Computer scientists design
and develop algorithms to solve problems, which includes
– Studying the behavior of algorithms to determine they are correct and
efficient (study their formal and mathematical properties)
Computer Science
• Defnition by Norman Gibbs and Allen Tucker: Computer scientists design
and develop algorithms to solve problems, which includes
– Studying the behavior of algorithms to determine they are correct and
efficient (study their formal and mathematical properties)
– Designing and building computer systems that are able to execute algorithms
(their hardware realizations)
Computer Science
• Defnition by Norman Gibbs and Allen Tucker: Computer scientists design
and develop algorithms to solve problems, which includes
– Studying the behavior of algorithms to determine they are correct and
efficient (study their formal and mathematical properties)
– Designing and building computer systems that are able to execute algorithms
(their hardware realizations)
– Designing programming languages and translating algorithms into these
languages so that they can be executed by the hardware (their linguistic
realizations)
Computer Science
• Defnition by Norman Gibbs and Allen Tucker: Computer scientists design
and develop algorithms to solve problems, which includes
– Studying the behavior of algorithms to determine they are correct and
efficient (study their formal and mathematical properties)
– Designing and building computer systems that are able to execute algorithms
(study their hardware realizations)
– Designing programming languages and translating algorithms into these
languages so that they can be executed by the hardware (study their
linguistic realizations)
– Identifying important problems and designing correct and efficient software
packages to solve these problems (study their applications)
Summary

Computer science is the study of algorithms,


including:

1. Their formal and mathematical properties

2. Their hardware realizations

3. Their linguistic realizations

4. Their applications
Organization of the Textbook into a six-layer hierarchy

Computer science is the study of algorithms,


including:

1. Their formal and mathematical properties

2. Their hardware realizations

3. Their linguistic realizations

4. Their applications
Organization of the Textbook

Computer science is the study of algorithms,


including:

1. Their formal and mathematical properties

2. Their hardware realizations

3. Their linguistic realizations

4. Their applications
Organization of the Text

Study of algorithms Levels of the text:

1. Their formal and Level 1: The


mathematical Algorithmic
properties Foundations of
Computer Science
2. Their hardware Level 2: The Hardware
realizations World
Level 3: The Virtual
Machine
3. Their linguistic Level 4: The Software
realizations World
4. Their applications Level 5: Applications
Level 6: Social Issues
Schedule
Week Week Material Book Week Week Material Book
Beginning Beginning
1 February 27 An Introduction to Computer Level 1 Mid-Sem Break
Science - Algorithms & History
Mid-Sem Break

2 March 6 Algorithms continued 7 April 24 Computational Models Level 4 & 5


Level 1
3 No Lecture Monday Public Holiday
8 May 01 Artificial Intelligence / Ethics Level 5 & 6
4 March 20 The Building Blocks / Computer Level 2 9 May 08 Databases / Data Analysis
Organization Level 5

5 March 27 Languages / Compiler / Python Level 3 & 4 10 May 15 Networking / Security Level 3
6 April 03 Python continued 11 May 22 Graphics / Human-Machine Level 5
Interaction

Mid-Sem Break 12 May 29 Guest Lectures: Applications at Level 5


Flinders University 1
Mid-Sem Break
13 June 05 Guest Lectures: Applications at Level 5
Flinders University 2
Assessment

• 2 Tests => hurdle task


• 3 Quizzes (in addition to unmarked weekly quizzes for practice)
• Test python
Assessment

• 3 Quizzes (in addition to unmarked weekly quizzes for practice)


• 2 Tests => hurdle task
• Test python
• COMP8802: Multimedia presentation on research topic => hurdle task
– Groups of up to 6 students
– Poster + Video production
– Literature review
Assessment - Schedule
Week Week Material Assessments Due Week Week Material Assessments Due
Beginning Beginning
1 February Computer Science Mid-Sem Break
27 Algorithms
History Mid-Sem Break
2 March 6 Algorithms cont. 7 April 24 Computational Models Practical test (Python)
8 May 01 Artificial Intelligence
No Lecture
9 May 08 Databases / Data Analysis
4 March 20 Building Blocks /
Computer
Organization 10 May 15 Networking / Security / Ethics Quiz 2
5 March 27 Compiler / Quiz 1
11 May 22 Graphics / Human-Machine Multimedia
Languages /
Interaction / Applications at Flinders Assignment
Python
University 1
6 April 03 Python continued Midterm test
12 May 29 Guest Lectures: Applications at Quiz 3
Mid-Sem Break Flinders University 2

Mid-Sem Break 13 June 05 Guest Lectures: Applications at Final Test


Flinders University 3
Contact Time & Team
• 1x lecture per week = Monday 12 - 2pm
• 1x laboratory per week = several slots
• Email => Subject [COMP1002] or [COMP8802]
• Matthew Stephenson
• Denis Kalkofen
• Jayshween Kumar
• Laura Savaglia
• Robert Wright
• Adam Wilden
• Maëlic Neau
• Evan Sahlos
Information Portal
• Canvas

• Lecture slides
• Lecture recording
• Contact information
• Contact times
• Announcements
• Assignments
• …
Lecture 1: Organization & Introduction
Week Week Beginning Material Book
1 February 27 An Introduction to Computer Science – Algorithms & History Level 1

2 March 6 Algorithms continued Level 1

3 No Lecture Monday Public Holiday

4 March 20 The Building Blocks / Computer Organization Level 2

5 March 27 Languages / Compiler / Python Level 3 & 4


6 April 03 Python continued

Mid-Sem Break

Mid-Sem Break

7 April 24 Computational Models Level 4 & 5


Learning Objectives

• Understand the definition of computer science


Learning Objectives

• Understand the definition of computer science

• Understand the definition of the term algorithm


• Write down everyday algorithms
• Determine if an algorithm is ambiguous or not effectively computable
Learning Objectives

• Understand the definition of computer science

• Understand the definition of the term algorithm


• Write down everyday algorithms
• Determine if an algorithm is ambiguous or not effectively computable

• Understand the roots of modern computer science in mathematics and


mechanical machines
• Summarize the key points in the historical development of modern
electronic computers
Informal Definition of an Algorithm
• An ordered sequence of instructions that is guaranteed to solve
a specific problem.
Informal Definition of an Algorithm
• An ordered sequence of instructions that is guaranteed to solve
a specific problem.

• For example:
Step 1: Do something
Step 2: Do something
Step 3: Do something
Operations used to construct algorithms

• Sequential operations
– Carries out a single well-defined task
– Usually expressed as simple declarative sentences
– Examples
▪ Add 1 cup of butter
▪ Set the value of x to 1
▪ Subtract 100 from current account balance
Operations used to construct algorithms

• Sequential operations
– Carries out a single well-defined task
• Conditional operations
– Ask a question and the next operation is then selected on the basis of the
answer to that question
– Examples
▪ If the amount of check is less than or equal to the current account balance, then
cash the check; otherwise, tell there are insufficient funds.
▪ If x is not equal to 0, then set z equal to 1/x.
Operations used to construct algorithms
• Sequential operations
– Carries out a single well-defined task
• Conditional operations
– Ask a question and the next operation is then selected on the basis of the
answer to that question
• Iterative operations
– Looping instructions that tell not to go on but go back and repeat the
execution of a previous block of instructions
– Examples:
▪ While there are still more checks to be processed, do the following five steps.
▪ Repeat steps 1, 2, and 3 until the value of y is equal to +1.
Example: Algorithm for adding two m-digit numbers

• Algorithms are everywhere !!!


• The evolution of computer science began before the development of the first
computer system.
• Example: adding of two m-digit numbers
Example: Algorithm for adding two m-digit numbers

• Algorithms are everywhere !!!


• The evolution of computer science began before the development of the first
computer system.
• Example: adding of two m-digit numbers
Example: Algorithm for adding two m-digit numbers
Example: Algorithm for adding two m-digit numbers
Input & Initialize
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=0
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=0
• carry=0
• i=0
Example: Algorithm for adding two m-digit numbers
Memory :
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=0
• carry=0
• i=0
Processing:
• if(i <= m-1)
– 0 <= 1 => true
Repeat 4-6
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=0
• carry=0
• i=0
Processing:
• a0+b0+carry=c0
c0=7+5+0=12
Example: Algorithm for adding two m-digit numbers
Memory :
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=0
• carry=0
• i=0
Processing:
• If (c0>10) => true
c0=12-10=2
carry=1
Example: Algorithm for adding two m-digit numbers
Memory :
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=2
• carry=1
• i=0
Processing:
• If (c0>10) => true
c0=12-10=2
carry=1
Example: Algorithm for adding two m-digit numbers
Memory
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=2
• carry=1
• i=0
Processing:
• i=0+1=1
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=2
• carry=1
• i=1
Processing:
• i=0+1=1
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=2
• carry=1
• i=1
Processing:
• if(i <= m-1)
1<=1 => true
Repeat 4-6
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=0, c0=2
• carry=1
• i=1
Processing:
• a1+b1+carry=c1
c1=4+2+1=7
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=1
• i=1
Processing:
• a1+b1+carry=c1
c1=4+2+1=7
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=1
• i=1
Processing:
• If (c1>10)
(7>10) =>False
carry=0
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=0
• i=1
Processing:
• If (c1>10)
(7>10) =>False
carry=0
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=0
• i=1
Processing:
• i=1+1=2
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=0
• i=2
Processing:
• i=1+1=2
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c1=7, c0=2
• carry=0
• i=2
Processing:
• if(i <= m-1)
2<=1 => false
Goto 7
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c2=0,c1=7, c0=2
• carry=0
• i=2
Processing:
• c2=0
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c2=0,c1=7, c0=2
• carry=0
• i=2
Processing:
• print out: 072
Example: Algorithm for adding two m-digit numbers
Memory:
• m=2
• a1=4, a0=7
• b1=2, b0=5
• c2=0,c1=7, c0=2
• carry=0
• i=2

Result: 072
An example of an algorithm
- Adding two m-digit numbers -
• Sequential operations
– Carries out a single well-
defined task
An example of an algorithm
- Adding two m-digit numbers -
• Conditional operations
– Ask a question and the
next operation is then
selected on the basis of
the answer to that
question
An example of an algorithm
- Adding two m-digit numbers -
• Iterative operations
– Looping instructions that
tell not to go on but go
back and repeat the
execution of a previous
block of instructions
Another Example - Programming a DVR -
An example of an algorithm
- Programming a DVR -
• Conditional operations
– Ask a question and the
next operation is then
selected on the basis of
the answer to that
question
An example of an algorithm
- Programming a DVR -
• Conditional operations
– Ask a question and the
next operation is then
selected on the basis of
the answer to that
question
An example of an algorithm
- Programming a DVR -
• Iterative operations
– Looping instructions that
tell not to go on but go
back and repeat the
execution of a previous
block of instructions
An example of an algorithm
- Programming a DVR -
• Iterative operations
– Looping instructions that
tell not to go on but go
back and repeat the
execution of a previous
block of instructions
An example of an algorithm
- Programming a DVR -
• Sequential operations
– Carries out a single well-
defined task
An example of an algorithm
- Programming a DVR -
• Sequential operations
– Carries out a single well-
defined task
Computer Science and Algorithms

• Why are formal algorithms so important in computer science?


Computer Science and Algorithms

• Why are formal algorithms so important in computer science?


– If we can specify an algorithm to solve a problem, then we can
automate its solution
Computer Science and Algorithms

• Why are formal algorithms so important in computer science?


– If we can specify an algorithm to solve a problem, then we can
automate its solution
• Computing agent
– Machine, robot, person, or thing carrying out the steps of the
algorithm
Computer Science and Algorithms

• Why are formal algorithms so important in computer science?


– If we can specify an algorithm to solve a problem, then we can
automate its solution
• Computing agent
– Machine, robot, person, or thing carrying out the steps of the
algorithm
• Unsolved problems
– some problems are unsolvable,
– some solutions are too slow, and
– some solutions are not yet known
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Well-ordered collection
– Upon completion of an operation, we always know which operation to do next
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Unambiguous operation (or primitive)
– Can be understood by the computing agent without having to be further
defined or simplified
– When an operation is unambiguous, we call it a primitive
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Unambiguous operation (or primitive)
– Can be understood by the computing agent without having to be further
defined or simplified
– When an operation is unambiguous, we call it a primitive
– Ambiguous statements
▪ Example: Go back and do it again (Do what again?)
▪ Example: Start over (From where?)
Example: Ambiguous Algorithm

• Why is it not a good algorithm?


Example: Ambiguous Algorithm

• Why is it not a good algorithm?


– Lather => What?
Example: Ambiguous Algorithm

• Why is it not a good algorithm?


– Lather => What?
– Repeat => Which part?
Example: Ambiguous Algorithm

• Why is it not a good algorithm?


– Lather => What?
– Repeat => Which part?
– Repeat => How many times?
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Effectively computable operations = doable !
– Operations must also be doable (effectively computable) by the computing agent
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Effectively computable operations = doable !
– Operations must also be doable (effectively computable) by the computing agent
– Examples of not “effectively computable“ operations:
▪ Divide x by 0
▪ Add 1 to the current value of x, without
▪ Generate a list L of all the prime numbers
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Produces a result and halts in a finite amount of time
– To know whether a solution is correct, an algorithm must produce a result that is
observable to a user:
▪ A numerical answer
▪ A new object
▪ A change in the environment
Formal Definition of an Algorithms
• A well-ordered collection of unambiguous and effectively computable
operations that, when executed, produces a result and halts in a finite amount
of time
• Produces a result and halts in a finite amount of time
– To know whether a solution is correct, an algorithm must produce a result that is
observable to a user:
▪ A numerical answer
▪ A new object
▪ A change in the environment
– Example that will never stop:
▪ Infinite loop runs forever (usually a mistake)
An algorithmic solution to the shampooing problem

• If meets all criteria of the formal definition of an algorithm


=> it can be automated
An Algorithm to the shampooing problem

• A well-ordered collection of (operations)


An Algorithm to the shampooing problem

• A well-ordered collection of (operations)


• unambiguous (operations) and
An Algorithm to the shampooing problem

• A well-ordered collection of (operations)


• unambiguous (operations) and
• effectively computable (doable) operations
An Algorithm to the shampooing problem

• A well-ordered collection of (operations)


• unambiguous (operations) and
• effectively computable operations
• that, when executed, produces a result and halts in a finite amount of time
An Algorithm to the shampooing problem

• A well-ordered collection of (operations)


• unambiguous (operations) and
• effectively computable operations
• that, when executed, produces a result and halts in a finite amount of time
Another Algorithm to the shampooing problem
Another Algorithm to the shampooing problem

• A well-ordered collection of
• unambiguous and
• effectively computable operations
• that, when executed, produces a result and halts in a finite amount of time
Another Algorithm to the shampooing problem

• Which one is better?


Algorithms to the shampooing problem

• Which is better?

More general
Fewer steps
Less memory
Summary
• Computer science is the study of algorithms.
Summary
• Computer science is the study of algorithms.

• An algorithm is a well-ordered collection of unambiguous and


effectively computable operations that, when executed,
produces a result and halts in a finite amount of time.
• If we can specify an algorithm to solve a problem, then we can
automate its solution.
Algorithms & Computer Science

• Development of algorithms and the evolution of computer science began


before the development of the first computer system.
Algorithms & Computer Science

• Development of algorithms and the evolution of computer science began


before the development of the first computer system.
• Algorithmic Problem Solving was important to
– “Industrial revolution” of the 19th century
▪ Mechanized and automated repetitive physical tasks

– “Computer revolution” of the 20th and 21 centuries


▪ Mechanized and automated repetitive mental tasks
▪ Used algorithms and computer hardware
A Brief History of Computing

Source: INTERFOTO/Alamy

Source: From the


Collections of the University
of Pennsylvania Archives Source:
© Bettmann/CORBIS
(U.S Army photo)
The Early Period: Up to 1940 - 17th Century Devices

• 17th century: automation/simplification of arithmetic for scientific research:


The Early Period: Up to 1940 - 17th Century Devices
• Example of arithmetic simplification:
– John Napier invented logarithms to simplify difficult mathematical computations (1614)
– Simply a multiplication with additions and look-up
– Logarithm table : 10^b = a
a => b
1.00 => 0.0000000
2.00 => 0.3010300
3.00 => 0.4771213
4.00 => 0.6020600
5.00 => 0.6989700
6.00 => 0.7781513
7.00 => 0.8450980
8.00 => 0.9030900
9.00 => 0.9542425
The Early Period: Up to 1940 - 17th Century Devices
• Example of arithmetic simplification:
– John Napier invented logarithms to simplify difficult mathematical computations (1614)
– Simply a multiplication with additions and look-up
– Logarithm table : 10^b = a
a => b
1.00 => 0.0000000 2 x 3 => 0.301 + 0.477 = 0.778 => 6
2.00 => 0.3010300
3.00 => 0.4771213
4.00 => 0.6020600
5.00 => 0.6989700
6.00 => 0.7781513
7.00 => 0.8450980
8.00 => 0.9030900
9.00 => 0.9542425
The Early Period: Up to 1940 - 17th Century Devices
• Example of arithmetic simplification:
– John Napier invented logarithms to simplify difficult mathematical computations (1614)
– Simply a multiplication with additions and look-up
– Logarithm table : 10^b = a
a => b
1.00 => 0.0000000 2 x 3 => 0.301 + 0.477 = 0.778 => 6
2.00 => 0.3010300
3.00 => 0.4771213 2 x 4 => 0.301 + 0.602 = 0.903 => 8
4.00 => 0.6020600
5.00 => 0.6989700 3 x 3 => 0.477 + 0.477 = 0.954 => 9
6.00 => 0.7781513
7.00 => 0.8450980
8.00 => 0.9030900
9.00 => 0.9542425
The Early Period: Up to 1940 - 17th Century Devices
• 17th century: automation/simplification of arithmetic for scientific research:
– John Napier invented logarithms as a way to simplify difficult mathematical
computations (1614).

Mechanical devices to automate arithmetic operations


– The first slide rule appeared around 1622.
– Pascal designed and built a mechanical calculator named the Pascaline
(1642).

Source: INTERFOTO/Alamy
The Early Period: Up to 1940 - 17th Century Devices

• 17th century devices:


– Could represent numbers
– Could perform arithmetic operations on numbers
The Early Period: Up to 1940 - 17th Century Devices

• 17th century devices:


– Could represent numbers
– Could perform arithmetic operations on numbers

– Did not have a memory to store information


– Were not programmable (a user could not provide a sequence of actions to
be executed by the device)
The Early Period: Up to 1940 - 19th Century Devices

• 19th-century devices:
– Joseph Jacquard designed an automated loom that used punched cards to
create patterns (1801)
▪ programmable => considered the first computing device
The Early Period: Up to 1940 - 19th Century Devices

• 19th-century devices:
– Joseph Jacquard designed an automated loom that used punched cards to
create patterns (1801)

– Pattern controls if needles lift a thread or not


The Jacquard loom
Source:©Bettmann/CORBIS
The Early Period: Up to 1940 - 19th Century Devices

• 19th-century devices:
– Herman Hollerith (1880s onward)
▪ Designed and built programmable card-processing machines to read, tally, and
sort data on punched cards for the U.S. Census Bureau 1890
▪ Punch code on a card was used to store personal data
▪ Machine was used to generate statistics
State Gender Disease
▪ Reduced evaluation of census data Chickenpox
from 10 years to 2 years

Rubella
The Early Period: Up to 1940 - 19th Century Devices

• 19th-century devices:
– Mechanical (not electrical)
– Had many features of modern computers:
▪ Representation of numbers and other data
▪ Operations to manipulate the data
▪ Memory to store values in a machine-readable form
▪ Programmable: sequences of instructions could be predesigned for complex
operations
The Birth of Computers: 1940–1950

• 1940-1950 devices: improved performance


– Mark I (1944) designed by Howard Aiken
▪ Electro-mechanical computer
▪ Used a mix of relays, magnets, and gears to process and store data

▪ Improved speed
o 0.3 seconds per additions
o 6 seconds per multiplication
o 15 seconds per division
The Birth of Computers: 1940–1950

• 1940-1950 devices: improved performance


– ENIAC (Electronic Numerical Integrator and Calculator) (1946)
▪ Avoids slow mechanical components
▪ First publicly known fully electronic computer
▪ Further improved performance
o Add two 10-digit numbers in 0.2ms
The Birth of Computers: 1940–1950

• Problem: fast computation but slow to re-program (required physical re-


configuration)
The Birth of Computers: 1940–1950

• 1940-1950 devices: improved performance


– John Von Neumann
▪ Proposed a radically different computer design based on a model called the
stored-program computer (1945)
▪ Stores data and instructions in the same memory
=> enabled fast re-programming
The Birth of Computers: 1940–1950

• 1940-1950 devices: improved performance


– 1949: Research group at the University of Pennsylvania built one of
the first stored-program computers, called EDVAC
Electronic Discrete Variable Automatic Computer

– 1951: a version of EDVAC, UNIVAC I Universal Automatic Computer, was


the first commercially sold computer - sold to the U.S. Bureau of the Census

– Nearly all modern computers use the Von Neumann architecture!!!


The Modern Era: 1950 to the Present
• 1950–1957 devices: characteristics of First generation of
computing
– Similar to EDVAC
– Vacuum tubes for processing and storage
– Large = room size
– expensive, and delicate
– Required trained users and special environments
The Modern Era: 1950 to the Present
• Characteristics of Second generation (1957–1965)
– Transistors and magnetic cores instead of vacuum tubes
=> smaller devices
– Era of FORTRAN and COBOL: some of the first high-level
programming languages
The Modern Era: 1950 to the Present
• Characteristics of Third generation (1965–1975)
– Integrated circuit (components integrated in silicon)
=> even smaller devices
– Birth of the first minicomputer: desk-sized, not room-sized
computer
– Birth of the software industry
▪ Microsoft 1975
The Modern Era: 1950 to the Present
• Characteristics of Fourth generation (1975–1985)
– Advanced integrated circuit technology enabled miniaturization
– The first microcomputer: desktop machine (similar to size of typewriter)

– Development of widespread computer networks

– Electronic mail, graphical user interfaces, and embedded systems


▪ 1983 Apple Lisa, the first desktop PC with a graphical user interface
The Modern Era: 1950 to the Present
• Characteristics of Fifth generation (1985–present)
– Massively parallel processors capable of quadrillions (1015) of computations
per second
– Handheld digital devices
– Powerful multimedia user interfaces incorporating sound, voice recognition,
video, and television
– Wireless communications
– Massive cloud storage devices
– Ubiquitous computing
– Ultra-high-resolution graphics and virtual reality
Summary
• Computer science is the study of algorithms.
• An algorithm is a well-ordered collection of unambiguous and
effectively computable operations that, when executed,
produces a result and halts in a finite amount of time.
• If we can specify an algorithm to solve a problem, then we can
automate its solution.
• Computers developed from mechanical calculating devices to
modern electronic marvels of miniaturization.

You might also like