Chapter 01
Chapter 01
Chapter 1: Introduction
1 Introduction
Computer science is one of the most important modern sciences. His subject is calculation, in the broadest
sense of the term. In other words, any type of information that can be represented by a series of numbers.
Such as text, DNA, images or sounds, etc.
The aim of this chapter is to introduce a few basic concepts to get you started. The chapter begins with the
concept of computing, followed by a brief history of the most important stages it has gone through, and
finally, some basic definitions of algorithms and their characteristics.
2 Computer science
2.1 Definition
Computer science or Informatic, which is a combination of the two words information and automatic, is the
science of processing information automatically using a machine.
processing: is the set of instructions (commands) or operations that the machine executes.
information: Everything a machine can handle and manipulate. Such as : text, number, image,
video, ... It can be divided into data and instructions.
machine: The device carries out these instructions. Such as : calculator, computer, telephone, game,
television, receiver (demo), and any system bearing the word digital.
2.2 Computer
computer means any programmable device where information is input, processed, stored or output.
The computer consists of :
Input units: these are the devices used to enter information into a computer. Such as: keyboard for
entering numbers and text, mouse for entering movements and clicks, scanner for entering images,
microphone for entering audio, and camera for video.
Processing unit: It is made up of memory, the best-known of which is RAM, and the processor. The
memory contains the instructions and data, and the processor executes the instructions (logical and
arithmetic operations) on the data and stores the results in memory.
Storage units: They are used to store and retrieve information. Such as : hard disk, floppy disk, CD
or DVD, flash disk, memory card...
Chapter 1 Introduction Algorithms and data structures 1
Output units: the screen for viewing photos and videos, the printer for outputting images and text on
paper, the headphones and loudspeaker for sound...
Information
Input units
Keyboard
Processing unit
Processor + memory
output units
storage units
screen
Results
beginnings of a computer that ran on vacuum tubes. Then came the era of transistors and integrated circuits.
And then the era of the Internet and the Web. Finally, the current era, which represents the age of mobility
and data sharing. The following list summarizes the most important inventions, theories and events in
computer science.
The beginnings of computing
o 3000 BC J.-C. The Babylonian abacus .
o 780 Al-Khwarizmi Algebra
o 1645 Pascaline: Pascal invents the arithmetic machine.
o 1703 Binary arithmetic by Leibniz.
o 1801 Jacquard synthesizes the work of his predecessors and invents the first programmable machine
to knit and weave.
o 1822 Babbage invents the first mechanical calculator designed to calculate polynomials.
o 1847 Boolean algebra for binary arithmetic and logic.
o 1890 First use of jacquard cards outside the textile industry. and its use in statistical studies.
Pioneering times
o 1920 Quevedo invents an electromechanical arithmetic device that is controlled by a typewriter and
prints out the results.
o 1928 Von Neumann's MinMax algorithm.
o 1936 Alan Turing publishes an article in which he presents the Turing machine as a theoretical model
for the computer.
o 1937 Design of the first electronic calculator, Atanasov.
o 1939 The first non-programmed ABC electronic computer.
o 1942 The invention of an Enigma decoding machine (Germany) by Alan Turing (England), which
was the reason for his victory in the Second World War.
o 1944 Howard Aiken used perforated paper strips and vacuum tubes to calculate problems, and was the
first programmable device in America.
o 1946 The first large-scale electronic digital computer was launched under the name ENIAC.
o 1947 The invention of the transistor.
o 1947 Invention du langage de programmation assembleur, un langage de bas niveau.
o 1948 Invention of the first machine corresponding to the first-generation Van Neumann architecture
(instructions are stored with data in memory).
o 1950 the Turing test.
o 1953 The first high-level programming language.
o 1956 IBM's first hard disk.
o 1958 invention of integrated circuits.
o 1958 The second generation of computers appeared after the invention of the transistor.
o 1960 The first computer with multiple processors and parallel tasks.
o 1962 invention of the word informatic.
o 1963 invention of the mouse.
o 1964 IBM completes the 360 family of minicomputers (third generation of computers).
o 1964 BASIC programming language.
o 1965 Moore's Law: "A CPU will double in speed every 18 months."
o 1967 marketing of floppy disks by IBM.
o 1969 Unix operating system development.
o 1970 Pascal language
The beginnings of microcomputing
o 1970 The first Alto personal computer is produced in Xerox laboratories, using icons, windows,
graphics and a mouse.
o 1971 Intel 4004, the first microprocessor (4th generation)
Chapter 1 Introduction Algorithms and data structures 1
4 Introduction to algorithms
A computer can be compared to a human. Humans receive information through their senses. For example, 5
+ 3 is information that is received through hearing if spoken or through sight if written. The human stores it
in memory, and then the brain (the reasoning part) processes and calculates the result. In the previous
example, the result is 8, which is stored in memory and can be communicated through speech or pointing,
for example. However, if we encounter a human with senses, a brain, memory, and language, and we ask
them, for example, to calculate 25 × 13 or solve a linear equation without teaching them, they would not be
able to do it unless we teach them. The same applies to a computer. It alone cannot do anything unless we
provide it with a solving method, known as an algorithm or software.
Chapter 1 Introduction Algorithms and data structures 1
4.1 Definitions
Algorithm: An algorithm in mathematics and computer science is a set of sequential, detailed and
completed steps required to solve a problem and achieve results, based on elementary data. In other words,
it's the solution method.
It was named Algorithm after the scientist Abu Jaafar Muhammad ibn Musa al-Khwarizmi, who in the 9th
century wrote the first systematic work presenting solutions to linear (first-degree) and quadratic (second-
degree) equations.
The algorithm is based on three components:
Sequence: An algorithm is a set of sequential instructions executed by the computer in a specific
order.
Selection: An algorithm may need to test certain conditions. If the result is correct, it follows a path
with sequential instructions, and if it is incorrect, it follows another path of instructions.
Loop: Sometimes, the same sequence of steps needs to be repeated multiple times.
Note: An algorithm is not a programming language but rather a set of analysis and thinking methods that a
programmer must follow in order to write code correctly. It is considered the most challenging part of
programming, but once you learn it properly, you can learn any programming language.
Data Structure: It is a means of storing and organizing data to facilitate their use and modification.
Program: It is an algorithm written in a programming language and can be written on a computer using any
text editor. The computer cannot directly execute it until it is translated.
Application: A program that has been translated into machine language (0 and 1) and is ready to be
executed by the processor. Sometimes it is also referred to as a program.
Examples
Cooking recipe, changing a car wheel, recitation method, chess game...
Calculation of the greatest common divisor, method for solving a quadratic equation, calculation of
the derivative...
Calculation of students' average, employees' salary, electricity bill...
problem
algorithm program Application
Example 2
The problem: Calculate the sum of the squares of two numbers.
Analysis:
The problem can be divided into three subproblems:
• Calculate the square of the first number.
• Calculate the square of the second number.
• Calculate the sum of the two squares.
The first subproblem:
• Identify the input: a of integer type.
• Identify the output: x of integer type.
• Relation: x = a * a.
The second subproblem:
• Identify the input: b of integer type.
• Identify the output: y of integer type.
• Relation: y = b * b.
The third subproblem:
• Identify the inputs: x and y of integer type.
• Identify the output: z of integer type.
• Relation: z = x + y.
Algorithm
Data:
Identify the inputs: a and b of integer type.
Specify the output: z of integer type.
Identify the intermediates: x and y of integer type.
Instructions:
x=a*a
y=b*b
z=x + y
II. Coding programming (to obtain the program)
After obtaining the algorithm, which is usually written in human language, the programmer chooses a
programming language, such as C, and then translates the data and instructions into that language. The
program is called the source code. It is a text file that can be read by a person, with a file extension specific
to the language used. For example, .c for C or .cpp for C++.
III. Compilation (to obtain the application)
The program is translated and converted into codes that can be understood and executed by the computer,
which means the binary language (0 and 1), a language that varies depending on the device (processor and
operating system). This process is done automatically. The result is a binary file (unreadable by humans),
Chapter 1 Introduction Algorithms and data structures 1
usually with the .exe extension in the Windows environment. This process includes a verification process for
spelling mistakes (syntax errors and writing errors).
IV. execution
The processor loads the program into memory and starts executing one instruction after another. In a
Windows environment, this is done by double-clicking on the application (.exe). This process involves
testing and correcting semantic errors (errors in the result).
Note: In the case of interpreted languages, the process of translation and execution happens simultaneously.