Daa Unit-I Introduction
Daa Unit-I Introduction
UNIT - I
ALGORITHM
An Algorithm is a finite sequence of instructions, each of which has a clear meaning and
can be performed with a finite amount of effort in a finite length of time.
An Algorithm is any well-defined computational procedure that takes some value or set
of values as Input and produces a set of values or some value as output, an algorithm terminates
after executing a finite number of instructions.
In addition every algorithm must satisfy the following criteria:
Input: zero or more quantities externally supplied.
Output: At least one quantity is produced.
Definiteness: Each instruction must be clear and unambiguous;
Finiteness: (algorithm terminates after a finite number of steps)
If we trace out the instructions of an algorithm, then for all cases the algorithm will
terminate after a finite number of steps.
Here we have to define clear and unambiguous statements. Not allow any unnecessary
statements.
Algorithm Program
It refers to a set of instructions for a computer to
It is a well-defined, step-by-step, logical
follow. A program can be an implementation of
procedure for solving a given problem.
many algorithms.
Implementation phase: After getting design we
Design Phase: An algorithm is a design before
need to implement codes to build the machine in
construct a machine in software engineering.
software engineering.
To write an algorithm person has respective
To write programs we need programmers.
Domain Knowledge.
Algorithm are fully independent on Hardware and Programs are fully dependent on Hardware and
operating system. operating system.
It is written in any language like Hindi, French, It could be written in any programming language
Chinese, plain English language and can be such as Python, Java, C++, JavaScript, or any
understood by those from a non- programming other language, depending on the particular task
background. the program is being designed for.
The study of algorithms includes many important and active areas of research.
How to devise algorithms?
Creating an algorithm is an art which may never be fully automated.
Example Sorting Approaches
Insertion sort, bubble sort, quick sort, heap sort,
block sort, merge sort, radix sort, tree sort, smooth sort, etc…
Example Searching Approaches
Binary search, linear search, interpolation search, exponential search, etc….
Profiling is the process of executing a correct program on data sets and measuring the
time and space it takes to compute the results.
How to analyze algorithms:
As an algorithm is executed, it uses the computer's central processing unit (CPU) to
perform operations and its memory to hold the program and data.
Analysis of algorithms or performance analysis refers to the task of determining how
much computing time and storage algorithms replace.
We analyze the algorithm based on Time complexity and Space complexity.
The amount of time need to run the algorithm is called Time complexity.
The amount of memory need to run the algorithm is called Space complexity.
1. Theoretical analysis:
All possible inputs.
Independent of hardware / software implementation
2. Experimental Study:
Some typical inputs.
Depends on hardware / software implementation.
Algorithm Specification:
Algorithm can be described in three ways.
3. Pseudo-code Method:
This method describes algorithms as program, which resembles language like Pascal & algol.
3. An identifier begins with a letter. The data types of variables are not explicitly declared.
For Loop:
For variable: = value-1 to value-2 step step do
{
<statement-1>
.
.
<statement-n>
}
repeat-until:
repeat
<statement-1>
.
.
<statement-n>
until<condition>
9. Input and output are done using the instructions read & write.
10. There is only one type of procedure:
Algorithm, the heading takes the form,
Algorithm Name (Parameter lists)
Examples:
Algorithm for find max of two numbers:-
algorithm Max(A,n) // A is an array of size n
{
Result := A[1];
for I:= 2 to n do
if A[I] > Result then
Result :=A[I];
return Result;
}
Algorithm for Selection Sort:
Algorithm selection sort (a,n)
// Sort the array a[1:n] into non-decreasing order
{
for i:=1 to n do
{
j:=i;
for k:=i+1 to n do
if (a[k]<a[j]) then j:=k;
t:=a[i];
a[i]:=a[j];
a[j]:=t;
}
}
Recursive Algorithms:
A Recursive function is a function that is defined in terms of itself.
Similarly, an algorithm is said to be recursive if the same algorithm is invoked in the
body. An algorithm that calls itself is Direct Recursive.
Algorithm „A‟ is said to be Indirect Recursive if it calls another algorithm which in
turns calls „A‟.
The Recursive mechanism, are externally powerful, but even more importantly, many
times they can express an otherwise complex process very clearly. Or these reasons we
introduce recursion here.
The following 2 examples show how to develop recursive algorithms.
In the first, we consider the Towers of Hanoi problem, and in the second, we generate
all possible permutations of a list of characters.
1. Towers of Hanoi:
Towers of Hanoi is a problem in which there will be 3 Platinum Towers and some
Golden disks which of decreasing sizes from top to bottom. Besides this there are two other
towers (I and D) in which one tower will be act as destination tower and other act as intermediate
tower (Auxiliary memory). In this problem we have to move the disks from source tower to the
destination tower.
The steps involved during moving the disks from source tower A to Destination Tower C using
intermediate tower (Auxiliary Memory) B are as follows.
Step 1: Move the smaller disk which is present at the top of the tower A to C.
Step 2: Then move the next smallest disk present at the top of the tower A to B.
Step 3: Now move the smallest disk present at tower C to tower B
Step 4: Now move the largest disk present at tower A to tower C
Step 5: Move the disk smallest disk present at the top of the tower B to tower A.
Step 6: Move the disk present at tower B to tower C.
Step 7: Move the smallest disk present at tower A to tower C
In this way disks are moved from source tower to destination tower.
Performance Analysis:
The performance of a program is the amount of computer memory and time needed to
run a program. We use two approaches to determine the performance of a program. One is
analytical, and the other experimental. In performance analysis we use analytical methods, while
in performance measurement we conduct experiments.
1. Space Complexity:
The space complexity of an algorithm is the amount of memory it needs to run to
compilation.
2. Time Complexity:
The time complexity of an algorithm is the amount of computer time it needs to run to
compilation.
Space Complexity:
Space complexity is the amount of memory used by the algorithm (including the input
values to the algorithm) to execute and produce the result.
Auxiliary Space is the extra space or the temporary space used by the algorithm during
it's execution.
Example 1:
Example 2:
Algorithm sum(a,n)
{
s:=0.0;
for I=1 to n do
s:=s+a[I];
returns;
}
In the above algoritm n, s and occupies one word each and
so S(P)>=n+3
Example 3:
ALGORITHM FOR SUM OF NUMBERS USING RECURSION:
Algorithm RSum (a, n)
{
if(n<=0) then
return 0.0;
else
return RSum(a,n-1)+a[n];
}
Space complexity for above algorithm is:
In the above recursion algorithm the space need for the values of n, return address and
pointer to array. The above recursive algorithm depth is (n+1).
To each recursive call we require space for values of n, return address and pointer to
array. Sothe total space occupied by the above algorithm is S(P) >= 3(n+1)
Time Complexity:
The time T(p) taken by a program P is the sum of the compile time and the run time
(execution time)
The compile time does not depend on the instance characteristics. Also we may assume
that a compiled program will be run several times without recompilation. This run time is
denoted by Tp (instance characteristics).
The number of steps any problem statement is assigned depends on the kind of
statement.
For example
Comments, brases // 0 steps
Assignment statements : = 1 step
Expression statements : = 1 step
Interactive statement such as for, while & repeat-untilControl part of the statement.
We can determine the number of steps needed by a program to solve a particular problem
instance in two ways.
We introduce a variable, count into the program statement to increment count with initial
value 0. Statement to increment count by the appropriate amount are introduced into the
program.
This is done so that each time a statement in the original program is executes count is
incremented by the step count of that statement.
1. Count method:-
Example 1:
Example 2:
In this mechanism how many no.of statements executed and for each statement how many
no.of times it is executed.
Example:- 1
Example:- 2
Asymptotic Notations:
The best algorithm can be measured by the efficiency of that algorithm. The efficiency of
an algorithm is measured by computing time complexity. The asymptotic notations are used to
find the time complexity of an algorithm.
Asymptotic notations gives fastest possible, slowest possible time and average time of the
algorithm.
The basic asymptotic notations are
1. Big Oh (O)
2. Big Omega (Ω)
3. Theta (Ɵ) Notation
4. Small oh (o)
5. Small Omega (ω)
(ii) It is used to find the upper bound time of an algorithm that means the maximum time taken
by the algorithm.
Definition :
Let f(n), g(n) are two non-negative functions.
If there exists two positive constants c, n0.
such that c>0 and ∀ n>=n0
Example:-
Here f(n), g(n) are two +ve functions, c is a constants and n 0, n are +ve integers
Then prove that f(n)= O(g(n)) for these functions f(n)=2n+2, g(n)= n2.
(ii) It is used to find the lower bound time of an algorithm, that means the minimum time taken
by an algorithm.
Definition: Let f(n),g(n) are two non-negative functions. If there exists two positive constants c,
n0. such that c>0and for all n>=n0.
if f(n)>=c*g(n) then we say that f(n)=Ω(g(n))
If n=3:
2n+3>=2n=> 2(3)+5>=2(3)=> 11>=6 (true)
f(n)>=c*g(n), c=1 and ∀ n>=3 proved n>=2 proved
(ii) It is used to find the time in-between lower bound time and upper bound time of
an algorithm.
Definition :
Let f(n),g(n) are two non-negative functions. If there exists positive constants
c1,c2,n0.such that c1>0,c2>0 and for all n>=n0.
if c1*g(n)<=f(n)<=c2*g(n) then
Example :
4. LITTLE-OH (O)NOTATION:
Example:
sol : let us
lim f(n)/g(n) = 0
n->∞
= lim (2n+3) / (n2) = lim n(2+(3/n)) / (n2)
n->∞ n->∞
= lim (2+(3/n)) /n
n->∞
=2/∞ = 0
So, f(n)=o(n2)
sol :
RANDOMIZED ALGORITHMS:
The output or the running time are functions of the input and random bits
choosen.
Probability theory has the goal of characterizing the outcomes of natural or Conceptual
"experiments.” Example of such experiment is include tossing a Coin ten times, rolling a die
three times, playing a lottery, gambling, picking a ball from an urn containing white and red
Each possible outcome of an experiment is called a sample point and the set of all
possible outcomes is known as the sample space S. In this text e assume that S is finite (such a
sample space is called a discrete sample space).An event E is a subset of the sample space S. If
the sample space consists of n sample points, then there are 2n possible events.
Example : [Tossing three coins]
When a coin is tossed, there are two possible outcomes: heads (H) and tails (T).Consider
the experiment of throwing three coins. There are eight possible outcomes: HHH,HHT, HTH,
HTT, THH, THT, TTH, and TTT. Each such outcome is a sample point. The
sets{HHT,HTT,TTT},{HHH,TTT},and { } are three possible events. The third event has no
sample points and is the empty set.
For this experiment there are 28 possible events.
Quick sort is one of the basic sorting algorithms normally encountered in a data
structures course.The idea is simple:
• Choose a pivot element p
• Divide the elements into those less than p, those equal to p, and those greater than p
• Recursively sort the “less than” and “greater than” groups
• Assemble the results into a single list
The algorithm as written here is underspecified because we haven‟t said how to choose
the pivot. The choice turns out to be important! Any pivot will have some number k of elements
less than it, and thus at most n − k – 1 elements greater than it. The non-recursive parts of the
algorithm pretty clearly take O(n) time. We thus get a recurrence:
Example: