NNNNNN
NNNNNN
INTRODUCTION
Department of ISE 3
Learning Outcomes of
Module -1
Students will be able to
✔ Representing real world problem into algorithmic notation.
✔ Performance analysis of an algorithm.
✔ Important problem types.
✔ Fundamental Data structures.
DSU
Department of ISE BMS Institute of Technology and Mgmt 4
What is an algorithm?
Algorithmic: The sprit of computing – David Harel.
DSU
Department of ISE BMS Institute of Technology and Mgmt 5
What is an algorithm?
Recipe, process, method, technique,
procedure, routine,… with the following
requirements:
1. Finiteness
♌ terminates after a finite number of steps
2. Definiteness
♌ rigorously and unambiguously specified
3. Clearly specified input
♌ valid inputs are clearly specified
4. Clearly specified/expected output
♌ can be proved to produce the correct output given a valid input
5. Effectiveness
♌ steps are sufficiently simple and basic
BMS Institute of Technology and Mgmt 6
Algorithm
DSU
Department of ISE BMS Institute of Technology and Mgmt 7
What is an algorithm?
An algorithm is a sequence of unambiguous instructions
for solving a problem, i.e., for obtaining a required
output for any legitimate input in a finite amount of
time.
Problem
Algorithm
• Practical importance
DSU
Department of ISE BMS Institute of Technology and Mgmt 9
Euclid’s
Algorithm
Problem: Find gcd(m,n), the greatest common divisor of two
nonnegative, not both zero integers m and n
while n ≠ 0 do
r ← m mod n
m← n
n←r
return m
DSU
Department of ISE BMS Institute of Technology and Mgmt 12
Other methods for
gcd(m,n)[cont.]
Middle-school procedure
Step 1 Find the prime factorization of m
Step 2 Find the prime factorization of n
Step 3 Find all the common prime
factors
Step 4 Compute the product of all the common prime
factors
and return it as gcd(m,n)
Is this an algorithm?
Operation Count
The operation count method is a technique used to analyze the time complexity of
algorithms by counting the number of basic operations performed as a function of
the input size.
1.Identify the Basic Operations: Begin by identifying the basic operations that the
algorithm performs. These operations can be simple arithmetic operations (e.g.,
addition, subtraction, multiplication), comparisons (e.g., less than, equal to),
assignments, or any other fundamental operations that are executed repeatedly.
2.Count the Operations: For each basic operation, determine how many times it is
executed based on the input size (n). To do this, you may need to examine the
algorithm's loops, recursive calls, and conditional statements. Keep in mind that the
number of operations might vary depending on the specific input data and any early
termination conditions.
Example
def array_sum(arr):
sum = 0
for element in arr:
sum += element
return sum
Now, let's go through the steps of the operation count method to find the time complexity:
T(n) ≈ n
STEPS COUNT-
Here we attempt to find the time spent in all parts of the program.
Asymptotic Analysis of algorithms (Growth of function)
Resources for an algorithm are usually expressed as a function regarding input. Often this
function is messy and complicated to work. To study Function growth efficiently, we reduce
the function down to the important part.
Let f (n) = an2+bn+c
In this function, the n2 term dominates the function that is when n gets sufficiently large.
Dominate terms are what we are interested in reducing a function, in this; we ignore all
constants and coefficient and look at the highest order term concerning n.
Asymptotic notation:
The word Asymptotic means approaching a value or curve arbitrarily closely (i.e., as some
sort of limit is taken).
Asymptotic means study of functions of parameter n and n becomes larger and larger without
bound.
Here we are concerned about how the running time of an algorithm increases with the size of
the input.
Asymptotic notations are used to write fastest and slowest possible running time for an
algorithm. These are also referred to as 'best case' and 'worst case' scenarios respectively.
"In asymptotic notations, we derive the complexity concerning the size of the input.
(Example in terms of n)"
"These notations are important because without expanding the cost of running the algorithm,
we can estimate the complexity of the algorithms."
Asymptotic Notations:
Asymptotic Notation is a way of comparing function that ignores constant factors and small
input sizes. Three notations are used to calculate the running time complexity of an
algorithm:
1. Big-oh notation: Big-oh is the formal method of expressing the upper bound of an
algorithm's running time. It is the measure of the longest amount of time. The function f (n) =
O (g (n)) [read as "f of n is big-oh of g of n"] if and only if exist positive constant c and such
that
1.f (n) ⩽ k.g (n) for n>=n0 in all case
Hence, function g (n) is an upper bound for function f (n), as g (n) grows faster than f (n)
A
For Example:
1.1. 3n+2=O(n) as 3n+2≤4n for all n≥2
2.2. 3n+3=O(n) as 3n+3≤4n for all n≥3
Hence, the complexity of f(n) can be represented as O (g (n))
2. Omega () Notation: The function f (n) = Ω (g (n)) [read as "f of n is omega of g of n"] if
and only if there exists positive constant c and n0 such that F (n) ≥ k* g (n) for all n, n≥ n0
For Example:
f (n) =8n2+2n-3≥8n2-3 =7n2+(n2-3)≥7n2 (g(n)) Thus, k1=7
Hence, the complexity of f (n) can be represented as Ω (g (n))
3. Theta (θ): The function f (n) = θ (g (n)) [read as "f is the theta of g of n"] if and only if
there exists positive constant k1, k2 and k0 such that
k1 * g (n) ≤ f(n)≤ k2 g(n)for all n, n≥ n0
For Example:
3n+2= θ (n) as 3n+2≥3n and 3n+2≤ 4n, for n k1=3,k2=4, and n0=2
Hence, the complexity of f (n) can be represented as θ (g(n)).
Computing time functions
DSU
Department of ISE BMS Institute of Technology and Mgmt 31
Values of some important functions as n → ∞
DSU
Department of ISE BMS Institute of Technology and Mgmt 32
Order of
growth
• Most important: Order of growth within
a constant multiple as n→∞
• Example:
– How much faster will the algorithm run on computer that
is twice as fast?
DSU
Department of ISE BMS Institute of Technology and Mgmt 33
Best-case, average-case, worst-case
For some algorithms efficiency depends on form of input:
= 2T + θ (n) if n>1
There are four methods for solving Recurrence:
1.Substitution Method
2.Iteration Method
3.Recursion Tree Method
4.Master Method
Iteration Methods
It means to expand the recurrence and express it as a summation of terms of n and initial
condition.
Example1: Consider the Recurrence
T (n) = 1 if n=1
= 2T (n-1) if n>1
Example2: Consider the Recurrence
1.T (n) = T (n-1) +1 and T (1) = θ (1).