DAA Lecture1
DAA Lecture1
ALGORITHM
Prepared by
Murari Kumar Singh
Assistant Professor
CSE SET, Greater Noida
Why study algorithms?
Their impact is broad and far-reaching.
• Speed is fun!
Why study algorithms?
• For intellectual stimulation.
“For me, great algorithms are the poetry of computation. Just like verse, they can be
terse, allusive, dense, and even mysterious.
But once unlocked, they cast a brilliant new light on some aspect of computing. ” —
Francis Sullivan
“ I will, in fact, claim that the difference between a bad programmer and a
good one is whether he considers his code or his data structures more
important. Bad programmers worry about the code. Good programmers
worry about data structures and their relationships. ”
— Linus Torvalds (creator of Linux)
Algorithms: a common language for nature, human, and computer. ” — Avi Wigderson
Why study algorithms?
Algorithm
• is any well-defined computational procedure that takes some value, or set of values,
as input and produces some value, or set of values, as output.
• is thus a sequence of computational steps that transform the input into the
output.
PROBABILISITIC ALGORITHM
HEURISTIC ALGORITHM
APPROXIMATE ALGORITHM
• The running time of an algorithm typically grows with the input size.
8000
7000
• Run the program with inputs of varying size and
6000
composition
Time (ms)
5000
• Use a method like System.currentTimeMillis() to 4000
get an accurate measure of the actual running
3000
time
2000
• Plot the results 1000
0
0 50 100
Input Size
21
Limitations of Experimental approach
23
A problem we all know how to solve:
Integer Multiplication
13 132 1234
x x x
43 432 4321
39
52
559
A problem we all know how to solve:
Integer Multiplication
Prepared by
Murari Kumar Singh
Asymptotic Notation
• Mathematical tool that allow us to analyse an algorithm's running time by
identifying its behaviour as the input size for the algorithm increases.
• f(n) is O(g(n)), if for some real constants c (c > 0) and n0, cg(n)
Rate
Growth
• f(n) <= c g(n) for every input size n (n > n0). f(n)
{f(n) = O(g(n)):
positive constants c and n0, such
that n n0, 0 n0
Input n
we have 0 f(n) cg(n) }
{f(n) = O(g(n)):
positive constants c and n0, such
that n n0,
we have 0 f(n) cg(n) }
1) 𝑓 𝑛 = 2𝑛2 + 3𝑛 + 6 ≤ 𝟐𝒏𝟐 + 𝟒𝒏
cg(n)
f(n) is theta g(n), f(n) = Θ(g(n)), if for sufficiently large n, the function T(n) is
bounded from both above and below by a constant multiple of g(n).
c2g(n)
f(n
Rate
Growth
• We say that f(n) is o(g(n)) (or f(n) ∈ o(g(n))) if for any real constant c > 0, there
exists an integer constant n0 ≥ 1 such that f(n) < c ∗ g(n) for every integer n ≥ n0.
Little–Omega, ω()
Little–Omega, ω(): Let f(n) and g(n) be functions that map positive integers to positive real
numbers.
We say that f(n) is ω(g(n)) (or f(n) ∈ ω(g(n))) if for any real constant c > 0, there exists
an integer constant n0 ≥ 1 such that f(n) > c · g(n) for every integer n ≥ n0.
o(f(n)) Visualize the relationships
O(f(n)) between these notations:
Growth Rate
f(n)
Ω(f(n))
w(f(n))
Input n
Relationship between asymptotic notations
O(Big-
Oh)
Ω(Big-Omega)
Θ
o(little-oh)
Little-
omega(ω)
Little-o U Θ = Big-O
Little-omega(ω) UΘ =Big-Omega(Ω)
O(Big-Oh) ∩ Ω(Big-Omega) =Θ
Asymptotic Growth
for(i=1;i<n;i++) N times
Linear
{ Loop
N-1 times
Print(“Hello”)
T(n)=Θ(n)
{
{ T(n)=Θ(n2)
print(“hello”);
} T(n) = No. of outer loop x No. of inner Loop
}
For(i=1;i<n;i++) N times T(n) = O(n2)
{
For(j=i; j<n;j++) (N -1)/2 times
{
Print (“Hello”); Dependent
} Quadratic Loop
For(i=1;i<n;i++) N times
{ T(n)=Θ(nlog(n)
For(j=1;j<n; j=j*2) )
{ Log(n) times
Printf(“hello”)
}
For(i=n; I >1; i--) N times
{
T(n)=Θ(n2 log(n)
for(j=1;j<n;j++} N times
{
for(k=n; k>1; k=k/2)
Log(n) times
{
printf(“Hello”);
}}}
for(j=1;j<n;j++} N times
{ T(n)=Θ(nlog(n)
for(k=n; k>1; k=k/2) Log(n) times
{
printf(“Hello”);
}}
Asymptotic Growth
N N1/2 log(N)
1 Complex no. 0
2 1.4 1
3 1.73 1 Log(3) <2
4 2 2
…….. ……….. ……
16 4 4
………….. ………. …….
64 8 6
1024 32 10
2048 45.25 20
Arrange the following (asymptotically) increasing order of growth rate
{ T(n)=Θ(n^2)
print(“hello”);
} T(n) = No. of outer loop x No. of inner Loop
}
For(i=1;i<n;i++) N times T(n) = O(n^2)
{
For(j=i; j<n;j++) (N -1)/2 times
{
Print (“Hello”); Dependent
} Quadratic Loop
For(i=1;i<n;i++) N times
{ T(n)=Θ(nlog(n)
For(j=1;j<n; j=j*2) )
{ Log(n) times
Printf(“hello”)
}
DESIGN AND ANALYSIS OF
ALGORITHM
Solving
Recurrences:
Substitution,
Prepared by Iteration, Master
Method
Murari Kumar Singh
Solving
Recurrences: Substitution,
Iteration, Master Method
• Many algorithms, particularly divide and conquer algorithms, have time
complexities which are naturally modelled by recurrence relations.
2.Verify by induction.
• The recursion tree method is good for generating guesses for the substitution method.
• The recursion-tree method can be unreliable, just like any method that uses
ellipses (…).
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
T(n)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
T(n/4) T(n/2)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
Q(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
(n/4)2 (n/2)2
Q(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
(n/4)2 (n/2)2 5 n2
16
(n/16)2 (n/8)2 (n/8)2 (n/4)2
Q(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
(n/4)2 (n/2)2 5 n2
16
(n/16)2 (n/8)2 (n/8)2 (n/4)2 25 n 2
256
…
Q(1)
Conclusion
Solving recurrence by
The master method
Prepared by
Murari Kumar Singh
The master method
( )= 1.
k
p
ai /bi
i=1
Then, the answers are the same as for the
master method, but with np instead of nlogba.
(Akra and Bazzi also prove an even more
general result.)
Idea of master theorem
Recursion tree:
f (n) f (n)
a
f (n/b) f (n/b) … f (n/b) a f (n/b)
h = logbn a
f (n/b2) f (n/b2) … f (n/b2) a2 f(n/b2)
…
#leaves = ah
= alogbn nlogba (1)
(1)
= nlogba
Idea of master theorem
Recursion tree:
f (n) f (n)
a
f (n/b) f (n/b) … f (n/b) a f (n/b)
h = logbn a
f (n/b2) f (n/b2) … f (n/b2) a2 f(n/b2)
…
CASE 1: The weight increases
geometrically from the root to the nlogba (1)
(1) leaves. The leaves hold a constant
fraction of the total weight. Q(nlogba)
Idea of master theorem
Recursion tree:
f (n) f (n)
a
f (n/b) f (n/b) … f (n/b) a f (n/b)
h = logbn a
f (n/b2) f (n/b2) … f (n/b2) a2 f(n/b2)
…
CASE 2: (k = 0) The weight
nlogba (1)
(1) is approximately the same on
each of the logbn levels.
Q(nlogbalg n)
Idea of master theorem
Recursion tree:
f (n) f (n)
a
f (n/b) f (n/b) … f (n/b) a f (n/b)
h = logbn a
f (n/b2) f (n/b2) … f (n/b2) a2 f(n/b2)
…
CASE 3: The weight decreases
geometrically from the root to the nlogba (1)
(1) leaves. The root holds a constant
fraction of the total weight. Q( f (n))
Lets revise
The Master Theorem
• Given: a divide and conquer algorithm
• An algorithm that divides the problem of size n into a subproblems, each of size n/b
• Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems)
be described by the function f(n)
• Then, the Master Theorem gives us a cookbook for the algorithm’s running time:
The Master Theorem
• if T(n) = aT(n/b) + f(n) then
Q (
n )
logb a
( )
f (n) = O n logb a −
0
T (n) = Q n ( logb a
log n ) (
f ( n) = Q n )
logb a
c 1
Q( f (n) ) ( )
f (n) = W n logb a + AND
af (n / b) cf (n) for large n
Using The Master Method
• T(n) = 9T(n/3) + n
• a=9, b=3, f(n) = n
• nlogb a = nlog3 9 = Q(n2)
• Since f(n) = O(nlog3 9 - ), where =1, case 1 applies: