0% found this document useful (0 votes)
5 views

Data Structures & Algorithms - Topic 3 - Time Complexity Basics

Uploaded by

arsalanbaig099
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Data Structures & Algorithms - Topic 3 - Time Complexity Basics

Uploaded by

arsalanbaig099
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

DATA STRUCTURES AND

ALGORITHMS

Time Complexity Basics


INSTRUCTOR: Sulaman
Ahmad Naz
Time Complexity of Algorithms
■ When we talk about time complexity of algorithms,
what do we actually mean?

■ It is not the actual running time. It is not the standard.


– It varies from machine to machine & compiler to compiler.
– It even varies on one machine due to availability of
resources.
Time Complexity of Algorithms
■ We actually calculate the estimation, not actual
running time.
■ Complexity can be viewed as the maximum number of
primitive operations that a program may execute.
■ Regular operations are:
– Addition
– Multiplication
– Assignment
– Accessing array element, etc.
■ We may leave some operations uncounted and
concentrate on those that are performed the largest
Time Complexity of Algorithms
■ We define complexity as a numerical function T(n) -
time versus the input size n.

■ We want to define time taken by an algorithm without


depending on the implementation details.

■ Talgorithm#1(n) = n(n-1) = n2 - n //quadratic time

■ Talgorithm#2(n) = 2(n-1) = 2n – 2 //linear time


Time Complexity of Algorithms
■ We have already talked about how to get the time
complexity function of any algorithm.

■ We consider the basic operations (computational steps) only:


– Arithmetic operations
– Logical operations
– Relational operations
– Accessing the element of an array
– Assignment operation

■ We do not consider all the operations. We consider only the


most dominant ones as we need the approximation.
Time Complexity of Iterations
■What if a set of instructions is repeated many
times in an algorithm?

■Usually a set of code is repeated using either of


the two techniques:
– Loops
■For Loop
■While Loop
– Recursions
Iterative vs. Recursive Time

A() A(n)
{ {
For j = 1 to n If (-----)
Print “Hello” A(n/2)
}
}
Time Complexity (Iterations)
For i=1 to n T(n)=n
<computational instruction>
=O(n)
-----------------------------------------------------------
For i=1 to n
For j=1 to n
T(n)=n.n=n 2

=O(n )
<computational instruction>
2

i 1 2 3 4 5 6 7 … n
J n n n n n n n … n
Time Complexity (Iterations)
For i=1 to n
T(n)=n.n.n=n 3

For j=1 to n
For k=1 to n
=O(n )
3

<computational instruction>
----------------------------------------------------------------
For i=1 to n
For j=1 to n
T(n)=n.n.n.n=n
For k=1 to n =O(n )
4
For l=1 to n
<computational instruction>
Time Complexity (Iterations)

For i=1; i2<=n;i++


<computational instruction>

T(n)=O() i <=n
2

i<=
Time Complexity (Iterations)
T(n)=100+200+…+n.10
=100(1+2+…+n)
For i=1 to n =100.n(n+1)/2
For j=1 to i
=O(n2)
For k=1 to 100
<computational
instruction>
i 1 2 3 4 … n
j 1 times 2 times 3 times 4 times … n times
k 1x100 2x100 3x100 4x100 … nx100
times times times times times
Things to remember
Sum of squares of 1st m natural numbers=m(m+1)(2m+1

Time Complexity (Iterations)


T(n)=1(n/2)+4(n/2)+9(n/2)+…+n2(n
=(n/2)(1+4+9+…+n2)
For i=1 to n
For j=1 to i 2 =(n/2)(1 2
+2 2
+3 2
+…n 2
)
=(n/2).n(n+1)(2n+1)/6
For k=1 to n/2 = O(n 4
)
<computational
instruction>
I 1 2 3 4 … n
J 1 times 4 times 9 times 16 times … n2 times
k n/2x1 n/2x4 n/2x9 n/2x16 … n/2xn2
times times times times times
Time Complexity (Iterations)
i 1 2 4 8 … n

For i=1 to n & i=i*2


<computational instruction>
---------------------------
For i=1 to n & i=i*3
<computational instruction>
---------------------------
For i=1 to n & i=i*m
<computational instruction>
Time Complexity (Iterations)
i 1=20 2=21 4=22 8=23 … n=2k

For i=1 to n & i=i*2


T(n)=k+1
<computational instruction>
---------------------------
=log2n+1
n=2 k
For i=1 to n & i=i*3 =O(log2n)log n=log 2 k
2 2
<computational instruction>
--------------------------- log2n=k.log22
For i=1 to n & i=i*m log2n=k
<computational instruction> Or
k=log2n
Time Complexity (Iterations)
i 1=20 2=21 4=22 8=23 … n=2k

For i=1 to n & i=i*2


T(n)=k+1
<computational instruction>
---------------------------
=log2n+1
n=2 k
For i=1 to n & i=i*3 =O(log2n)log n=log 2 k
2 2
<computational instruction>
--------------------------- log2n=k.log22
For i=1 to n & i=i*m T(n)=O(log3n)log n=k 2

<computational instruction> Or
k=log2n
T(n)=O(logmn)
Time Complexity (Iterations)
T(n)=(+1)(n/2)O(lg n)
For i=n/2 to n
For j=1 to n/2 =O(n 2
.lg n)
For k=1 to n & k=k*2
<computational instruction>
-----------------------------------------------------------
For i=n/2 to n T(n)=(+1).O(lg n).O(lg n
For j=1 to n & j=2*j
=O(n.(lg n)2)
For k=1 to n & k=k*2
<computational instruction>
Time Complexity (Iterations)

While(n>1)
n=n/2
T(n)=O(log 2 n)
<computational instruction>

i = 1 to n i = n to 1
←same→
i=i*2 i=i/2
where = 0.5772156649…

Time Complexity (Iterations)


T(n)= n+n/2+n/3+…+n/n
For i=1 to n
For j=1 to n & j=j+i = n(1+1/2+1/3+…+1/n)
≈ n(Hn)
<computational instruction>
= n.O(ln n)
= O(n. ln n)

i 1 2 3 … n
j n times n/2 n/3 … n/n
times times times
Iterative vs. Recursive Time

A() A(n)
{ {
For j = 1 to n If (-----)
Print “Hello” A(n/2)
}
}
Recursions
■ A function calls itself repeatedly

■ QUESTION: For how long a function will call itself?

■ ANSWER: Slight modification in parameters

■ PURPOSE: To reach the ANCHOR condition

■ ANCHOR CONDITION: The point at which the


recursive calls will come to an end 20
Example
Algo(n)
{
//some lines of code
If n=1
return 1
else
return Algo(n-1)
}
-------------------------------------------------------------------------
T(n)=c+T(n-1) for n>1; otherwise T(n) =c for n=1
-------------------------------------------------------------------------
Here, “c” is the time complexity of “//some lines of code” and the
comparison
And, T(n-1) is the time complexity of the next recursive call 21
Example T(n)
=c+c+…+c
Algo(n)
Algo(n-
=n.c
1)
Algo(n-
=O(n)
2)

c c
c
c Algo(1)

22
Recursion Tree Method
■ Let’s consider another example:
T(n) = T(n-1)+n, for n>1
T(1) = n  This is the Anchor Condition
-------------------------------------------------------------------------
Note that here, at each step, only 1 branches will be
drawn; decreasing the input size by 1 at each step.
And a total of n work will be done at branching.

23
T(n n
)

T(n- n-1
1)

T(n) = n+(n-1)+(n-3)+… T(n-


n-2
+2+n 2)
= n+(n-1)+(n-3)+…
+2+1-1+n T(n- n-3
3)
= n(n+1)/2 -1 +n
= O(n2)
T(n-
k) n
=T(1) 24
Methods of Solution

Recursive
Relations

Back Recursion
Master
Substituti Tree
Theorem
on Method Method

25
Master Theorem
■The master theorem provides a solution in
asymptotic terms (using  notation) for
recurrence relations of types that occur in the
analysis of many divide and conquer algorithms.
■The function must be in some specific pattern.
■Not all the problems can be solved by Master
Theorem.
■It has various versions; we will use an easy one.

26
Master Theorem
■ Function Pattern:

– T(n)=aT(n/b)+ (nc.logdn) , T(1)= (1)

– Where a,b,c,d are constants


– Constraints:
■ a>=1
■ b>1
■ c>=0
■ d is any real number
27
Master Theorem
■Now, we have three
possibilities:
1. If a>bc, then T(n)= (nlogba)
2. If a=bc, then:
– If d>-1, then T(n)=
(nlogbalogd+1n)
– If d=-1, then T(n)= (nlogbalog
log n)
– If d<-1, then T(n)= (nlogba)
28
c
Master Theorem

Lets consider an example:


T(n)=3T(n/2)+n2
Here, a=3, b=2, c=2, and d=0
Now, bc=22=4
Here a<bc, so third possibility is our
selection.
As d=0, so solution will be:
2 0 2 29
Class Tasks
■ Here are some practice questions.
■ Do solve them by yourselves and consult in
case of any problem.
1. T(n)=4T(n/2)+n2
2. T(n)=T(n/2)+n2
3. T(n)=2nT(n/2)+nn
4. T(n)=16T(n/4)+n
5. T(n)=2T(n/2)+n.log n
6. T(n)=2T(n/2)+n/log n
7. T(n)=2T(n/4)+n0.51
30
End of Lecture

THANK YOU

You might also like