0% found this document useful (0 votes)
25 views36 pages

Lecture 18 Order of Growth

The document discusses different measures for analyzing the time and space efficiency of algorithms, known as time complexity and space complexity. It defines basic operations, input size, and different cases for analyzing complexity - every case, best case, worst case, and average case. It introduces asymptotic notations like Big-O notation to describe the order of growth or asymptotic upper bound of an algorithm. Order of growth is used to compare algorithms and determine which is more efficient for large input sizes by looking at the dominant term in the time expression. Constant factors and lower order terms do not affect the growth rate.

Uploaded by

Saad Zaheer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views36 pages

Lecture 18 Order of Growth

The document discusses different measures for analyzing the time and space efficiency of algorithms, known as time complexity and space complexity. It defines basic operations, input size, and different cases for analyzing complexity - every case, best case, worst case, and average case. It introduces asymptotic notations like Big-O notation to describe the order of growth or asymptotic upper bound of an algorithm. Order of growth is used to compare algorithms and determine which is more efficient for large input sizes by looking at the dominant term in the time expression. Constant factors and lower order terms do not affect the growth rate.

Uploaded by

Saad Zaheer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Order of Growth of Algorithms

1
Efficiency of Algorithm
• Suppose X is an algorithm and n is the size of input data,

• The time and space used by the Algorithm X are the two
main factors which decide the efficiency of X.

• Measuring Time Efficiency is also called Time


Complexity Likewise we have space complexity

• Time complexity depends on


– Input size
– Basic operation
2
Concept of Basic Operation
• Time efficiency is analyzed by determining the
number of repetitions of the basic operation as a
function of input size.

• Basic operation: the operation that contributes most


towards the running time of the algorithm.
– As a rule, the basic operation is located in its
inner-most loop
• Basic Operation in searching ?
3
Time Complexity
• Time Complexity of an algorithm represents the
amount of time required by the algorithm to run
to completion.

• Time requirements can be defined as a


numerical function T(n), where T(n) can be
measured as the number of steps, provided each
step consumes constant time.
4
Space Complexity
• Space complexity of an algorithm
represents the amount of memory space
required by the algorithm in its life cycle.

5
Every case Time Complexity
• For a given algorithm, T(n) is every case
time complexity if algorithm have to repeat
its basic operation every time for given
input size n. determination of T(n) is called
every case time complexity analysis.

6
Every case time complexity(examples)

• Sum of elements of array


Algorithm sum_array(A,n)
sum=0
For i=1 to n
sum+=A[i]
Return n

7
Every case time complexity(examples)

• Basic operation addition of array elements


• Repeated how many number of times??
• Complexity n

8
Every case time complexity(examples)

• Exchange Sort
Algorithm exchange_sort(A,n)
For i=1 to n-1
for j= i+1 to n
if A[i]>A[j]
exchange A[i] &A[j]

9
Every case time complexity(examples)

• Basic operation comparison of array


elements
• Repeated how many number of times??
• Complexity n(n-1)/2

10
Best Case Time Complexity

• For a given algorithm, B(n) is every case


time complexity if algorithm have to repeat
its basic operation for minimum time for
given input size n. determination of B(n) is
called Best case time complexity analysis.

11
Best Case Time Complexity (Example)

Algorithm sequential_search(A,n,key)
i=0
While i<n &&A[i]!= key
i=i+1
If i<n
return I
Else return-1
12
Best Case Time Complexity (Example)

• Input size: number of elements in the array


ie n
• Basic operation :comparison of key with
array elements
• Best case: first element is the required key

13
Worst Case Time Complexity
• For a given algorithm, W(n) is every case
time complexity if algorithm have to repeat
its basic operation for maximum number of
times for given input size n. determination
of W(n) is called worst case time
complexity analysis.

14
Sequential Search
• Input size: number of elements in the array
ie n
• Basic operation :comparison of key with
array elements
• worst case: last element is the required key
or key is not present in array at all
• Complexity :w(n)=n

15
Average Case Time Complexity
• For a given algorithm, A(n) is every case
time complexity if algorithm have to repeat
its basic operation for average number of
times for given input size n. determination
of A(n) is called average case time
complexity analysis.

16
Sequential Search

• Input size: number of elements in the array


i.e. n
• Basic operation :comparison of key with
array elements

• Average Case time complexity is


determined by considering probability of
basic operation.
17
Order Of Growth

18
What is Order of Growth?
 When we want to compare two algorithm with
respect to their behavior for large input size, a
useful measure is so-called Order of growth.
 The order of growth can be estimated by
taking into account the dominant term of the
running time expression.

19
Dominant Term
 In the running time expression, when n
becomes large a term will become significantly
larger than the other ones: this is the so-called
dominant term.
T1(n)=an+b Dominant term: a n

T2(n)=a log n+b Dominant term: a log n

T3(n)=a n2+bn+c Dominant term: a n2

T4(n)=an+b n +c Dominant term: an


(a>1)
20
Growth Rate
Growth Rate of Diferent Functions

300 lg n

250 n lg n
Function Value

200 n square

150 n cube

100 2 raise to
power n
50
0
8

2
32
0

8
48

92
12

51

76
20

81

32
Data Size

21
Growth Rates
n lgn nlgn n2 n3 2n
0 #NUM! #NUM! 0 0 1
1 0 0 1 1 2
2 1 2 4 8 4
4 2 8 16 64 16
8 3 24 64 512 256
16 4 64 256 4096 65536
32 5 160 1024 32768 4294967296
64 6 384 4096 262144 1.84467E+19
128 7 896 16384 2097152 3.40282E+38
256 8 2048 65536 16777216 1.15792E+77
512 9 4608 262144 134217728 1.3408E+154
1024 10 10240 1048576 1073741824
2048 11 22528 4194304 8589934592

22
How can be interpreted the order of growth?

 Between two algorithms it is considered that the one


having a smaller order of growth is more efficient
 However, this is true only for large enough input sizes

Example. Let us consider


T1(n)=10n+10 (linear order of growth)
T2(n)=n2 (quadratic order of growth)

If n<=10 then T1(n)>T2(n)


Thus the order of growth is relevant only for n>10
23
Constant Factors
• The growth rate is not affected by
– constant factors or
– lower-order terms
• Examples
– 102n + 105 is a linear function
– 105n2 + 108n is a quadratic function

24
Asymptotic Notations

25
Asymptotic Notations
• Asymptotic analysis refers to computing the running
time of any operation in mathematical units of
computation.
• For example, running time of one operation is
computed as f(n) and may be for another operation it is
computed as g(n2).
• Which means first operation running time will increase
linearly with the increase in n and running time of
second operation will increase exponentially when n
increases.

26
Asymptotic Notations

• Following are commonly used asymptotic


notations used in calculating running time
complexity of an algorithm.
• Ο Notation
• Ω Notation
• θ Notation

27
Big Oh Notation, Ο

• The Ο(n) is the formal way to express


the upper bound of an algorithm's
running time. It measures the worst case
time complexity or longest amount of
time an algorithm can possibly take to
complete.

28
Big Oh Notation, Ο
There may be a situation, e.g.
g(n)

f(n)
Tt

1 n0 n
f(n) <= g(n) for all n >= n0 Or
f(n) <= cg(n) for all n >= n0 and c = 1
g(n) is an asymptotic upper bound on f(n).
f(n) = O(g(n)) if there exist two positive constants c and n0 such that
f(n) <= cg(n) for all n >= n0 29
Omega Notation, Ω

• The Ω(n) is the formal way to express the


lower bound of an algorithm's running time.
It measures the best case time complexity or
best amount of time an algorithm can
possibly take to complete.

30
Omega Notation, Ω
Asymptotic Lower Bound: f(n) = (g(n)),
if there exit positive constants c and n0 such that
f(n) >= cg(n) for all n >= n0

f(n)

g(n)

n0 n
31
Theta Notation, θ

• The θ(n) is the formal way to express both


the lower bound and upper bound of an
algorithm's running time.
• This means that the best and worst
case requires the same amount of
time to within a constant factor.

32
Theta Notation, θ
Asymptotically Tight Bound: f(n) = (g(n)),
iff there exit positive constants c1 and c2 and n0 such that
c1 g(n) <= f(n) <= c2g(n) for all n >= n0

c2g(n)
f(n)
c1g(n)

n0 n

33
Big-Oh and Growth Rate
• The big-Oh notation gives an upper bound on the
growth rate of a function
• The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
• We can use the big-Oh notation to rank functions
according to their growth rate

f(n) is O(g(n)) g(n) is O(f(n))


g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
36
Intuition for Asymptotic Notation
 Big-Oh
f(n) is O(g(n)) if f(n) is asymptotically less than or equal
to g(n)

 Big-Omega
f(n) is (g(n)) if f(n) is asymptotically greater than or
equal to g(n)

 Big-Theta
f(n) is (g(n)) if f(n) is asymptotically equal to g(n)

37
Common Asymptotic Notations

constant − Ο(1)

logarithmic − Ο(log n)

linear − Ο(n)

n log n − Ο(n log n)


2
quadratic − Ο(n )
3
cubic − Ο(n )
Ο(1)
polynomial − n
Ο(n)
exponential − 2
38

You might also like