0% found this document useful (0 votes)
34 views56 pages

Growth of Functions Lecture2 DAA

The document discusses the growth of functions in the context of algorithm analysis, focusing on asymptotic analysis and various notations such as Big O, Big Omega, and Theta. It explains how to analyze running time, classify growth rates, and compare the efficiency of algorithms based on their asymptotic behavior. Key concepts include upper and lower bounds, and the importance of ignoring constant factors and small input sizes when evaluating algorithm performance.

Uploaded by

Clock
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views56 pages

Growth of Functions Lecture2 DAA

The document discusses the growth of functions in the context of algorithm analysis, focusing on asymptotic analysis and various notations such as Big O, Big Omega, and Theta. It explains how to analyze running time, classify growth rates, and compare the efficiency of algorithms based on their asymptotic behavior. Key concepts include upper and lower bounds, and the importance of ignoring constant factors and small input sizes when evaluating algorithm performance.

Uploaded by

Clock
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 56

1

Growth of Functions

Lecture # 2

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Contents 2

 Analyzing Running Time


 Asymptotic Analysis
 Asymptotic Notations Properties
 Rate of Growth ≡Asymptotic Analysis
 Rate of Growth
 Classification of Growth
 Asymptotic Upper Bound (Big Oh )
 Big-Omega Notation ()
 Theta Notation ()

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Analyzing Running Time 3

T(n), or theAsymptotic analysis


running time of a particular algorithm on input of size n, is taken to be
the number of times the instructions in the algorithm are executed. Pseudo code
algorithm illustrates the calculation of the mean (average) of a set of n numbers:

1. n = read input from user


2. sum = 0
3. i = 0
4. while i < n
5. number = read input from user
6. sum = sum + number
7. i = i + 1
8. mean = sum / n
The computing time for this algorithm in terms on input size n is: T(n) = 4n + 5.
Statement Number of times executed
1 1
2 1
3 1
4 n+1
5 n
6 n
7 n
8 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Asymptotic analysis 4

 The very word “asymptotic” scares people because it


sounds complicated.

 The definition doesn't serve to alleviate that fear.


Something that's asymptotic relates to a asymptote,
which is defined as “A line whose distance to a given
curve tends toward zero”.

 That's damn near worthless, so let's say that something


asymptotic refers to a limiting behavior based on a
single variable and a desired measure.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Asymptotic Notations Properties 5

 Measures algorithm efficiency


 Categorize algorithms based on asymptotic growth rate

e.g. linear, quadratic, polynomial, exponential


 Ignore small constant and small inputs

 Estimate upper bound and lower bound on growth rate

of time complexity function


 Describe running time of algorithm as n grows to .

 Describes behavior of function within the limit.

Limitations
 not always useful for analysis on fixed-size inputs.

 All results are for sufficiently large inputs.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Asymptotic analysis 6

 The simplest example is, when considering a


function f(n), there is a need to describe its
properties when n becomes very large.

 Thus, if f(n) = n2+3n, the term 3n becomes


insignificant compared to n2 when n is very large.

 The function "f(n) is said to be asymptotically


equivalent to n2 as n → ∞", and this is written
symbolically as f(n) ~ n2.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Rate of Growth ≡Asymptotic Analysis 7

 A way of comparing functions that ignores constant


factors and small input sizes
 Using rate of growth as a measure to compare different
functions implies comparing them asymptotically.

 If f(x) is faster growing than g(x), then f(x) always


eventually becomes larger than g(x) in the limit (for
large enough values of x).

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Rate of Growth 8

 The rate of growth of a function determines how


fast the function value increases when the input
increases

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Rate of Growth 9

 If algorithm A does x3 operations on an input of size x


and algorithm B does x2 operations, algorithm B is
more efficient
 Because of the relative growth rate, we will consider
x3 + x2 + x equivalent to x3

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Example 10

 Suppose you are designing a web site to process user


data (e.g., financial records).

 Suppose program A takes fA(n)=30n+8 microseconds


to process any n records, while program B takes
fB(n)=n2+1 microseconds to process the n records.

 Which program would you choose, knowing you’ll want


to support millions of users?

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Visualizing Orders of Growth 11

On a graph, as
you go to the
right, a faster
growing
function fA(n)=30n+8

Value of function 
eventually
becomes
larger...
fB(n)=n2+1

Increasing n 

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Classification of Growth 12

•Asymptotic Upper Bound (Big Oh O-notation )

•Asymptotic tight bound(Omega Θ-notation )

•Asymptotic lower bound(Theta Ω-notation )

03/02/25 Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Asymptotic Upper Bound (Big Oh )
13

• Big O notation is used in Computer Science to describe the


performance or complexity of an algorithm.

• Big O specifically describes the worst-case scenario, and can be


used to describe the execution time required or the space used
(e.g. in memory or on disk) by an algorithm.

• In 1892, P. Bachmann invented a notation for characterizing the


asymptotic behavior of functions. His invention has come to be
known as big oh notation:

Big O notation: asymptotic “less than”:


f(n)=O(g(n)) implies: f(n) “≤” g(n)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Asymptotic Upper Bound(Big Oh: Less than or equal to(“<=”) )
14

For a given function g(n), we denote O(g(n)) as the


set of functions :
O(g(n)) { f(n) | there exists positive
constants c and n0 such that
0 f(n) c. g(n) for all n n0 }

f n  g n  means function g n is an asymptotically


upper bound for f n .
We may write f(n) = O(g(n)) OR f(n)  O(g(n))
Or f(n)=O(g(n)) implies: f(n) “≤” g(n)
Intuitively:
Set of all functions whose rate of growth is the same as or lower
than that of g(n).
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Big-Oh Notation 15

f(n)  O(g(n))

 c > 0,  n0  0 and n  n0, 0  f(n)  c.g(n)


g(n) is an asymptotic upper bound for f(n), although f(n) starts above cg(n) in
the figure, eventually it falls beneath cg(n) and stays there.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 16

Example 1: Prove that 2n2  O(n3)


Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  O(g(n)) ?
Now we have to find the existence of c and n 0

f(n) ≤ c.g(n)  2n2 ≤ c.n3  2 ≤ c.n


if we take, c = 1 and n0= 2 OR
c = 2 and n0= 1 then
2n2 ≤ c.n3
Hence f(n)  O(g(n)), c = 1 and n0= 2

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 17

Example 2: Prove that n2  O(n2)

Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  O(g(n))

Since
f(n) ≤ c.g(n)  n2 ≤ c.n2  1 ≤ c, take, c = 1, n0= 1

Then
n2 ≤ c.n2 for c = 1 and n  1
Hence, 2n2  O(n2), where c = 1 and n0= 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 18

• How about n2-3n+10?


– It is O(n2) if there exist c and n0 such that
cn2≥ n2-3n+10 for all n ≥ n0
• We see (fig.) that: 3n2 ≥ n2-3n+10 for all n ≥ 2
• So c=3, n0=2
– More c-n0 pairs could be found, but finding just one is enough to
prove that n2-3n+10 is O(n2)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Properties of Big-Oh 19

• Ignore low-order terms


– e.g., O(n3+4n2+3n)=O(n3)
• Ignore multiplicative constant
– e.g., O(5n3)=O(n3)
• n4 + 100n2 + 10n + 50 is of the order of n4 or O(n4)
• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• 10 is O(1),
• 1273 is O(1)
• Combine growth-rate functions
– O(f(n)) + O(g(n)) = O(f(n)+g(n))
– e.g., O(n2) + O(n*log2n) = O(n2 + n*log2n)
• Then, O(n2 + n*log2n) = O(n2)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Big-O Visualization 20

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Properties of Big-Oh 21

if (i<j)
for ( i=0; i<N; i++ )
O(N)
X = X+i;
else
X=0; O(1)

Max ( O(N), O(1) ) = O (N)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Big-Omega Notation () 22

• Asymptotic lower bound


• Ω (g(n)) represents a set of functions such that:

Ω(g(n)) = {f(n): there exist positive


constants c and n0 such that
0 ≤ c g(n) ≤ f(n) for all n≥ n0}
We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively:
Set of all functions whose rate of growth is the same as or higher
than that of g(n).

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Big-Omega Notation () 23

This is almost the same definition as Big Oh, except that


"f(n) ≥ cg(n)", this makes g(n) a lower bound function,
instead of an upper bound function.

It describes the best that can happen for a given data


size.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Big-Omega Notation 24

f(n)  (g(n))

 c > 0,  n0  0 , n  n0, f(n)  c.g(n)


g(n) is an asymptotically lower bound for f(n).
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Examples Examples 25

Example 1: Prove that 5.n2  (n)


Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n 0 s.t.
c.g(n) ≤ f(n)  n  n0
c.n ≤ 5.n2  c ≤ 5.n
if we take, c = 5 and n0= 1 then
c.n ≤ 5.n2  n  n0
And hence f(n)  (g(n)), for c = 5 and n0= 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 26

Example 2: Prove that 5.n + 10  (n)


Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 s.t.
c.g(n) ≤ f(n)  n  n0
c.n ≤ 5.n + 10  c.n ≤ 5.n + 10.n  c ≤ 15.n
if we take, c = 15 and n0= 1 then
c.n ≤ 5.n + 10  n  n0
And hence f(n)  (g(n)), for c = 15 and n0= 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Theta Notation () 27

Asymptotic tight bound

Θ (g(n)) represents a set of functions such that:

Θ (g(n)) = {f(n): there exist positive


constants c1, c2, and n0 such
that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)
for all n≥ n0}

We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively: Set of all functions that have same rate of growth as g(n).

03/02/25 Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Theta Notation () 28

 Theta means that f is bounded above and below by g; Big The


ta implies the "best fit".
 This is denoted as "f(n) = Θ(g(n))".

 This is basically saying that the function, f(n) is bounded both


from the top and bottom by the same function, g(n).

 f(n) does not have to be linear itself in order to be of linear gr


owth; it just has to be between two linear functions.

 We will use Theta whenever we have enough information to


show that the f(n) is both an upper and lower bound. Theta
is a “stronger” statement than Big-Oh or Big-Omega.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Theta Notation 29

f(n)  (g(n))

 c1> 0, c2> 0,  n0  0,  n  n0, c2.g(n)  f(n)  c1.g(n)


We say that g(n) is an asymptotically tight bound for f(n).
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Common growth rates 30

Q: What does it mean to say f1(n) = (1)?


A: f1(n) = (1) means after a few n, f1 is bounded abov
e & below by a constant.

Q: What does it mean to say f2(n) = (n lg n)?


A: f2(n) = (n lg n) means that after a few n, f2 is boun
ded above and below by a constant times nlg n. In oth
er words, f2 is the same order of magnitude as nlg n.

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Common growth rates(Big O) 31

complexit Name Example


y
O(1) constant (perfect Adding to the front of a linked list
scalability)
O(log n) logarithmic Finding an entry in a sorted array
O(n log N-log-n Sorting n items by ‘divide-and-
n) conquer
O(n) linear Finding an entry in an unsorted
array
O(n2 ) Quadratic Shortest path between two nodes in
a graph
O(n3 ) Cubic Simultaneous linear equations
2n exponential (evil) The Towers of Hanoi problem
n! factorial (pure evil) Recursive Problems

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Common growth rates 32

BETTER  O(1) constant time


 O(log n) log time
 O(n) linear time
 O(n log n) log linear time
 O(n2) quadratic time
 O(n3) cubic time
 O(2n) exponential time
 O(n!) Factorial Time
WORSE

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Number of comparisons for common Big O notations.
33

 Differences among the growth-rate functions grow with n


 See the differences growing on the diagram on the next page
 The bigger n, the bigger differences -
- that’s why algorithm efficiency is “concern for large problems
only”
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Comparison Graph for Common Big O Notations
34

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples 35

1. statement;
 is constant. The running time of the statement
will not change i.e. O(1)

2. for ( i = 0; i < N; i++ )


statement;
 is linear. The running time of the loop is directly
proportional to N i.e. O(N)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples 36

3. for ( i = 0; i < N; i++ )


{
for ( j = 0; j < N; j++ )
statement;
}
Is quadratic. The running time of the two loops is
proportional to the square of N. When N doubles,
the running time increases by N * N.
O(N*N)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples 37

4. while ( low <= high ) {


mid = ( low + high ) / 2;

if ( target < list[mid] )


high = mid - 1;
else if ( target > list[mid] )
low = mid + 1;
else break;
}
Is logarithmic. The running time of the algorithm is
proportional to the number of times N can be divided by 2.
This is because the algorithm divides the working area in half
with each iteration.
O(logN)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples (Another Way) 38

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Conclusion 39

 In general, doing something with every item in one


dimension is linear,

 Doing something with every item in two dimensions is


quadratic,

 Dividing the working area in half is logarithmic.

 Doing something with every item in three dimensions(loop


nesting up to three levels) cubic,

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Logarithms and properties 40

 In algorithm analysis we often use the notation “log n”


without specifying the base

Binary logarithm
lg n log2 n log x y  y log x
Natural logarithm ln n loge n log xy  log x  log y
x
lg k n (lg n ) k log  log x  log y
y
lg lg n lg(lg n ) log a
a logb x  x b

log b x  log a x
log a b

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Common Summations 41

n
n( n  1)
 Arithmetic series:  k 1  2  ...  n 
k 1 2
n
x n 1  1
 Geometric series:  k 2
x 1  x  x  ...  x  n
x 1
k 0 x 1
◦ Special case: |x| < 1: 
1
 
x k

k 0 1 x
 Harmonic series: n
1 1 1

k 1 k
1 
2
 ... 
n
ln n
 Other important n

 lg k n lg n
formulas: k 1

n
1
 k p 1p  2 p  ...  n p 
k 1 p 1
n p 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Self Practice 42

Self Practice Examples & Topics

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
ExamplesPractice Examples(Big O) 43

Example 3: Prove that 1000.n2 + 1000.n  O(n2)


Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
We have to find existence of c and n 0 such that
0 ≤ f(n) ≤ c.g(n)  n  n0
1000.n2 + 1000.n ≤ c.n2 = 1001.n2, for c = 1001
1000.n2 + 1000.n ≤ 1001.n2
Û 1000.n ≤ n2  n2  1000.n  n2 - 1000.n  0
Û n (n-1000)  0, this true for n  1000

f(n) ≤ c.g(n)  n  n0 and c = 1001

Hence f(n)  O(g(n)) for c = 1001 and n 0 = 1000

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
ExamplesPractice Examples(Big O) 44

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
ExamplesPractice Examples(Big O) 45

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
ExamplesPractice Examples(Big O) 46

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
ExamplesPractice Examples(Big O) 47

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Practice Examples(Big Omega)
Examples 48

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Little-Oh Notation 49

o-notation is used to denote a upper bound that is not


asymptotically tight.
For a given function g n  0, denoted by og n  the set of functions,
 f n : for any positive constants c, there exists a constant no
og n  
such that 0  f n   cg n for all n n o 

f(n) becomes insignificant relative to g(n) as n


approaches infinity
f n 
2
 
2 2 lim
e.g., 2n o n but 2n o n .. n   g n  0  
g(n) is an upper bound for f(n), not asymptotically tight
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Examples Examples 50

Example 1: Prove that 2n2  o(n3)


Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  o(g(n)) ?

Now we have to find the existence n 0 for any c


f(n) < c.g(n) this is true
 2n2 < c.n3  2 < c.n
This is true for any c, because for any arbitrary c we
can choose n0 such that the above inequality holds.
Hence f(n)  o(g(n))

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 51

Example 2: Prove that n2  o(n2)


Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  o(g(n))

Since
f(n) < c.g(n)  n2 < c.n2  1 ≤ c,

In our definition of small o, it was required to prove


for any c but here there is a constraint over c .
Hence, n2  o(n2), where c = 1 and n0= 1

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 52

Example 3: Prove that 1000.n2 + 1000.n  o(n2)


Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
we have to show that f(n)  o(g(n)) i.e.
We assume that for any c there exist n 0 such that
0 ≤ f(n) < c.g(n)  n  n0
1000.n2 + 1000.n < c.n2
If we take c = 2001, then,1000.n 2 + 1000.n < 2001.n2
 1000.n < 1001.n2 which is not true
Hence f(n)  o(g(n)) for c = 2001

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Little-Omega Notation 53

Little- notation is used to denote a lower bound


that is not asymptotically tight.
For a given function g n , denote by  g n  the set of all functions.
 g n   f n : for any positive constants c, there exists a constant no such that
0 cg n   f n for all n n o 

f(n) becomes arbitrarily large relative to g(n) as n


approaches infinity
f n 
lim 
n   g n 
2 2
n n
e.g.,
2
 n  but
2
 n ..
2
 
Advance Analysis of Algorithms b 03/02/25
y: Khalid Mahmood
Examples Examples 54

Example 1: Prove that 5.n2  (n)


Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to prove that for any c there exists n 0 s.t.,
c.g(n) < f(n)  n  n0
c.n < 5.n2  c < 5.n
This is true for any c, because for any arbitrary c e.g.
c = 1000000, we can choose n 0 = 1000000/5 =
200000 and the above inequality does hold.
And hence f(n)  (g(n)),

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 55

Example 2: Prove that 5.n + 10  (n)


Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?

We have to find the existence n 0 for any c, s.t.


c.g(n) < f(n)  n  n0
c.n < 5.n + 10, if we take c = 16 then
16.n < 5.n + 10  11.n < 10 is not true for any
positive integer.
Hence f(n)  (g(n))

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood
Examples Examples 56

Example 3: Prove that 100.n  (n2)


Proof:
Let f(n) = 100.n, and g(n) = n 2
Assume that f(n)  (g(n))
Now if f(n)  (g(n)) then there n0 for any c s.t.
c.g(n) < f(n)  n  n0 this is true
 c.n2 < 100.n  c.n < 100
If we take c = 100, n < 1, not possible
Hence f(n)  (g(n)) i.e. 100.n  (n2)

Advance Analysis of Algorithms b 03/02/25


y: Khalid Mahmood

You might also like