0% found this document useful (0 votes)
16 views45 pages

02 CS251 Ch3 Functions Growth

This document discusses algorithms and asymptotic analysis. It begins by defining an algorithm and analyzing algorithms in terms of running time. It introduces asymptotic notation like Big-O to analyze how running time increases with input size. Common functions like linear, quadratic, and logarithmic time are examined. The document provides examples of analyzing simple algorithms and calculating time complexity using summation formulas. Overall, it provides an overview of analyzing algorithms as problems get larger and defining the rate of growth using asymptotic notation.

Uploaded by

Ahmad Alaraby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views45 pages

02 CS251 Ch3 Functions Growth

This document discusses algorithms and asymptotic analysis. It begins by defining an algorithm and analyzing algorithms in terms of running time. It introduces asymptotic notation like Big-O to analyze how running time increases with input size. Common functions like linear, quadratic, and logarithmic time are examined. The document provides examples of analyzing simple algorithms and calculating time complexity using summation formulas. Overall, it provides an overview of analyzing algorithms as problems get larger and defining the rate of growth using asymptotic notation.

Uploaded by

Ahmad Alaraby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 45

Algorithms

Growth of Functions
Computer Science Dept.
Instructor: Ameera Jaradat
Outline
 Analysis of Algorithms
 Asymptotic notation
 Standard notations and common functions
Analysis of Algorithms
 An algorithm is a finite set of precise instructions for
solving a problem.
 What is the goal of analysis of algorithms?
 To compare algorithms mainly in terms of running time.
 also in terms of other factors (e.g., memory requirements,
programmer's effort etc.)
 What do we mean by running time analysis?
 Determine how running time increases as the size of the
problem increases.
Input Size
 Input size (number of elements in the input)
 size of an array
 # of elements in a matrix
 # of bits in the binary representation of the input
 Vertices (nodes) and edges in a graph

4
Types of Analysis
 Worst case
 Provides an upper bound on running time
 An absolute guarantee that the algorithm would not run longer, no matter
what the inputs are
 Best case
 Provides a lower bound on running time
 Input is the one for which the algorithm runs the fastest

Lower Bound  Running Time Upper Bound


 Average case
 Provides a prediction about the running time
 Assumes that the input is random
5
?How do we compare algorithms
 Express running time as a function of the input size n
(i.e., f(n)).
 Compare different functions corresponding to running
times.

6
Asymptotic Analysis
 To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
 Hint: use rate of growth
 Compare functions in the limit, that is,
asymptotically!
)i.e., for large values of n(

7
Which running time is faster

n2 scales faster as n (input size) grows


Rate of Growth
 The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same


rate of growth

 Asymptotic notation allows us to ignore small input sizes,


constant factors, lower-order terms in polynomials, and so
forth.

9
Function of Growth rate
Asymptotic Notation

 O notation: asymptotic “less than”:


 f(n)=O(g(n)) implies: f(n) “≤” g(n)

  notation: asymptotic “greater than”:


 f(n)=  (g(n)) implies: f(n) “≥” g(n)

  notation: asymptotic “equality”:


 f(n)=  (g(n)) implies: f(n) “=” g(n)

11
Asymptotic notations
 O-notation

12
Big-O Visualization
O(g(n)) is the set of
functions with smaller
or same order of growth
as g(n)

13
Examples

 2n2 = O(n3): 2n2 ≤ cn3  2 ≤ cn  c = 1 and n0= 2

 n2 = O(n2): n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1


 k n + k n + k ∈ O(n3)
1
2
2 3

 as k1n + k2n + k3 ≤ ( k1 + |k2| + |k3| )n


2 3

 (upper bound)

14
No Uniqueness
 There is no unique set of values for n0 and c in proving the asymptotic
bounds
 Prove that 100n + 5 = O(n2)
 100n + 5 ≤ 100n + n = 101n ≤ 101n2

for all n ≥ 5

n0 = 5 and c = 101 is a solution

 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2


for all n ≥ 1

n0 = 1 and c = 105 is also a solution

Must find SOME constants c and n0 that satisfy the asymptotic notation relation
15
Asymptotic notations (cont.)
  - notation

(g(n)) is the set of functions


with larger or same order of
growth as g(n)

16
Asymptotic notations (cont.)
 -notation

(g(n)) is the set of functions with


the same order of growth as g(n)

17
Relations Between Different Sets
 asymptotic growth rate, asymptotic order, or order of
functions
 The Sets:
 big oh O(g), big theta (g), big omega (g)

(g): functions that grow at least as fast as g


g (g): functions that grow at the same rate as g

O(g): functions that grow no faster as g


Other Asymptotic Notations
 Little o
 A function f(n) is o(g(n)) if  positive constants c and n0 such that
f(n) < c g(n)  n  n0
 little-omega
 A function f(n) is (g(n)) if  positive constants c and n0 such that
c g(n) < f(n)  n  n0

 f (n) = O(g(n))  a  b f no faster than g


 f (n) = (g(n))  a  b f no slower than g
 f (n) = (g(n))  a = b f about as fast as g
 f (n) = o (g(n))  a < bf slower than g
 f (n) = w(g(n))  a > b f faster than g
Common orders of magnitude

20
Common orders of magnitude

21
 f(n) is called polylogarithmically bounded if f (n) =
O(lgkn) for some constant k.
 for any constant a > 0
Factorials

• Stirling’s approximation 

e is the base of the natural logarithm

• One can prove


Some facts about hierarchy of asymptotic
classes
Some facts about hierarchy of asymptotic
classes
Logarithms and properties

 In algorithm analysis we often use the notation “log n” without


specifying the base

Binary logarithm lg n  log2 n log x y  y log x


Natural logarithm ln n  loge n log xy  log x  log y
x
lg k n  (lg n ) k log  log x  log y
y
lg lg n  lg(lg n ) log a
a logb x  x b

logb x  log a x
log a b

27
Common Summations
n
n( n  1)
 Arithmetic series:
 k  1  2  ...  n 
k 1 2
n
x n 1  1
 k 2
x  1  x  x  ...  x  n
x  1
 Geometric series: k 0 x 1

1
 Special case: |x| < 1: x
k 0
k

1 x
n
1 1 1
 Harmonic series: 
k 1 k
 1 
2
 ... 
n
 ln n

 Other important formulas:  lg k  n lg n


k 1

n
1
 k p  1p  2 p  ...  n p 
k 1 p 1
n p 1

28
How to estimate the running time
Counting the number of iterations
Algorithm COUNT1
Input: n = 2k, for some positive integer k.
Output: count = number of times Step 4 is
executed.
1. count  0
2. while n  1
3. for j 1 to n Observations:
4. count  count + 1 The while loop is executed k + 1 times,
where k = log n.
5. end for The for loop is executed n times, and then
6. n  n/2 n/2, n/4 ,…,1.
7. end while
8. return count
k
.n ∑ (1/2j) = n (2- (1/2k)) = 2n – 1 = Θ(n) = )n/2j( ∑
J=0
Algorithm COUNT2

Input: A positive integer n.


Output: count = number of times Step 5 is
executed.
1. count  0 :Observations
2. for i  1 to n Algorithm COUNT2, consists of two nested
3. m  └ n/i ┘ loops and a variable count
4. for j  1 to m The inner for loop is executed repeatedly for
5. count  count + 1 :the following values of n
6. end for .┘ n, └ n/2 ┘, └ n/3 ┘,…..., └ n/n
7. end for Thus, the number of times Step 5 is executed
8. return count :is

n
 n/i  = Θ (n log n)
i=1
Analysis of BINARY-SEARCH
BINARY-SEARCH (A, lo, hi, x)
if (lo > hi) constant time
return FALSE
mid  (lo+hi)/2 constant time
if x = A[mid] constant time
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x) same problem of size n/2
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x) same problem of size n/2
T(n/2)
 T(n) = c +
 T(n) – running time for an array of size n
Recurrences and Running Time
 Recurrences arise when an algorithm contains recursive
calls to itself
 What is the actual running time of the algorithm?
 Need to solve the recurrence
 Find an explicit formula of the expression (the generic term of
the sequence)

32
Example Recurrences
 T(n) = T(n-1) + n Θ(n2)
 Recursive algorithm that loops through the input to eliminate
one item
 T(n) = T(n/2) + c Θ(lgn)
 Recursive algorithm that halves the input in one step
 T(n) = T(n/2) + n Θ(n)
 Recursive algorithm that halves the input but must examine
every item in the input
 T(n) = 2T(n/2) + 1 Θ(n)
 Recursive algorithm that splits the input into 2 halves and does
a constant amount of other work
Methods for Solving Recurrences

 Iteration method

 Substitution method

 Recursion tree method

 Master method
The Iteration Method
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c + T(n/4)
= c + c + T(n/4) T(n/4) = c + T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)

k times

= clgn + T(1)
= Θ(lgn)
Iteration Method – Example
T(n) = n + 2T(n/2) Assume: n = 2k
T(n) = n + 2T(n/2) T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlgn + nT(1) = Θ(nlgn)
Iteration Method – Example
T(n) = n + T(n-1)

T(n) = n + T(n-1)
= n + (n-1) + T(n-2)
= n + (n-1) + (n-2) + T(n-3)
… = n + (n-1) + (n-2) + … + 2 + T(1)
= n(n+1)/2 + T(1)
= n2+ T(1) = Θ(n2)
The substitution method

1. Guess a solution

2. Use induction to prove that the solution works


:Example
T(n) = T(n – 1) + 1, T(0) = 0
 Guess: T(n) = O(n)
 Prove that T(n+1) = n+1
 T(n+1) = T((n+1)-1) + 1
 = T(n) + 1
 = n+1
Example: Binary Search
T(n) = 1 + T(n/2)
 Guess: T(n) = O(log2(n))
 Prove that : T(2n) = log2(2n) +1
 T(2n) = T((2n)/2) + 1
 = T(n) + 1
 = log2n + 1 +1
 = log2n + log22 + 1
 = log2(2n) + 1
The recursion-tree method

Convert the recurrence into a tree:


 Each node represents the cost incurred at that level of recursion
 Sum up the costs of all levels
The master method- version1
 The master method applies to recurrences of the form:
T(n) = aT(n/b) + f (n) ,
 where a ≥ 1, b > 1, and f is asymptotically positive.

 

 
 n 
logb a

f (n)  O nlogb a  

 
   0

T (n)   nlogb a log n  
f (n)   n 
logb a

  c 1
 
 f (n) 
f (n)   n 
logb a 
AND 
 
 af (n / b)  cf (n) for large n
The master method – version2
 The master method applies to recurrences of the form:
T(n) = aT(n/b) + f (n) ,

if T(n) = aT(n/b) + nc , and then

(nlogba) a > bc

 
a >= 1
T(n) = (nclogbn) a = bc b >= 1
c>0
c c
(n ) a<b
Application of Master Theorem
 T(n) = 9T(n/3)+n;
 a=9,b=3, f(n) =n
 nlogba = nlog39 =  (n2)
 By case 1, T(n) = (n2).

 T(n) = T(2n/3)+1
 a=1,b=3/2, f(n) =1
 nlogba = nlog3/21 =  (n0) =  (1)
 By case 2, T(n)= (log n).

44
Application of Master Theorem
 T(n) = 3T(n/4)+nlg n;
 a=3,b=4, f(n) =nlg n
 nlogba = nlog43 =  (n0.793)
 By case 3, T(n) = (f(n))= (nlog n).

45

You might also like