0% found this document useful (0 votes)
20 views19 pages

L2 Asymptotic Analysis Ts

Uploaded by

nerdyall51
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views19 pages

L2 Asymptotic Analysis Ts

Uploaded by

nerdyall51
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Lecture3: Aymptotic

Analysis
DR VICTOR ODUMUYIWA
Learning Outcome
At the end of this class, you should be able to:

 Determine the running time of an algorithm.


Guiding Principles for Analysis of
Algorithm
1. Focus on worst-case analysis
2. Won’t pay much attention to constant factors and lower-
order terms
3. Do asymptotic analysis
#1: Worst-case analysis
 Gives upper bound on the running time for any input n, to have a guarantee of
performance
 This holds for every input of length n
 As opposed to average-case analysis
 For average-case analysis, (input) domain knowledge is required
 For some algorithms, the worst case occurs fairly often (e.g. searching through
a database for a particular information)
 The “average case” is often roughly as bad as the worst case (e.g. insertion
sort)
 Easier to analyse
 Mathematically much more tractable
 Appropriate for “general-purpose” routine
#2: Less attention to constant factor
 Won’t pay much attention to constant factors and lower-
order terms
 Easier way to analyse
 Constant factors depends on the processor, the compiler, the
programmer
 As the input size grows larger it is the high order term that
dominates
 Lose very little predictive power
#3: Asymptotic analysis
 Focus on running time for large input sizes n
 “Only big problems are interesting!”
 Look for the rate of growth or the order of growth of the
running time
Fast Algorithm?

Fast algorithm ≈ worst-case running time


grows slowly with input size

An algorithm is efficient if its running


time is low order polynomial (quadratic)
Asymptotic Analysis
 Vocabulary for the design and analysis of algorithm
 Focusses on what is important by abstracting away low-order
terms and constant factors
 Is a way to compare “sizes” of functions

O≈≤
ꭥ≈≥
Θ≈=
o≈<
ω≈>
Computational Complexity
problem algorithm 1
algorithm 2

Which one is best?


We need measures.

algorithm k

a function f(n) where n is the size of the input


O-notation
 f(n) is O(g(n)) if there exist
constants c > 0 and n0  0 such that
for all n  n0 we have f(n)  c g(n).
 O(g(n)) = {f(n) : there exists
positive constants c and n0 such that
0 ≤ f(n) ≤ c g(n) for all n  n0 }.
 g(n) is an asymptotic upper bound
for f(n)
O-notation examples
Examples of function in O(n2)

2n2 = O(n3), with c=1 and n0 = 2


Computational Complexity
Big-Oh Form Name
O(1) constant
O(log2n) Logarithmic
O(n) Linear
O(nlog2n) nlog2n
O(n2) Quadratic
O(n3) Cubic
O(nm),m=0,1,2,3,... Polynomial
O(cn),c>1 Exponential
O(n!) Factorial

An algorithm is efficient if its running time is low


order polynomial (quadratic)
ꭥ-notation
 f(n) is (g(n)) if there exist
constants c > 0 and n0  0 such that
for all n  n0 we have f(n)  cg(n).
 (g(n)) = {f(n) : there exists
positive constants c and n0 such that
0 ≤ c g(n) ≤ f(n) for all n  n0 }.
 g(n) is an asymptotic lower bound
for f(n)
ꭥ-notation examples
Examples of function in ꭥ(n2)
Θ-notation
 f(n) is (g(n)) if f(n) is both O(g(n))
and (g(n)).
 O(g(n)) = {f(n) : there exists
positive constants c1, c2 and n0 such
that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all
n  n0 }.
 g(n) is an asymptotic tight bound for
f(n)
Θ-notation example

n2 /2 − 2n = Θ (n2), with c1 = 1/4, c2 =


1/2, and n0 = 8.
o-notation
ω-notation
Problem 3
List all the possible bounds on the function below
f(n) = 32n2 + 17n + 32.

You might also like