Asymp Not
Asymp Not
CS 372
Asymptotic notations
Algorithm Analysis
• The amount of resources used by the algorithm
– Space
– Computational time
• Running time:
– The number of primitive operations (steps) executed
before termination
• Order of growth
– The leading term of a formula
– Expresses the behavior of a function toward infinity
Asymptotic Notations
1
Logarithms
• In algorithm analysis we often use the notation “log n”
without specifying the base
a logb x = x logb a
Asymptotic notations
• O-notation
Examples
– 1000n2+1000n = O(n2):
2
Asymptotic notations (cont.)
• Ω - notation
Examples
– 5n2 = Ω(n)
∃ c, n0 such that: 0 ≤ cn ≤ 5n2 ⇒ cn ≤ 5n2 ⇒ c = 1 and n0 = 1
– 100n + 5 ≠ Ω(n2)
∃ c, n0 such that: 0 ≤ cn2 ≤ 100n + 5
100n + 5 ≤ 100n + 5n (∀ n ≥ 1) = 105n
cn2 ≤ 105n ⇒ n(cn – 105) ≤ 0
Since n is positive ⇒ cn – 105 ≤ 0 ⇒ n ≤ 105/c
⇒ contradiction: n cannot be smaller than a constant
– n = Ω(2n), n3 = Ω(n2), n = Ω(logn)
8
3
Examples
– n2/2 –n/2 = Θ(n2)
• ½ n2 - ½ n ≤ ½ n2 ∀n ≥ 0 ⇒ c2= ½
• ½ n2 -½n≥½ n2 - ½ n * ½ n ( ∀n ≥ 2 ) = ¼ n2 ⇒ c1= ¼
⇒ c2 ≥ n/logn, ∀ n≥ n0 – impossible
10
for all n ≥ 5
Comparisons of Functions
• Theorem:
f(n) = Θ(g(n)) ⇔ f = O(g(n)) and f = Ω(g(n))
• Transitivity:
– f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))
– Same for O and Ω
• Reflexivity:
– f(n) = Θ(f(n))
– Same for O and Ω
• Symmetry:
– f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))
• Transpose symmetry:
– f(n) = O(g(n)) if and only if g(n) = Ω(f(n))
12
4
Asymptotic Notations in Equations
• On the right-hand side
– Θ(n2) stands for some anonymous function in Θ(n2)
2n2 + 3n + 1 = 2n2 + Θ(n) means:
There exists a function f(n) ∈ Θ(n) such that
2n2 + 3n + 1 = 2n2 + f(n)
• On the left-hand side
2n2 + Θ(n) = Θ(n2)
No matter how the anonymous function is chosen on
the left-hand side, there is a way to choose the
anonymous function on the right-hand side to make
the equation valid.
13
Practical Complexity
250
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
14
Practical Complexity
500
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
15
5
Practical Complexity
1000
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 3 5 7 9 11 13 15 17 19
16
Practical Complexity
5000
4000
f(n) = n
f(n) = log(n)
3000
f(n) = n log(n)
f(n) = n^2
2000 f(n) = n^3
f(n) = 2^n
1000
0
1 3 5 7 9 11 13 15 17 19
17
6
Other Asymptotic Notations
• A function f(n) is o(g(n)) if for any positive
constant c, there exists n0 such that
f(n) < c g(n) ∀ n ≥ n0
or lim (f(n)/g(n)) = 0
nض
19
or lim (f(n)/g(n)) = ¶
nض
20
Intuitions
• Intuitively,
21
7
Typical Running Time Functions
• Θ (1) (constant running time):
– Instructions are executed once or a few times
• Θ (logN) (logarithmic)
– A big problem is solved by cutting the original problem in smaller sizes,
by a constant fraction at each step
• Θ (N) (linear)
– A small amount of processing is done on each input element
• Θ (N logN)
– A problem is solved by dividing it into smaller problems, solving them
independently and combining the solution
22
• Θ (N3 ) (cubic)
– Processing of triples of data (triple nested loops)
• Θ (NK ) (polynomial)
• Θ (2N ) (exponential)
– Few exponential algorithms are appropriate for practical use
23
n
1
∑k
k =1
p
= 1p + 2 p + ... + n p ≈
p +1
n p +1
24