0% found this document useful (0 votes)
45 views

Running Time Evaluation: Worst-Case and Average-Case Performance

This document discusses time complexity analysis of algorithms. It covers key concepts like worst-case and average-case time complexity, time growth rates like O(n), Θ(n log n), and how to analyze the time complexity of nested loops and multiple algorithms. It includes a brief quiz testing understanding of asymptotic time complexity, Big-O, Big-Omega and Big-Theta notation.

Uploaded by

Pavan Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

Running Time Evaluation: Worst-Case and Average-Case Performance

This document discusses time complexity analysis of algorithms. It covers key concepts like worst-case and average-case time complexity, time growth rates like O(n), Θ(n log n), and how to analyze the time complexity of nested loops and multiple algorithms. It includes a brief quiz testing understanding of asymptotic time complexity, Big-O, Big-Omega and Big-Theta notation.

Uploaded by

Pavan Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Outline Time complexity Time growth Worst-case Average-case

Running Time Evaluation


Worst-case and average-case performance

Lecturer: Georgy Gimel’farb

COMPSCI 220 Algorithms and Data Structures

1 / 14
Outline Time complexity Time growth Worst-case Average-case

1 Time complexity

2 Time growth

3 Worst-case

4 Average-case

2 / 14
Outline Time complexity Time growth Worst-case Average-case

Algorithm’s Efficiency

Efficiency of an algorithm is generally measured in following terms:


• the running time (i.e. the time it takes to terminate);
• the amount of memory it requires and
• other required computation resources.
Today, “other resources” often include communication bandwidth.
• An algorithm downloading, rather storing whatever data required
does not need much memory but does need bandwidth!
• Examples: Google Earth and Google Maps.
Memory is relatively cheap at present, so that time complexity of
an algorithm is very important.

3 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 Executing an algorithm with time complexity O(n2 ) exactly 10


times gives an algorithm with Big-Oh time complexity O(n2 ).
2 Executing an algorithm with time complexity O(n) and then
an algorithm with time complexity O(n log n) gives an
algorithm with Big-Oh time complexity O(n log n).
3 Consider an algorithm consisting of three nested for-loops that
iterate each n times, and a body for the innermost loop that
executes in O(n log n). What is the total Big-Oh complexity
of the algorithm?
• The total Big-Oh complexity of the algorithm is O(n4 log n).

4 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

A Brief Quiz on Asymptotic Time Complexity

1 If an algorithm A always executes in the same number or


fewer steps than an algorithm B with time complexity O(n2 ),
then what can you say about the algorithm A?
The algorithm A has the same time complexity O(n2 ),
2 An algorithm A has time complexity Θ(n log n), and an
algorithm B always runs in 5% of the time that the algorithm
A takes. What are the Big-Oh, Big-Omega, and Big-Theta
time complexities of the algorithm B?
• The algorithm B is O(n log n), Ω(n log n), and Θ(n log n).
3 Executing an algorithm A with time complexity Θ(n) exactly
bn/10c times gives an algorithm with Big-Theta time
complexity Θ(n2 ).

5 / 14
Outline Time complexity Time growth Worst-case Average-case

Informal Definition

Measuring time complexity


A function f (n) such that the running time T (n) of a given
algorithm is Θ(f (n)) measures the time complexity of the
algorithm.

• A polynomial time algorithm: T (n) is O(nk ) where k is a


fixed positive integer.
• An intractable computational problem: iff no deterministic
algorithm with polynomial time complexity exists for it.
• Many problems are classed as intractable only because a
polynomial solution is unknown.
• It is a very challenging task to find such a solution for
one of them.

6 / 14
Outline Time complexity Time growth Worst-case Average-case

Relative Growth of the Running Time T (n)

Let T (n) = cf (n) where f (n) is some function of n and c is a


fixed constant.

Then the relative growth of the running time is

T (n) cf (n) f (n)


τ (n) = ◦
= ◦
=
T (n ) cf (n ) f (n◦ )

where n◦ is the reference input size.

For n◦ = 8:
f (n) 1 lg n lg2 n n n lg n n2 n3 2n
f (8) 1 lg 8 = 3 9 8 24 64 512 28 = 256
lg2 n n2 n3 2n
τ (n) = ff (n)
(8) 1 lg n
3 9
n
8
n lg n
24 64 512 28
= 2n−8
For simplicity: lg n ≡ log2 n

7 / 14
Outline Time complexity Time growth Worst-case Average-case

Relative Growth of the Running Time T (n)


f (n)
τ (n) = f (8) when the input size increases from n = 8 to n = 1024:
Time complexity Input size n τ (n)
Function f Notation 8 32 128 1024
Constant 1 1 1 1 1 1
lg n
Logarithmic lg n 1 1.67 2.67 3.33 3
Log-squared lg2 n 1 2.78 5.44 11.1 lg2 n
9
n
Linear n 1 4 16 128 8
n lg n
Linearithmic n lg n 1 6.67 37.3 427 24
Quadratic n2 1 16 256 16384 n2
64
Cubic n3 1 64 4096 2.1 · 106 n3
512
2n
Exponential 2n 1 224 2120 21016 28
= 2n−8

For simplicity: lg n ≡ log2 n


8 / 14
Outline Time complexity Time growth Worst-case Average-case

Time Complexity Vs. Problem’s Sizes

The largest data sizes n that can be processed by an algorithm with time
complexity f (n) provided that T (10) = 1 minute:

f (n) Length of time to run an algorithm


1 min 1 hour 1 day 1 week 1 year 1 decade
n 10 600 14, 400 100, 800 5.26 · 106 5.26 · 107
n lg n 10 250 3, 997 23, 100 883, 895 7.64 · 106
n1.5 10 153 1, 275 4, 666 65, 128 302, 409
n2 10 77 379 1, 003 7, 249 22, 932
n3 10 39 112 216 807 1, 738
2n 10 15 20 23 29 32

Practicable algorithms in most applications: up to linearithmic,


n log n, complexity.

9 / 14
Outline Time complexity Time growth Worst-case Average-case

Time Complexity Vs. Problem’s Sizes

The largest data sizes n that can be processed by an algorithm with time
complexity f (n) provided that T (10) = 1 microsecond:

f (n) Length of time to run an algorithm


1 sec 1min 1 hour 1 day 1 year 1 decade
n 1.0 · 107 8
6.0 · 10 3.6 · 10 10
8.6 · 10 11
3.2 · 1014 3.2 · 1015
n lg n 1.6 · 106 7.6 · 10 7 9
3.8 · 10 7.9 · 10 10
2.4 · 1013 2.2 · 1014
n1.5 1.0 · 105 1.5 · 10 6
2.3 · 10 7
2.0 · 10 1.0 · 1010 4.6 · 1010
8

n2 10000 77460 6.0 · 10 5


2.9 · 106 5.6 · 107 1.8 · 108
n3 1000 3915 15326 4.4 · 104 3.2 · 105 6.8 · 105
2n 30 36 42 46 55 58

Practicable algorithms: in most applications – up to linearithmic,


n log n, complexity.

10 / 14
Outline Time complexity Time growth Worst-case Average-case

Check the hidden constants!


Order relations can be drastically misleading.
g(n) ∈ O(f (n)) means that g(n) ≤ cf (n) for all n > n0 where
• c is the fixed amount of computations per data item and
• n0 is the lowest data size, after which the relation holds.

Check whether the constant c is sufficiently small.


Practical rule:
Roughly estimate the computation volume per data item after time
complexities of the algorithms are compared in a “Big-Oh” sense.

• “Big-Oh”: The linear algorithm A is better than the linearithmic one B.


• Estimated computation volumes: gA (n) = 4n and gB = 0.1n lg n.
• A outperforms B: 4n < 0.1n lg n, or lg n > 40, i.e. n > 240 ≈ 1012 items.
• Therefore, in practice the algorithm B is mostly better.
11 / 14
Outline Time complexity Time growth Worst-case Average-case

Check the hidden constants!


Order relations can be drastically misleading.
g(n) ∈ O(f (n)) means that g(n) ≤ cf (n) for all n > n0 where
• c is the fixed amount of computations per data item and
• n0 is the lowest data size, after which the relation holds.

Check whether the constant c is sufficiently small.


Practical rule:
Roughly estimate the computation volume per data item after time
complexities of the algorithms are compared in a “Big-Oh” sense.

• “Big-Oh”: The linear algorithm A is better than the linearithmic one B.


• Estimated computation volumes: gA (n) = 4n and gB = 0.1n lg n.
• A outperforms B: 4n < 0.1n lg n, or lg n > 40, i.e. n > 240 ≈ 1012 items.
• Therefore, in practice the algorithm B is mostly better.
11 / 14
Outline Time complexity Time growth Worst-case Average-case

Check the hidden constants!


Order relations can be drastically misleading.
g(n) ∈ O(f (n)) means that g(n) ≤ cf (n) for all n > n0 where
• c is the fixed amount of computations per data item and
• n0 is the lowest data size, after which the relation holds.

Check whether the constant c is sufficiently small.


Practical rule:
Roughly estimate the computation volume per data item after time
complexities of the algorithms are compared in a “Big-Oh” sense.

• “Big-Oh”: The linear algorithm A is better than the linearithmic one B.


• Estimated computation volumes: gA (n) = 4n and gB = 0.1n lg n.
• A outperforms B: 4n < 0.1n lg n, or lg n > 40, i.e. n > 240 ≈ 1012 items.
• Therefore, in practice the algorithm B is mostly better.
11 / 14
Outline Time complexity Time growth Worst-case Average-case

Asymptotic Bounds on Running Time

Asymptotic notation measures the running time of the algorithm in


terms of elementary operations, i.e. asymptotic bounds are
independent of inputs and implementation:

 Upper bound “Big-Oh”: g(n) ∈ O(f (n)) → g(n) ≤ cf (n)
Lower bound “Big-Omega”: g(n) ∈ Ω(f (n)) → g(n) ≥ cf (n)
Tight bound “Big-Theta”: g(n) ∈ Θ(f (n)) → c1 f (n) ≥ g(n) ≥ c2 f (n)

In general, the running time varies not only according to the


size of the input, but the input itself.
• Some sorting algorithms take almost no time if the input is
already sorted in the desired order, but much longer if it is not.
The most common performance measures for all inputs:
the worst-case running time and the average-case running time.

12 / 14
Outline Time complexity Time growth Worst-case Average-case

The Worst-case Running Time

Advantages:
• An upper bound of the worst-case running time is usually
fairly easy to find.
• It is essential for so-called “mission-critical” applications
Drawbacks:
• It may be too pessimistic: actually encountered inputs may
lead to much lower running times that the “upper bound”.
• The most popular fast sorting algorithm, quicksort, is Θ(n2 )
in the worst case, but Θ(n log n) for “random” inputs, being
most frequent in practice.
• The worst-case input might be unlikely to be met in practice.
• In many cases it is difficult to specify the worst-case input.

13 / 14
Outline Time complexity Time growth Worst-case Average-case

The Average-case Running Time

By contrast, the average-case time is not as easy to define.


• A probability distribution on the inputs has to be specified.
• Simple assumption: all equally likely inputs of size n.
• Sometimes this assumption may not hold for the real inputs.
• The analysis might be a difficult math challenge.
• Even if the average-case complexity is good, the worst-case
one may be very bad.

Asymptotic optimality of an algorithm with the running time T (n):


It is asymptotically optimal for the given problem if
(i) t(n) ∈ O(f (n)) for some function f AND
(ii) there is no algorithm with running time g(n) for any function g that
grows more slowly than f when n → ∞.

14 / 14
Outline Time complexity Time growth Worst-case Average-case

The Average-case Running Time

By contrast, the average-case time is not as easy to define.


• A probability distribution on the inputs has to be specified.
• Simple assumption: all equally likely inputs of size n.
• Sometimes this assumption may not hold for the real inputs.
• The analysis might be a difficult math challenge.
• Even if the average-case complexity is good, the worst-case
one may be very bad.

Asymptotic optimality of an algorithm with the running time T (n):


It is asymptotically optimal for the given problem if
(i) t(n) ∈ O(f (n)) for some function f AND
(ii) there is no algorithm with running time g(n) for any function g that
grows more slowly than f when n → ∞.

14 / 14

You might also like