0% found this document useful (0 votes)
9 views18 pages

Lecture Four-Algorithm and Problem Solving

The lecture focuses on characterizing the running times of algorithms using asymptotic notation, including O-notation, Ω-notation, and Θ-notation. It explains how to describe the upper, lower, and tight bounds of functions, with examples such as insertion sort to illustrate these concepts. The importance of using the correct asymptotic notation to specify running times accurately is emphasized.

Uploaded by

Omnia barakat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views18 pages

Lecture Four-Algorithm and Problem Solving

The lecture focuses on characterizing the running times of algorithms using asymptotic notation, including O-notation, Ω-notation, and Θ-notation. It explains how to describe the upper, lower, and tight bounds of functions, with examples such as insertion sort to illustrate these concepts. The importance of using the correct asymptotic notation to specify running times accurately is emphasized.

Uploaded by

Omnia barakat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Algorithms and Problem Solving

Course Number: ITCC103

Lecture 4: Characterizing Running Times

Rawand Al-Foqah’a

COLLEGE OF INFORMATION TECHNOLOGY - LUSAIL UNIVERSITY 1


Guiding Textbook
These slides are formatted and structured based on the content from the following
reference book:

2
OVERVIEW
Goals
• A way to describe behavior of functions in the limit. We’re studying
asymptotic efficiency.
• Describe growth of functions.
• Focus on what’s important by abstracting away low-order terms and
constant factors.
• How we indicate running times of algorithms.

3
O-notation
O-notation characterizes an upper bound on the asympototic behavior
of a function: it says that a function grows no faster than a certain
rate. This rate is based on the highest order term.
For example:
𝑓(𝑛) = 7𝑛3 + 100𝑛2 − 20𝑛 + 6 is 𝑂(𝑛3), since the highest order
term is 7𝑛3, and therefore the function grows no faster than 𝑛3.
The function 𝑓(𝑛) is also 𝑂(𝑛5), 𝑂(𝑛6), and 𝑂(𝑛𝑐 ) for any constant
𝑐 ≥ 3.

4
ꭥ-notation
ꭥ-notation characterizes a lower bound on the asymptotic behavior of a
function.
For example:
𝑓(𝑛) = 7𝑛3 + 100𝑛2 − 20𝑛 + 6 is ꭥ(𝑛3), since the highest-order term,
𝑛3, grows at least as fast as 𝑛3.
The function 𝑓(𝑛) is also ꭥ(𝑛2), ꭥ(𝑛) and ꭥ(𝑛𝑐) for any constant 𝑐 ≤
3.

5
Θ-notation
Θ-notation characterizes a tight bound on the asympototic behavior of a
function: it says that a function grows precisely at a certain rate, again
based on the highest-order term.
If a function is is both 𝑂(𝑓(𝑛)) and ꭥ(𝑓(𝑛)), then a function is
Θ(𝑓(𝑛)).

6
EXAMPLE: INSERTION-SORT

7
EXAMPLE: INSERTION-SORT (continued)
First, show that INSERTION-SORT is runs in 𝑂(𝑛2) time, regardless of
the input:
• The outer for loop runs 𝑛 − 1 times regardless of the values being sorted.
• The inner while loop iterates at most 𝑖 − 1 times.
• The exact number of iterations the while loop makes depends on the values it
iterates over, but it will definitely iterate between 0 and 𝑖 − 1 times.
• Since 𝑖 is at most 𝑛, the total number of iterations of the inner loop is at most
(𝑛 − 1)(𝑛 − 1), which is less than 𝑛2.
Each inner loop iteration takes constant time, for a total of at most
c𝑛2 for some constant 𝑐, or 𝑂(𝑛2).
8
EXAMPLE: INSERTION-SORT (continued)

Now show that INSERTION-SORT has a worst-case running time of


ꭥ(𝑛2):
• Observe that for a value to end up 𝑘 positions to the right of where it
started, the line 𝐴[𝑗 + 1] = 𝐴[𝑗] must have been executed 𝑘 times.
• Assume that 𝑛 is a multiple of 3 so that we can divide the array 𝐴 into
groups of 𝑛/3 positions.

9
EXAMPLE: INSERTION-SORT (continued)
Because at least 𝑛/3 values must pass through at least 𝑛/3 positions,
the line 𝐴[𝑗 + 1] = 𝐴[𝑗] executes at least (𝑛/3)(𝑛/3) = 𝑛2/9 times,
which is ꭥ(𝑛2). For this input, INSERTION-SORT takes time ꭥ(𝑛2).

Since we have shown that INSERTION-SORT runs in 𝑂(𝑛2) time in all


cases and that there is an input that makes it take ꭥ(𝑛2) time, we can
conclude that the worst-case running time of INSERTION-SORT is Θ(𝑛2).

The constant factors for the upper and lower bounds may differ. That
doesn’t matter.
10
O-notation

11
ꭥ-notation

12
Θ-notation

13
ASYMPTOTIC NOTATION AND
RUNNING TIMES
Need to be careful to use asymptotic notation correctly when
characterizing a running time. Asymptotic notation describes functions,
which in turn describe running times. Must be careful to specify which
running time.
For example:
The worst-case running time for insertion sort is 𝑂(𝑛2), ꭥ(𝑛2), and
Θ(𝑛2); all are correct. Prefer to use Θ(𝑛2) here, since it’s the most
precise.
The best-case running time for insertion sort is 𝑂(𝑛), ꭥ(𝑛), and Θ(𝑛);
prefer Θ(𝑛).
14
ASYMPTOTIC NOTATION AND
RUNNING TIMES (continued)
But cannot say that the running time for insertion sort is Θ(𝑛2), with
“worst-case” omitted. Omitting the case means making a blanket
statement that covers all cases, and insertion sort does not run in Θ(𝑛2)
time in all cases.
Can make the blanket statement that the running time for insertion sort
is 𝑂(𝑛2), or that it’s ꭥ(𝑛), because these asymptotic running times are
true for all cases.
For merge sort, its running time is Θ(𝑛 lg 𝑛) in all cases, so it’s OK to
omit which case.

15
ASYMPTOTIC NOTATION AND
RUNNING TIMES (continued)
Common error: conflating O-notation with Θ-notation by using O-
notation to indicate an asymptotically tight bound. O-notation gives
only an asymptotic upper bound. Saying “an 𝑂(𝑛 lg 𝑛)-time algorithm
runs faster than an 𝑂(𝑛2)-time algorithm” is not necessarily true. An
algorithm that runs in Θ(𝑛) time also runs in 𝑂(𝑛2) time. If you really
mean an asymptotically tight bound, then use Θ-notation.
Use the simplest and most precise asymptotic notation that applies.
Suppose that an algorithm’s running time is 3𝑛2 + 20𝑛. Best to say that
it’s Θ(𝑛2). Could say that it’s 𝑂(𝑛3), but that’s less precise. Could say
that it’s Θ(3𝑛2 + 20𝑛) but that obscures the order of growth.
16
End Of the Lecture

COLLEGE OF INFORMATION TECHNOLOGY - LUSAIL UNIVERSITY 17


References
❑Thomas H. CORMEN, et al. Introduction to Algorithms. The MIT
Press, Cambridge, Massachusetts London, England, ISBN
9780262046305, 2022.

18

You might also like