0% found this document useful (0 votes)
53 views14 pages

COMP90038 Algorithms and Complexity

This document is a lecture on analyzing algorithms and establishing their growth rate. It discusses using limits to determine asymptotic growth rates, L'Hopital's rule, examples of finding the growth rates of simple algorithms like finding the maximum element in a list, selection sort, and matrix multiplication. It also covers analyzing recursive algorithms using telescoping and binary search as an example. Finally, it provides some useful rules and formulas for asymptotic analysis.

Uploaded by

Bogdan Jovicic
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views14 pages

COMP90038 Algorithms and Complexity

This document is a lecture on analyzing algorithms and establishing their growth rate. It discusses using limits to determine asymptotic growth rates, L'Hopital's rule, examples of finding the growth rates of simple algorithms like finding the maximum element in a list, selection sort, and matrix multiplication. It also covers analyzing recursive algorithms using telescoping and binary search as an example. Finally, it provides some useful rules and formulas for asymptotic analysis.

Uploaded by

Bogdan Jovicic
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

COMP90038 Algorithms and Complexity

Analysis of Algorithms

Michael Kirley

Lecture 4

Semester 1, 2017

Algorithms and Complexity (2017) Analysis of Algorithms 1 / 14


Establishing Growth Rate

In the last lecture we proved t(n) ∈ O(g (n)) for some cases of t and
g , using the definition of O directly:

n > n0 ⇒ t(n) < c · g (n)

for some c and n0 . A more common approach uses



t(n)  0 implies t grows asymptotically slower than g
lim = c implies t and g have same order of growth
n→∞ g (n) 
∞ implies t grows asymptotically faster than g

Use this to show that 1000n = O(n2 ). ✎


Algorithms and Complexity (2017) Analysis of Algorithms 2 / 14
L’Hôpital’s Rule

Often it is helpful to use L’Hôpital’s rule:

t(n) t ′ (n)
lim = lim ′
n→∞ g (n) n→∞ g (n)

where t ′ and g ′ are the derivatives of t and g .



For example, we can show that log2 n grows slower than n:

log2 n (log2 e) n1 1
lim √ = lim 1 = 2 log2 e lim √ = 0
n→∞ n n→∞ √
2 n
n→∞ n

Algorithms and Complexity (2017) Analysis of Algorithms 3 / 14


Example: Finding the Largest Element in a List

function MaxElement(A[0..n − 1])


max ← A[0]
for i ← 1 to n − 1 do
if A[i ] > max then
max ← A[i ]
return max

We count the number of comparisons executed for a list of size n:


n−1
X
C (n) = 1 = n − 1 = Θ(n)
i =1

Algorithms and Complexity (2017) Analysis of Algorithms 4 / 14


Example: Selection Sort
function SelSort(A[0..n − 1])
for i ← 0 to n − 2 do
min ← i
for j ← i + 1 to n − 1 do
if A[j] < a[min] then
min ← j
swap A[i ] and A[min]

We count the number of comparisons executed for a list of size n:


n−2 X
X n−1 n−2
X n−2
X
2
C (n) = 1= (n − 1 − i ) = (n − 1) − i
i =0 j=i +1 i =0 i =0

(n − 2)(n − 1) n(n − 1)
= (n − 1)2 − = = Θ(n2 )
2 2
Algorithms and Complexity (2017) Analysis of Algorithms 5 / 14
Example: Matrix Multiplication
function MatrixMult(A[0..n − 1, 0..n − 1], B[0..n − 1, 0..n − 1])
for i ← 0 to n − 1 do
for j ← 0 to n − 1 do
C [i , j] ← 0.0
for k ← 0 to n − 1 do
C [i , j] ← C [i , j] + A[i , k] · B[k, j]
return C

We count the number of multiplications executed for a list of size n:

X n−1
n−1 X
n−1 X
M(n) = 1
i =0 j=0 k=0


Algorithms and Complexity (2017) Analysis of Algorithms 6 / 14
Analysing Recursive Algorithms

Let us start with a simple example:


function F(n)
if n = 0 then return 1
else return F(n − 1) · n

The basic operation here is the multiplication.

We express the cost recursively as well:

M(0) = 0
M(n) = M(n − 1) + 1 for n > 0

To find a closed form, that is, one without recursion, we usually try
“telescoping”, or “backward substitutions” in the recursive part.

Algorithms and Complexity (2017) Analysis of Algorithms 7 / 14


Telescoping

The recursive equation was:

M(n) = M(n − 1) + 1 (for n > 0)

Use the fact M(n − 1) = M(n − 2) + 1 to expand the right-hand side:

M(n) = [M(n − 2) + 1] + 1 = M(n − 2) + 2

and keep going:

. . . = [M(n − 3) + 1] + 2 = M(n − 3) + 3 = . . . = M(n − n) + n = n

where we used the base case M(0) = 0 to finish.

Algorithms and Complexity (2017) Analysis of Algorithms 8 / 14


A Second Example: Binary Search in Sorted Array
function BinSearch(A[], lo, hi , key )
if lo > hi then return −1
mid ← lo + (hi − lo)/2
if A[mid ] = key then return mid
else
if A[mid ] > key then
return BinSearch(A, lo, mid − 1, key )
else return BinSearch(A, mid + 1, hi , key )

The basic operation is the key comparison. The cost, recursively, in


the worst case:
C (0) = 0
C (n) = C (n/2) + 1 for n > 0

Algorithms and Complexity (2017) Analysis of Algorithms 9 / 14


Telescoping
A smoothness rule allows us to assume that n is a power of 2.
The recursive equation was:

C (n) = C (n/2) + 1 (for n > 0)

Use the fact C (n/2) = C (n/4) + 1 to expand, and keep going:

C (n) = C (n/2) + 1
= [C (n/4) + 1] + 1
= [[C (n/8) + 1] + 1] + 1
..
.
= [[. . . [[C (0) + 1] + 1] + · · · + 1] + 1]
| {z }
1+log2 n times

Hence C (n) = Θ(log n).


Algorithms and Complexity (2017) Analysis of Algorithms 10 / 14
Logarithmic Functions Have Same Rate of Growth
In O-expressions we can just write “log” for any logarithmic function,
no matter what its base is.

Asymptotically, all logarithmic behaviour is the same, since

loga x = (loga b)(logb x)

So, for example, if ln is the natural logarithm then

log2 n = O(ln n)
ln n = O(log2 n)

Also note that since log nc = c · log n, we have, for all constants c,

log nc = O(log n)

Algorithms and Complexity (2017) Analysis of Algorithms 11 / 14


Summarising Reasoning with Big-Oh

O(f (n)) + O(g (n)) = O(max{f (n), g (n)})

c · O(f (n)) = O(f (n))

O(f (n)) · O(g (n)) = O(f (n) · g (n)).

The first equation justifies throwing smaller summands away.

The second says that constants can be thrown away too.

The third may be used with some nested loops. Suppose we have a
loop which is executed O(f (n)) times, and each execution takes time
O(g (n)). Then the execution of the loop takes time O(f (n) · g (n)).

Algorithms and Complexity (2017) Analysis of Algorithms 12 / 14


Some Useful Formulas

From Stirling’s formula:


1
n! = O(nn+ 2 )

Some useful sums:


Pn
i =0 i2 = n
3
(n + 12 )(n + 1)
Pn
i =0 (2i + 1) = (n + 1)2
Pn
i =1 1/i = O(log n)

See also Levitin’s Appendix A.

Levitin’s Appendix B is a tutorial on recurrence relations.

Algorithms and Complexity (2017) Analysis of Algorithms 13 / 14


The Road Ahead

You will become much more familiar with asymptotic analysis as we


use it on algorithms that we meet.

We shall begin the study of algorithms by looking at brute force


approaches.

Algorithms and Complexity (2017) Analysis of Algorithms 14 / 14

You might also like