0% found this document useful (0 votes)
43 views51 pages

Lecture#04, DAA, Asymptotic Notations - Growth Rate

The document discusses the design and analysis of algorithms, focusing on asymptotic notations and growth rates. It covers algorithm efficiency, time and space complexity, and the importance of measuring execution time and running time through asymptotic analysis. Various complexity functions and their implications for algorithm performance are also explored, including best, worst, and average case scenarios.

Uploaded by

waqtwaqtwaqtwaqt
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views51 pages

Lecture#04, DAA, Asymptotic Notations - Growth Rate

The document discusses the design and analysis of algorithms, focusing on asymptotic notations and growth rates. It covers algorithm efficiency, time and space complexity, and the importance of measuring execution time and running time through asymptotic analysis. Various complexity functions and their implications for algorithm performance are also explored, including best, worst, and average case scenarios.

Uploaded by

waqtwaqtwaqtwaqt
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Design & Analysis of Algorithms

Lecture#04
Asymptotic Notations,
Growth Rate

Mian Ahmed Shafiq


Lecture Contents
 Algorithms Efficiency
 Measurement of Efficiency
 Time & Space Complexity
 Time / Space Tradeoff
 Tips for Execution Friendly Development
 Execution Time Function
 Execution Time Comparison for Different Time Functions
Lecture Contents
 Best, Worst & Average Cases
 Asymptotic Analysis
 Complexity Functions (C, log, n, nlogn, n2, nk, 2n)
 𝜃 Notation
 Big-O Notation
 Ω Notation
 Small-o Notation
 Big-O is not Enough
Algorithm Efficiency
“Algorithm efficiency is the quality of an algorithm that describes the
computational resources required to run it”

Computational resources include: 350

 Execution Time 300

 Space occupancy 250


 Memory Requirements
200

150

100

50

0
1 2 3 4 5 6 7
n 2 4 7 11 16 29 60
T(n) 1 10 21 48 82 178 290

n T(n)
Algorithm Efficiency
Less the resources utilized by an
algorithm, more efficient it is
350
Time & space often go in opposite 300
directions
250
Priority of time and/or space can 200
dictate the choice of algorithm
150
In today’s world time is mostly given 100
higher priority than space
50

0
1 2 3 4 5 6 7
n 2 4 7 11 16 29 60
T(n) 1 10 21 48 82 178 290

n T(n)
Measuring Algorithm Efficiency
Two different ways:
1. Measure & Compare Execution Time
2. Measure & Compare Running time (Asymptotic Analysis)

Note: Running time mostly depends upon input size (n) and is denoted by T(n)
Measuring Algorithm Efficiency… Measuring Execution Time
 Dependent upon hardware configuration
 Dependent upon IO operations
 Dependent upon memory in system
 Dependent upon other applications installed on system
 May differ for same machine on different times
Does not really help to predict affect on execution time when input size is
significantly increased / decreased
 May differ greatly for parallel infrastructure
 Involve function call overhead
Measuring Algorithm Efficiency… Measuring Execution Time
Multiple threads trying to access a common resource can increase execution time
significantly
If the program is running on server with multiple disks, some particular raid
configuration might work best for it
Choice of language can increase / decrease the execution time e.g. C language is
faster than Java
Measuring Algorithm Efficiency … Measuring Running Time
 Independent upon hardware configuration
 Independent upon IO operations
 Independent upon memory in system
 Independent upon other applications installed on system
 Always results same
 Can predict affect on execution time when input size is significantly increased
/ decreased
Time & Space Complexity
 Time Complexity
Time required to execute an algorithm
 Space Complexity
Total memory taken by an algorithm during its execution
 Time & Space Tradeoffs
Time & Space Tradeoffs … Example
 Application data may be stored in arrays, link lists, trees, graphs etc
For banking / financial transactions time may be compromised (a bit) but
accuracy is a must requirements
 For audio/video stream based problems accuracy may be compromised
preferring solutions with low execution time
We want google to respond to our queries promptly while few irrelevant links
may be ignored
Execution Friendly Development … Tips
 Move statements out of loop structures that do not belong there
 Reduce IO operations as much as possible
 Code in a way that produces efficient compiled code
 Choose best suitable algorithm (before implementation)
Execution Time Function
Algorithm statements are considered to be executed in equal logical units of
time
Logical units as above will reflect relationship between input size (n) and
execution time function is T(n)
 Execution time function will only show significant difference for large (n)
 For small input size running time differences do not matter
Running Time Functions …. Example
Say a problem has n as input size & there are two algorithms a1 and a2 with T(n)
respectively as:
 For a1: T(n) = 5000 n
 For a2 : T(n) = n2 + 2

n T(n) for a1 T(n) for a2


10 50,000 102
100 500,000 10,002
1000 5,000,000 1,000,002
100,00 50,000,000 100,000,002
100,000 5,000,000,000 1,000,000,000,002
Execution Time Functions ….
As per previous example, algorithm a2 cannot be used for large input
size
Algorithm a1 initially looked costly but later for large input it is still
feasible
 Growth of the complexity function matters a lot
Only growth function provides the necessary abstraction and sound
mathematical grounds to compare algorithms
Execution Time Functions (Running Time)
Functions

Constant
log n
n
n log n
n2
n3
2n
nn

Running time functions are listed above in ascending order


Execution Time Functions (Running Time)
 Constants are ignored
 Most dominant element is counted
 Elements with small growth values are also ignored
Execution Time Functions ….
Execution Time Example Problems
Function
n2 Bubble Sort, Insertion Sort, Selection Sort
n log(n) Merge Sort, Quick Sort, Heap Sort, Huffman Encoding
n+k Bucket Sort
nk Radix Sort
n3 Matrix Multiplication
log n Binary Search (sorted data)
n Linear Search
Best, Average & Worst Cases
 For same algorithm, not all inputs take same time to execute
 There are input values for which execution time is least (best cases).
 Examples:
 in case of sorting data is already sorted
 You are looking for a key (linear search) and first element is your required key
 You are looking for a key (binary search) and middle value is your required key
Best, Average & Worst Cases
 There are input values for which execution time is maximum (worst cases).
 Examples:
 We want to sort data in ascending order while it is already in descending order
 You are looking for a key (linear search) and it is not present in array
 You are looking for a key (binary search) and it is not present in array
Best, Average & Worst Cases
 Average case is present when no prediction is possible about data
 A data value can exist anywhere in the available list of data
 It is more difficult to be measured in comparison to best case or worst case
 Example:
 Taking n random values and trying to sort them
Asymptotic Analysis
 Independent of hardware, platform & software
Expresses complexity of an algorithm in terms of a known function related to
input size
 Analysis describes the growth of running time with reference to input size
e.g. when n grows then T(n) will grow on order or n log n
 Notation for this expression is: T(n) = O(f(n))
Asymptotic Analysis … Definition
Given that: T(n) = O(f(n)), iff there exist two constants c0 and n0
such that T(n) <= c0 f(n) for all n >= n0

Example: T(n) = 3n2+7n+10, Find O(f(n)), c0 & n0


Asymptotic Analysis … Constant Growth O(c)
No growth at all
 The runtime does not grow at all as a function of n (constant)
Basically, it is any operation that does not depend on the value of n to do its
job
 Has the slowest growth pattern (none!)

Examples:
1. Accessing an element of array
2. Accessing maximum value from a MAX HEAP
3. Accessing header node from a link list
4. Accessing root node from a tree
5. Hashing
Asymptotic Analysis … Logarithmic Growth O(log n)
Logarithmic Growth
 The runtime growth is proportional to the base 2 logarithm (log) of n

Examples:
1. Binary Search
2. Max/Min value from a complete binary tree
Asymptotic Analysis … Linear Growth O(n)
Linear Growth
 Runtime grows proportional to the value of n

Examples:
1. Linear Search
2. Max/Min value from an array
3. Sum of value from an array
4. Link list traversal
Asymptotic Analysis … O(n log n)
(n log n) Growth
Any sorting algorithm that uses comparisons between elements is
O(n log n), based on divide an conquer approach

Examples:
1. Merge Sort
2. Quick Sort
Asymptotic Analysis … O(n2)
(n2) Growth
 Running Time grows rapidly
 Slow sorting algorithms

Examples:
1. Bubble Sort
2. Insertion Sort
3. Selection Sort
4. Quick Sort (Worst Case)
Asymptotic Analysis … Polynomial Growth O(nk)
(nk) Growth
 Running Time grows rapidly
 Suitable for small n

Examples:
1. Matrix multiplication
2. Maximum matching for bipartite graph
3. Multiplying n-digit numbers by simple
algorithm
Asymptotic Analysis … Polynomial Growth O(2k)
(2k) Growth
 Running Time grows extremely rapidly
 Suitable only for very small n

Examples:
1. Exact solution for travelling salesman
problem
2. Brute force search problems
Asymptotic Analysis … Graph for n (1-15)
log (n) vs n
16

14

12

10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

log (n) n
Asymptotic Analysis … Graph for n (1-15)

n vs n log (n)
70

60

50

40

30

20

10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n n log (n)
Asymptotic Analysis … Graph for n (1-15)
n log (n) vs n^2 vs n^3
4000

3500

3000

2500

2000

1500

1000

500

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n log (n) n^2 n^3
Asymptotic Analysis … Graph for n (1-15)
n^3 vs 2^n
35000

30000

25000

20000

15000

10000

5000

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n^3 2^n
Asymptotic Analysis … Graph for n (1-100)

log n – n – n log n – n2 – n3-2n


35000

30000

25000

20000

15000

10000

5000

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

n log (n) n n log (n) n^2 n^3 2^n


Asymptotic Notations & Complexity
Asymptotic Time Complexity
The limiting behavior of the execution time of an algorithm when the size of
the problem goes to infinity

Asymptotic Space Complexity


 The limiting behavior of the use of memory space of an algorithm when the
size of the problem goes to infinity
Asymptotic Notations (Theta Notation Θ)
 For non-negative functions, f(n) and g(n), f(n) is theta of g(n) if and only if
f(n) = O(g(n)) and f(n) = Ω(g(n)). This is denoted as "f(n) = Θ(g(n))"
This is basically saying that the function, f(n) is bounded both from the top
and bottom by the same function, g(n)
A function f(n) is Θ(g(n)) if there exist positive constants c1, c2, and n0 such
that:
Asymptotic Notations (Theta Notation Θ)
Function f(n) belongs to Θ(g(n)) if there exist positive constants c1 and c2 such
that its value lies between c1.g(n) and c2.g(n), for all large n
Example:
Given function: 4n+3 belongs to Θ(n) for c1 = 4 and c2 = 5
4n+3 is always >= 4n for n >= 3 and
4n+3 is always <= 5n for n >= 3
Hense 4n+3 = Θ(n)

Note: Θ notation is also called asymptotic tight bound


Asymptotic Notations (Big O Notation)
The O (pronounced big-oh) is the formal method of expressing the upper
bound of an algorithm's running time
 It's a measure of the longest amount of time it could possibly take for the
algorithm to complete the task
 It can be assume that it represents the "worst case scenario" of a program
Asymptotic Notations (Big O Notation)
More formally, f(n) = O(g(n)), iff there exist two constants c and n0 such that
f(n) ≤ c.g(n) for all n >= n0
 For example, say f(n) = 2n+5, then f(n) is O(n) for c = 3 and n0=5

Note: g(n) = n, c = 3 & n0 = 5 in above example


Asymptotic Notations (Big O Notation)
 Big O is said to be upper bound function
 If f(n) = O(g(n)) then it can be said that f can grow at most as g
If f(n) = O(n) then also
f(n) = O(n log n)
f(n) = O(n2) and so on
 So we have to find smallest g(n) for which f(n)=O(g(n))
Big O Notation …………. Few Rules
For any polynomial function, , f(n) = aknk + ak-1nk-1 + … + a0n0, where
a0, a1, …, ak are real numbers, f(n) = O(nk)
If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then (f1 + f2)(x) is
O(max(g1(x), g2(x)))
 If f1(x) is O(g(x)) and f2(x) is O(g(x)), then (f1 + f2)(x) is O(g(x))
If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then (f1f2)(x) is
O(g1(x)g2(x))

Note: Discuss examples here


Asymptotic Notations (Big Ω Notation)
For non-negative functions, f(n) and g(n), if there exists an integer n0
and a constant c > 0 such that for all integers n≥n0, f(n) ≥ cg(n),
then f(n) is omega of g(n)
 This is denoted as "f(n) = Ω(g(n))"
 This is almost the same definition as Big Oh, except that
"f(n) ≥ cg(n)", this makes g(n) a lower bound function, instead of an
upper bound function
 It describes the best that can happen for a given data size
Asymptotic Notations (Big Ω Notation)
 f(n) is Ω(g(n)) if there exists positive constants c and n0 such that 0 ≤ c⋅g(n) ≤
f(n) for all n ≥ n0

Note: f(n) is Θ(g(n)) if f(n) is both O(g(n)) and Ω(g(n))


Asymptotic Notations (Little o Notation)
 For non-negative functions, f(n) and g(n), f(n) is little o of g(n) if and only if
f(n) = O(g(n)), but f(n) ≠ Θ(g(n)). This is denoted as "f(n) = o(g(n))".
This represents a loose bounding version of Big Og(n) bounds from the top, but
it does not bound the bottom.
Asymptotic Notations (Little ω Notation)
• For non-negative functions, f(n) and g(n), f(n) is little omega of g(n)
if and only if f(n) = Ω(g(n)), but f(n) ≠ Θ(g(n)). This is denoted as
"f(n) = ω(g(n))".
• Much like Little Oh, this is the equivalent for Big Omega. g(n) is a
loose lower boundary of the function f(n); it bounds from the bottom,
but not from the top.
Asymptotic Notations ………. Example
function find-min(array a[1..n])
let j :=
for i := 1 to n:
j := min(j, a[i])
next
return j
end

O(n)?
Asymptotic Notations ………. Example
function find-min-plus-max(array a[1..n])
let j := ; // First, find the smallest element in the array
for i := 1 to n:
j := min(j, a[i])
repeat
let minim := j // Now, find the biggest element
j := ;
for i := 1 to n:
j := max(j, a[i])
repeat
let maxim := j
return minim + maxim; // return the sum of the two
end O(n)?
Asymptotic Notations ………. Example
function max-diff(array a[1..n])
m := 0 Max difference problem
for i := 1 to n-1
for j := i + 1 to n
if |a[i] – a[j]| > m then
m := |a[i] – a[j]|
end proc

O(n)?
Asymptotic Notations ………. Example
function max-diff(array a[1..n])
min := a[1]
max := a[1] Max difference problem
(Another Algorithm)
for i := 2 to n
if a[i] < min then
min := a[i]
else if a[i] > max then
max := a[i]
m := max – min
end proc

O(n)?
Big Oh is not the complete story
Two Algorithms A and B with same asymptotic performance, Why select one
over the other, they're both the same, right?
 They may not be the same. There is this small matter of the constant of
proportionality.
Suppose that A does ten operations for each data item, but algorithm B only
does three
It is reasonable to expect B to be faster than A even though both have the
same asymptotic performance. The reason is that asymptotic analysis
ignores constants of proportionality

Note: Discuss examples here

You might also like