0% found this document useful (0 votes)
10 views57 pages

L1 AlgoAnalysis

Uploaded by

nsuchatgtp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views57 pages

L1 AlgoAnalysis

Uploaded by

nsuchatgtp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Lecture 01

Analysis of algorithms
CSE373: Design and Analysis of Algorithms
Course Objective
The theoretical study of design and analysis of computer
algorithms

Analysis: predict the cost of an algorithm in terms of


resources and performance

Design: design algorithms which minimize the cost


Definition
In simple terms, an algorithm is a series of instructions to
solve a problem (complete a task)

Problems can be in any form


Business
Get a part from Dhaka to Sylhet by morning
Allocate manpower to maximize profit

Life
I am hungry. How do I order pizza?
Explain how to tie shoelaces to a five year old child
Definition
An algorithm is a finite set of precise instructions for
performing a computation or for solving a problem.
It must produce the correct result
It must finish in some finite time
You can represent an algorithm using pseudocode, flowchart, or
even actual code
Algorithm Description
The Problem of Sorting

Input: sequence a1, a2, …, an of numbers.

Output: permutation a'1, a'2, …, a'n such


that a'1  a'2  …  a'n .

Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
Insertion Sort

INSERTION-SORT (A, n) ⊳ A[1 . . n]


for j ← 2 to n
do key ← A[ j]
i←j–1
“pseudocode” while i > 0 and A[i] > key
do A[i+1] ← A[i]
i←i–1
A[i+1] = key
1 i j n
A:
key
sorted
Example

8 2 4 9 3 6
Example

8 2 4 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example

8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
How can you modify this algorithm so that it sorts numbers in reverse order?
Analysis of Algorithms
What does it mean to analyze an algorithm?
To have an estimate about how much time an algorithm may
take to finish, or in other words, to analyze its running time
Sometimes, instead of running time, we are interested in how
much memory the algorithm may consume while it runs
It enables us to compare between two algorithms
What do we mean by running time analysis?
Also referred to as time-complexity analysis
To determine how running time increases as the size of the
problem increases.
Analysis of Algorithms
Size of the problem can be a range of things, including
size of an array

polynomial degree of an equation

number of elements in a matrix

number of bits in the binary representation of the input

.
and so on…
How Do We Analyze Running Time?
We need to define an objective measure.
(1) Compare execution times?
Not good: times are specific to a particular computer !!
How Do We Analyze Running Time?
We need to define an objective measure.
(2) Count the number of statements executed?
Associate a "cost" with each statement.
Find the "total cost“ by finding the total number of times each statement
is executed.
Not good: number of statements vary with the programming language as
well as the style of the individual programmer.

Algorithm 1 Algorithm 2
Cost c1 c2 c3 Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c1+c2(N+1)+ c3N
arr[1] = 0; c1 arr[i] = 0; c1N
arr[2] = 0; c1 -----------------------------
... c1+c2(N+1)+c3N+c1N = (c1+c2+c3)N+(c1+c2)
arr[N-1] = 0; c1
----------------------
c1+c1+...+c1 = c1N
Insertion Sort

INSERTION-SORT (A, n) ⊳ A[1 . . n]


for j ← 2 to n
do key ← A[ j]
i←j–1
while i > 0 and A[i] > key
do A[i+1] ← A[i]
i←i–1
A[i+1] = key

What is the estimated running time?


Depends on arrangement of numbers in the input array. We are typically
interested in the runtime of an algorithm in the worst case scenario.
Because it provides us a guarantee that the algorithm won’t take any longer than
this for any type of input.

How can you arrange the input numbers so that this


algorithm becomes most inefficient (worst case)?
Insertion Sort: Running Time
Statement cost
INSERTION-SORT (A, n) ⊳ A[1 . . n]
for j ← 2 to n 𝑐1 𝑛
do key ← A[ j] 𝑐2 (𝑛 − 1)
i←j–1 𝑐3 (𝑛 − 1)
while i > 0 and A[i] > key 𝑐4 σ𝑛𝑗=2 𝑡𝑗
do A[i+1] ← A[i] 𝑐5 σ𝑛𝑗=2(𝑡𝑗 − 1)
i←i–1 𝑐6 σ𝑛𝑗=2(𝑡𝑗 − 1)
A[i+1] = key 𝑐7 (𝑛 − 1)
𝑇 𝑛
𝑛 𝑛 𝑛

= 𝑐1 𝑛 + 𝑐2 𝑛 − 1 + 𝑐3 𝑛 − 1 + 𝑐4 ෍ 𝑡𝑗 + 𝑐5 ෍(𝑡𝑗 − 1) + 𝑐6 ෍(𝑡𝑗 − 1) + 𝑐7 (𝑛 − 1)
𝑗=2 𝑗=2 𝑗=2
Here tj = no. of times the condition of while loop is tested for the current value of j.
In the worst case (when input is reverse-sorted), in each iteration of the for loop, all the j-1
elements need to be right shifted and the key will be inserted in the front of them, i.e., tj=j-1.
Putting this in the above equation, we get: T(n) = An2+Bn+C, where A, B, C are constants.
What is T(n) in the best case (when the input numbers are already sorted)?
Ideal Solution
Express running time as a function of the input size n (i.e.,
f(n)).
Compare different functions of running times in an
asymptotic manner.
Such an analysis is independent of machine type,
programming style, etc.
Asymptotic Analysis
To compare two algorithms with running times f(n) and
g(n), we need a rough measure that characterizes how fast
each function grows.
Hint: use rate of growth
Compare functions in the limit, that is, asymptotically!
(i.e., for large values of n)
Rate of Growth
Consider the example of buying elephants and goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
The low order terms, as well as constants in a function are
relatively insignificant for large n
6n + 4 ~ n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have


the same rate of growth
Big-O Notation
We say 𝑓(𝑛) = 30000 is in the order of 1, or 𝑶(𝟏)
Growth rate of 30000 is constant, that is, it is not dependent on
problem size.
𝑓(𝑛) = 30𝑛 + 8 is in the order of 𝑛, or 𝑶(𝒏)
Growth rate of 30𝑛 + 8 is roughly proportional to the growth rate
of 𝑛.
𝑓(𝑛) = 𝑛2 + 1 is in the order of 𝑛2, or 𝑶 𝒏𝟐
Growth rate of 𝑛2 + 1 is roughly proportional to the growth rate
of 𝑛2.
In general, any 𝑂(𝑛2) function is faster- growing than any
𝑂(𝑛) function.
For large 𝑛, a 𝑂(𝑛2) algorithm runs a lot slower than a 𝑂(𝑛)
algorithm.
Visualizing Orders of Growth
On a graph, as you go to the right, a faster growing function
eventually becomes larger.

fA(n)=30n+8
Running time→

fB(n)=n2+1

Increasing n →
Growth of Functions
Complexity Graphs

log(n)
Complexity Graphs

n log(n)

log(n)
Complexity Graphs

n10 n3

n2
n log(n)
Complexity Graphs (log scale)

3n
nn
n20

2n

n10

1.1n
Asymptotic Notations
O notation: asymptotic “upper bound”:

 notation: asymptotic “lower bound”:

 notation: asymptotic “tight bound”:


Asymptotic Notations
• O-notation

O(g(n)) is the set of functions


with smaller or same order of
growth as g(n)

Examples:
T(n) = 3n2+10nlgn+8 is O(n2), O(n2lgn), O(n3), O(n4), …
T’(n) = 52n2+3n2lgn+8 is O(n2lgn), O(n3), O(n4), …
Asymptotic Notations
•  - notation

(g(n)) is the set of functions


with larger or same order of
growth as g(n)
Examples:
T(n)=3n2+10nlgn+8 is Ω(n2), Ω(nlgn), Ω(n), Ω(nlgn),Ω(1)
T’(n) = 52n2+3n2lgn+8 is Ω(n2lgn), Ω(n2), Ω(n), …
Asymptotic Notations
-notation

(g(n)) is the set of functions with the same


order of growth as g(n)

* f(n) is both O(g(n)) & (g(n)) ↔ f(n) is (g(n))

Examples:
T(n) = 3n2+10nlgn+8 is Θ(n2)
T’(n) = 52n2+3n2lgn+8 is Θ(n2lgn)
Big-O Visualization
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<10000; i++)
count++;
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<10000; i++)
count++; 𝑶(𝟏)
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<n; i++)
count++;
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<n; i++)
count++; 𝑶(𝒏)
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n; j++)
sum += arr[i][j];
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n; j++) O(n2)
sum += arr[i][j];
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=1; i<=n; i=i*2)
count++;
Some Examples
Determine the time complexity for the following algorithm.
count = 0;
for(i=1; i<=n; i=i*2) O(lg n)
count++;
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=1; i<n; i=i*2)
for(j=0; j<n; j++)
sum += i*j;
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=1; i<n; i=i*2) O(n lg n)
for(j=0; j<n; j++)
sum += i*j;
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=1; i<=n; i=i*4)
for(j=0; j<=n; j*=2)
sum += i*j;

Asympotic Tight Bound: Θ(lg n) WHY?


Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=1; i<n; i=i*2)
for(j=0; j<i; j++)
sum += i*j;
Some Examples
Determine the time complexity for the following algorithm.
sum = 0;
for(i=1; i<n; i=i*2)
for(j=0; j<i; j++)
sum += i*j;

Loose Upper Bound: O(n lg n)


Tight Upper Bound: O(n) WHY?
Asympotic Tight Bound: Θ(n)
Some Examples
Determine the time complexity for the following algorithm.
char someString[10];
gets(someString);
for(i=0; i<strlen(someString); i++)
someString[i] -= 32;
Some Examples
Determine the time complexity for the following algorithm.
char someString[10];
gets(someString);
for(i=0; i<strlen(someString); i++)
someString[i] -= 32; 𝑶(𝒏𝟐)
Types of Analysis
Is input size everything that matters?
int find_a(char *str)
{
int i;
for (i = 0; str[i]; i++)
{
if (str[i] == ’a’)
return i;
}
return -1;
}
Time complexity: 𝑂(𝑛)

Consider two inputs: “alibi” and “never”


Types of Analysis
So how does the running time vary with respect to various input?

Three scenarios
Best case
𝑏𝑒𝑠𝑡𝑓𝑖𝑛𝑑_𝑎 = min 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑥∈𝑋𝑛
Worst case
𝑤𝑜𝑟𝑠𝑡𝑓𝑖𝑛𝑑_𝑎 = max 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑥∈𝑋𝑛
Average case
1
𝑎𝑣𝑔𝑓𝑖𝑛𝑑_𝑎 = ෍ 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑋𝑛
𝑥∈𝑋𝑛
Types of Analysis
Worst-case: (usually done)
upper bound on running time
maximum running time of algorithm on any input of size n
Average-case: (sometimes done)
we take all possible inputs and calculate computing time for all of the
inputs
sum all the calculated values and divide the sum by total
number of inputs
we must know (or predict) distribution of cases
Best-case: (bogus)
lower bound on running time of an algorithm
minimum running time of algorithm on any input of size n

You might also like