0% found this document useful (0 votes)
26 views

Analysis of Algorithms

This document discusses analyzing algorithms to determine their time complexity or running time. It explains that algorithms should be analyzed asymptotically based on how their running time increases with problem size. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and logarithmic O(log n). Examples are provided to demonstrate determining the time complexity of simple algorithms using for loops. Big O notation provides an upper bound, while Big Omega and Big Theta notations also consider lower bounds.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

Analysis of Algorithms

This document discusses analyzing algorithms to determine their time complexity or running time. It explains that algorithms should be analyzed asymptotically based on how their running time increases with problem size. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and logarithmic O(log n). Examples are provided to demonstrate determining the time complexity of simple algorithms using for loops. Big O notation provides an upper bound, while Big Omega and Big Theta notations also consider lower bounds.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Analysis of Algorithms

CSE225: Data Structures and Algorithms


Analysis of Algorithms
• An algorithm is a finite set of precise instructions for performing a
computation or for solving a problem.
• It must produce the correct result
• It must finish in some finite time
• You can represent an algorithm using pseudocode, flowchart, or even actual
code
Analysis of Algorithms
Analysis of Algorithms
• What does it mean to analyze an algorithm?
• To have an estimate about how much time an algorithm may take to finish,
or in other words, to analyze its running time
• Sometimes, instead of running time, we are interested in how much memory
the algorithm may consume while it runs
• It enables us to compare between two algorithms
• What do we mean by running time analysis?
• Also referred to as time-complexity analysis
• To determine how running time increases as the size of the problem
increases.
Analysis of Algorithms
• Size of the problem can be a range of things, including
• size of an array
• polynomial degree of an equation
• number of elements in a matrix
• number of bits in the binary representation of the input
.
.
and so on…
How Do We Analyze Running Time?
• We need to define an objective measure.
• (1) Compare execution times?
• Not good: times are specific to a particular computer !!
How Do We Analyze Running Time?
• We need to define an objective measure.
• (2) Count the number of statements executed?
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number of times each statement is executed.
• Not good: number of statements vary with the programming language as well as the
style of the individual programmer. Algorithm 2
Algorithm 1 c1 c2 c3 Time
Time for(i=0; i<N; i++) c1+c2(N+1)+ c3N
arr[0] = 0; c1 arr[i] = 0; c1N
arr[1] = 0; c1 -----------------------------
arr[2] = 0; c1 c1+c2(N+1)+c3N+c1N = (c1+c2+c3)N+(c1+c2)
...
arr[N-1] = 0; c1
----------------------
c1+c1+...+c1 = c1N
Ideal Solution
• Express running time as a function of the input size n (i.e., f(n)).
• Compare different functions of running times in an asymptotic
manner.
• Such an analysis is independent of machine type, programming style,
etc.
Visualizing Orders of Growth
• On a graph, as you go to the right, a faster growing function
eventually becomes larger.

fA(n)=30n+8

Running time
fB(n)=n2+1

Increasing n 
Growth of Functions
Complexity Graphs

log(n)
Complexity Graphs

n log(n)

log(n)
Complexity Graphs

n10 n3

n2
n log(n)
Complexity Graphs (log scale)
3n
nn
n20

2n

n10

1.1n
Types of Analysis
• Is input size everything that matters?
int find_a(char *str)
{
int i;
for (i = 0; str[i]; i++) {
if (str[i] == ’a’) return i;
}
return -1;
}
• Time complexity: 𝑂(𝑛)
• Consider two inputs: “alibi” and “never”
Types of Analysis
• So how does the running time vary with respect to various input?

• Three scenarios
• Best case
• 𝑏𝑒𝑠𝑡𝑓𝑖𝑛𝑑_𝑎 = min 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑥∈𝑋𝑛
• Worst case
• 𝑤𝑜𝑟𝑠𝑡𝑓𝑖𝑛𝑑_𝑎 = max 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑥∈𝑋𝑛
• Average case
1
• 𝑎𝑣𝑔𝑓𝑖𝑛𝑑_𝑎 = σ𝑥∈𝑋𝑛 𝑟𝑢𝑛𝑡𝑖𝑚𝑒𝑓𝑖𝑛𝑑_𝑎
𝑋𝑛
Types of Analysis
• Worst-case: (usually done)
• upper bound on running time
• maximum running time of algorithm on any input of size n
• Average-case: (sometimes done)
• we take all possible inputs and calculate computing time
for all of the inputs
• sum all the calculated values and divide the sum by total
number of inputs
• we must know (or predict) distribution of cases
• Best-case: (bogus)
• lower bound on running time of an algorithm
• minimum running time of algorithm on any input of size n
Asymptotic Upper Bound
• Asymptotically bounds the growth of a running time for large values
of n to within constant factors above, or in other words, “the running
time grows at most this much, but it could grow more slowly”
• A means to compare two algorithms with running times f(n) and g(n),
we can establish a rough measure that characterizes how fast each
function grows.
• Compare functions in the limit, that is, asymptotically!
Asymptotic Upper Bound
• O-notation
Asymptotic Upper Bound
•Example:
•Show that f(x) = x2 + 2x + 1 is O(x2).
•For x > 1 we have:
•x2 + 2x + 1  x2 + 2x2 + x2
• x2 + 2x + 1  4x2
•Therefore, for C = 4 and k = 1:
•f(x)  Cx2 whenever x > k.
• f(x) is O(x2).
Asymptotic Upper Bound
• We say 𝑓(𝑛) = 30000 is in the order of 1, or 𝑶(𝟏)
• Growth rate of 30000 is constant, that is, it is not dependent on problem size.
• 𝑓(𝑛) = 30𝑛 + 8 is in the order of 𝑛, or 𝑶(𝒏)
• Growth rate of 30𝑛 + 8 is roughly proportional to the growth rate of 𝑛.
• 𝑓(𝑛) = 𝑛2 + 1 is in the order of 𝑛2, or 𝑶 𝒏𝟐
• Growth rate of 𝑛2 + 1 is roughly proportional to the growth rate of 𝑛2.
• In general, any 𝑂(𝑛2) function is faster- growing than any 𝑂(𝑛)
function.
• For large 𝑛, a 𝑂(𝑛2) algorithm runs a lot slower than a 𝑂(𝑛) algorithm.
Asymptotic Upper Bound
• Consider the example of buying elephants and goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
• Easier way: Discard the low order terms, as well as constants in a
function since they are relatively insignificant for large n
6n + 4 ~ n
n4 + 100n2 + 10n + 50 ~ n4
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate of
growth
Some Examples
• Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<10000; i++)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
count = 0; 𝑶(𝟏)
for(i=0; i<10000; i++)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
count = 0;
for(i=0; i<N; i++)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
count = 0; 𝑶(𝒏)
for(i=0; i<N; i++)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
sum = 0;
for(i=0; i<N; i++)
for(j=0; j<N; j++)
sum += arr[i][j];
Some Examples
• Determine the time complexity for the following algorithm.
sum = 0; 𝑶(𝒏𝟐)
for(i=0; i<N; i++)
for(j=0; j<N; j++)
sum += arr[i][j];
Some Examples
• Determine the time complexity for the following algorithm.
count = 0;
for(i=1; i<N; i=i*2)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
count = 0; 𝑶(log 𝒏)
for(i=1; i<N; i=i*2)
count++;
Some Examples
• Determine the time complexity for the following algorithm.
char someString[10];
gets(someString);
for(i=0; i< strlen(someString); i++)
someString[i] -= 32;

int n = strlen(someString);
for(i=0; i< n; i++)
someString[i] -= 32;
Some Examples

• Determine the time complexity for the following algorithm.


char someString[10]; 𝑶(𝒏𝟐)
gets(someString);
for(i=0; i<strlen(someString); i++)
someString[i] -= 32;

int n = strlen(someString);
for(i=0; i< n; i++) 𝑶(𝒏)
someString[i] -= 32;
Big Oh (O) Notation
Big Omega (Ω) Notation
The notation Ω(n) is the formal way to express the lower bound of an algorithm's running time. It measures the
best case time complexity or the best amount of time an algorithm can possibly take to complete.
Big Theta (Θ) Notation
The notation θ(n) is the formal way to express both the lower bound and the upper bound of an algorithm's
running time.

You might also like