Analysis of Algorithms
Prof. Rajesh Mukherjee
Dept. of CSE
Haldia Institute of Technology
Analysis of Algorithms
Input Algorithm Output
An algorithm is a step-by-step procedure for
solving a problem in a finite amount of time.
Running Time
• The running time of an best case
average case
algorithm typically grows with worst case
the input size. 120
• Average case time is often 100
difficult to determine.
Running Time
80
• worst case running time. 60
– Easier to analyze
40
– Crucial to applications such as
games, finance and robotics 20
0
1000 2000 3000 4000
Input Size
Analysis of Algorithms 3
Experimental Studies
• Write a program 9000
implementing the algorithm 8000
• Run the program with inputs 7000
of varying size and 6000
Time (ms)
composition 5000
• Use a method like 4000
System.currentTimeMillis() to 3000
get an accurate measure of 2000
the actual running time
1000
• Plot the results
0
0 50 100
Input Size
Analysis of Algorithms 4
Limitations of Experiments
• It is necessary to implement the algorithm,
which may be difficult
• Results may not be indicative of the running
time on other inputs not included in the
experiment.
• In order to compare two algorithms, the
same hardware and software environments
must be used
Analysis of Algorithms 5
Theoretical Analysis
• Uses a high-level description of the
algorithm instead of an implementation
• Characterizes running time as a function of
the input size, n.
• Takes into account all possible inputs
• Allows us to evaluate the speed of an
algorithm independent of the
hardware/software environment
Analysis of Algorithms 6
Pseudocode
Example: find max element
• High-level description of of an array
an algorithm
• More structured than Algorithm arrayMax(A, n)
English prose Input array A of n integers
Output maximum element of A
• Less detailed than a
program currentMax A[0]
for i 1 to n 1 do
• Preferred notation for if A[i] currentMax then
describing algorithms currentMax A[i]
return currentMax
• Hides program design
issues
Analysis of Algorithms 7
Pseudocode Details
• Control flow • Method call
– if … then … [else …] var.method (arg [, arg…])
– while … do … • Return value
– repeat … until … return expression
– for … do … • Expressions
– Indentation replaces braces Assignment
(like in Java)
• Method declaration Equality testing
Algorithm method (arg [, arg…]) (like in Java)
Input … n2 Superscripts and other
Output … mathematical formatting
allowed
Analysis of Algorithms 8
The Random Access Memory (RAM)
Model
• An potentially unbounded 2
1
bank of memory cells, each 0
of which can hold an
arbitrary number or
character
Memory cells are numbered and accessing any
cell in memory takes unit time.
Analysis of Algorithms 9
Primitive Operations
• Basic computations performed
• Examples:
by an algorithm
– Evaluating an
• Identifiable in pseudocode expression
• Largely independent from the – Assigning a value to
a variable
programming language
– Indexing into an
• Exact definition not important array
• Assumed to take a constant – Calling a method
– Returning from a
amount of time in the RAM
method
model
Analysis of Algorithms 10
Counting Primitive Operations
• By inspecting the pseudocode, we can determine the
maximum number of primitive operations executed by an
algorithm, as a function of the input size
Algorithm arrayMax(A, n) # operations
currentMax A[0] 2
for i 1 to n 1 do 2n
if A[i] currentMax then 2(n 1)
currentMax A[i] 2(n 1)
{ increment counter i } 2(n 1)
return currentMax 1
Total 8n 2
Analysis of Algorithms 11
Estimating Running Time
• Algorithm arrayMax executes 8n 2 primitive
operations in the worst case. Define:
a = Time taken by the fastest primitive operation
b = Time taken by the slowest primitive operation
• Let T(n) be worst-case time of arrayMax.
Then
a (8n 2) T(n) b(8n 2)
• Hence, the running time T(n) is bounded by
two linear functions
Analysis of Algorithms 12
Frequency Count Method
(Counting frequency)
Algorithm arrayMax(A, n) Frequency
currentMax A[0] 1
for i 1 to n 1 do n
if A[i] currentMax then (n 1)
currentMax A[i] (n 1)
{ increment counter i } (n 1)
return currentMax 1
Total 4n 2
Analysis of Algorithms 13
Growth Rate of Running Time
• Changing the hardware/ software
environment
– Affects T(n) by a constant factor, but
– Does not alter the growth rate of T(n)
• The linear growth rate of the running time
T(n) is an intrinsic property of algorithm
arrayMax
Analysis of Algorithms 14
Seven Important Functions
Seven functions that
often appear in 1E+30
algorithm analysis: 1E+28
1E+26
Cubic
Constant 1 1E+24 Quadratic
Logarithmic log n 1E+22
Linear
Linear n 1E+20
1E+18
N-Log-N n log n
T (n )
1E+16
Quadratic n2 1E+14
Cubic n3 1E+12
1E+10
Exponential 2n 1E+8
1E+6
In a log-log chart, the 1E+4
slope of the line 1E+2
corresponds to the 1E+0
growth rate of the 1E+0 1E+2 1E+4
n
1E+6 1E+8 1E+10
function
Analysis of Algorithms 15
ConstantFactors
1E+26
The growth rate is 1E+24 Quadratic
Quadratic
not affected by 1E+22
1E+20 Linear
constant factors or 1E+18 Linear
lower-order terms 1E+16
T (n )
1E+14
Examples 1E+12
1E+10
10n + 10 is a linear
1E+8
function 1E+6
105n2 + 108n is a 1E+4
quadratic function 1E+2
1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
n
Analysis of Algorithms 16
Big-Oh Notation
10,000
Given functions f(n) and 3n
g(n), we say that f(n) is 2n+10
1,000
O(g(n)) if there are
n
positive constants
c and n0 such that 100
f(n) cg(n) for n n0
10
Example: 2n + 10 is O(n)
2n + 10 cn
1
(c 2) n 10
1 10 100 1,000
n 10/(c 2) n
Pick c 3 and n0 10
Analysis of Algorithms 17
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 c•n for n n0
this is true for c = 7 and n0 = 1
3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n n0
this is true for c = 4 and n0 = 21
3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 c•log n for n n0
this is true for c = 8 and n0 = 2
Analysis of Algorithms 18
Big-Oh and Growth Rate
• The big-Oh notation gives an upper bound on the growth
rate of a function
• The statement “f(n) is O(g(n))” means that the growth rate
of f(n) is no more than the growth rate of g(n)
• We can use the big-Oh notation to rank functions according
to their growth rate
f(n) is O(g(n)) g(n) is O(f(n))
g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
Analysis of Algorithms 19
Big-Oh Rules
• If is f(n) a polynomial of degree d, then f(n) is
O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
• Use the smallest possible class of functions
– Say “2n is O(n)” instead of “2n is O(n2)”
• Use the simplest expression of the class
– Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
Analysis of Algorithms 20
Asymptotic Algorithm Analysis
• The asymptotic analysis of an algorithm determines the
running time in big-Oh notation
• To perform the asymptotic analysis
– We find the worst-case number of primitive operations
executed as a function of the input size
– We express this function with big-Oh notation
• Example:
– We determine that algorithm arrayMax executes at most 8n
2 primitive operations
– We say that algorithm arrayMax “runs in O(n) time”
• Since constant factors and lower-order terms are
eventually dropped anyhow, we can disregard them when
counting primitive operations
Analysis of Algorithms 21
O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
positive constants c and n0, such
that n n0,
we have 0 f(n) cg(n) }
Intuitively: Set of all functions whose rate of
growth is the same as or lower than that of
g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = (g(n)) f(n) = O(g(n)).
(g(n)) O(g(n)).
Examples
O(g(n)) = {f(n) : positive constants c and n0, such
that n n0, we have 0 f(n) cg(n) }
• Any linear function an + b is in O(n2). How?
• Show that 3n3=O(n4) for appropriate c and n0.
-notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
positive constants c and n0, such
that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions whose rate of
growth is the same as or higher than that of
g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = (g(n)) f(n) = (g(n)).
(g(n)) (g(n)).
Example
(g(n)) = {f(n) : positive constants c and n0, such that
n n0, we have 0 cg(n) f(n)}
• n = (lg n). Choose c and n0.
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)
}
Technically, f(n) (g(n)).
Older usage, f(n) = (g(n)).
I’ll accept either…
f(n) and g(n) are nonnegative, for large n.
Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}
• 10n2 - 3n = (n2)
• What constants for n0, c1, and c2 will work?
• Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
• To compare orders of growth, look at the
leading term.
• Exercise: Prove that n2/2-3n= (n2)
Relations Between , O,
Relations Between , , O
Theorem : For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).
• I.e., (g(n)) = O(g(n)) (g(n))
• In practice, asymptotically tight bounds are
obtained from asymptotic upper and lower
bounds.
Thank You