Algorithms Analysis
Algorithms Analysis
Analysis
Data Structures
Problem Solving
Main Steps:
1. Problem definition
2. Algorithm design / Algorithm
specification
3. Algorithm analysis
4. Implementation
5. Testing
6. [Maintenance]
1. Problem Definition
What is the task to be
accomplished?
Calculate the average of the grades for
a given student
What are the time / space / speed /
performance requirements ?
2. Algorithm Design /
Specifications
Algorithm: Finite set of instructions that, if
followed, accomplishes a particular task.
Describe: in natural language / pseudo-code /
diagrams / etc.
Criteria to follow:
Input: Zero or more quantities
Output: One or more quantities
Definiteness: Clarity, precision of each instruction
Finiteness: The algorithm has to stop after a finite
(may be very large) number of steps
Effectiveness: Each instruction has to be basic
enough and feasible
Computer Algorithm
A procedure (a finite set of well-defined
instructions) for accomplishing some
tasks which,
given an initial state
terminate in a defined end-state
3 ms
}
average-case?
best-case
2 ms
1 ms
A B C D E F G
Input
Suppose the program includes an if-then statement that
may execute or not: variable running time
Running Time
80
wrt the increase of the input size.
60
The average running time is
difficult to determine. 40
20
We focus on the worst case
running time 0
Easier to analyze 1000 2000 3000 4000
Input Size
09:20 AM
Algorithm analysis:
Experimental Approach
Write a program to 9000
implement the algorithm.
8000
Time (ms)
composition.
5000
4000
Get an accurate measure
of the actual running time 3000
(e.g. system call date). 2000
1000
Plot the results.
0
Problems?
0 50 100
Input Size
09:20 AM
Limitations of Experimental
Approach
Value of function
fA(n)=30n+8
right, a faster
growing
function
fB(n)=n2+1
eventually
becomes
larger... Increasing n
Constant Factors
The growth rate is not affected by
constant factors or
lower-order terms
Examples
102n + 105 is a linear function
103n2 + 105n is a quadratic function
Understanding Rate of Growth
Consider the example of buying elephants
and a fish:
n4 + 100n2 + 10n + 50 n4
Characterize an algorithm as a
function of the “problem size”
E.g.
Input data = array problem size is N
(length of array)
Input data = matrix problem size is N x M
Asymptotic Notation
A way to describe behavior of functions
Abstracts away low-order terms and constant
factors
How we indicate running times of algorithms
Describe the running time of an algorithm as
n grows to
Asymptotic Notation
Goal: to simplify analysis by getting
rid of unneeded information
We want to say in a formal way 3n2
≈ n2
The “Big-Oh” Notation:
given functions f(n) and g(n), we say
that f(n) is O(g(n)) if and only if there
are positive constants c and n0 such
that f(n) ≤ c g(n) for n ≥ n0
Graphic Illustration
f(n) = 2n+6
According to definition: c g(n) = 4n
Need to find a
function g(n) and a
const. c and a f(n) = 2n + 6
constant n0 such as
f(n) ≤ cg(n) when n
≥ n0
g(n) = n and c = 4 g(n) = n
and n0=3
f(n) is O(n)
The order of f(n) is n
2n+6<=cn
cn-2n>=6
n(c-2)>=6
n0
n
n>=6/c-2
More examples
O(g(n)) = the set of functions with a
smaller or same order of growth as g(n)
50n3 + 20n + 4 is O(n3)
Would be correct to say is O(n3+n) ?
• Not useful, as n3 exceeds by far n, for large
values
Would be correct to say is O(n5)
• OK, but g(n) should be as close as possible to
f(n)
3log(n) + log (log (n)) = O( ? )
• Simple Rule: Drop lower order terms and constant factors
Big-Oh and Growth Rate
The big-Oh notation gives an upper
bound on the growth rate of a function.
The statement “f (n) is O(g (n))” means
that the growth rate of f (n) is no more
than the growth rate of g (n).
We can use the big-Oh notation to rank
functions according to their growth rate.
Big-Oh Rules
If f (n) is a polynomial of degree d, then f
(n) is O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors