Chapter 1
Analysis of Algorithms
Design and Analysis of Computer
Algorithms
What is an Algorithm?
❑ An algorithm is a definite procedure for solving a problem in
finite number of steps
❑ Algorithm is a well defined computational procedure that
takes some value(s) as input, and produces some value(s) as
output
❑ Algorithm is finite number of computational statements that
transform input into the output
Major Factors in Algorithms Design
1. Correctness
• An algorithm is said to be correct if
• For every input, it halts with correct output.
• An incorrect algorithm might not halt at all OR
• It might halt with an answer other than desired one.
• Correct algorithm solves a computational problem
2. Algorithm Efficiency
• Measuring efficiency of an algorithm
• do its analysis i.e. growth rate.
• Compare efficiencies of different algorithms for the same problem.
Complexity Analysis of Algorithms
• Analyze the running time as a function of n (# of input
elements).
– E.g. F(n) = 2n+1
• If the input size is 20 then the time for the algorithm is 2(20)+1=41
• Efficient Algorithms
– Consumes lesser amount of resources while solving
a problem of size n
• Memory
• Time
Usually Time is our biggest
Complexity Analysis
• Algorithm analysis means predicting resources such as
• computational time
• memory
• Worst case analysis
• Provides an upper bound on running time
• An absolute guarantee
• Average case analysis
• Provides the expected running time
• Very useful, but treat with care: what is “average”?
• Random (equally likely) inputs
• Real-life inputs
• Best case
• the case that causes minimum number of operations to be
executed
(n/2)
Theoretical Analysis
➢ Uses a high-level description of the algorithm
instead of an implementation
➢ Characterizes running time as a function of the
input size, n.
➢ Takes into account all possible inputs
➢ Allows us to evaluate the speed of an algorithm
independent of the hardware/software
environment
Pseudocode
➢ High-level description of an algorithm
➢ More structured than English prose
➢ Less detailed than a program
➢ Preferred notation for describing algorithms
➢ Hides program design issues
Pseudocode Details
❑ Control flow ❑ Method call
◼ if … then … [else …] method (arg [, arg…])
◼ while … do … ❑ Return value
◼ repeat … until … return expression
◼ for … do … ❑ Expressions:
◼ Indentation replaces braces Assignment
❑ Method declaration = Equality testing
Algorithm method (arg [, arg…])
Input … n2 Superscripts and other
Output … mathematical
formatting allowed
Asymptotic Analysis
• Asymptotic efficiency of algorithms
• How does the running time of an algorithm increase as
the input size (n) increases infinitely n→∞
• Asymptotic notation (“the order of”)
• Define sets of functions that satisfy certain criteria
and use these to characterize time and space
complexity of algorithms
Asymptotic Analysis
• Categorize algorithms based on asymptotic
growth rate
• e.g. linear, quadratic, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth rate
of time complexity function
• Describe running time of algorithm as n grows to
F(n) = 2n3 + 7n2 + 10n + 5 becomes O(n3 )
• Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large input
Asymptotic Notations {О Ω Ɵ o ѡ}
Define a set of functions: which is in practice used to
compare two function sizes.
• Ɵ (big theta )➔ “order exactly”, tight bound (=)
• O (big O) ➔ “order at most”, upper bound – worst case (<=)
• Ω (big omega) ➔ “order at least”, lower bound – best case (>=)
• o (little o ) ➔ “tight upper bound” (<)
• ѡ (little omega) “tight lower bound” (>)
O-notation - Seven Important Functions
Intuition: concentrate on the leading term, ignore
constants
19 n3 + 17 n2 - 3n becomes O(n3)
2 n lg n + 5 n1.1 - 5 becomes O( n1.1 )
Complexity Term
O(1) constant
O(log n) logarithmic
O(n) linear
O(n lg n) n log n “linear-logarithmic”
O(nb) b > 1 polynomial(n2 :square or quadratic
n3 :cubic)
O(bn) b > 1 exponential
O(n!) factorial
Complexity categories
growth rates of some common complexity functions.
Space Complexity
• Number of memory cells (or words) needed to carry out the
computational steps required to solve an instance of the
problem excluding the space allocated to hold the input.
– Only the work space.
• All previous asymptotic notation definitions are also applied
to space complexity.
• Naturally, in many problems there is a time-space tradeoff:
The more space we allocate for the algorithm the faster it
runs, and vice versa
Simple Example - 1
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A
int Sum(int A[], int N){
1. int s=0 -------- 1
2. int i=0 -------- 1
3. while (i< N ) -------- N+1
{
5. s = s + A[i] -------- N
6. i = i+1 -------- N
}
7. return s -------- 1
}
Total = F(N)= 3N+4
Asymptotic notation O ➔ O(N)linear
Simple Example - 2
// Input: int data[N], array of N integers
// Output: the maximum number in array data
Step 3: 1
Step 4: n
Step 5: n
Step 6: (0→ n)
Step 7: 1
Total = F(N)= 2N+2 (best case) : 3N+2 (worst case)
➔ O(N)linear O(N)linear
Simple Example - 3
“Sequential search” or Linear
// Input: int A[N], array of N elements and a target X
// Output: index i (location) if X is found in the array , 0 if not found
Linear-Search[A, n, x]
1 for i ← 1 to n
2 if A[i] = x
3 return i
4 else i ← i + 1
5 return 0
Step 1: n
Step 2: n
Step 3: 1
Step 4: n
Step 5: 1
Total = F(N)= 3N+2 (worst case) unsuccessful Search
➔ O(N)linear
Note: best case O(1) but we do not value best case
Note: average case O(N) same as worst case