Algorithm Design - Slide
Algorithm Design - Slide
E.g.
– Input data = array problem size is N (length of array)
– Input data = matrix problem size is N x M
Algorithm Analysis Cont…
• We only analyze correct algorithms
• An algorithm is correct
– If, for every input instance, it halts with the correct output
• Incorrect algorithms
– Might not halt at all on some input instances
– Might halt with other than the desired answer
• Analyzing an algorithm
– Predicting the resources that the algorithm requires
– Resources include
• Memory
• Communication bandwidth
• Computational time (usually most important)
Algorithm Analysis Cont…
• Factors affecting the running time
– computer
– compiler
– algorithm used
– input to the algorithm
• The content of the input affects the running time
• typically, the input size (number of items in the input) is
the main consideration
– E.g. sorting problem the number of items to be
sorted
– E.g. multiply two matrices together the total
number of elements in the two matrices
Running time
• The running time depends on the input: an already sorted sequence
is easier to sort.
i 1
1
1
2 2N+2
3 4N
4 1
Average-case: (sometimes)
• T(n) = expected time of algorithm over all inputs of size n.
• Need assumption of statistical distribution of inputs.
Best-case: (NEVER)
• Cheat with a slow algorithm that works fast on some input.
Analysis of Algorithms or Performance Analysis
• S(P) = c + Sp(instance)
Time Complexity
• Time Complexity of an algorithm is the amount of CPU time it
needs to run to completion.
• Time required T(P) to run a program P also consists of two
components:
– A fixed part: compile time which is independent of the
problem instance c.
– A variable part: run time which depends on the problem
instance tp(instance)
• T(P) = c + tp(instance)
• The time T(P) taken by a program P, is the sum of its compile
time and its run time.
• The compile time is similar to the fixed space component since
it does not depend upon instance characteristics.
• The two table below show iterative and recursive complexity
respectively
STATEMENTS S/E FREQUE TOTAL
NCY STEPS
TOTAL 2n+3
STATEMENTS S/E FREQ TOTAL
UENC STEPS
Y
float rsum (float list [ ] ,int n) 0 0 0
{ 0 0 0
if (n) 1 n+1 n+1
return rsum (list,n-1)+ list [n-1] ; 1 n n
return 0 ; 1 1 1
} 0 0 0
TOTAL - - 2n+2
Asymptotic Notation
N 0, 1, 2, ...
Asymptotic notation
• The growth rate of f(N) is less than or equal to the growth rate
of g(N)
• g(N) is an upper bound on f(N)
Big-Oh: example1
• Let f(N) = 2N2. Then
– f(N) = O(N4)
– f(N) = O(N3)
– f(N) = O(N2) (best answer, asymptotically tight)
for all n ≥ 5
Because :
3n 2 2n 5 (n 2 )
3n 2 2n 5 O (n 2 )
Example 3.
3n 2 100n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O(n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n) since for any c, cn 3n 2 when n c
3n 2 100n 6 (n 2 ) since for c 2, 2n 2 3n 2 100n 6 when n 100
3n 2 100n 6 (n 3 ) since for c 3, 3n 2 100n 6 n 3 when n 3
3n 2 100n 6 (n) since for any c, cn 3n 2 100n 6 when n 100
3n 2 100n 6 (n 2 ) since both O and apply.
3n 2 100n 6 (n 3 ) since only O applies.
3n 2 100n 6 (n) since only applies.
Example Relative of Big-Oh
5n2 is (n2)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1
such that f(n) c•g(n) for n n0
let c = 5 and n0 = 1
5n2 is (n)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1
such that f(n) c•g(n) for n n0
let c = 1 and n0 = 1
5n2 is (n2)
f(n) is (g(n)) if it is (n2) and O(n2). We have already seen the former,
for the latter recall that f(n) is O(g(n)) if there is a constant c > 0 and
an integer constant n0 1 such that f(n) < c•g(n) for n n0
Let c = 5 and n0 = 1
Algorithm 1
int MaxSubSum1(const vector <int> & a) {
int maxSum=0;
if (thisSum>maxSum)
O(n )
3
maxSum=thisSum;
}
return maxSum;
}
Algorithm 2
int MaxSubSum2(const vector <int> & a) {
int maxSum=0;
}
if (thisSum>maxSum)
maxSum=thisSum; O(n )
2
}
return maxSum;
}
Algorithm 3
int MaxSubSum4(const vector <int> & a) {
int maxSum=0, thisSum=0;
if (thisSum>maxSum)
maxSum=thisSum;
}
else if (thisSum<0)
thisSum=0; O(n)
return maxSum;
}
Comparison of running times
Searches
– Linear: n steps
– Binary: log2 n steps
– Binary search is about as fast as you can get
Sorts
– Bubble: n2 steps
– Insertion: n2 steps
– There are other, more efficient, sorting techniques
• In principle, the fastest are heap sort, quick sort, and merge
sort
• These each take take n * log2 n steps
• In practice, quick sort is the fastest, followed by merge sort
Complexity Time