Lect 2 ALgorithm Analysis
Lect 2 ALgorithm Analysis
• Algorithm efficiency is used to describe properties of an • Problem: Need to multiply two positive integers a and b
algorithm relating to how much of various types of resources – Algorithm 1: Multiply a and b
(eg. Run/cpu time, memory) it consumes. – Algorithm 2: V = a, W = b
While W > 1
• Machine dependent - an algorithm may run differently V ®V + a; W ®W-1
depending on: Output V
• the hardware platform (PC, Cray, Sun, Smartphone) • First algorithm has 1 multiplication.
• the programming language (C, Java, C++) Second has b additions and subtractions.
• the programmer (you, me, Bill Gates) • For some computer architectures, 1 multiplication is more expensive than b
additions and subtractions.
• Ideally, we would like to program all choices and run all of them in the
machine we are going to use and find which is efficient!-Very difficult
3 4
• Efficiency of an algorithm is measured in terms of the number • Need to multiply two positive integers a and b
of basic operations it performs. – Algorithm 1: Multiply a and b
– Algorithm 2: V = a, W = b
• We assume that every basic operation takes constant time.
While W > 1
– Examples of Basic Operations:
V ®V + a; W ®W-1
• Single Arithmetic Operation (Addition, Subtraction, Multiplication)
Output V
• Memory Access
• Assignment Operation • Algorithm 1 uses 1 basic operation
• Single Input/Output Operation • Algorithm 2 uses b basic operations
• Single Boolean Operation • Algorithm 1 is more efficient.
• Function Return
1
Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)
7 8
Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)
9 10
Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)
Sample Code Sample Code
void func() void func() Count of Basic Operations (Time Units)
{ { 1 for the first assignment statement: x=0;
1 for the second assignment statement: i=0;
int x=0; int x=0;
1 for the third assignment statement: j=1;
int i=0; int i=0; 1 for the output statement.
int j=1; int j=1; 1 for the input statement.
cout<< “Enter an Integer value”; cout<< “Enter an Integer value”; In the first while loop:
cin>>n; cin>>n; n+1 tests
n loops of 2 units for the two increment
while (i<n){ while (i<n){ (addition) operations
x++; x++; In the second while loop:
i++; i++; n tests
} } n-1 increments
T (n) = 1+1+1+1+1+n+1+2n+n+n-1 =
while (j<n) while (j<n) 5n+5
{ {
j++; j++;
} 11 } 12
} }
2
Examples: Count of Basic Operations, T(n) Examples: Count of Basic Operations, T(n)
Sample Code Sample Code
int sum (int n) int sum (int n)
{ {
int partial_sum = 0; int partial_sum = 0;
for (int i = 1; i <= n; i++) for (int i = 1; i <= n; i++)
partial_sum = partial_sum + (i * i * i); partial_sum = partial_sum + (i * i * i);
return partial_sum; return partial_sum;
} }
Count of Basic Operations (Time Units)
1 for the assignment.
1 assignment, n+1 tests, and n increments.
n loops of 4 units for an assignment, an addition, and two multiplications.
1 for the return statement.
T (n) = 1+(1+n+1+n)+4n+1 = 6n+4
13 14
Simplified Rules to Compute Time Units Simplified Rules to Compute Time Units
åå å
for ( int j = 1; j <= M; j++) {
sum = sum+i+j; 3 = 3 M = 3 MN
} i=1 j =1 i =1
}
N
å 2 = 2N
for ( int i = 1; i <= N; i++) {
sum = sum+i;
}
i =1
15 16
Simplified Rules to Compute Time Units Simplified Rules to Compute Time Units
max (2N, 3N )= 3 N
} else for ( int i = 1; i <= N; i++) {
2 2
for ( int j = 1; j <= N; j++) {
sum = sum+i+j;
}}
17 18
3
Example: Computation of Run-time Example: Computation of Run-time
• Suppose we have hardware capable of executing 106 • Suppose we have hardware capable of executing 106
instructions per second. How long would it take to execute an instructions per second. How long would it take to execute an
algorithm whose time unit function was T (n) = 2n2 on an algorithm whose time unit function was T (n) = 2n2 on an
input size of n =108? input size of n =108?
19 20
• Suppose an algorithm for processing a retail store’s inventory takes: • Measuring the efficiency of an algorithm in terms of the
– 10,000 milliseconds to read the initial inventory from disk, and then number of basic operations it performs is good for all large
– 10 milliseconds to process each transaction (items acquired or sold). input sizes
– Processing n transactions takes (10,000 + 10 n) milliseconds. • In fact, we will not worry about the exact values, but will look
• Even though 10,000 >> 10, the "10 n" term will be more important if the at “broad classes" of values.
number of transactions is very large.
• We also know that these coefficients will change if we buy a faster • Let there be n inputs.
computer or disk drive, or use a different language or compiler. – If an algorithm needs n basic operations and another needs
• We want a way to express the speed of an algorithm independently of a 2n basic operations, we will consider them to be in the
specific implementation on a specific machine--specifically, we want to same efficiency category.
ignore constant factors (which get smaller and smaller as technology
improves). – However, we distinguish between exp(n), n, log(n)
21 22
• We worry about the speed of our algorithms for large input sizes. • The Big-Oh Notation - is a way of comparing algorithms and
• Asymptotic analysis is concerned with how the running time of an is used for computing the complexity of algorithms; i.e., the
algorithm increases, as the size of the input increases without bound. amount of time that it takes for computer program to run.
– bounds on running time or memory – It’s only concerned with what happens for a very large value of n.
exp (n)
log n
23 24
4
Big-Oh Notation EXAMPLE: Inventory
– Let n be the size of a program’s input (in bits or data words or
whatever). • Let’s consider the function T(n) = 10,000 + 10 n, from our previous
example.
– Let T(n) be a function. For now, T(n) is precisely equal to the
algorithm’s running time, given an input of size n (usually a • Let’s try out f(n) = n, because it’s simple. We can choose c as large as we
complicated expression). want, and we’re trying to make T(n) fit underneath c f(n), so pick c = 20.
– Let f(n) be another function--preferably a simple function like f(n) = n.
• As these functions extend forever to
We say that T(n) is in O( f(n) ) the right, their asymptotes will never
IF AND ONLY IF T(n) <= c f(n) WHENEVER n IS BIG, FOR cross again.
SOME LARGE CONSTANT c. • For large n--any n bigger than 1000,
– HOW BIG IS "BIG"? in fact--T(n) <= c f(n).
• THEREFORE,
• Big enough to make T(n) fit under c f(n).
– HOW LARGE IS c? T(n) is in O(f(n)).
• Large enough to make T(n) fit under c f(n).
25 26
• The following points are facts that you can use for Big-Oh 1. f(n) = 10n + 5 and g(n) = n. Show that f(n) is in O(g(n)).
problems:
– 1<= n for all n >= 1
– n <= n2 for all n >= 1
– 2n <= n! for all n >= 4
– log2n <= n for all n >= 2
– n <= nlog2n for all n >= 2
29 30
5
Example: Big-Oh Functions Example: Big-Oh Functions
1. f(n) = 10n + 5 and g(n) = n. Show that f(n) is in O(g(n)). 2. f(n) = 3n2 + 4n + 1. Show that f(n) = O(n2).
– To show that f(n) is O(g(n)) we must show that constants c and k
such that
f(n) <= c.g(n) for all n >= k
– Or 10n + 5 <= c.n for all n >= k
– Try c = 15. Then we need to show that 10n + 5 <= 15n
– Solving for n we get: 5 < 5n or 1 <= n.
– So f(n) =10n + 5 <= 15.g(n) for all n >= 1.
– (c = 15, k = 1).
31 32
2. f(n) = 3n2 + 4n + 1. Show that f(n) = O(n2). • Prove that T(n) = a0 + a1n + a2 n2 + a3n3 is O(n3)
– 4n <= 4n2 for all n >= 1 and 1 <= n2 for all n >= 1
– 3n2 + 4n+1 <= 3n2 + 4n2 + n2 for all n >= 1
– <= 8n2 for all n >= 1
– So we have shown that f(n)<= 8n2 for all n >= 1
– Therefore, f (n) is O(n2) (c = 8, k = 1)
33 34
• Prove that T(n) = a0 + a1n + a2 n2 + a3n3 is O(n3) • Give as good (i.e. small) a big-O estimate for function
– Find a constant c and k such that cn3>= T(n) for n>=k. T(n)=(n² + 3n + 8)(n + 1)
– It is obvious that T(n) <= |a0| + |a1|n + |a2|n2 + |a3|n3.
– Thus if n>=1, then T(n)<= cn3 where c = |a0| + |a1| + |a2| + |a3| so that
T(n) is O(n3).
35 36
6
Example: Big-Oh Functions Typical Big-Oh Orders
• Give as good (i.e. small) a big-O estimate for function N O(1) O(log n) O(n) O(n log n) O(n2) O(n3)
T(n)=(n² + 3n + 8)(n + 1)
– Answer: T(n)=(n² + 3n + 8)(n + 1) = n3 + 3n2 + 8n + n2 + 3n + 8 = n3 + 1 1 1 1 1 1 1
4n2 + 11n + 8=O(n3)
2 1 1 2 2 4 8
4 1 2 4 8 16 64
8 1 3 8 24 64 512
16 1 4 16 64 256 4,096
37 38
39 40