0% found this document useful (0 votes)
17 views

Lect 2 ALgorithm Analysis

Analysis of different Types of algorithm for solving a problem

Uploaded by

Yohans Brhanu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Lect 2 ALgorithm Analysis

Analysis of different Types of algorithm for solving a problem

Uploaded by

Yohans Brhanu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Algorithm Analysis

• In order to solve a problem, there are many possible algorithms.


– How to choose the best algorithm for the problem ?
• What metric should be used to judge algorithms?
Algorithm Analysis – Length of the program (lines of code)
– Ease of programming (bugs, maintenance)
– Memory required
– Running time
Fitsum Admasu • Running time is the dominant standard.
Department of Computer Science – Quantifiable and easy to compare
– Often the critical bottleneck
Addis Ababa University
• To analyze an algorithm is to determine the amount of resources (such as
time and storage) necessary to execute it

Efficiency of an Algorithm Efficiency of an Algorithm

• Algorithm efficiency is used to describe properties of an • Problem: Need to multiply two positive integers a and b
algorithm relating to how much of various types of resources – Algorithm 1: Multiply a and b
(eg. Run/cpu time, memory) it consumes. – Algorithm 2: V = a, W = b
While W > 1
• Machine dependent - an algorithm may run differently V ®V + a; W ®W-1
depending on: Output V
• the hardware platform (PC, Cray, Sun, Smartphone) • First algorithm has 1 multiplication.
• the programming language (C, Java, C++) Second has b additions and subtractions.
• the programmer (you, me, Bill Gates) • For some computer architectures, 1 multiplication is more expensive than b
additions and subtractions.
• Ideally, we would like to program all choices and run all of them in the
machine we are going to use and find which is efficient!-Very difficult

3 4

Machine Independent Analysis Machine Independent Analysis

• Efficiency of an algorithm is measured in terms of the number • Need to multiply two positive integers a and b
of basic operations it performs. – Algorithm 1: Multiply a and b
– Algorithm 2: V = a, W = b
• We assume that every basic operation takes constant time.
While W > 1
– Examples of Basic Operations:
V ®V + a; W ®W-1
• Single Arithmetic Operation (Addition, Subtraction, Multiplication)
Output V
• Memory Access
• Assignment Operation • Algorithm 1 uses 1 basic operation
• Single Input/Output Operation • Algorithm 2 uses b basic operations
• Single Boolean Operation • Algorithm 1 is more efficient.
• Function Return

• We do not distinguish between the basic operations.


• Examples of Non-basic Operations are
– Sorting, Searching.
5 6

1
Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)

Sample Code Sample Code Count of Basic Operations (Time Units)


int count() int count() 1 for the assignment statement: int k=0
{ { 1 for the output statement.
int k=0; int k=0; 1 for the input statement.
cout<< “Enter an integer”; cout<< “Enter an integer”; In the for loop:
1 assignment, n+1 tests, and n increments.
cin>>n; cin>>n; n loops of 2 units for an assignment, and an
for (i = 0;i < n;i++) for (i = 0;i < n;i++) addition.
k = k+1; k = k+1; 1 for the return statement.
return 0; return 0;
T (n) = 1+1+1+(1+n+1+n)+2n+1 = 4n+6
} }

7 8

Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)

Sample Code Sample Code Count of Basic Operations (Time Units)


1 for the assignment statement: int sum=0
int total(int n) int total(int n) In the for loop:
1 assignment, n+1 tests, and n increments.
{ {
n loops of 2 units for an assignment, and an
int sum=0; int sum=0; addition.
for (int i=1;i<=n;i++) for (int i=1;i<=n;i++) 1 for the return statement.
sum=sum+1; sum=sum+1; T (n) = 1+ (1+n+1+n)+2n+1 = 4n+4
return sum; return sum;
} }

9 10

Examples: Count of Basic Operations T(n) Examples: Count of Basic Operations T(n)
Sample Code Sample Code
void func() void func() Count of Basic Operations (Time Units)
{ { 1 for the first assignment statement: x=0;
1 for the second assignment statement: i=0;
int x=0; int x=0;
1 for the third assignment statement: j=1;
int i=0; int i=0; 1 for the output statement.
int j=1; int j=1; 1 for the input statement.
cout<< “Enter an Integer value”; cout<< “Enter an Integer value”; In the first while loop:
cin>>n; cin>>n; n+1 tests
n loops of 2 units for the two increment
while (i<n){ while (i<n){ (addition) operations
x++; x++; In the second while loop:
i++; i++; n tests
} } n-1 increments
T (n) = 1+1+1+1+1+n+1+2n+n+n-1 =
while (j<n) while (j<n) 5n+5
{ {
j++; j++;
} 11 } 12

} }

2
Examples: Count of Basic Operations, T(n) Examples: Count of Basic Operations, T(n)
Sample Code Sample Code
int sum (int n) int sum (int n)
{ {
int partial_sum = 0; int partial_sum = 0;
for (int i = 1; i <= n; i++) for (int i = 1; i <= n; i++)
partial_sum = partial_sum + (i * i * i); partial_sum = partial_sum + (i * i * i);
return partial_sum; return partial_sum;
} }
Count of Basic Operations (Time Units)
1 for the assignment.
1 assignment, n+1 tests, and n increments.
n loops of 4 units for an assignment, an addition, and two multiplications.
1 for the return statement.
T (n) = 1+(1+n+1+n)+4n+1 = 6n+4
13 14

Simplified Rules to Compute Time Units Simplified Rules to Compute Time Units

• for Loops: • Nested Loops:


– In general, a for loop translates to a summation. The index and
bounds of the summation are the same as the index and bounds of the
for loop. for ( int i = 1; i <= N; i++) {
N M N

åå å
for ( int j = 1; j <= M; j++) {
sum = sum+i+j; 3 = 3 M = 3 MN
} i=1 j =1 i =1
}
N
å 2 = 2N
for ( int i = 1; i <= N; i++) {
sum = sum+i;
}
i =1

15 16

Simplified Rules to Compute Time Units Simplified Rules to Compute Time Units

• Consecutive Statements • Conditionals:


– If (test) s1 else s2: Compute the maximum of the running time for s1
for ( int i = 1; i <= N; i++) {
and s2.
sum = sum+i;
é N ù é N N ù
ê å 2ú + ê å å 3ú =
} if (test == 1) {
for ( int i = 1; i <= N; i++) { 2N + 3 N 2 for ( int i = 1; i <= N; i++) { æ N N N ö
ë i = 1 û ë i =1 û
max çç å 2, å å3
j =1
for ( int j = 1; j <= N; j++) {
sum = sum+i+j;
sum = sum+i; =
} }} è i =1 i =1 j =1 ø

max (2N, 3N )= 3 N
} else for ( int i = 1; i <= N; i++) {
2 2
for ( int j = 1; j <= N; j++) {
sum = sum+i+j;
}}

17 18

3
Example: Computation of Run-time Example: Computation of Run-time

• Suppose we have hardware capable of executing 106 • Suppose we have hardware capable of executing 106
instructions per second. How long would it take to execute an instructions per second. How long would it take to execute an
algorithm whose time unit function was T (n) = 2n2 on an algorithm whose time unit function was T (n) = 2n2 on an
input size of n =108? input size of n =108?

The total number of operations to be performed would be T(108):


T(108) = 2*(108)2 =2*1016
The required number of seconds would be given by
T(108)/106 so:
Running time = 2*1016/106 = 2*1010
The number of seconds per day is 86,400 so this is about 231,480 days (~634 years).

19 20

Example Machine Independent Analysis

• Suppose an algorithm for processing a retail store’s inventory takes: • Measuring the efficiency of an algorithm in terms of the
– 10,000 milliseconds to read the initial inventory from disk, and then number of basic operations it performs is good for all large
– 10 milliseconds to process each transaction (items acquired or sold). input sizes
– Processing n transactions takes (10,000 + 10 n) milliseconds. • In fact, we will not worry about the exact values, but will look
• Even though 10,000 >> 10, the "10 n" term will be more important if the at “broad classes" of values.
number of transactions is very large.
• We also know that these coefficients will change if we buy a faster • Let there be n inputs.
computer or disk drive, or use a different language or compiler. – If an algorithm needs n basic operations and another needs
• We want a way to express the speed of an algorithm independently of a 2n basic operations, we will consider them to be in the
specific implementation on a specific machine--specifically, we want to same efficiency category.
ignore constant factors (which get smaller and smaller as technology
improves). – However, we distinguish between exp(n), n, log(n)

21 22

Order of Increase & Asymptotic Analysis Big-Oh Notation

• We worry about the speed of our algorithms for large input sizes. • The Big-Oh Notation - is a way of comparing algorithms and
• Asymptotic analysis is concerned with how the running time of an is used for computing the complexity of algorithms; i.e., the
algorithm increases, as the size of the input increases without bound. amount of time that it takes for computer program to run.
– bounds on running time or memory – It’s only concerned with what happens for a very large value of n.

exp (n)

log n

23 24

4
Big-Oh Notation EXAMPLE: Inventory
– Let n be the size of a program’s input (in bits or data words or
whatever). • Let’s consider the function T(n) = 10,000 + 10 n, from our previous
example.
– Let T(n) be a function. For now, T(n) is precisely equal to the
algorithm’s running time, given an input of size n (usually a • Let’s try out f(n) = n, because it’s simple. We can choose c as large as we
complicated expression). want, and we’re trying to make T(n) fit underneath c f(n), so pick c = 20.
– Let f(n) be another function--preferably a simple function like f(n) = n.
• As these functions extend forever to
We say that T(n) is in O( f(n) ) the right, their asymptotes will never
IF AND ONLY IF T(n) <= c f(n) WHENEVER n IS BIG, FOR cross again.
SOME LARGE CONSTANT c. • For large n--any n bigger than 1000,
– HOW BIG IS "BIG"? in fact--T(n) <= c f(n).
• THEREFORE,
• Big enough to make T(n) fit under c f(n).
– HOW LARGE IS c? T(n) is in O(f(n)).
• Large enough to make T(n) fit under c f(n).
25 26

EXAMPLE: Inventory Big-Oh Formally

• Pay close attention to c and N. In the • Formal Definition:


graph above, c = 20, and N = 1000.
• Think of it this way: if you’re trying to
prove that one function is T (n) = O (f (n)) if there exist c, k ∊ℛ+ such
asymptotically bounded by another that for all n ≥ k, T (n) ≤ c.f (n).
[f(n) = O(g(n))], you’re allowed to
multiply them by positive constants in
an attempt to stuff one underneath the
other.
• Demonstrating that a function T(n) is in big-O of a
• You’re also allowed to move the
vertical line (N) as far to the right as function f(n) requires that we find specific constants c
you like (to get all the crossings onto and k for which the inequality holds.
the left side). We’re only interested in
how the functions behave as they shoot
off toward infinity.
27 28

Big-Oh Facts Example: Big-Oh Functions

• The following points are facts that you can use for Big-Oh 1. f(n) = 10n + 5 and g(n) = n. Show that f(n) is in O(g(n)).
problems:
– 1<= n for all n >= 1
– n <= n2 for all n >= 1
– 2n <= n! for all n >= 4
– log2n <= n for all n >= 2
– n <= nlog2n for all n >= 2

29 30

5
Example: Big-Oh Functions Example: Big-Oh Functions

1. f(n) = 10n + 5 and g(n) = n. Show that f(n) is in O(g(n)). 2. f(n) = 3n2 + 4n + 1. Show that f(n) = O(n2).
– To show that f(n) is O(g(n)) we must show that constants c and k
such that
f(n) <= c.g(n) for all n >= k
– Or 10n + 5 <= c.n for all n >= k
– Try c = 15. Then we need to show that 10n + 5 <= 15n
– Solving for n we get: 5 < 5n or 1 <= n.
– So f(n) =10n + 5 <= 15.g(n) for all n >= 1.
– (c = 15, k = 1).

31 32

Example: Big-Oh Functions Example: Big-Oh Functions

2. f(n) = 3n2 + 4n + 1. Show that f(n) = O(n2). • Prove that T(n) = a0 + a1n + a2 n2 + a3n3 is O(n3)
– 4n <= 4n2 for all n >= 1 and 1 <= n2 for all n >= 1
– 3n2 + 4n+1 <= 3n2 + 4n2 + n2 for all n >= 1
– <= 8n2 for all n >= 1
– So we have shown that f(n)<= 8n2 for all n >= 1
– Therefore, f (n) is O(n2) (c = 8, k = 1)

33 34

Example: Big-Oh Functions Example: Big-Oh Functions

• Prove that T(n) = a0 + a1n + a2 n2 + a3n3 is O(n3) • Give as good (i.e. small) a big-O estimate for function
– Find a constant c and k such that cn3>= T(n) for n>=k. T(n)=(n² + 3n + 8)(n + 1)
– It is obvious that T(n) <= |a0| + |a1|n + |a2|n2 + |a3|n3.
– Thus if n>=1, then T(n)<= cn3 where c = |a0| + |a1| + |a2| + |a3| so that
T(n) is O(n3).

35 36

6
Example: Big-Oh Functions Typical Big-Oh Orders

• Give as good (i.e. small) a big-O estimate for function N O(1) O(log n) O(n) O(n log n) O(n2) O(n3)
T(n)=(n² + 3n + 8)(n + 1)
– Answer: T(n)=(n² + 3n + 8)(n + 1) = n3 + 3n2 + 8n + n2 + 3n + 8 = n3 + 1 1 1 1 1 1 1
4n2 + 11n + 8=O(n3)
2 1 1 2 2 4 8

4 1 2 4 8 16 64

8 1 3 8 24 64 512

16 1 4 16 64 256 4,096

1024 1 10 1,024 10,240 1,048,576 1,073,741,824

37 38

Orders of Common Algorithms Implication of Big-Oh notation

Notation Name Example


• We use Big-Oh notation to say how slowly code might run as
O(1) Constant basic operations its input grows.
O(log n) Logarithmic Finding an item in a sorted array with a binary search
• Suppose we know that our algorithm uses at most O(f(n))
O(n) Linear Finding an item in an unsorted list; adding two n-digit
basic steps for any n inputs, and n is sufficiently large, then we
numbers know that our algorithm will terminate after executing at most
O(n2) Quadratic Multiplying two n-digit numbers by a simple algorithm; constant times f(n) basic steps.
adding two n×n matrices;
• We know that a basic step takes a constant time in a machine.
• Hence, our algorithm will terminate in a constant times f(n)
units of time, for all large n.

39 40

You might also like