0% found this document useful (0 votes)
17 views

Algorithm Analysis

The document discusses different approaches for measuring the efficiency of algorithms, including empirical comparison and asymptotic analysis. It covers analyzing the growth rate and complexity of algorithms, and differentiating between best, worst, and average cases. The document also discusses how faster computers can improve performance for different algorithm complexities.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Algorithm Analysis

The document discusses different approaches for measuring the efficiency of algorithms, including empirical comparison and asymptotic analysis. It covers analyzing the growth rate and complexity of algorithms, and differentiating between best, worst, and average cases. The document also discusses how faster computers can improve performance for different algorithm complexities.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

(Algorithm) Analysis

1
Algorithm Efficiency
There are often many approaches (algorithms) to
solve a problem. How do we choose between
them?

At the heart of computer program design are two


(sometimes conflicting) goals.
1. To design an algorithm that is easy to understand,
code, debug.
2. To design an algorithm that makes efficient use of
the computer’s resources.

2
Algorithm Efficiency (cont)
Goal (1) is the concern of Software Engineering.

Goal (2) is the concern of data structures and


algorithm analysis.

When goal (2) is important, how do we measure


an algorithm’s cost?

3
How to Measure Efficiency?
1. Empirical comparison (run programs)
2. Asymptotic Algorithm Analysis

4
How to Measure Efficiency?
• Empirical comparison (run programs) is not
satisfactory. Why?
• First, there is the effort involved in programming and
testing two algorithms when at best you want to keep only
one.
• Second, when empirically comparing two algorithms there
is always the chance that one of the programs was “better
written”.
• Third, the choice of empirical test cases might unfairly
favor one algorithm.
• Fourth, you could find that even the better of the two
algorithms does not fall within your resource budget.

5
How to Measure Efficiency?
1. Empirical comparison (run programs)
2. Asymptotic Algorithm Analysis

So we need to consider the second option.

6
How to Measure Efficiency?
Asymptotic Algorithm Analysis

Critical resources: Running time and Space

For most algorithms, running time depends on “size”


of the input.

We estimate the number of “basic operations” required


by the algorithm to process an input of a certain
“size”. 7
Examples of Growth Rate
Example 1.
/** @return Position of largest value in "A“ */
static int largest(int[] A) {
int currlarge = 0; // Position of largest
for (int i=1; i<A.length; i++)
if (A[currlarge] < A[i])
currlarge = i; // Remember pos
return currlarge; // Return largest pos
}

8
Examples of Growth Rate
Example 1.
/** @return Position of largest value in "A“ */
static int largest(int[] A) {
int currlarge = 0; // Position of largest
for (int i=1; i<A.length; i++)
if (A[currlarge] < A[i])
currlarge = i; // Remember pos
return currlarge; // Return largest pos
}

We can deduce: T(n) = cn


c is a constant in the sense
that it is not dependent on n 9
Examples (cont)
Example 2: Assignment statement.

We can say: T(n) = c_1

Example 3: • We can ignore initializations or can


assume these to be included within
c_2
sum = 0; • Increment/comparison of loop
for (i=1; i<=n; i++) counters can be assumed to be
for (j=1; j<n; j++) included in c_2

sum++;
}
We can deduce: T(n) = c_2 n^2.
10
Growth Rate
• In essence we are interested in the growth rate of a algorithm
• Growth rate for an algorithm is the rate at which the cost of the
algorithm grows as the size of its input grows.

11
Growth Rate Graph

12
Growth Rate Graph

13
Growth Rate Table

14
Best, Worst, Average Cases
• largest() always
/** @return Position of largest value in "A“
examines every array */
value. static int largest(int[] A) {
• This algorithm works on
int currlarge = 0; // Position of largest
many inputs of a given
size n. for (int i=1; i<A.length; i++)

• no matter what array of if (A[currlarge] < A[i])


size n that the algorithm currlarge = i; // Remember pos
looks at, its cost will return currlarge; // Return largest pos
always be the same }

15
Best, Worst, Average Cases
Consider a slightly different problem but a similar algorithm:
• Search for K in an array of n integers:
• Sequential Search Algorithm: Begin at first element in array and
look at each element in turn until K is found (assume K appears
only once for now)
• Not all inputs of a given size will take the same time to run now.
• Unlike large()
• Now the question comes, what are the Best, Worst and Average
cases?
16
Best, Worst, Average Cases
• Sequential Search Algorithm: Begin at first
element in array and look at each element in turn
until K is found (assume K appears only once for
now)
• Best case: T(n) = c When the first element is in fact K

• Worst case: T(n) = cn When the last element is K or it is absent.


• Average case: T(n) = c n/2
run it many times on many different arrays of size n,
or search for many different values of K within the same array,
=> On average, the algorithm examines about n/2 values.

17
Which Analysis to Use?
While average time appears to be the fairest
measure, it may be difficult to determine.

For real-time application, the worst case time is the


most important.

18
Faster Computer or Faster Algorithm
If the algorithm’s growth rate is
• Old PC can do 10K basic operations/s greater than cn, such as c1n^2, then
you will not be able to do a problem
• New PC is 10x faster (i.e., 100K basic ops/s) ten times the size in the same
amount of time on a machine that is
• N (N’)=max input size can be handled in a sec in old ten times faster.
(new) PC
f(n) Calculate N N Calculate N’ N’ N’/N
10n 10n = 10000 1000 10n = 100000 10000 10 N’  10N
20n 20n=10000 500 20n = 100000 5000 10 Constant factors never affect the
5n log n 5n log n=10000 250 5n log n = 100000 1842 7.368 relative improvement gained by a
faster computer; it only affect the
2n^2 2n^2=10000 70.71068 2n^2 =100000 223.6068 3.162278 absolute size.
3n^2 3n^2=10000 57.73503 3n^2 =100000 182.5742 3.162278
4n^2 4n^2=10000 50 4n^2 =100000 158.1139 3.162278 N’  10N

4n^3 4n^3=10000 13.57209 4n^3 =100000 29.24018 2.154435 N’  10^{1/3}N For a machine 100x faster
2^n 2^n=10000 13.28771 2^n = 100000 16.60964 1.25 N’  N + log_2 10 N’  N + log_2 100 = 19.93
3^n 3^n=10000 8.383613 3^n = 100000 10.47952 1.25 N’  N + log_3 10 N’  N + log_3 100 = 12.58
4^n 4^n=10000 6.643856 4^n = 100000 8.30482 1.25 N’  N + log_4 10 N’  N + log_4 100 = 9.97
19

You might also like