0% found this document useful (0 votes)
2 views

Analysis of Algorithms

The document provides an introduction to algorithms, focusing on the analysis of algorithms such as insertion sort and merge sort. It discusses the importance of performance, scalability, and asymptotic analysis in understanding algorithm efficiency. Additionally, it details the running time and complexity of sorting algorithms, emphasizing the significance of performance in computing.

Uploaded by

rsinglame24
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Analysis of Algorithms

The document provides an introduction to algorithms, focusing on the analysis of algorithms such as insertion sort and merge sort. It discusses the importance of performance, scalability, and asymptotic analysis in understanding algorithm efficiency. Additionally, it details the running time and complexity of sorting algorithms, emphasizing the significance of performance in computing.

Uploaded by

rsinglame24
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Introduction to Algorithms

Analysis of Algorithms
• Insertion sort
• Asymptotic analysis
• Merge sort
• Recurrences
Analysis of algorithms
The theoretical study of computer-program
performance and resource usage.
What’s more important than performance?
• modularity • user-friendliness
• correctness • programmer time
• maintainability • simplicity
• functionality • extensibility
• robustness • reliability
Why study algorithms and
performance?
• Algorithms help us to understand scalability.
• Performance often draws the line between what
is feasible and what is impossible.
• Algorithmic mathematics provides a language
for talking about program behavior.
• Performance is the currency of computing.
• The lessons of program performance generalize
to other computing resources.
• Speed is fun!
The problem of sorting

Input: sequence a1, a2, …, an of numbers.

Output: permutation a'1, a'2, …, a'n such


that a'1  a'2  …  a'n .

Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
Insertion sort
INSERTION-SORT (A, n) ⊳ A[1 . . n]
for j ← 2 to n
do key ← A[ j]
i←j–1
“pseudocode” while i > 0 and A[i] > key
do A[i+1] ← A[i]
i←i–1
A[i+1] = key
Insertion sort
INSERTION-SORT (A, n) ⊳ A[1 . . n]
for j ← 2 to n
do key ← A[ j]
i←j–1
“pseudocode” while i > 0 and A[i] > key
do A[i+1] ← A[i]
i←i–1
A[i+1] = key
1 i j n
A:
key
sorted
Example of insertion sort
8 2 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
Running time

• The running time depends on the input: an


already sorted sequence is easier to sort.
• Parameterize the running time by the size of
the input, since short sequences are easier to
sort than long ones.
• Generally, we seek upper bounds on the
running time, because everybody likes a
guarantee.
Kinds of analyses
Worst-case: (usually)
• T(n) = maximum time of algorithm
on any input of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm
over all inputs of size n.
• Need assumption of statistical
distribution of inputs.
Best-case: (bogus)
• Cheat with a slow algorithm that
works fast on some input.
Machine-independent time

What is insertion sort’s worst-case time?


• It depends on the speed of our computer:
• relative speed (on the same machine),
• absolute speed (on different machines).
BIG IDEA:
• Ignore machine-dependent constants.
• Look at growth of T(n) as n → ∞ .
“Asymptotic Analysis”
-notation

Math:
(g(n)) = { f (n) : there exist positive constants c1, c2, and
n0 such that 0  c1 g(n)  f (n)  c2 g(n)
for all n  n0 }
Engineering:
• Drop low-order terms; ignore leading constants.
• Example: 3n3 + 90n2 – 5n + 6046 = (n3)
Asymptotic performance
When n gets large enough, a (n2) algorithm
always beats a (n3) algorithm.
• We shouldn’t ignore
asymptotically slower
algorithms, however.
• Real-world design
situations often call for a
T(n) careful balancing of
engineering objectives.
• Asymptotic analysis is a
useful tool to help to
n n0 structure our thinking.
Insertion sort analysis
Worst case: Input reverse sorted.
n
T ( n) =   ( j ) =  (n 2) [arithmetic series]
j =2
Average case: All permutations equally likely.
n
T ( n) =  ( j / 2) = (n 2 )
j =2
Is insertion sort a fast sorting algorithm?
• Moderately so, for small n.
• Not at all, for large n.
Merge sort

MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.

Key subroutine: MERGE


Merging two sorted arrays
20 12
13 11
7 9
2 1
Merging two sorted arrays
20 12
13 11
7 9
2 1

1
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2

1
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2

1 2
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2

1 2
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2

1 2 7
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11 12
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11 12

Time = (n) to merge a total


of n elements (linear time).
Analyzing merge sort

T(n) MERGE-SORT A[1 . . n]


(1) 1. If n = 1, done.
2T(n/2) 2. Recursively sort A[ 1 . . n/2 ]
Abuse and A[ n/2+1 . . n ] .
(n) 3. “Merge” the 2 sorted lists
Sloppiness: Should be T( n/2 ) + T( n/2 ) ,
but it turns out not to matter asymptotically.
Recurrence for merge sort
(1) if n = 1;
T(n) =
2T(n/2) + (n) if n > 1.
• We shall usually omit stating the base
case when T(n) = (1) for sufficiently
small n, but only when it has no effect on
the asymptotic solution to the recurrence.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/2) T(n/2)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2

T(n/4) T(n/4) T(n/4) T(n/4)


Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2

cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1) #leaves = n (n)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1) #leaves = n (n)
Total = (n lg n)
Conclusions

• (n lg n) grows more slowly than (n2).


• Therefore, merge sort asymptotically
beats insertion sort in the worst case.
• In practice, merge sort beats insertion
sort for n > 30 or so.
• Go test it out for yourself!

You might also like