Data Structures and Algorithms - L2
Data Structures and Algorithms - L2
(CS F211)
Algorithm 1: Algorithm 2:
Time
Space
Experimental Studies
Time (ms)
Use a method like 5000
System.currentTimeMillis() or Clock 4000
functions to get an accurate measure
of the actual running time 3000
2000
Plot the results
1000
0
0 50 100
Input Size
Limitations of Experiments
Results may not be indicative of the running time as all inputs may not
be included in the experiment.
O(n)
o(n)
Ω(n)
(n)
Ө(n)
Edmund Landau
• 1877~1938
• Inventor of the asymptotic notation
Donald E. Knuth
• 1938 ~
• Turing Award, 1974.
• Father of the analysis of algorithms
• Popularizing the asymptotic notation
Theoretical Analysis
A CPU
Algorithm arrayMax(A, n)
currentMax A[0] 2
for (i =1; i<n; i++) 2n
(i=1 once, i<n n times, i++ (n-1) times)
if A[i] currentMax then 2(n 1)
currentMax A[i] 2(n 1)
return currentMax 1
Total 6n
Estimating Running Time
Define:
a = Time taken by the fastest primitive operation
b = Time taken by the slowest primitive operation
Running Time
80
We focus on the worst case running
time. 60
Easier to analyze and best to bet
40
Crucial to applications such as
games, finance and robotics 20
0
1000 2000 3000 4000
Input Size
The Growth Rate of the Six Popular functions
n logn n nlogn n2 n3 2n
4 2 4 8 16 64 16
8 3 8 24 64 512 256
16 4 16 64 256 4,096 65,536
10000000
1000000
100000
10000
1000
100
10
1
1 4 16 64 256 1024 4096 16384 65536
Asymptotic Dominance in Action
Implications of Dominance
Asymptotic Dominance in Action
109 instructions/second
Faster Computer Vs Better Algorithm
10,000
Example: 2n 10 is O(n) 3n
– 2n 10 cn 1,000 2n+10
– (c 2) n 10 n
– n 10(c 2) 100
– Pick c 3 and n0 10
10
1
1 10 100 1,000
n
1
1 10 100 1,000
n
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 c•n for n n0
this is true for c = 7 and n0 = 1
3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n n0
this is true for c = 4 and n0 = 21
3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 c•log n for n n0
this is true for c = 8 and n0 = 2
Big-Oh Rules
O(n)
o(n)
Ω(n)
(n)
Ө(n)
Relatives of Big-Oh ((n) )
big-Omega
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such
that f(n) c•g(n) for n n0
And
Alternatively
o(g(n)) = {f(n): lim [f(n) / g(n)] = 0 }
n
Example
o(n2) = {10n + 25, 150, …..}
Relatives of Big-Oh ((n))
Alternatively
w(g(n)) = {f(n): lim [f(n) / g(n)] = }
n
Example
w(n2) = {8n3 + 25, 150, …..}
Example Uses of the Relatives of Big-Oh
3logn+loglogn is (logn)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n 0 1 such that f(n)
c•g(n) for n n0
let c = 3 and n0 = 2
3logn+loglogn is (logn)
f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
12n2 + 6n is o(n3)
f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
12n2 + 6n is (n)
f(n) is (g(n)) if f(n) is asymptotically strictly greater than g(n)
Some more Examples
3n + 2 is (n)
As 3n +2 >= 3n for n>=1, though the inequality holds for n>=0, for
we need n0>0
3n + 2 is (n)
As 3n+2 >=3n for all n>=2 and 3n+2 <=4n for all n>=2, c1=3,
c2=4 and n0=2
3n + 2 is o(n2)
Useful Facts about Big O
Sums of functions:
If fO(g) and hO(g), then f+hO(g).
O(g) (g)
•g
o(g) (g) (g)
Why o(f)O(x)(x)
(g) = {f | gO(f)}
“The functions that are at least order g.”
Devise an algorithm that finds the sum of all the integers in a list.
Work out the computational complexity of the following piece of code assuming
that n = 2m:
for( int i = n; i > 0; i-- ) {
for( int j = 1; j < n; j *= 2 ) {
for( int k = 0; k < j; k++ ) {
for(int m = 0; m < 10000; m++)
sum = sum + m;
}
}
}
49
50
A B C
51
52
Search Algorithm #1: Linear Search
Basic idea: On each step, look at the middle element of the remaining
list to eliminate half of it, and quickly zero in on the desired element.