Lecture 12 13 Algorithms
Lecture 12 13 Algorithms
► Queues
2
Algorithm
3
Algorithm Properties
► It must be correct
► It must terminate
4
Questions to be addressed!
5
Efficiency
▪ Complexity types
► Space (Space Complexity)
6
Measuring Efficiency
7
Running time: Empirical Method
8
Empirical Method: Example
9
Empirical Method: Example
10
Empirical Method: Example
▪ Computer A
► Linear growth rate
► Program's run-time is directly proportional to its input size
▪ Computer B
► Logarithmic growth rate
► Doubling the input size only increases the run time by a
constant amount (i.e. 25,000 ns)
► Even though Computer A is a faster machine, Computer B
will inevitably surpass Computer A in run-time because it's
running an algorithm with a much slower growth rate
11
Limitations of Empirical Method
12
Limitations of Empirical Method –
Algorithm Properties
13
RAM Model
14
A Simple Example
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A
s = s + A[i];
return s;
}
15
A Simple Example: Analysis of Sum
16
A Simple Example: Analysis of Sum
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A
1,2,8: Once
3,4,5,6,7: Once per each iteration of for loop, N iteration
Total: 5N + 3
The complexity function of the algorithm is : f(N) = 5N +3
17
How 5N+3 grows?
N = 10 5(10)+3 = 53 steps
N = 100 5(100)+3 = 503 steps
N = 1,000 5(1000)+3 = 5003 steps
N = 1,000,000 5(1000000)+3 = 5,000,003 steps
18
Which term Dominates?
Asymptotic Complexity:
▪ As N gets large, concentrate on the highest order term:
▪ Drop lower order terms such as +3
▪ Drop the constant coefficient of the highest order term
i.e. 5
19
Asympototic Complexity
20
Growth Rates
▪ Growth rates of
functions:
▪ Linear ≈ n
▪ Quadratic ≈ n2
▪ Cubic ≈ n3
21
Constant Factors
▪ Examples
▪ 102n + 105 is a
linear function
▪ 105n2 + 108n is a
quadratic function
22
Machine Independent Time
“Asymptotic Analysis”
23
Orders of Growth
24
Summary
25
Recap
28
Asymptotic complexity
30
Asymptotic complexity – Growth rate
31
Asymptotic Analysis
32
Big-O notation
Definition
▪ Given two positive-valued functions f and g:
f (n) is O(g(n))
if there exist positive numbers c and N such that
f (n) ≤ cg(n) for all n ≥ N
f is big-O of g if there is a positive number c such that f
is not larger than cg for sufficiently large ns; that is, for
all ns larger than some number N
33
Relationship b/w f and g
c g(n)
f(n)
n
N
34
Calculating c and g
35
Calculating c and g
36
Calculating c and g
37
Practical Significance
38
Practical Significance
39
g(n) vs c
N is always a
point where the
functions cg(n)
and f intersect
each other
Proof:
▪ By the Big-O definition, T(n) is O(n3)
if T(n) ≤ c·n3 for some n ≥ N
▪ Check the condition: n3 + 20n + 1 ≤ c·n3
or equivalently 1 + 20/n2 + 1/n3 ≤ c
▪ Therefore, the Big-O condition holds for n ≥ N = 1 and
c ≥ 22 (= 1 + 20 + 1)
▪ Larger values of N result in smaller factors c (e.g., for
N = 10, c ≥ 1.201 and so on) but in any case the above
statement is valid
42
Big-O Examples
Proof:
▪ By the Big-O definition, T(n) is O(n2)
if T(n) ≤ c·n2 for some n ≥ N
43
Big-O Examples
T(n) examples
▪ 3n + 4 → O(n) Linear
▪ 4n2 + 17n + 5344 → O(n2) Quadratic
▪ 6n3+ 3n2 + 5n + 57 → O(n3) Cubic
▪ 15log2(n) + 37 → O(log(n)) Logarithmic
▪ 15 x 2n + 4n57 +16 → O(2n) Exponential
▪ 37 → O(1) Constant time
44
Typical functions applied in Big-O estimates
45
Execution times of typical functions
46
Big-O Notation
▪ Important
► Big-O is not a function!
▪ Examples
► 5n + 3 = O(n)
▪ Transitive
If f(n) is O(g(n)) and g(n) is O(h(n)) f(n) is O(h(n))
▪ Additive
if f(n) is O(h(n)) and g(n) is O(h(n)) f(n) + g(n) is O(h(n))
▪ Power rule
ank is O(nk) for c ≥ a
▪ Another power rule
nk is O(nk+j ), j>0
48
Other Notations
► Theta Notation
49