0% found this document useful (0 votes)
22 views

Chapter 3 - Analysis of Algorithms

Let's break this down step-by-step: - The outer for loop runs from 1 to n with doubling i each time. This will run O(log n) times. - The middle for loop runs from n to 1, halving j each time. This also runs O(log n) times. - The inner for loop runs from j to n. Since j decreases geometrically from n to 1 in the outer loop, this inner loop contributes an additional O(log n) factor. - Therefore, the overall complexity is O(log n * log n * log n) = O(log3 n) So in summary, the computational complexity of the given code is O(log3
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Chapter 3 - Analysis of Algorithms

Let's break this down step-by-step: - The outer for loop runs from 1 to n with doubling i each time. This will run O(log n) times. - The middle for loop runs from n to 1, halving j each time. This also runs O(log n) times. - The inner for loop runs from j to n. Since j decreases geometrically from n to 1 in the outer loop, this inner loop contributes an additional O(log n) factor. - Therefore, the overall complexity is O(log n * log n * log n) = O(log3 n) So in summary, the computational complexity of the given code is O(log3
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

Design and Analysis of Algorithms

(503040)
ANALYSIS OF ALGORITHMS

Giảng viên: Tiến sĩ Bùi Thanh Hùng


Trưởng Lab Khoa học Phân tích dữ liệu và Trí tuệ nhân tạo
Giám đốc chương trình Hệ thống thông tin
Đại học Thủ Dầu Một
Email: [email protected]
Website: https://sites.google.com/site/hungthanhbui1980/
Why study algorithms?
Theoretical importance
- The cornerstone of computer science

Practical importance
- a practitioner’s toolkit of known algorithms
- frameworks for designing and analyzing algorithms for
new problems
Major Algorithm Design Techniques/Strategies

• Brute force
• Decrease and conquer
• Divide and conquer
• Transform and conquer
• Space-time tradeoff
• Greedy approach
• Dynamic programming
• Iterative improvement
• Backtracking
• Branch and Bound
Analysis of Algorithms
 Difficulties with comparing programs instead
of algorithms
 How are the algorithms coded?

 Which compiler is used?


 What computer should you use?

 What data should the programs use?

 Algorithm analysis should be independent of


 Specific implementations

 Compilers and their optimizers


 Computers

 Data

4
Analysis of Algorithms
• How good is the algorithm?
– correctness (accuracy for approximation alg.)
– time efficiency
– space efficiency
– optimality

• Approaches:
– empirical (experimental) analysis
– theoretical (mathematical) analysis
Theoretical analysis of time efficiency

Time efficiency is analyzed by determining the


number of times the algorithm’s basic operation is
executed as a function of input size
- Input size: number of input items or, if matters,
their size
- Basic operation: the operation contributing the
most toward the running time of the algorithm
Big O notation
• Given a function f(n), we say g(n) is an (asymptotic) upper
bound of f(n), denoted as f(n) = O(g(n)), if there exist a
constant c > 0, and a positive integer n0 such that f(n) 
c*g(n) for all n  n0.
 f(n) is said to be
bounded from above c*g(n)
by g(n).
 O() is called the “big f(n)
O” notation.
g(n)

n0

7
Growth Terms
 The most common growth terms can be ordered as
follows: (note: many others are not shown)
O(1) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n) < …

Note:
 “log” = log base 2, or log2; “log10” = log base 10; “ln” = log
base e. In big O, all these log functions are the same.

8
Order-of-Magnitude Analysis and Big O Notation

Figure - Comparison of growth-rate functions in tabular form

9
Order-of-Magnitude Analysis and Big O Notation

Figure - Comparison of growth-rate functions in graphical form

10
Some rules of thumb and examples
• Basically just count the number of statements executed.
• If there are only a small number of simple statements in a program
– O(1)
• If there is a ‘for’ loop dictated by a loop index that goes up to n
– O(n)
• If there is a nested ‘for’ loop with outer one controlled by n and
the inner one controlled by m – O(n*m)
• For a loop with a range of values n, and each iteration reduces the
range by a fixed constant fraction (eg: ½)
– O(log n)
• For a recursive method, each call is usually O(1). So
– if n calls are made – O(n)
– if n log n calls are made – O(n log n)

11
Example
• Image that we have the number of
calculations is S(n)
• S(n) = 1 + 2+.... + n

• What is complexity?

12
Example
• S(n) = 1 + 2+.... + n < n+n+ ...+ n =
• O( )

13
Example
Example
Example
Example
Example
Example
Example
Example
Example
Example
Example
Example
• f( n)= n log ( n!) + (3 +2 n)log n.

27
Logarit

28
Example
• f( n)= n log ( n!) + (3 +2 n)log n.

• log( n!) = O( n log n)


• n log ( n!) = O( log n)
• (3 +2 n) = O( )
• (3 +2 n)log n = O( log n)

• O( log n)

29
Example
• f(n) = (n+3) log ( +4) + 5

30
Example
• f(n) = (n+3) log ( +4) + 5

• n+3= O(n)
• log ( +4)=O(log n).
• n>2 log( +4) < log(2 ) < log 2 + log =
log 2 + 2log n < 3 log n.
• (n+3) log ( +4) = O(nlog n).
• 5 = O( ).
• f(n) = O(max { nlog n, }) = O( ).
31
Example
• f(x) = 2 + 23

32
Example
• f(x) = 2 + 23

• x > 5 ta có f(x) < 2 × 2


• f(x) = O(2 ).
• 2 < f(x) với mọi x>0.
• O(2 ) là đánh giá tốt nhất đối với f(x) (hay
nói cách khác 2 là cùng bậc với f(x)).

33
Example

int sum = 0;
for (int i=1; i<n; i=i*2)
{
sum++;
}

34
Example
int sum = 0;
for (int i=1; i<n; i=i*2) {
sum++;
}
 It is clear that sum is incremented only when
i = 1, 2, 4, 8, …, 2k where k = log2 n
There are k+1 iterations. So the complexity is O(k) or
O(log n)

Note:
 In Computer Science, log n means log2 n.
 When 2 is replaced by 10 in the ‘for’ loop, the complexity is
O(log10 n) which is the same as O(log2 n).
 log10 n = log2 n / log2 10
35
Example

let’s assume that n is some power of 3

int sum = 0;
for (int i=1; i<n; i=i*3)
{
for (j=1; j<=i; j++) {
sum++;
}
}

36
Example
let’s assume that n is some power of 3
int sum = 0;
for (int i=1; i<n; i=i*3) {
for (j=1; j<=i; j++) {
sum++;
}
}
 f(n) = 1 + 3 + 9 + 27 + … + 3(log3 n)
= 1 + 3 + … + n/9 + n/3 + n
= n + n/3 + n/9 + … + 3 + 1 (reversing the terms in previous step)
= n * (1 + 1/3 + 1/9 + …)
 n * (3/2)
= 3n/2
= O(n)
37
Example
def: BinarySearch(el, a):
l=0
r= len(a)-1
m= a[(l+r)//2]
if el < a[m]:
el > a[m]:
el=a[m]
Input
a=[4,5,7,1,3,9,12] Comparation
a=sorted(a) Bestcase
BinarySearch(9,a) Worstcase
-> O()
Example
Xét độ phức tạp của thuật toán, giả thiết rằng có n= 2 phần tử.
def BinarySearch(x, a):
first =0
last =len(a)-1
found =False
while (first<=last and not found ):
index= (first + last) // 2
if (x == a[index]): found = True
elif (x< a[index]): last = index –1
else: first = index +1
if (not found ): index = -1
return index
a=[4,5,7,1,3,9,12]
a=sorted(a)
39
print(BinarySearch(9,a))
Example
Xét độ phức tạp của thuật toán, giả thiết rằng có n= 2 phần tử.
def BinarySearch(x, a):
first =0
last =len(a)-1
found =False
while (first<=last and not found ):
index= (first + last) // 2
if (x == a[index]): found = True
elif (x< a[index]): last = index –1
else: first = index +1
if (not found ): index = -1
return index • Số phép toán so sánh tối đa
a=[4,5,7,1,3,9,12]
là 2k+1 = 2 log .
• Hay độ phức tạp O(logn),
a=sorted(a) độ phức tạp logarit.
40
print(BinarySearch(9,a))
Example
public static int USCLN(int a,int b){
int x= a;
int y=b;
while (y>0) {
int r = x % y;
x = y;
y = r;
}
return x;
}

41
Example
public int void USCLN(int a,int b){
int x= a;
int y=b;
while (y>0) {
int r = x % y;
x = y;
y=r
}
return x;
Định lý Lamé:
Cho a và b là các số nguyên dương với a >= b. Số phép chia
cần thiết để tìm USCLN(a,b) nhỏ hơn hoặc bằng 5 lần
số chữ số của b trong hệ thập phân (hay nói cách khác thuộc
O(log ) hay O(log n).
42
Example
Work out the computational complexity of the
following piece of code.
for ( i=1; i < n; i *= 2 ) {
for ( j = n; j > 0; j /= 2 ) {
for ( k = j; k < n; k += 2 ) {
sum += (i + j * k );
}
}
}
43
Example
Work out the computational complexity of the following piece of
code.
for ( i=1; i < n; i *= 2 ) {
for ( j = n; j > 0; j /= 2 ) {
for ( k = j; k < n; k += 2 ) {
sum += (i + j * k );
}
}
}
Running time of the inner, middle, and outer loop is proportional
to n, log n, and log n, respectively. Thus the overall Big-Oh
complexity is O(n(log n) 2 ).

44
Analysis of Different Cases
Worst-Case Analysis
– Interested in the worst-case behaviour.
– A determination of the maximum amount of time that an algorithm
requires to solve problems of size n

Best-Case Analysis
– Interested in the best-case behaviour
– Not useful

Average-Case Analysis
– A determination of the average amount of time that an algorithm requires
to solve problems of size n
– Have to know the probability distribution
– The hardest
45
Element uniqueness problem

 Input size
 Basic operation
 Best case
 Worst case – summation for C(n)
Element uniqueness problem
Matrix multiplication

 Input size
 Basic operation
 Best case
 Worst case – summation for C(n)
numpy
import numpy as np
x = np.array([[1,1,1],[1, 4,6]])
y = np.array([[1,1,1],[1, 1, 1],[1, 1, 1]])
print(x * y)
print(np.multiply(x, y))
Matrix multiplication
Gaussian elimination

 Input size
 Basic operation
 Best case
 Worst case – summation for C(n)
Gaussian elimination

You might also like