0% found this document useful (0 votes)
25 views6 pages

Correction of Worksheet 1

The document provides information about asymptotic notation and complexity analysis. It includes: 1) Examples of asymptotic growth rates such as O(n), O(n^2), O(log n) and comparisons between them. 2) Exercises analyzing time complexities of algorithms including linear search, binary search, bubble sort, and combinations of sorting and searching. 3) Solutions to the exercises determining tight asymptotic bounds and comparing algorithms' performance in terms of big O notation.

Uploaded by

Farah Essid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views6 pages

Correction of Worksheet 1

The document provides information about asymptotic notation and complexity analysis. It includes: 1) Examples of asymptotic growth rates such as O(n), O(n^2), O(log n) and comparisons between them. 2) Exercises analyzing time complexities of algorithms including linear search, binary search, bubble sort, and combinations of sorting and searching. 3) Solutions to the exercises determining tight asymptotic bounds and comparing algorithms' performance in terms of big O notation.

Uploaded by

Farah Essid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Correction of Worksheet -1-

Exercise 1:
To answer this question, you must know some basic comparison scales.
for example, as 𝑛 → ∞ , and for constants a>1, b>0, and any real d, we have:
𝑑 < 𝑙𝑜𝑔𝑛 < 𝑛 < 𝑛𝑙𝑜𝑔𝑛 < 𝑛𝑎 < 𝑏 𝑛 < 𝑛! < 𝑛𝑛
You have also some allowed operations, for example,

• 𝑙𝑜𝑔𝑥 𝑦 = 𝑦𝑙𝑜𝑔𝑥
• 𝑙𝑜𝑔𝑥𝑦 = 𝑙𝑜𝑔𝑥 + 𝑙𝑜𝑔𝑦
1
• √𝑛 = 𝑛2
• Any polynomials dominate any logarithms: Generally, constant powers of n (𝑒𝑔. 𝑛𝑎 ) come after expressions
with only log factors. Even a small constant power of n grows faster than expressions with only logs. So √𝑛 =
1
𝑛2 grows faster than log 3 𝑛.
• if a>1 is a constant real and 1<n< n', then 𝑎n < 𝑎𝑛 ′ : Generally, expressions with some function of n in the
exponent go last. These are called "exponentials". Just see what has the biggest function in the exponent. For
example, since √𝑛 grows faster than logn, then 2√𝑛 grows faster than 2logn .

It is obvious that 𝑛 ! < (𝑛 +1)!


3𝑛+1 = 3 × 3𝑛 = 𝑂(3𝑛 ) (removing constant factors)
Since we have 𝑙𝑜𝑔𝑛 < 𝑛:
by adding log in both sides, we will have 𝑙𝑜𝑔𝑙𝑜𝑔(𝑛) < 𝑙𝑜𝑔𝑛.
1
1 1
And by multiplying both sides with 2 we will have 𝑙𝑜𝑔√𝑛 < √𝑛 (since 𝑙𝑜𝑔√𝑛 = log 𝑛2 = 2 𝑙𝑜𝑔𝑛 = 𝑂(𝑙𝑜𝑔𝑛 )
𝑙𝑜𝑔(𝑛 + 2𝑛3 ) ≈ log(2𝑛3 ) = 𝑙𝑜𝑔2 + 𝑙𝑜𝑔𝑛3 = 𝑙𝑜𝑔2 + 3𝑙𝑜𝑔𝑛 = 𝑂(𝑙𝑜𝑔𝑛) (Removing constant factors and lower order terms)
2

Functions by increasing asymptotic growth rate:


𝑙𝑜𝑔𝑙𝑜𝑔(𝑛) 𝑙𝑜𝑔√𝑛 √𝑛 𝑛2 + 𝑛(𝑙𝑜𝑔𝑛)3 4𝑛2 + 2𝑛3 3𝑛+1 𝑛! (𝑛 + 1)!
= 𝑂(𝑙𝑜𝑔𝑙𝑜𝑔(𝑛)) 𝑙𝑜𝑔(𝑛2 + 2𝑛3 ) 1 = 𝑂(𝑛2 ) = 𝑂(𝑛3 ) 3𝑛 = 𝑂(𝑛!) = 𝑂((𝑛 + 1)!)
= 𝑂(𝑛2 )
= 𝑂(𝑙𝑜𝑔𝑛) = 𝑂(3𝑛 )

Exercise 2:

• 𝑓1 (𝑛) = 𝑂(𝑛2 ) 𝑎𝑛𝑑 𝑓2 (𝑛) = 𝑂(𝑛) 𝑡ℎ𝑒𝑛 𝑓1 (𝑓2 (𝑛)) = 𝑂(𝑛3 ) ➔ True.
1) We have 𝑓1 (𝑛) = 𝑂(𝑛2 ), then based on the formal definition of the big O notation, there is a real constant 𝑐1 > 0 and an
integer constant 𝑛01 ≥ 1 such that 𝑓1 (𝑛) ≤ 𝑐1 𝑛2 , for 𝑛 ≥ 𝑛01 .
2) Similarly, we have 𝑓2 (𝑛) = 𝑂(𝑛), then there is a real constant 𝑐2 > 0 and an integer constant 𝑛02 ≥ 1 such that
𝑓2 (𝑛) ≤ 𝑐2 𝑛, for 𝑛 ≥ 𝑛02 .
𝑓1 (𝑓2 (𝑛)) ≤ 𝑓1 (𝑐2 𝑛), for 𝑛 ≥ 𝑛02 . (Adding 𝑓1 𝑜𝑛 𝑏𝑜𝑡ℎ 𝑠𝑖𝑑𝑒𝑠)
𝑛
Based on 1), we can write 𝑓1 (𝑓2 (𝑛)) ≤ 𝑓1 (𝑐2 𝑛) ≤ 𝑐1 (𝑐2 𝑛)2 , for 𝑛 ≥ 𝑛02 and c2 𝑛 ≥ 𝑛01  𝑛 ≥ 𝑐01
2
𝑛01
➔ 𝑓1 (𝑓2 (𝑛)) ≤ 𝑐1 𝑐22 𝑛2 , we can choose 𝑐 = 𝑐1 𝑐22 > 0 and 𝑛0 = max (𝑛02 , 𝑐2
).
Finally, based on the formal definition of the big O notation, 𝑓1 (𝑓2 (𝑛)) = 𝑂(𝑛2 ) and since 𝑂(𝑛2 ) < 𝑂(𝑛3 ) then
𝑓1 (𝑓2 (𝑛)) = 𝑂(𝑛3 ) , with 𝑂(𝑛3 ) is a non-tight bound.

• 𝑓1 (𝑛)𝑓2 (𝑛) = 𝑂(max (𝑓1 (𝑛), 𝑓2 (𝑛))) ➔ False.


Counter example: for 𝑓1 (𝑛) = 𝑛 𝑎𝑛𝑑 𝑓2 (𝑛) = n2
We have: 𝑓1 (𝑛)𝑓2 (𝑛) = 𝑛3 and 𝑂(max(𝑓1 (𝑛), 𝑓2 (𝑛))) = 𝑂(𝑛2 )
It is obvious that 𝑛3 is not 𝑂(𝑛2 ) since 𝑛3 dominates 𝑛2 .

Exercise3:
Exercice 4:
Initially, [5, 3, 8, 1, 9, 2, 11, 6]
For i = 1, [3, 5, 8, 1, 9, 2, 11, 6]
For i = 2, [3, 5, 8, 1, 9, 2, 11, 6]
For i = 3, [1,3, 5, 8, 9, 2, 11, 6]
For i = 4, [1, 3, 5, 8, 9, 2, 11, 6]
For i= 5, [1,2, 3, 5, 8, 9, 11, 6]
For i= 6, [1, 2, 3, 5, 8, 9, 11, 6]
For i = 7, [1, 2, 3, 5, 6, 8, 9, 11]
The worst-case time complexity is O(n2)

Exercise 5:
It is easier to find an element in a sorted array, so it is recommended to start with question 2).
Here I provided two possible solutions.
A better Algorithm with time complexity O(n), proposed by “Ons Kharrat Sophomore 2”:
Exercise 6:
In the worst case, we have the time complexity in the big O notation:
Linear search ➔ O(n)
Binary search ➔ O(logn)
Bubble sort ➔ O(n2)
1) Sorting the array with Bubble sort and then doing the searches using binary search:
We will sort the array once using bubble sort, then we will perform the binary search 100 times:
𝑓1 (𝑛) = 𝑂(𝑛2 ) + 100 × 𝑂(𝑙𝑜𝑔𝑛) ≈ (106 )2 + 102 × 20 = 1012 + 2 × 103

2) Just doing the searches with linear search (without sorting):


𝑓2 (𝑛) = 100 × 𝑂(𝑛) ≈ 102 × 106 = 108

➔𝑓2 (𝑛) ≪ 𝑓1 (𝑛)

Finally, it is clear that the second possibility (linear search without sorting) requires less work and time than the first one.

You might also like