Lecture 1
Lecture 1
CS3610
𝐹1 = 1
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
f[0] = 0;
f[1] = 1;
1. Introduction
return f[n];
}
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
🡪 Example:
2. Time complexity
int fibonacci_iterative(int n)
{
if (n <= 1) {
return n;
}
2. Time complexity
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
f[0] = 0;
f[1] = 1;
for (int i = 2; i <= n; i++)
return f[n];
f[i] = f[i - 1] + f[i - 2];
}
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return
2. Time complexity
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
🡪 The algorithm is slow since it keeps recomputing the same calculations over
and over again!
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return
2. Time complexity
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
🡪 Example for n = 5
🡪 Auxiliary Space is the extra space or the temporary space used by the
algorithm during it's execution.
return sum;
F2023 Analysis, Design of Algorithms 17
Space complexity for Fibonacci – approach 1
int fibonacci_iterative(int n)
{
if (n <= 1) {
return n;
}
3. Space complexity
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
f[0] = 0;
f[1] = 1;
for (int i = 2; i <= n; i++)
f[i] = f[i - 1] + f[i - 2];
return f[n];
}
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
3. Space complexity
}
}
Array 9n + 6 4n + 12
Recursive ~2^n 4n + 4
needed
Counts time for all statements Counts memory for all
variables (Even input)
More important for solution Less important with modern
optimization hardware
LinearSearch(A, key)
1 i←1
2 while i ≤ n and A[i] != key
4. Asymptotic
3 do i++
4 if i ≤ n
5 then return true
6 else return false
🡪 Worst-case Complexity
🡪 The maximum number of steps the algorithm takes during execution
🡪 Average-case Complexity
4. Asymptotic
🡪 Best-case Complexity
🡪 The minimum number of steps the algorithm takes during execution
🡪 Time taken to execute the algorithm in the worst case: 𝑇(n) = O(n)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the running time of
two algorithms. We say,
g(n) is upper bound of f(n)
🡪 if there exist positive constants c and n0
such that 0 ≤ f(n) ≤ c.g(n) for all n ≥ n0.
4. Asymptotic
🡪 Time taken to execute the algorithm in the best case: 𝑇(𝑛) = Ω(1)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the running time of
two algorithms. We say,
🡪 This notation is denoted by ‘Θ’, and it is pronounced as “Big Theta”. Big Theta
notation defines tight bound for the algorithm. It means the running time of
algorithm cannot be less than or greater than it’s asymptotic tight bound for
any random sequence of data. We use it to express the average case
4. Asymptotic
complexity.
🡪 Time taken to execute the algorithm in the best case: 𝑇(𝑛) = Θ(𝑛)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the
running time of two algorithms. We say,
g(n) is tight bound of f(n)
🡪 Examples:
🡪 Accessing an index of an array
🡪 Insert/Delete a value as a head of Linked List
🡪 Looping a constant time
🡪 O(n) means that the execution time increases at the same rate as the input.
🡪 Examples:
5. Common growth rates
iterations 1
st
2
nd
3
rd
4
th
5
th
🡪 Example
🡪 Selection sort
🡪 Table of Multiplication