Co 1 (Lo 3)
Co 1 (Lo 3)
that describe the relationship between terms in a sequence. These concepts are often
encountered in various fields of mathematics, computer science, and engineering,
particularly in the study of sequences, algorithms, and dynamic processes.
F(0) = 0
F(1) = 1
In this example, each term is the sum of the two previous terms.
a(n) = a(1) + (n - 1) * d
Recurrence relations are often used when there's a natural progression or dependency
among the terms, making them suitable for describing processes that evolve over time.
Non-recurrence relations, on the other hand, are more appropriate when the terms don't
rely on previous terms and can be computed directly.
Both types of relations have their applications in various areas. Recurrence relations are
commonly used in algorithm analysis, dynamic programming, and solving difference
equations, while non-recurrence relations are often seen in arithmetic and geometric
sequences, as well as in formulas for calculating interest, growth, and other non-iterative
processes.
Mathematical analysis of recursive and non-recursive algorithms involves evaluating their
time complexity and space complexity in terms of mathematical functions and expressions.
This analysis helps us understand how the algorithms perform in terms of time and memory
usage as the input size grows.
Common notations used for time complexity analysis include O-notation (big O), Ω-
notation (big Omega), and Θ-notation (big Theta), which provide upper, lower, and tight
bounds on the algorithm's growth rate, respectively.
Remember that analyzing both time and space complexity involves counting the dominant
operations or memory usage, which have the most significant impact on the overall
performance.
The above algorithm divides the problem into a subproblems, each of size n/b and
solve them recursively to compute the problem and the extra work done for
problem is given by f(n), i.e., the time to create the subproblems and combine their
results in the above procedure.
So, according to master theorem the runtime of the above algorithm can be
expressed as:
2. if a = bk, then
(a) if p > -1, then T(n) = θ(n log a logp+1n)
b
T(n) = θ(logn)
Example-2: Merge Sort – T(n) = 2T(n/2) + O(n)
a = 2, b = 2, k = 1, p = 0
bk = 2. So, a = bk and p > -1 [Case 2.(a)]
T(n) = θ(nlog a logp+1n)
b
T(n) = θ(nlogn)
Example-3: T(n) = 3T(n/2) + n2
a = 3, b = 2, k = 2, p = 0
bk = 4. So, a < bk and p = 0 [Case 3.(a)]
T(n) = θ(nk logpn)
T(n) = θ(n2)
T(n) = θ(nlog 3)
2
T(n) = θ(nlog3n)
Here are some important points to keep in mind regarding the Master
Theorem:
1. Divide-and-conquer recurrences: The Master Theorem is specifically
designed to solve recurrence relations that arise in the analysis of divide-
and-conquer algorithms.
2. Form of the recurrence: The Master Theorem applies to recurrence
relations of the form T(n) = aT(n/b) + f(n), where a, b, and f(n) are positive
functions and n is the size of the problem.
3. Time complexity: The Master Theorem provides conditions for the solution
of the recurrence to be in the form of O(n^k) for some constant k, and it
gives a formula for determining the value of k.
4. Advanced version: The advanced version of the Master Theorem provides
a more general form of the theorem that can handle recurrence relations
that are more complex than the basic form.
5. Limitations: The Master Theorem is not applicable to all recurrence
relations, and it may not always provide an exact solution to a given
recurrence.
6. Useful tool: Despite its limitations, the Master Theorem is a useful tool for
analyzing the time complexity of divide-and-conquer algorithms and
provides a good starting point for solving more complex recurrences.
7. Supplemented with other techniques: In some cases, the Master Theorem
may need to be supplemented with other techniques, such as the
substitution method or the iteration method, to completely solve a given
recurrence relation.