Recurrence Relations | A Complete Guide
Last Updated :
23 Jul, 2025
Have you ever wondered how to calculate the time complexity of algorithms like Fibonacci Series, Merge Sort, etc. where the problem is solved by dividing it into subproblems. This is done by analyzing the Recurrence Relations of these algorithms. In this article, we will learn about the basics of Recurrence Relations and how to analyze them.
What is Recurrence Relation?
A recurrence relation is a mathematical expression that defines a sequence in terms of its previous terms. In the context of algorithmic analysis, it is often used to model the time complexity of recursive algorithms.
General form of a Recurrence Relation:
a_{n} = f(a_{n-1}, a_{n-2},....,a_{n-k})
where f is a function that defines the relationship between the current term and the previous terms
Significance of Recurrence Relations in DSA:
Recurrence Relations play a significant role in analyzing and optimizing the complexity of algorithms. Having a strong understanding of Recurrence Relations play a great role in developing the problem-solving skills of an individual. Some of the common uses of Recurrence Relations are:
- Time Complexity Analysis
- Generalizing Divide and Conquer Algorithms
- Analyzing Recursive Algorithms
- Defining State and Transitions for Dynamic Programming.
Common Examples of Recurrence Relations:
Types of Recurrence Relations:
Various types of Recurrence Relations are:
- Linear Recurrence Relations
- Divide and Conquer Recurrences
- Substitution Recurrences
- Homogeneous Recurrences
- Non-Homogeneous Recurrences
1. Linear Recurrence Relations:
Following are some of the examples of recurrence relations based on linear recurrence relation.
T(n) = T(n-1) + n for n > 0 and T(0) = 1
These types of recurrence relations can be easily solved using substitution method.
For example,
T(n) = T(n-1) + n
= T(n-2) + (n-1) + n
= T(n-k) + (n-(k-1))….. (n-1) + n
Substituting k = n, we get
T(n) = T(0) + 1 + 2+….. +n = n(n+1)/2 = O(n^2)
2. Divide and conquer recurrence relations:
Following are some of the examples of recurrence relations based on divide and conquer.
T(n) = 2T(n/2) + cn
T(n) = 2T(n/2) + √n
These types of recurrence relations can be easily solved using Master Method.
For recurrence relation: T(n) = 2T(n/2) + cn, the values of a = 2, b = 2 and k =1. Here log_b(a) = log_2(2) = 1 = k. Therefore, the complexity will be Θ(nlog_2(n)).
Similarly for recurrence relation T(n) = 2T(n/2) + √n, the values of a = 2, b = 2 and k =1/2. Here log_b(a) = log_2(2) = 1 > k. Therefore, the complexity will be Θ(n).
3. Substitution Recurrences:
Sometimes, recurrence relations can’t be directly solved using techniques like substitution, recurrence tree or master method. Therefore, we need to convert the recurrence relation into appropriate form before solving. For example,
T(n) = T(√n) + 1
To solve this type of recurrence, substitute n = 2^m as:
T(2^m) = T(2^m /2) + 1Let T(2^m) = S(m),S(m) = S(m/2) + 1
Solving by master method, we get
S(m) = Θ(logm)As n = 2^m or m = log_2(n),T(n) = T(2^m) = S(m) = Θ(logm) = Θ(loglogn)
4. Homogeneous Recurrence Relations:
A homogeneous recurrence relation is one in which the right-hand side is equal to zero. Mathematically, a homogeneous recurrence relation of order k is represented as:
a_{n} = f(a_{n-1}, a_{n-2},..., a_{n-k})
with the condition that the above equation equates to 0.
Example: a_{n} = 2*a_{n-1} - a_{n-2}
5. Non-Homogeneous Recurrence Relations:
A non-homogeneous recurrence relation is one in which the right-hand side is not equal to zero. It can be expressed as:
a_{n} = f(a_{n-1}, a_{n-2},...,a_{n-k}) + g(n)
where g(n) is a function that introduces a term not dependent on the previous terms. The presence of g(n) makes the recurrence non-homogeneous.
Example: a_{n} = 2*a_{n-1}- a_{n-2} + 3^n
In this case, the term 3^n on the right-hand side makes the recurrence non-homogeneous.
To know more about types of Recurrence Relations, Click Here
Ways to Solve Recurrence Relations:
Here are the general steps to analyze the complexity of a recurrence relation:
- Substitute the input size into the recurrence relation to obtain a sequence of terms.
- Identify a pattern in the sequence of terms, if any, and simplify the recurrence relation to obtain a closed-form expression for the number of operations performed by the algorithm.
- Determine the order of growth of the closed-form expression by using techniques such as the Master Theorem, or by finding the dominant term and ignoring lower-order terms.
- Use the order of growth to determine the asymptotic upper bound on the running time of the algorithm, which can be expressed in terms of big O notation.
It’s important to note that the above steps are just a general outline and that the specific details of how to analyze the complexity of a recurrence relation can vary greatly depending on the specific recurrence relation being analyzed.
We have already discussed the analysis of loops. Many algorithms are recursive. When we analyze them, we get a recurrence relation for time complexity. We get running time on an input of size n as a function of n and the running time on inputs of smaller sizes. For example, in Merge Sort, to sort a given array, we divide it into two halves and recursively repeat the process for the two halves. Finally, we merge the results. Time complexity of Merge Sort can be written as T(n) = 2T(n/2) + cn. There are many other algorithms like Binary Search, Tower of Hanoi, etc.
Overall, solving recurrences plays a crucial role in the analysis, design, and optimization of algorithms, and is an important topic in computer science.
There are mainly three ways of solving recurrences:
- Substitution Method
- Recurrence Tree Method
- Master Method
1. Substitution Method:
We make a guess for the solution and then we use mathematical induction to prove the guess is correct or incorrect.
For example, consider the recurrence T(n) = 2T(n/2) + n
We guess the solution as T(n) = O(nLogn). Now we use induction to prove our guess.
We need to prove that T(n) <= cnLogn. We can assume that it is true for values smaller than n.
T(n) = 2T(n/2) + n
<= 2cn/2Log(n/2) + n
= cnLogn – cnLog2 + n
= cnLogn – cn + n
<= cnLogn
2. Recurrence Tree Method:
In this method, we draw a recurrence tree and calculate the time taken by every level of the tree. Finally, we sum the work done at all levels. To draw the recurrence tree, we start from the given recurrence and keep drawing till we find a pattern among levels. The pattern is typically arithmetic or geometric series.
Consider the recurrence relation, T(n) = T(n/4) + T(n/2) + cn^2
cn^2
/ \
T(n/4) T(n/2)
If we further break down the expression T(n/4) and T(n /2), we get the following recursion tree.
cn^2
/ \
c(n^2 )/16 c(n^2 )/4
/ \ / \
T(n/16) T(n/8) T(n/8) T(n/4)
Breaking down further gives us following
cn^2
/ \
c(n^2 )/16 c(n^2 )/4
/ \ / \
c(n^2 )/256 c(n^2 )/64 c(n^2 )/64 c(n^2 )/16
/ \ / \ / \ / \
To know the value of T(n), we need to calculate the sum of tree nodes level by level. If we sum the above tree level by level, we get the following series T(n) = cn^2(1 + 5/16 + 25/256 ) + ….
The above series is a geometrical progression with a ratio of 5/16.
To get an upper bound, we can sum the infinite series. We get the sum as (cn^2 )/(1 – 5/16) which is O(n^2 )
3. Master Method:
Master Method is a direct way to get the solution. The master method works only for the following type of recurrences or for recurrences that can be transformed into the following type.
T(n) = aT(n/b) + f(n) where a >= 1 and b > 1
There are the following three cases:
- If f(n) = O(nc) where c < Log_ba then T(n) = Θ(nLog_ba)
If f(n) = Θ(nc) where c = Logb_a then T(n) = Θ(ncLog n)
If f(n) = Ω(nc) where c > Log_ba then T(n) = Θ(f(n))
How does this work?
The master method is mainly derived from the recurrence tree method. If we draw the recurrence tree of T(n) = aT(n/b) + f(n), we can see that the work done at the root is f(n), and work done at all leaves is Θ(nc) where c is Log_ba. And the height of the recurrence tree is Log_bn

In the Recurrence Tree Method, we calculate the total work done. If the work done at leaves is polynomial more, then leaves are the dominant part, and our result becomes the work done at leaves (Case 1). If work done at leaves and root is asymptotically the same, then our result becomes height multiplied by work done at any level (Case 2). If work done at the root is asymptotically more, then our result becomes work done at the root (Case 3).
Examples of some standard algorithms whose time complexity can be evaluated using the Master Method
- Merge Sort: T(n) = 2T(n/2) + Θ(n). It falls in case 2 as c is 1 and Log_ba is also 1. So, the solution is Θ(n Logn)
- Binary Search: T(n) = T(n/2) + Θ(1). It also falls in case 2 as c is 0 and Log_ba is also 0. So, the solution is Θ(Logn)
To know more about ways to solve Recurrence Relations, Click Here
Similar Reads
Basics & Prerequisites
Data Structures
Array Data StructureIn this article, we introduce array, implementation in different popular languages, its basic operations and commonly seen problems / interview questions. An array stores items (in case of C/C++ and Java Primitive Arrays) or their references (in case of Python, JS, Java Non-Primitive) at contiguous
3 min read
String in Data StructureA string is a sequence of characters. The following facts make string an interesting data structure.Small set of elements. Unlike normal array, strings typically have smaller set of items. For example, lowercase English alphabet has only 26 characters. ASCII has only 256 characters.Strings are immut
2 min read
Hashing in Data StructureHashing is a technique used in data structures that efficiently stores and retrieves data in a way that allows for quick access. Hashing involves mapping data to a specific index in a hash table (an array of items) using a hash function. It enables fast retrieval of information based on its key. The
2 min read
Linked List Data StructureA linked list is a fundamental data structure in computer science. It mainly allows efficient insertion and deletion operations compared to arrays. Like arrays, it is also used to implement other data structures like stack, queue and deque. Hereâs the comparison of Linked List vs Arrays Linked List:
2 min read
Stack Data StructureA Stack is a linear data structure that follows a particular order in which the operations are performed. The order may be LIFO(Last In First Out) or FILO(First In Last Out). LIFO implies that the element that is inserted last, comes out first and FILO implies that the element that is inserted first
2 min read
Queue Data StructureA Queue Data Structure is a fundamental concept in computer science used for storing and managing data in a specific order. It follows the principle of "First in, First out" (FIFO), where the first element added to the queue is the first one to be removed. It is used as a buffer in computer systems
2 min read
Tree Data StructureTree Data Structure is a non-linear data structure in which a collection of elements known as nodes are connected to each other via edges such that there exists exactly one path between any two nodes. Types of TreeBinary Tree : Every node has at most two childrenTernary Tree : Every node has at most
4 min read
Graph Data StructureGraph Data Structure is a collection of nodes connected by edges. It's used to represent relationships between different entities. If you are looking for topic-wise list of problems on different topics like DFS, BFS, Topological Sort, Shortest Path, etc., please refer to Graph Algorithms. Basics of
3 min read
Trie Data StructureThe Trie data structure is a tree-like structure used for storing a dynamic set of strings. It allows for efficient retrieval and storage of keys, making it highly effective in handling large datasets. Trie supports operations such as insertion, search, deletion of keys, and prefix searches. In this
15+ min read
Algorithms
Searching AlgorithmsSearching algorithms are essential tools in computer science used to locate specific items within a collection of data. In this tutorial, we are mainly going to focus upon searching in an array. When we search an item in an array, there are two most common algorithms used based on the type of input
2 min read
Sorting AlgorithmsA Sorting Algorithm is used to rearrange a given array or list of elements in an order. For example, a given array [10, 20, 5, 2] becomes [2, 5, 10, 20] after sorting in increasing order and becomes [20, 10, 5, 2] after sorting in decreasing order. There exist different sorting algorithms for differ
3 min read
Introduction to RecursionThe process in which a function calls itself directly or indirectly is called recursion and the corresponding function is called a recursive function. A recursive algorithm takes one step toward solution and then recursively call itself to further move. The algorithm stops once we reach the solution
14 min read
Greedy AlgorithmsGreedy algorithms are a class of algorithms that make locally optimal choices at each step with the hope of finding a global optimum solution. At every step of the algorithm, we make a choice that looks the best at the moment. To make the choice, we sometimes sort the array so that we can always get
3 min read
Graph AlgorithmsGraph is a non-linear data structure like tree data structure. The limitation of tree is, it can only represent hierarchical data. For situations where nodes or vertices are randomly connected with each other other, we use Graph. Example situations where we use graph data structure are, a social net
3 min read
Dynamic Programming or DPDynamic Programming is an algorithmic technique with the following properties.It is mainly an optimization over plain recursion. Wherever we see a recursive solution that has repeated calls for the same inputs, we can optimize it using Dynamic Programming. The idea is to simply store the results of
3 min read
Bitwise AlgorithmsBitwise algorithms in Data Structures and Algorithms (DSA) involve manipulating individual bits of binary representations of numbers to perform operations efficiently. These algorithms utilize bitwise operators like AND, OR, XOR, NOT, Left Shift, and Right Shift.BasicsIntroduction to Bitwise Algorit
4 min read
Advanced
Segment TreeSegment Tree is a data structure that allows efficient querying and updating of intervals or segments of an array. It is particularly useful for problems involving range queries, such as finding the sum, minimum, maximum, or any other operation over a specific range of elements in an array. The tree
3 min read
Pattern SearchingPattern searching algorithms are essential tools in computer science and data processing. These algorithms are designed to efficiently find a particular pattern within a larger set of data. Patten SearchingImportant Pattern Searching Algorithms:Naive String Matching : A Simple Algorithm that works i
2 min read
GeometryGeometry is a branch of mathematics that studies the properties, measurements, and relationships of points, lines, angles, surfaces, and solids. From basic lines and angles to complex structures, it helps us understand the world around us.Geometry for Students and BeginnersThis section covers key br
2 min read
Interview Preparation
Practice Problem