0% found this document useful (0 votes)
26 views

CS341_Recursion (1)

presentation original

Uploaded by

eyanajjar75
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

CS341_Recursion (1)

presentation original

Uploaded by

eyanajjar75
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

Recursion

Data structures and Algorithms


Fall24
Sophomore/HEC
Recursion

• Many natural and artificial


phenomena have the very
fundamental characteristic; are
self-similar across different scales
and can be described by a
procedure that specifies a
repeated operation for producing
the details.

11/4/2024 CS341: Recursion 2


Recursion
• Recursion: The definition of an operation in terms of itself.
• Solving a problem using recursion depends on solving10/29/2024
smaller occurrences of the same problem.

• Recursive programming: Writing methods that call themselves to solve problems


recursively.

• An equally powerful substitute for iteration (loops)


• Particularly well-suited to solving certain types of problems

11/4/2024 CS341: Recursion 3


WHY???
"Cultural experience" – think
differently about problems

Solves some problems more


naturally than iteration

Can lead to elegant, simplistic,


short code (when used well)

Many programming languages


("functional" languages such as
Scheme, ML, and Haskell) use
recursion exclusively (no loops)

11/4/2024 CS341: Recursion 4


Getting down stairs
Need to know two things:
 Getting down one stair
 Recognizing the bottom

Most code will look like:


if (simplest case) {
compute and return solution
} else {
divide into similar subproblem(s)
solve each subproblem recursively
assemble the overall solution
}

11/4/2024 CS341: Recursion 5


Recursion and cases
• Every recursive algorithm involves at least 2 cases:
• base case: A simple occurrence that can be answered directly.

• recursive case: A more complex occurrence of the problem that


cannot be directly answered but can instead be described in
terms of smaller occurrences of the same problem.
=> Some recursive algorithms have more than one base or recursive
case, but all have at least one of each.

A crucial part of recursive programming is


identifying these cases.

11/4/2024 CS341: Recursion 6


Exercise
• Write a recursive method reverseLines that accepts a file
Scanner and prints the lines of the file in reverse order.
• Example input file: Expected console output:
I have eaten the icebox
the plums that were in
that were in the plums
the icebox I have eaten

• What are the cases to consider?


• How can we solve a small part of the problem at a time?
• What is a file that is very easy to reverse?

11/4/2024 CS341: Recursion 7


Tracing our algorithm
• call stack: The method invocations currently running
public static void reverseLines(Scanner input) {
if (input.hasNextLine())
reverseLines(new {
Scanner("poem.txt"));
String line = input.nextLine(); // "I have eaten"
public static void reverseLines(Scanner input) {
reverseLines(input);
if (input.hasNextLine()) {
System.out.println(line);
String line = input.nextLine(); // "the plums"
} static
public void reverseLines(Scanner input) {
reverseLines(input);
} if (input.hasNextLine()) {
System.out.println(line);
String line = input.nextLine(); // "that were in"
} static
public void reverseLines(Scanner input) {
reverseLines(input);
} if (input.hasNextLine()) {
System.out.println(line);
String line = input.nextLine(); // "the icebox"
} static
public void reverseLines(Scanner input) {
reverseLines(input);
} if (input.hasNextLine()) { // false
System.out.println(line);
} ...
} file:
} input output:
}
I have eaten the icebox
the plums that were in
that were in the plums
11/4/2024 the icebox CS341: Recursion I have eaten 10
Key Components of a Recursive Algorithm Design

1. What is a smaller identical problem(s)?


Decomposition
2. How are the answers to smaller problems combined to form the
answer to the larger problem?
Composition
3. Which is the smallest problem that can be solved easily (without
further decomposition)?
Base/stopping case

11/4/2024 CS341: Recursion 11


Factorial (N!)

• N! = (N-1)! * N [for N > 1]


• 1! = 1
• 3!
= 2! * 3
= (1! * 2) * 3
=1*2*3
• Recursive design:
• Decomposition: (N-1)!
• Composition: * N
• Base case: 1!

11/4/2024 CS341: Recursion 12


Fibonacci Numbers
• The Nth Fibonacci number is the sum of the previous two Fibonacci
numbers
• 0, 1, 1, 2, 3, 5, 8, 13, …
• Recursive Design:
• Decomposition & Composition
• fibonacci(n) = fibonacci(n-1) + fibonacci(n-2)
• Base case:
• fibonacci(1) = 0
• fibonacci(2) = 1

11/4/2024 CS341: Recursion 13


fibonacci Method
public static int fibonacci(int n)
{
int fib;
if (n > 2)
fib = fibonacci(n-1) + fibonacci(n-2);
else if (n == 2)
fib = 1;
else
fib = 0;
return fib;
}

11/4/2024 CS341: Recursion 14


Execution Trace (decomposition)
fibonacci(4)

fibonacci(3) fibonacci(2)

11/4/2024 CS341: Recursion 15


Execution Trace (decomposition)
fibonacci(4)

fibonacci(3) fibonacci(2)

fibonacci(2) fibonacci(1)

11/4/2024 CS341: Recursion 16


Execution Trace (composition)
fibonacci(4)
+

fibonacci(3) fibonacci(2)
+

fibonacci(2)->1 fibonacci(1)->0

11/4/2024 CS341: Recursion 17


Execution Trace (composition)
fibonacci(4)
+

fibonacci(3)->1 fibonacci(2)->1

11/4/2024 CS341: Recursion 18


Execution Trace (composition)
fibonacci(4)->2

11/4/2024 CS341: Recursion 19


Remember:
Key to Successful Recursion
• if-else statement (or some other branching
statement)
• Some branches: recursive call
• "smaller" arguments or solve "smaller" versions of
the same task (decomposition)
• Combine the results (composition) [if necessary]
• Other branches: no recursive calls
• stopping cases or base cases

11/4/2024 CS341: Recursion 20


Template
… method(…)
{
if ( … )// base case
{
}
else // decomposition & composition
{
}
return … ; // if not void method
}

11/4/2024 CS341: Recursion 21


Template (only one base case)
… method(…)
{
… result = … ;//base case

if ( … ) // not base case


{ //decomposition & composition
result = …
}

return result;
}

11/4/2024 CS341: Recursion 22


Recursive Versus Iterative Methods
All recursive algorithms/methods
can be rewritten without recursion.

• Iterative methods use loops instead of recursion

• Iterative methods generally run faster and use less


memory--less overhead in keeping track of method calls

11/4/2024 CS341: Recursion 23


So When Should You Use Recursion?
• Solutions/algorithms for some problems are inherently recursive
• iterative implementation could be more complicated
• When efficiency is less important
• it might make the code easier to understand
• Bottom line is about:
• Algorithm design
• Tradeoff between readability and efficiency

11/4/2024 CS341: Recursion 24


Pitfalls with recursion

public static double harmonic(int n) {


return harmonic(n-1) + 1.0/n;
}
Missing Base Case

11/4/2024 CS341: Recursion 25


Pitfalls with recursion

public static double harmonic(int n) {


if (n == 1) return 1.0;
return harmonic(n) + 1.0/n;
}
No guarantee of convergence

11/4/2024 CS341: Recursion 26


Pitfalls with recursion

public static double harmonic(int n) {


if (n == 0) return 0.0;
return harmonic(n-1) + 1.0/n;
}
Excessive memory
requirements.

11/4/2024 CS341: Recursion 27


Pitfalls with recursion

// Warning: spectacularly inefficient.


public static long fibonacci(int n) {
if (n == 0) return 0;
if (n == 1) return 1;
return fibonacci(n-1) + fibonacci(n-2);
}
Excessive recomputation

11/4/2024 CS341: Recursion 28


Merge sort
• merge sort: Repeatedly divides the data in half, sorts each
half, and combines the sorted halves into a sorted whole.
The algorithm:
• Divide the list into two roughly equal halves.
• Sort the left half.
• Sort the right half.
• Merge the two sorted halves into one sorted list.

• An example of a "divide and conquer" algorithm.


• Invented by John von Neumann in 1945

11/4/2024 CS341: Recursion 29


Merge sort example
index 0 1 2 3 4 5 6 7
value 22 18 12 -4 58 7 31 42
split

22 18 12 -4 58 7 31 42
split split

22 18 12 -4 58 7 31 42
split split split split

22 18 12 -4 58 7 31 42
merge merge merge merge
18 22 -4 12 7 58 31 42
merge merge
-4 12 18 22 7 31 42 58
merge
-4 7 12 18 22 31 42 58
11/4/2024 CS341: Recursion 30
Merging sorted halves

11/4/2024 CS341: Recursion 31


Merge sort
• merge sort: Repeatedly divides the data in half, sorts each
half, and combines the sorted halves into a sorted whole.
The algorithm:
• Divide the list into two roughly equal halves.
• Sort the left half.
• Sort the right half.
• Merge the two sorted halves into one sorted list.

• An example of a "divide and conquer" algorithm.


• Invented by John von Neumann in 1945

11/4/2024 CS341: Recursion 32


Merge halves code
// Merges the left/right elements into a sorted result.
// Precondition: left/right are sorted
public static void merge(int[] result, int[] left,
int[] right) {
int i1 = 0; // index into left array
int i2 = 0; // index into right array
for (int i = 0; i < result.length; i++) {
if (i2 >= right.length ||
(i1 < left.length && left[i1] <= right[i2])) {
result[i] = left[i1]; // take from left
i1++;
} else {
result[i] = right[i2]; // take from right
i2++; } }}

11/4/2024 CS341: Recursion 33


Merge sort code
// Rearranges the elements of a into sorted order using
// the merge sort algorithm.
public static void mergeSort(int[] a) {
// split array into two halves
int[] left = Arrays.copyOfRange(a, 0, a.length/2);
int[] right = Arrays.copyOfRange(a, a.length/2, a.length);

// sort the two halves


...

// merge the sorted halves into a sorted whole


merge(a, left, right);
}

11/4/2024 CS341: Recursion 34


Merge sort code 2
// Rearranges the elements of a into sorted order using
// the merge sort algorithm (recursive).
public static void mergeSort(int[] a) {
if (a.length >= 2) {
// split array into two halves
int[] left = Arrays.copyOfRange(a, 0, a.length/2);
int[] right = Arrays.copyOfRange(a, a.length/2, a.length);

// sort the two halves


mergeSort(left);
mergeSort(right);

// merge the sorted halves into a sorted whole


merge(a, left, right);
}
}
11/4/2024 CS341: Recursion 35
Merge sort runtime
• What is the
complexity class
(Big-Oh) of merge
sort?

11/4/2024 CS341: Recursion 36


Binary recursion
• In binary recursion, the main function or algorithm makes two
recursive calls with modified or reduced arguments.
• These calls work on the two smaller subproblems created from the
original problem
• Base Case: Binary recursion, like any recursive approach, has one or
more base cases.
• Binary recursion is a powerful and efficient technique for solving a
wide range of problems, especially those with inherent binary
structures or characteristics.
• Binary search, merge sort, Fibonacci sequence

11/4/2024 CS341: Recursion 37


Divide-and-Conquer

• Divide the problem into a number of sub-problems


• Similar sub-problems of smaller size

• Conquer the sub-problems


• Solve the sub-problems recursively

• Sub-problem size small enough  solve the problems in straightforward manner

• Combine the solutions of the sub-problems


• Obtain the solution for the original problem

11/4/2024 CS341: 38
Recursion
Big ‘O’ of Recursion
1. Recurrence Relations:
1. Define a recurrence relation that represents the time complexity in terms of the input size.
2. Solve the recurrence relation to obtain a closed-form expression for the time complexity.
3. Common techniques for solving recurrence relations include substitution, recursion trees, and the
master theorem.
2. Recursion Trees:
1. Create a recursion tree that visualizes the recursive calls made by the algorithm.
2. Analyze the depth of the tree (number of recursive levels) and the number of nodes at each level.
3. Sum the work done at each level of the tree to determine the overall time complexity.
3. Master Theorem:
1. The master theorem is a specific method for analyzing the time complexity of divide-and-conquer
recursive algorithms.
2. It provides a general framework for identifying the time complexity in terms of big O notation based
on the structure of the recurrence relation.

11/4/2024 CS341: 39
Recursion
Solving Recurrence Relations - Iteration method
• Steps:
▪ Expand the recurrence
▪ Express the expansion as a summation by plugging the
recurrence back into itself until you see a pattern.
▪ Evaluate the summation
• In evaluating the summation one or more of the
following summation formulae may be used:
• Arithmetic series:
• Special Cases of Geometric Series:

• Geometric Series:
11/4/2024 CS341: Recursion 40
Solving Recurrence Relations - Iteration method

• Harmonic Series:

• Others:

11/4/2024 CS341: Recursion 41


Analysis Of Recursive Factorial method
• Example1: Form and solve the recurrence relation
for the running time of factorial method and hence
determine its big-O complexity
long factorial (int n) {
if (n == 0)
return 1;
else
return n * factorial (n – 1);
}
Number of steps T(0) = c (1)
T(n) = b + T(n - 1) (2)
= b + b + T(n - 2) by subtituting T(n – 1) in (2)
= b +b +b + T(n - 3) by substituting T(n – 2) in (2)

= kb + T(n - k)
The base case is reached when n – k = 0 → k = n, we then have:
T(n) = nb + T(n - n)
= bn + T(0)
= bn + c
Therefore the method factorial is O(n)
11/4/2024 CS341: Recursion 42
Analysis Of Recursive Binary Search
public int binarySearch (int target, int[] array,
int low, int high) {
if (low > high)
return -1;
else {
int middle = (low + high)/2;
if (array[middle] == target)
return middle;
else if(array[middle] < target)
return binarySearch(target, array, middle + 1, high);
else
return binarySearch(target, array, low, middle - 1);
}
}

• The recurrence relation for the running time of the method


is:
T(1) = a if n = 1 (one element array)
T(n) = T(n / 2) + b if n > 1
11/4/2024 CS341: Recursion 43
Analysis Of Recursive Binary Search
Without loss of generality, assume n, the problem size, is a multiple of 2, i.e., n = 2k
Expanding:
T(1) = a (1)
T(n) = T(n / 2) + b (2)
= [T(n / 22) + b] + b = T (n / 22) + 2b by substituting T(n/2) in (2)
= [T(n / 23) + b] + 2b = T(n / 23) + 3b by substituting T(n/22) in (2)
= ……..
= T( n / 2k) + kb

The base case is reached when n / 2k = 1 ➔ n = 2k ➔ k = log2 n, we then


have:

T(n) = T(1) + b log2 n


= a + b log2 n

Therefore, Recursive Binary Search is O(log n)

11/4/2024 CS341: Recursion 44


Analysis Of Recursive Towers of Hanoi Algorithm

public static void hanoi(int n, char from, char to, char temp){
if (n == 1)
System.out.println(from + " --------> " + to);
else{
hanoi(n - 1, from, temp, to);
System.out.println(from + " --------> " + to);
hanoi(n - 1, temp, to, from);
}
}

• The recurrence relation for the running time of the method


hanoi is:
T(n) = a if n = 1
T(n) = 2T(n - 1) + b if n > 1
11/4/2024 CS341: Recursion 45
Analysis Of Recursive Towers of Hanoi Algorithm (Cont’d)
Expanding:
T(1) = a (1)
T(n) = 2T(n – 1) + b if n > 1 (2)
= 2[2T(n – 2) + b] + b = 22 T(n – 2) + 2b + b by substituting T(n – 1) in (2)
= 22 [2T(n – 3) + b] + 2b + b = 23 T(n – 3) + 22b + 2b + b by substituting T(n-2) in (2)
= 23 [2T(n – 4) + b] + 22b + 2b + b = 24 T(n – 4) + 23 b + 22b + 21b + 20b by substituting T(n – 3) in (2)
= ……
= 2k T(n – k) + b[2k- 1 + 2k– 2 + . . . 21 + 20]

The base case is reached when n – k = 1 → k = n – 1, we then have:

Therefore, The method hanoi is O(2n)


11/4/2024 CS341: Recursion 46
Analysis Of Recursive Fibonacci
long fibonacci (int n) { // Recursively calculates Fibonacci number
if( n == 1 || n == 2)
return 1;
else
return fibonacci(n – 1) + fibonacci(n – 2);
}
T(n) = c if n = 1 or n = 2 (1)
T(n) = T(n – 1) + T(n – 2) + b if n > 2 (2)
We determine a lower bound on T(n):
Expanding: T(n) = T(n - 1) + T(n - 2) + b
≥ T(n - 2) + T(n-2) + b
= 2T(n - 2) + b
= 2[T(n - 3) + T(n - 4) + b] + b by substituting T(n - 2) in (2)
 2[T(n - 4) + T(n - 4) + b] + b
= 22T(n - 4) + 2b + b
= 22[T(n - 5) + T(n - 6) + b] + 2b + b by substituting T(n - 4) in (2)
≥ 23T(n – 6) + (22 + 21 + 20)b
...
 2kT(n – 2k) + (2k-1 + 2k-2 + . . . + 21 + 20)b
= 2kT(n – 2k) + (2k – 1)b
The base case is reached when n – 2k = 2 → k = (n - 2) / 2
Hence T(n) ≥ 2 (n – 2) / 2 T(2) + [2 (n - 2) / 2 – 1]b
= (b + c)2 (n – 2) / 2 – b
= [(b + c) / 2]*(2)n/2 – b → Recursive Fibonacci is exponential
11/4/2024 CS341: Recursion 47
Quick sort
YouTube

11/4/2024 CS341: Recursion 48


Quicksort A[p…q] ≤ A[q+1…r]

• Sort an array A[p…r]


• Divide
• Partition the array A into 2 subarrays A[p..q] and A[q+1..r], such that each
element of A[p..q] is smaller than or equal to each element in A[q+1..r]
• Need to find index q to partition the array

11/4/2024 CS341: 49
Recursion
Quicksort A[p…q] ≤ A[q+1…r]

• Conquer
• Recursively sort A[p..q] and A[q+1..r] using Quicksort
• Combine
• Trivial: the arrays are sorted in place
• No additional work is required to combine them
• The entire array is now sorted

11/4/2024 CS341: 50
Recursion
Partition Implementation (Java)
static int Partition(int[] a, int left, int
right) {
int p = a[left], l = left + 1, r = right;
while (l < r) {
while (l < right && a[l] < p) l++;
while (r > left && a[r] >= p) r--;
if (l < r) {
int temp = a[l]; a[l] = a[r]; a[r]
= temp;
}
}
a[left] = a[r];
a[r] = p;
return r;
}

11/4/2024 CS341: Recursion 51


Quicksort Implementation (Java)

static void Quicksort(int[] array, int left, int right) {


if (left < right) {
int p = Partition(array, left, right);
Quicksort(array, left, p - 1);
Quicksort(array, p + 1, right);
}
}

11/4/2024 CS341: Recursion 52


Best case of Quick Sort
• The best case occurs when we select the pivot as the
mean. So here
T(N) = 2 * T(N / 2) + N * constant
T(N) = 2*(2*T(N / 4) + N / 2 * constant) + N * constant
T(N) = 4 * T(N / 4) + 2 * constant * N.

T(N) = 2k * T(N / 2k) + k * constant * N; then, 2k = N=> k = log2N
So T(N) = N * T(1) + N * log2N. Therefore, the time complexity is O(N * logN).

11/4/2024 CS341: Recursion 53


Worst case of Quick Sort
• The worst case will occur when the array gets divided into two
parts, one part consisting of N-1 elements and the other and so
on. So,
T(N) = T(N – 1) + N * constant
T(N)= T(N – 2) + (N – 1) * constant + N * constant = T(N – 2) + 2 * N * constant – constant
T(N)= T(N – 3) + 3 * N * constant – 2 * constant – constant
...
T(N)= T(N – k) + k * N * constant – (k – 1) * constant – . . . – 2*constant – constant
T(N)= T(N – k) + k * N * constant – constant * (k*(k – 1))/2
If we put k = N in the above equation, then

T(N) = T(0) + N * N * constant – constant * (N * (N-1)/2)


= N² – N*(N-1)/2
= N²/2 + N/2

11/4/2024 CS341: Recursion 54


Thank you

11/4/2024 CS341: Recursion 55

You might also like