0% found this document useful (0 votes)
31 views33 pages

CH 9

Chapter 9 discusses recursion, which is the practice of defining objects in terms of themselves, including recursive algorithms, functions, and sets. It provides examples such as recursively defined functions, the Fibonacci series, and factorials, along with proofs and theorems related to these concepts. Additionally, it covers recursive algorithms and their efficiency, illustrating how recursion can simplify coding while potentially increasing stack space usage.

Uploaded by

munshijubair7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views33 pages

CH 9

Chapter 9 discusses recursion, which is the practice of defining objects in terms of themselves, including recursive algorithms, functions, and sets. It provides examples such as recursively defined functions, the Fibonacci series, and factorials, along with proofs and theorems related to these concepts. Additionally, it covers recursive algorithms and their efficiency, illustrating how recursion can simplify coding while potentially increasing stack space usage.

Uploaded by

munshijubair7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Chapter 9 : Recursion

Recursion

 Recursion is the general term for the


practice of defining an object in terms of
itself
 or of part of itself
 This may seem circular, but it isn’t necessarily.
 An inductive proof establishes the truth of
P(n+1) recursively in terms of P(n).
 There are also recursive algorithms,
definitions, functions, sequences, sets,
and other structures.
Recursively Defined Functions
 Simplest case:
One way to define a function f:ℕS (for any set S) or series an=f(n) is to:
 Define f(0).
 For n>0, define f(n) in terms of f(0),…,f(n−1).
 Ex. Define the series an :≡ 2n recursively:
 Let a0 :≡ 1.
 For n>0, let an :≡ 2an-1.
Another Example

 Suppose we define f(n) for all nℕ recursively by:


 Let f(0)=3
 For all nℕ, let f(n+1)=2f(n)+3
 What are the values of the following?
 f(1)= f(2)= f(3)= f(4)=

9 21 45 93
Recursive definition of
Factorial
 Give an inductive (recursive) definition of the factorial
function,
F(n) :≡ n! :≡ ∏1≤i≤n i = 12…n.
 Base case: F(0) :≡ 1
 Recursive part: F(n) :≡ n  F(n−1).
 F(1)=1
 F(2)=2
 F(3)=6
More Easy Examples

 Write down recursive definitions for:


 i+n (i integer, n natural) using only s(i) = i+1.
 a·n (a real, n natural) using only addition
 an (a real, n natural) using only multiplication
 ∑0≤i≤n ai (for an arbitrary series of numbers {ai})
 ∏0≤i≤n ai (for an arbitrary series of numbers {ai})

∩ 0≤i≤n Si (for an arbitrary series of sets {Si})


The Fibonacci Series

 The Fibonacci series fn≥0 is a famous series defined by:


f0 :≡ 0, f1 :≡ 1, fn≥2 :≡ fn−1 + fn−2

0
1 1
2 3
58
13
Leonardo Fibonacci
1170-1250
Inductive Proof about Fib.
series
 Thm. fn < 2n. Implicitly for all nℕ
Proof. By induction.
Base cases: f0 = 0 < 20 = 1 Note use of
f1 = 1 < 2 = 2
1
base cases of
Inductive step: Use 2nd principle of induction (strong
induction). Assume k<n, fk < 2k. recursive def’n.
Then fn = fn−1 + fn−2 is
< 2n−1 + 2n−2 < 2n−1 + 2n−1 = 2n. ■
A lower bound on
Fibonacci series
 Thm. For all integers n ≥ 3, fn > αn−2, where α
= (1+51/2)/2 ≈ 1.61803.
Proof. (Using strong induction.)
 Let P(n) = (fn > αn−2).
 Base cases:
For n=3, α3−2 = α ≈ 1.61803 < 2 = f3.
For n=4, α4−2 = α2 = (1+2·51/2+5)/4 = (3+51/2)/2 ≈
2.61803 < 3 = f4.
 Inductive step:
For k≥4, assume P(j) for 3≤j≤k, prove P(k+1).
Note α2 = α+1. α(k+1)−2 = αk−1 = α2 αk−3 = (α+1)αk−3 =
αk−2 + αk−3. By inductive hypothesis, fk−1 > αk−3 and
fk > αk−2. So,
α(k+1)−2 = αk−2 + αk−3 < fk + fk−1 = fk+1. Thus P(k+1).

Lamé’s Theorem

 Thm. a,bℤ+, a ≥b,


the number of steps in Euclid’s
algorithm to find gcd(a,b) is
≤ 5k,
where k = log10 b+1 is the
number of decimal digits in b.
 Thus, Euclid’s algorithm is linear-
time in the number of digits in b.
 Proof: Gabriel Lamé
 Uses the Fibonacci sequence! 1795-1870
 See next 2 slides.
Proof of Lamé’s Theorem

 Consider the sequence of division-algorithm equations


used in Euclid’s algorithm:
Where a = r0,
r0 = r1q1 + r2 with 0 ≤ r2 < r1

b = r1, and
 r1 = r2q2 + r3 with 0 ≤ r3 < r2 gcd(a,b)=rn.
 …
 rn−2 = rn−1qn−1 + rn with 0 ≤ rn < rn−1
 rn−1 = rnqn + rn+1 with rn+1 = 0 (terminate)
 The number of divisions (iterations) is n.

Continued on next slide…


Lamé Proof, continued
 Since r0 ≥ r1 > r2 > … > rn,
each quotient qi ≡ ri−1/ri ≥ 1, i=1, ..., n-1.
 Since rn−1 = rnqn and rn−1 > rn, qn ≥ 2.
 So we have the following relations between r and f:
 rn ≥ 1 = f2
 rn−1 ≥ 2rn ≥ 2 = f3
 rn−2 ≥ rn−1 + rn ≥ f2 + f3 = f4
 …
 r2 ≥ r3 + r4 ≥ fn−1 + fn−2 = fn
 b = r1 ≥ r2 + r3 ≥ fn + fn−1 = fn+1.
 Thus, if n>2 divisions are used, then b ≥ fn+1 > αn−1.
 Thus, log10 b > log10(αn−1) = (n−1)log10 α ≈ (n−1)0.208 > (n−1)/5.
 If b has k decimal digits, then log10 b < k, so n−1 < 5k, so n ≤ 5k.
Recursively Defined Sets

 An infinite set S may be defined recursively, by


giving:
i) A small finite set of base elements of S.
ii) A rule for constructing new elements of S from
previously-established elements.
iii) Implicitly, S has no other elements but these.
 Ex. Let 3S, and let x+yS if x,yS.
What is S?
The Set of All Strings
 Def. Given an alphabet Σ, the set Σ* of all strings over Σ can be recursively
defined by:
λ  Σ* (λ :≡ “”, the empty string)
 w  Σ*  x  Σ → wx  Σ*

 Exercise: Prove that this definition is equivalent to our old one:


 :   n
nN
Other Easy String Examples

 Give recursive definitions for:


• The concatenation of strings w1·w2.
• The length (w) of a string w.
• Well-formed formulae of propositional
logic involving T, F, propositional
variables, and operators in {¬, , ,
→, ↔}.
• Well-formed arithmetic formulae
involving variables, numerals, and
operations in {+, −, *, ↑}.
Rooted Trees

 Briefly, a tree is a graph in which there is


exactly one undirected path between each
pair of nodes.
 An undirected graph can be represented as a
set of unordered pairs (called edges) of
objects called nodes.
 Definition of the set of rooted trees:
i) Any single node r is a rooted tree.
ii) If T1, …, Tn are disjoint rooted trees with
respective roots r1, …, rn, and r is a node
not in any of the Ti’s, then another rooted
tree is {{r,r1},…,{r,rn}}T1…Tn.
iii) That is all.
Illustrating Rooted Tree
Def’n.
 How rooted trees can be combined to form a new
rooted tree…
r

T1
r1
T2
r2
… Tn
rn

Draw some examples…


Extended Binary Trees
 A special case of rooted trees.
 Def. Extended binary trees (EBT):
i) The empty set  is an EBT.
ii) If T1,T2 are disjoint EBTs,
then e1e2 T1T2 is an EBT,
where
e1 =  if T1 = , and
e1 = {(r,r1)} if T1≠ and
has root r1, and
similarly for e2.
iii) That is all.
Draw some examples…
Full Binary Trees
 A special case of extended binary trees.
 Def. Recursive definition of full binary trees
(FBT):
 i) A single node r is a FBT.
 Note this is different from the EBT base case.
 ii) If T1,T2 are disjoint FBTs,
then e1e2T1T2,
where
e1 =  if T1 = , and
e1 = {(r,r1)} if T1≠ and
has root r1, and
similarly for e2.
 Note this is the same as the EBT recursive case!
 Can simplify it to “If T1,T2 are disjoint FBTs with roots
r1 and r2, then {(r, r1),(r,r2)} T1T2 is an FBT.”
 i) That is all.

Draw some examples…


Structural Induction

 Proving something about a recursively defined object


using an inductive proof whose structure mirrors the
object’s definition.
Example
 Thm. Let 3S, and let x+yS if x,yS, and that is
all.
Let A = {nℤ+| (3|n)}.
Than A=S.
Proof. We show that AS and SA.
 To show AS, show [nℤ+  (3|n)]→ nS.
 Inductive proof. Let P(n) :≡ nS. Induction over positive
multiples of 3. Base case. n=3, thus 3S by def’n. of S.
Inductive step. Given P(n), prove P(n+3). By inductive hyp., nS,
and 3S, so by def’n of S, n+3S.
 To show SA: let nS, show nA.
 Structural inductive proof. Let P(n):≡nA.
Base case. 3S . Since 3|3, 3A.
Recursive step. x,y S, n=x+y S and x,y A . Since 3|x and 3|y,
we have 3|(x+y), thus x+y = n  A.
 stop
Recursive Algorithms (§3.5)

 Recursive definitions can be used to describe algorithms


as well as functions and sets.
 Ex. A procedure to compute an.
 procedure power(a≠0: real,
nℕ)
 if n = 0 then return 1
else return a · power(a, n−1)
Efficiency of Recursive
Algorithms
 The time complexity of a recursive algorithm may
depend critically on the number of recursive calls it
makes.
 Ex. Modular exponentiation to a power n can take
log(n) time if done right, but linear time if done slightly
differently.
 Task: Compute bn mod m, where
m≥2, n≥0, and 1≤b<m.
Modular Exponentiation Alg.
#1
 Uses the fact that bn = b·bn−1 and that
x·y mod m = x·(y mod m) mod m.
(Prove the latter theorem at home.)

 procedure mpower(b≥1,n≥0,m>b ℕ)


 {Returns bn mod m.}
if n=0
then return 1
else return (b·mpower(b,n−1,m))
mod m

 Note this algorithm takes Θ(n) steps!


Modular Exponentiation Alg.
#2
 Uses the fact that b2k = bk·2 = (bk)2.

 procedure mpower(b,n,m) {same


signature}
 if n=0 then return 1
else if 2|n
then return mpower(b,n/2,m)2 mod
m
else return (mpower(b,n−1,m)·b)
mod m

 What is its time complexity? Θ(log n) steps


A Slight Variation

 Nearly identical but takes Θ(n) time instead!

 procedure mpower(b,n,m) {same


signature}
 if n=0 then return 1
else if 2|n
then return (mpower(b,n/2,m)·

mpower(b,n/2,m)) mod m
else return (mpower(b,n−1,m)·b)
mod m
The number of recursive calls made is critical!
Recursive Euclid’s Algorithm

 procedure gcd(a,bN)
if a = 0
then return b
else return gcd(b mod a, a)

 Note recursive algorithms are often simpler to code


than iterative ones…
 However, they can consume more stack space, if your
compiler is not smart enough.
Recursive Linear Search

 Note there is no real advantage to using


recursion here, rather than just looping
for loc := i to j…

 procedure search(a: series; i, j: integer;


x: item to be found)
 {Find x in series a at a location ≥i and <j}
if ai = x
then return i {At the right item? Return it!}
if i = j
then return 0 {No locations in range? Failure!}
return search(i+1, j, x) {Try rest of range}
Recursive Binary Search
 procedure binarySearch(a, x, i, j) {same
sig}
{Find location of x in a, ≥i and <j}
m := (i + j)/2 {Go to halfway point.}
if x = am
then return m {Did we luck out?}
if x<am  i<m {If it’s to the left,}
then return binarySearch(a,x,i,m−1) {Check
that ½}
else if am<x  m<j {If it’s to right,}
then return binarySearch(a,x,m+1,j) {Check
that ½}
else
return 0 {No more items, failure.}
Recursive Fibonacci Algorithm
 procedure fibonacci(n  N)
if n=0
then return 0
if n=1
then return 1
return fibonacci(n−1)+fibonacci(n−2)

 Is this an efficient algorithm?


 Is it polynomial-time in n?
 How many additions are performed?
Analysis of Fibonacci
Procedure
 Thm. The preceding procedure for
fibonacci(n) performs fn+1−1 addition
operations.
Proof. By strong structural induction over
n, based on the procedure’s own recursive
definition.
 Base cases: fibonacci(0) performs 0 additions, and
f0+1−1 =
f1 − 1 = 1 − 1 = 0. Likewise, fibonacci(1) performs
0 additions, and f1+1−1 = f2−1 = 1−1 = 0.
 Inductive step: For n>1, by strong inductive
hypothesis, fibonacci(n−1) and fibonacci(n−2) do
fn−1 and fn−1−1 additions respectively, and
fibonacci(n) adds 1 more, for a total of fn−1+
fn−1−1+1 = fn+fn−1+1 = fn+1+1. ■
Recursive Merge Sort

 procedure sort(L = 1,…, n)


if n>1
then {
m := n/2 {this is rough ½-way
point}
L := merge(sort(1,…, m),
sort(m+1,…, n))
}
return L

 The merge (next slide) takes Θ(n) steps, and merge-sort


takes Θ(n log n).

You might also like