0% found this document useful (0 votes)
8 views34 pages

FoCS 202324 2

nsum takes O(n) nsumsum calls nsum and itself recursively Therefore: T(0) = 1 T(n+1) = T(n) + O(n) = O(n^2) So nsumsum is O(n^2)

Uploaded by

odarefisher
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views34 pages

FoCS 202324 2

nsum takes O(n) nsumsum calls nsum and itself recursively Therefore: T(0) = 1 T(n+1) = T(n) + O(n) = O(n^2) So nsumsum is O(n^2)

Uploaded by

odarefisher
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Foundations of Computer

Science: Lecture 2
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity
Recursion and Complexity

9th October 2023


Anil Madhavapeddy
The Practical Classes
https://www.cl.cam.ac.uk/teaching/2324/OCaml/

• Executed online in the hub.cl.cam.ac.uk server

• There are 5 ticks, each of which have a deadline for submission 10


days after they are issued (except last tick, which goes into Lent term).
Tick 1: released 2023-10-06 due 2023-10-16
Tick 2: released 2023-10-13 due 2023-10-23
Tick 3: released 2023-10-20 due 2023-10-30
Tick 4: released 2023-10-27 due 2023-11-06
Tick 5: released 2023-11-03 due 2024-01-19
Expression Evaluation
E0 → E1 → … → En → v
Expression Evaluation
E0 → E1 → … → En → v
Focus on expressions;
ignore side-e ects for now.

This discipline of separating expression


from e ects is often known as
functional programming

We will return to side e ects later in the


course to make useful programs!
ff
ff
ff
Expression Evaluation
E0 → E1 → … → En → v

# let rec power x n =


if n = 1 then x
else if even n then
power (x *. x) (n / 2)
else
x *. power (x *. x) (n / 2)
Expression Evaluation
E0 → E1 → … → En → v

# let rec power x n = power(2, 12) ⇒


if n = 1 then x
else if even n then power(4, 6) ⇒
power (x *. x) (n / 2)
else power(16, 3) ⇒
x *. power (x *. x) (n / 2)
16 × power(256, 1) ⇒
16 × 256 ⇒
4096
Summing rst n integers
# let rec nsum n =
nsum 3 ⇒ 3 + (nsum 2)
if n = 0 then
0 ⇒ 3 + (2 + (nsum 1)
else ⇒ 3 + (2 + (1 + nsum 0))
n + nsum (n - 1) ⇒ 3 + (2 + (1 + 0))
fi
Summing rst n integers
# let rec nsum n =
nsum 3 ⇒ 3 + (nsum 2)
if n = 0 then
0 ⇒ 3 + (2 + (nsum 1)
else ⇒ 3 + (2 + (1 + nsum 0))
n + nsum (n - 1) ⇒ 3 + (2 + (1 + 0))

Nothing can progress


until the nal expression
is calculated!
fi
fi
Summing rst n integers
# let rec nsum n =
nsum 3 ⇒ 3 + (nsum 2)
if n = 0 then
0 ⇒ 3 + (2 + (nsum 1)
else ⇒ 3 + (2 + (1 + nsum 0))
n + nsum (n - 1) ⇒ 3 + (2 + (1 + 0))

Intermediate results are


stored in the program Nothing can progress
stack which is usually of until the nal expression
limited size. is calculated!
fi
fi
Summing rst n integers
# let rec nsum n =
nsum 3 ⇒ 3 + (nsum 2)
if n = 0 then
0 ⇒ 3 + (2 + (nsum 1)
else ⇒ 3 + (2 + (1 + nsum 0))
n + nsum (n - 1) ⇒ 3 + (2 + (1 + 0))

Intermediate results are


stored in the program Nothing can progress
stack which is usually of until the nal expression
limited size. is calculated!

Two types of storage:


heap is a global area where the memory storing values bound to names are tracked
stack is a list where function call arguments are pushed and return values popped.
fi
fi
Iteratively summing
# let rec summing n total = # let rec nsum n =
if n = 0 then if n = 0 then
total 0
else else
summing (n - 1) (n + total) n + nsum (n - 1)
Iteratively summing
# let rec summing n total = # let rec nsum n =
if n = 0 then if n = 0 then
total 0
else else
summing (n - 1) (n + total) n + nsum (n - 1)

summing 3 0 ⇒ summing 2 3 nsum 3 ⇒ 3 + (nsum 2)


⇒ summing 1 5 ⇒ 3 + (2 + (nsum 1)
⇒ summing 0 6 ⇒ 3 + (2 + (1 + nsum 0))
⇒6 ⇒ 3 + (2 + (1 + 0))
Iteratively summing
# let rec summing n total =
if n = 0 then Extra argument total
total acts as the accumulator
else to keep track explicitly
summing (n - 1) (n + total) instead of using the stack

summing 3 0 ⇒ summing 2 3
Algorithms like this are
⇒ summing 1 5
⇒ summing 0 6
known as iterative or
⇒6 tail recursive
Recursion vs iteration
• Why two terms iterative and tail recursive?
• “Iterative” normally refers to a loop: e.g. coded using while.
• “Tail-recursion” involves the recursive function call being the last
thing that expression does.

• Tail-recursion is e cient only if the compiler detects it.


• Mainly it saves space, though iterative code can run faster.
• Do not make programs iterative unless you determine the
gain is signi cant.
fi
ffi
How can we
analyse our
programs for
e ciency?
ffi
Silly summing rst n integers
# let rec sillySum n =
if n = 0 then
0
else
n + (sillySum (n-1) + sillySum (n-1)) / 2

Recursively calls itself


twice for every invocation
fi
Silly summing rst n integers
# let rec sillySum n =
if n = 0 then
0
else
n + (sillySum (n-1) + sillySum (n-1)) / 2

Recursively calls itself


twice for every invocation

Should assign the result to # let x = 2.0 in


let y = Float.pow x 20.0 in
a local variable to prevent y *. (x /. y)
evaluating it twice
fi
Asymptotic complexity refers to how
program costs grow with increasing inputs

Usually space or time, with the latter usually being


larger than the former.

Question: if we double our processing power, how


much does our computation capability increase?
Time Complexity
Comparing Algorithms with O(n)
Formally, de ne f(n) = O(g(n))
provided that | f(n) | ≤ c | g(n) |
fi
Comparing Algorithms with O(n)
Formally, de ne f(n) = O(g(n))
provided that | f(n) | ≤ c | g(n) |

Intuitively, consider the most signi cant term


and ignore constant or smaller factors

2 2
E.g. simplify 3n + 34n + 433 → n
fi
fi
Facts about O notation

O(2g(n)) is the same as O(g(n))


O(log10 n) is the same as O(ln n)
2 2
O(n + 50n + 36) is the same as O(n )
2 3
O(n ) is contained in O(n )
O(2n) is contained in O(3n)
O(log n) is contained in O( n)
Common complexity
classes
Sample costs in O-notation
Simple recurrence relations
Mapping this to OCaml
Given (n+1), does a
# let rec nsum n =
if n = 0 then
constant amount of
0 work
else
n + nsum (n - 1)

Then calls itself


with n
Mapping this to OCaml
Given (n+1), does a
# let rec nsum n =
if n = 0 then
constant amount of
0 work
else
n + nsum (n - 1)

Then calls itself


with n

Therefore, recurrence relations are:


T(0) = 1
T(n + 1) = T(n) + 1
Mapping this to OCaml
Given (n+1), does a
# let rec nsum n =
if n = 0 then
constant amount of
0 work
else
n + nsum (n - 1)

Then calls itself


with n

Therefore, recurrence relations are:


T(0) = 1
T(n + 1) = T(n) + 1 O(n)
Mapping this to OCaml
# let rec nsumsum n = Calls itself
if n = 0 then recursively once
0
else
nsum n + nsumsum (n - 1)

Calls nsum which


takes O(n)
Mapping this to OCaml
# let rec nsumsum n = Calls itself
if n = 0 then recursively once
0
else
nsum n + nsumsum (n - 1)

Calls nsum which


takes O(n)

Therefore, recurrence relations are:


T(0) = 1
T(n + 1) = T(n) + n
Mapping this to OCaml
# let rec nsumsum n = Calls itself
if n = 0 then recursively once
0
else
nsum n + nsumsum (n - 1)

Calls nsum which


takes O(n)

Therefore, recurrence relations are:


T(0) = 1 2
T(n + 1) = T(n) + n O(n )
Mapping this to OCaml
# let rec power x n =
Calls itself
if n = 1 then x recursively once
else if even n then
power (x *. x) (n / 2)
else
x *. power (x *. x) (n / 2)
Always divides
iteration count by 2
Mapping this to OCaml
# let rec power x n =
Calls itself
if n = 1 then x recursively once
else if even n then
power (x *. x) (n / 2)
else
x *. power (x *. x) (n / 2)
Always divides
iteration count by 2

Therefore, recurrence relations are:


T(0) = 1
T(n) = T(n/2) + 1
Mapping this to OCaml
# let rec power x n =
Calls itself
if n = 1 then x recursively once
else if even n then
power (x *. x) (n / 2)
else
x *. power (x *. x) (n / 2)
Always divides
iteration count by 2

Therefore, recurrence relations are:


T(0) = 1
T(n) = T(n/2) + 1 O(log n)

You might also like