Lecture 11
Lecture 11
ING PROGRAM
EFFICIENCY: 2
(download slides and .py files and follow
along!) 6.0001 LECTURE 11
6.0001 LECTURE 11 1
TODA
Y
Classes of complexity
Examples characteristic of each class
6.0001 LECTURE 11 2
WHY WE WANT TO
UNDERSTAND EFFICIENCY
OF PROGRAMS
how can we reason about an algorithm in order to
predict the amount of time it will need to solve a
problem of a particular size?
how can we relate choices in algorithm design to the
time efficiency of the resulting algorithm?
◦are there fundamental limits on the amount of time
we will need to solve a particular problem?
6.0001 LECTURE 11 3
ORDERS OF GROWTH:
RECAP
Goals:
want to evaluate program’s efficiency when input is very big
want to express the growth of program’s run time as input
size grows
want to put an upper bound on growth – as tight as
possible
do not need to be precise: “order of” not “exact” growth
we will look at largest factors in run time (which section of
the program will take the longest to run?)
thus, generally we want tight upper bound on growth, as
function of size of input, in worst case
6.0001 LECTURE 11 4
COMPLEXITY CLASSES:
RECAP
O(1) denotes constant running time
O(log n) denotes logarithmic running time
O(n) denotes linear running time
O(n log n) denotes log-linear running time
O(nc) denotes polynomial running time (c is
a constant)
O(cn) denotes exponential running time (c is a
constant being raised to a power based on size of
input)
6.0001 LECTURE 11 5
COMPLEXITY
CLASSES ORDERED
LOW TO HIGH
O(1) : constant
O(log n) : logarithmic
O(n) linear
:
loglinear
O(n log
n): polynomial
O(nc) exponential
:
6.0001 LECTURE 11 6
O(cn)
COMPLEXITY GROWTH
CLASS n=10 = 100 = 1000 = 1000000
O(1) 1 1 1 1
O(log n) 1 2 3 6
6.0001 LECTURE 11 7
CONSTANT COMPLEXITY
complexity independent of inputs
very few interesting algorithms in this class, but can
often have pieces that fit this class
can have loops or recursive calls, but ONLY IF number
of iterations or calls independent of size of input
6.0001 LECTURE 11 8
LOGARITHMIC
COMPLEXITY
complexity grows as log of size of one of its inputs
example:
◦bisection search
◦binary search of a list
6.0001 LECTURE 11 9
BISECTION
SEARCH
suppose we want to know if a particular element is
present in a list
saw last time that we could just “walk down” the list,
checking each element
complexity was linear in length of the list
suppose we know that the list is ordered from
smallest to largest
◦saw that sequential search was still linear in
complexity
◦can we do better?
6.0001 LECTURE 11 10
BISECTION
SEARCH
1. pick an index, i, that divides list in half
2. ask if L[i] == e
3. if not, ask if L[i] is larger or smaller than e
4. depending on answer, search left or right half of L
for e
6.0001 LECTURE 11 11
BISECTION SEARCH
COMPLEXITY ANALYSIS
finish looking
through list
when
1 = n/2i
… so i = log n
… complexity of
recursion is
O(log n) –
where n is len(L)
6.0001 LECTURE 11 12
BISECTION SEARCH
IMPLEMENTATION 1
def bisect_search1(L, e):
if L == []:
return
elif False
len(L) == 1:
return L[0]
else: ==
e
half = len(L)//2
if L[half] > e:
return bisect_search1(
L[:half], e) else:
return L[half:] , e)
bisect_search1(
6.0001 LECTURE 11 13
COMPLEXITY OF FIRST
BISECTION SEARCH
METHOD
implementation 1 – bisect_search1
• O(log n) bisection search calls
• On each recursive call, size of range to be searched is cut in half
• If original range is of size n, in worst case down to range of
size 1 when n/(2^k) = 1; or when k = log n
• O(n) for each bisection search call to copy list
• This is the cost to set up each call, so do this for each
level of recursion
• O(log n) * O(n) O(n log n)
• if we are really careful, note that length of list to
be copied is also halved on each recursive call
• turns out that total cost to copy is O(n) and this dominates
the log n cost due to the recursive calls
6.0001 LECTURE 11 14
BISECTION
SEARCH
ALTERNATIVE still reduce size of
problem by factor
of two on each step
but just keep track
of low and high
portion of list to be
searched
avoid copying the
list
complexity of
recursion is again
O(log n) – where n
is len(L)
6.0001 LECTURE 11 15
BISECTION SEARCH
IMPLEMENTATION 2
def bisect_search2(L, e):
def bisect_search_helper(L, e, low, high):
if high == low:
return L[low] == e
mid = (low + high)//2
if L[mid] == e:
return True
elif L[mid] >
e: left to search
if low == mid: #nothing
else:
return False
return bisect_search_helper(L, e, low, mid - 1)
else:
return bisect_search_helper(L, e, mid + 1, high)
if len(L) == 0:
return False
else:
return
bisect_searc
h_helper(L, 6.0001 LECTURE 11 16
e, 0, len(L)
- 1)
COMPLEXITY OF
SECOND BISECTION
SEARCH
implementation 2METHOD
– bisect_search2 and its helper
• O(log n) bisection search calls
• On each recursive call, size of range to be searched is cut in half
• If original range is of size n, in worst case down to range of
size 1 when n/(2^k) = 1; or when k = log n
• pass list and indices as parameters
• list never copied, just re-passed as a pointer
• thus O(1) work on each recursive call
• O(log n) * O(1) O(log n)
6.0001 LECTURE 11 17
LOGARITHMIC
COMPLEXITY
def intToStr(i):
digits =
'0123456789' ifi ==
0:
return
'0'
result =
i''= while
i//10
i > 0: return
result
result
=
6.0001 LECTURE 11 18
digits[i
LOGARITHMIC
COMPLEXITY
def intToStr(i):
digits =
only have to look at loop as
no function calls
'0123456789' if i
== 0:
within while loop, constant
return
number of steps
'0' res
while i > 0:= how many times through
''
res = digits[i + res loop?
%10] i =i//10 ◦how many times can
return result one divide i by 10?
◦ O(log(i))
6.0001 LECTURE 11 19
LINEAR COMPLEXITY
saw this last time
◦ searching a list in sequence to see if an
element is present
◦ iterative loops
6.0001 LECTURE 11 20
O() FOR ITERATIVE
FACTORIAL
complexity can depend on number of iterative calls
def fact_iter(n):
prod = 1
for i in
range(1,n+1):
prod *=
i return
prod
overall O(n) – n times round loop, constant cost each
time 6.0001 LECTURE 11 21
O() FOR RECURSIVE
FACTORIAL
def fact_recur(n):
""" assume n >= 0 """
if n <=
1:
return
n*fact_recur(n – 1)
1
computes
else: factorial recursively
return
if you time it, may notice that it runs a bit slower than
iterative version due to function calls
still O(n) because the number of function calls is linear
in n, and constant effort to set up call
iterative and recursive factorial implementations are
the same order of growth
6.0001 LECTURE 11 22
LOG-LINEAR
COMPLEITY
many practical algorithms are log-linear
very commonly used log-linear algorithm is merge sort
will return to this next lecture
6.0001 LECTURE 11 23
POLYNOMIAL
COMPLEXITY
most common polynomial algorithms are quadratic,
i.e., complexity grows with square of size of input
commonly occurs when we have nested loops or
recursive function calls
saw this last time
6.0001 LECTURE 11 24
EXPONENTIAL
COMPLEXITY
recursive functions where more than one recursive
call for each size of problem
◦Towers of Hanoi
many important problems are inherently
exponential
◦unfortunate, as cost can be high
◦will lead us to consider approximate solutions as
may provide reasonable answer more quickly
6.0001 LECTURE 11 25
COMPLEXITY OF
TOWERS OF HANOI
Let tn denote time to solve tower of size n
tn = 2tn-1 + 1
= 2(2tn-2 + 1) + 1
= 4tn-2 + 2 + 1
= 4(2tn-3 + 1) + 2 + 1 Geometric growth
= 8tn-3 + 4 + 2 + 1 a= 2n-1 + … + 2 +
= 2k tn-k + 2k-1 + … + 4 + 2 + 1 1 2a = 2n + 2n-1 + ... + 2
a = 2n -1
= 2n-1 + 2n-2 + ... + 4 + 2 + 1
= 2n – 1
so order of growth is O(2n)
6.0001 LECTURE 11 26
EXPONENTIAL
COMPLEXITY
given a set of integers (with no repeats), want to
generate the collection of all possible subsets – called
the power set
{1, 2, 3, 4} would generate
◦{}, {1}, {2}, {3}, {4}, {1, 2}, {1, 3}, {1, 4}, {2, 3}, {2, 4}, {3,
4},
{1, 2, 3}, {1, 2, 4}, {1, 3, 4}, {2, 3, 4}, {1, 2, 3, 4}
order doesn’t matter
◦{}, {1}, {2}, {1, 2}, {3}, {1, 3}, {2, 3}, {1, 2, 3}, {4}, {1, 4},
{2,
4}, {1, 2, 4}, {3, 4}, {1, 3, 4}, {2, 3, 4}, {1, 2, 3, 4}
6.0001 LECTURE 11 27
POWER SET – CONCEPT
we want to generate the power set of integers from 1 to n
assume we can generate power set of integers from 1 to
n-1
then all of those subsets belong to bigger power set
(choosing not include n); and all of those subsets with n
added to each of them also belong to the bigger power set
(choosing to include n)
{}, {1}, {2}, {1, 2}, {3}, {1, 3}, {2, 3}, {1, 2, 3},{4}, {1, 4}, {2, 4}, {1, 2,
4}, {3, 4}, {1, 3, 4}, {2, 3, 4}, {1, 2, 3, 4}
6.0001 LECTURE 11 28
EXPONENTIAL
COMPLEXITY
def genSubsets(L):
res = []
if len(L)
list
== all subsets without
0:
of just last element
return [[]] #list of
empty smaller =
genSubsets(L[:-1])#
for all smaller
last element
solutions, add one with last element
extra = L[-1:] # create a
return smaller+new # combine those
list new = []
with last elementand those without
for small in smaller:
6.0001 LECTURE 11 29
new.append(small+extra) #
EXPONENTIAL
COMPLEXITY
def genSubsets(L):
res = []
assuming append is
constant time
if len(L) ==
time includes time to solve
0:
smaller problem, plus time
return [[]]
needed to make a copy of
smaller = genSubsets(L[:-
all elements in smaller
1]) extra = L[-1:]
problem
new = []
for small in smaller:
new.append(small+extra)
return smaller+new
6.0001 LECTURE 11 30
EXPONENTIAL
COMPLEXITY
def genSubsets(L):
res =
but important to think
about size of smaller
[]
if len(L) == know that for a set of size
0: k there are 2k cases
return [[]]
smaller = genSubsets(L[:- how can we deduce
1]) extra = L[-1:] overall complexity?
new = []
for small in smaller:
new.append(small+extra)
return smaller+new
6.0001 LECTURE 11 31
EXPONENTIAL
COMPLEXITY
let t denote time to solve problem of size n
n
6.0001 LECTURE 11 33
SOME MORE EXAMPLES
OF ANALYZING
COMPLEXITY
6.0001 LECTURE 11 34
COMPLEXITY OF
ITERATIVE
FIBONACCI
def fib_iter(n):
Best case:
if n == 0:
return O(1)
0 Worst case:
elif
else:n ==
fib_i
1: = 0 O(1) + O(n) + O(1) O(n)
fib_ii
retu = 1
for
rn i in range(n-1):
tmp =
1 fib_i fib_i =
fib_ii
fib_ii =
tmp +
fib_ii
return fib_ii 6.0001 LECTURE 11 36
COMPLEXITY OF
RECURSIVE
FIBONACCI
def fib_recur(n):
""" assumes n an int >=
0 """ if n == 0:
return 0
elif n == 1:
return
1
else: + fib_recur(n-2)
retu
Worst rn
case:
fib_recu
O(2n) r(n-1)
6.0001 LECTURE 11 37
COMPLEXITY OF
RECURSIVE FIBONACCI
fib(5)
fib(4) fib(3)
fib(2) fib(1)
6.0001 LECTURE 11 38
BIG OH SUMMARY
compare efficiency of algorithms
• notation that describes growth
• lower order of growth is better
• independent of machine or specific implementation
use Big Oh
• describe order of growth
• asymptotic notation
• upper bound
• worst case analysis
6.0001 LECTURE 11 40
COMPLEXITY OF
COMMON PYTHON
FUNCTIONS
Lists: n is Dictionaries: n is
•len(L)
index O(1) len(d)
• store O(1) •worst
indexcase O(n)
• length O(1) • store O(n)
• append O(1) • length O(n)
• == O(n) • delete O(n)
• remove O(n) • iteration O(n)
• copy O(n) average case
• reverse O(n) • index O(1)
• iteration O(n) • store O(1)
• in list O(n) • delete O(1)
• iteration O(n)
6.0001 LECTURE 11 41
MIT OpenCourseWare
https://ocw.mit.edu
For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.