0% found this document useful (0 votes)
42 views

Module 5 Complexity 2

Discrete mathematics slides

Uploaded by

ccchengchuuu
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views

Module 5 Complexity 2

Discrete mathematics slides

Uploaded by

ccchengchuuu
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 25

Module #7 - Complexity

Module #5:
Algorithmic Complexity

113/10/17 1
Module #7 - Complexity

What is complexity?
 The word complexity has a variety of
technical meanings in different fields.
 There is a field of complex systems, which

studies complicated, difficult-to-analyze non-


linear and chaotic natural & artificial systems.
 Another concept: Informational complexity:

the amount of information needed to


completely describe an object. (An active
research field.)
 We will study algorithmic complexity.

113/10/17 2
Module #7 - Complexity

§2.2: Algorithmic Complexity


 The algorithmic complexity of a computation is
some measure of how difficult it is to perform
the computation.
 Measures some aspect of cost of computation
(in a general sense of cost).
 Common complexity measures:
– “Time” complexity: # of ops or steps
required
– “Space” complexity: # of memory bits
req’d

113/10/17 3
Module #7 - Complexity

Complexity Depends on Input


 Most algorithms have different
complexities for inputs of different sizes.
(E.g. searching a long list takes more
time than searching a short one.)
 Therefore, complexity is usually

expressed as a function of input length.


 This function usually gives the

complexity for the worst-case input of


any given length.

113/10/17 4
Module #7 - Complexity

Complexity & Orders of Growth


 Suppose algorithm A has worst-case
time complexity (w.c.t.c., or just
time) f(n) for inputs of length n, while
algorithm B (for the same task) takes
time g(n).
f g
 Suppose that f(g), also written

.
 Which algorithm will be fastest on all

sufficiently-large, worst-case inputs?


113/10/17 5
Module #7 - Complexity

Example 1: Max algorithm


 Problem: Find the simplest form of
the exact order of growth () of the
worst-case time complexity (w.c.t.c.)
of the max algorithm, assuming that
each line of code takes some
constant time every time it is
executed (with possibly different
times for different lines of code).

113/10/17 6
Module #7 - Complexity

Complexity analysis of max


procedure max(a1, a2, …, an: integers)
v := a1 t1 Times for
for i := 2 to n t2 each
execution
if ai > v then v := ai t3 of each
return v t4 line.

What’s an expression for the exact


total worst-case time? (Not its order
of growth.)
113/10/17 7
Module #7 - Complexity

Complexity analysis, cont.


procedure max(a1, a2, …, an: integers)
v := a1 t1 Times for
for i := 2 to n t2 each
execution
if ai > v then v := ai t3 of each
line.
return v t4
w.c.t.c.:  n

t (n) t1    (t 2  t3 )   t 4
 i 2 

113/10/17 8
Module #7 - Complexity

Complexity analysis, cont.


Now, what is the simplest form of the
exact () order of growth of t(n)?

 n 
t (n) t1    (t 2  t3 )   t 4
 i 2 
 n 
(1)    (1)   (1) (1)  (n  1)(1)
 i 2 
(1)  (n)(1) (1)  (n) (n)

113/10/17 9
Module #7 - Complexity

Example 2: Linear Search


procedure linear search (x: integer,
a1, a2, …, an: distinct integers)
i := 1 t1
while (i  n  x  ai) t2
i := i + 1 t3
if i  n then location := i t4
else location := 0 t5
return location t6
113/10/17 10
Module #7 - Complexity

Linear search analysis


 Worst case time complexity order:
 n 
t (n) t1    (t 2  t3 )   t 4  t5  t6 (n)
 i 1 
 Best case:
t (n) t1  t 2  t 4  t6 (1)

 Average case, if item is present:


 n/2 
t (n) t1    (t 2  t3 )   t 4  t5  t6 (n)
 i 1 

113/10/17 11
Module #7 - Complexity

Review §2.2: Complexity

 Algorithmic complexity = cost of


computation.
 Focus on time complexity (space &
energy are also important.)
 Characterize complexity as a function of
input size: Worst-case, best-case,
average-case.
 Use orders of growth notation to
concisely summarize growth properties
of complexity functions.
113/10/17 12
Module #7 - Complexity

Example 3: Binary Search


procedure binary search (x:integer, a1, a2, …,
an: distinct integers)
i := 1
j := n (1) Key question:
while i<j begin How many loop iterations?
m := (i+j)/2
if x>am then i := m+1
else j := m (1)
end
if x = ai then location := i else location := 0
return location
(1)

113/10/17 13
Module #7 - Complexity

Binary search analysis


 Suppose n=2k.
 Original range from i=1 to j=n contains n elems.
 Each iteration: Size ji+1 of range is cut in half.
 Loop terminates when size of range is 1=20 (i=j).
 Therefore, number of iterations is k = log2n
= (log2 n)= (log n)
 Even for n2k (not an integral power of 2),
time complexity is still (log2 n) = (log n).

113/10/17 14
Module #7 - Complexity
Names for some orders of
growth
 (1) Constant
 (log n) Logarithmic (same order c)
c
 (logc n) Polylogarithmic (With c
 (n) Linear a constant.)
 (nc) Polynomial
 (cn), c>1 Exponential
 (n!) Factorial

113/10/17 15
Module #7 - Complexity

Problem Complexity
 The complexity of a computational
problem or task is (the order of growth
of) the complexity of the algorithm with
the lowest order of growth of complexity
for solving that problem or performing
that task.
 E.g. the problem of searching an

ordered list has at most logarithmic time


complexity. (Complexity is O(log n).)

113/10/17 16
Module #7 - Complexity

What computers can or cannot do?


 From computational thoery
Solvable problems

Tractable Problems
(Deterministic Polynomial algorithm
=> P)
Unsolvable
Problems Untractable Problems
(Non-deterministic Polynomial =>
NP )

113/10/17 17
Module #7 - Complexity

Unsolvable problems
 Turing discovered in the 1930’s that
there are problems unsolvable by any
algorithm.
– Or equivalently, there are undecidable
yes/no questions, and uncomputable
functions.
 Example: the halting problem.
– Given an arbitrary algorithm and its input,
will that algorithm eventually halt, or will it
continue forever in an “infinite loop?”

113/10/17 18
Module #7 - Complexity

The Halting Problem (Turing‘36)


 The halting problem was the first
mathematical function proven to
have no algorithm that computes it!
– We say, it is uncomputable.
 The desired function is Halts(P,I) :≡
the truth value of this statement:
– “Program P, given input I, eventually terminates.”
 Theorem: Halts is uncomputable! Alan Turing
1912-1954
– I.e., There does not exist any algorithm A that
computes Halts correctly for all possible inputs.
 Its proof is thus a non-existence proof.
 Corollary: General impossibility of predictive
analysis of arbitrary computer programs.
113/10/17 19
Module #7 - Complexity

The Proof
 Given any arbitrary program H(P,I),
 Consider algorithm Breaker, defined as:

procedure Breaker(P: a program)


halts := H(P,P)
if halts then while T begin end
 Note that Breaker(Breaker) halts iff

H(Breaker,Breaker) = F. Breaker makes a


liar out of H, by
 So H does not compute the function
doing the opposite
Halts! of whatever H
predicts.

113/10/17 20
Module #7 - Complexity

Tractable vs. intractable


 A problem or algorithm with at most
polynomial time complexity is considered
tractable (or feasible). P is the set of all
tractable problems.
 A problem or algorithm that has more than

polynomial complexity is considered


intractable (or infeasible).
 Note that n1,000,000 is technically tractable, but

really impossible. nlog log log n is technically


intractable, but easy. Such cases are rare
though.

113/10/17 21
Module #7 - Complexity

P vs. NP
 NP is the set of problems for which
there exists a tractable algorithm for
checking solutions to see if they are
correct.
 We know PNP, but the most famous

unproven conjecture in computer


science is that this inclusion is proper
(i.e., that PNP rather than P=NP).
 Whoever first proves it will be famous!

113/10/17 22
Module #7 - Complexity

Computer Time Examples


(1.25 bytes) (125 kB)
6
#ops(n) n=10 n=10 Assume
log2 n 3.3 ns 19.9 ns time = 1 ns
n 10 ns 1 ms (109
n log2 n 33 ns 19.9 ms second) per
n 2
100 ns 16 m 40 s op, problem
size = n
1.024 s
n 301,004.5
2 10
bits, #ops a
Gyr
function of
n! 3.63 ms Ouch! n as shown.
113/10/17 23
Module #7 - Complexity

Things to Know
 Definitions of algorithmic complexity,
time complexity, worst-case
complexity; names of orders of
growth of complexity.
 How to analyze the worst case, best

case, or average case order of


growth of time complexity for simple
algorithms.

113/10/17 24
Module #7 - Complexity

 For the following procedure A, analyze and provide the worst case time
complexities. We will use the following assumptions:
 Suppose A and B are both procedures. B take θ(n) time to compute.
Function A is a recursive one defined as follows
 (You can assume m is 2k)

Procedure A(x, a1 , a2 ,..., am ) /* a1 , a2 ,..., am is a sorted list */


1. if (n <=4) then return;
2. else { call A( a1 , a 2 ,..., a  m / 2  );
3. Call A( a  m / 2  1 ,..., am ); }
4.call B(x, a2 , a4 , a6 ..., a(  m / 2 *2) ) /* closest even integer for m */
6. return

113/10/17 25

You might also like