0% found this document useful (0 votes)
1 views25 pages

CS112 Week14

The document discusses algorithmic complexity, defining it as a measure of the difficulty of performing computations, focusing on time and space complexity. It explains how complexity varies with input size and introduces concepts like tractable vs. intractable problems, the halting problem, and the P vs. NP conjecture. The document emphasizes the importance of understanding orders of growth in complexity for algorithm analysis.

Uploaded by

peachypaimon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views25 pages

CS112 Week14

The document discusses algorithmic complexity, defining it as a measure of the difficulty of performing computations, focusing on time and space complexity. It explains how complexity varies with input size and introduces concepts like tractable vs. intractable problems, the halting problem, and the P vs. NP conjecture. The document emphasizes the importance of understanding orders of growth in complexity for algorithm analysis.

Uploaded by

peachypaimon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Spring 2025

Prof. Dr. Hasan F. ATEŞ


Özyeğin University, İstanbul, Turkey
[email protected]
Module #7 - Complexity

Algorithmic Complexity

Rosen 5th ed., §2.3

06/25/2025 (c)2001-2003, Michael P. Frank 2


Module #7 - Complexity

What is complexity?
• The word complexity has a variety of different
technical meanings in different research fields.
• There is a field of complex systems, which studies
complicated, difficult-to-analyze non-linear and
chaotic natural & artificial systems.
• Another concept: Informational or descriptional
complexity: The amount of information needed to
completely describe an object.
– As studied by Kolmogorov, Chaitin, Bennett, others…
• In this course, we will study algorithmic or
computational complexity.

06/25/2025 (c)2001-2003, Michael P. Frank 3


Module #7 - Complexity

§2.2: Algorithmic Complexity


• The algorithmic complexity of a computation is,
most generally, a measure of how difficult it is to
perform the computation.
• That is, it measures some aspect of the cost of
computation (in a general sense of “cost”).
– Amount of resources required to do a computation.
• Some of the most common complexity measures:
– “Time” complexity: # of operations or steps required
– “Space” complexity: # of memory bits req’d

06/25/2025 (c)2001-2003, Michael P. Frank 4


Module #7 - Complexity

An interesting aside...
• Another, increasingly important measure of
complexity for computing is energy complexity –
– How much total physical energy is used up (rendered
unavailable) as a result of performing the computation?
• Motivations:
– Battery life, electricity cost, computer overheating!
– Computer performance within power constraints.

06/25/2025 (c)2001-2003, Michael P. Frank 5


Module #7 - Complexity

Complexity Depends on Input


• Most algorithms have different complexities
for inputs of different sizes.
– E.g. searching a long list typically takes more
time than searching a short one.
• Therefore, complexity is usually expressed
as a function of the input length.
– This function usually gives the complexity for
the worst-case input of any given length.

06/25/2025 (c)2001-2003, Michael P. Frank 6


Module #7 - Complexity

Complexity & Orders of Growth


• Suppose algorithm A has worst-case time
complexity (w.c.t.c., or just time) f(n) for
inputs of length n, while algorithm B (for
the same task) takes time g(n).
• Suppose that f(g), also written f  g .
• Which algorithm will be fastest on all
sufficiently-large, worst-case inputs?

06/25/2025 (c)2001-2003, Michael P. Frank 7


Module #7 - Complexity

Example 1: Max algorithm


• Problem: Find the simplest form of the
exact order of growth () of the worst-case
time complexity (w.c.t.c.) of the max
algorithm, assuming that each line of code
takes some constant time every time it is
executed (with possibly different times for
different lines of code).

06/25/2025 (c)2001-2003, Michael P. Frank 8


Module #7 - Complexity

Complexity analysis of max


procedure max(a1, a2, …, an: integers)
v := a1 t1 Times for
for i := 2 to n t2 each
execution
if ai > v then v := ai t3 of each
line.
return v t4
First, what’s an expression for the exact total
worst-case time? (Not its order of growth.)
06/25/2025 (c)2001-2003, Michael P. Frank 9
Module #7 - Complexity

Complexity analysis, cont.


procedure max(a1, a2, …, an: integers)
v := a1 t1 Times for
for i := 2 to n t2 each
execution
if ai > v then v := ai t3 of each
line.
return v t4
w.c.t.c.:  n 
t (n) t1    (t 2  t3 )   t 4
 i 2 
06/25/2025 (c)2001-2003, Michael P. Frank 10
Module #7 - Complexity

Complexity analysis, cont.


Now, what is the simplest form of the exact
() order of growth of t(n)?
 n 
t (n) t1    (t 2  t3 )   t 4
 i 2 
 n 
(1)    (1)   (1) (1)  (n  1)(1)
 i 2 
(1)  (n)(1) (1)  (n) (n)

06/25/2025 (c)2001-2003, Michael P. Frank 11


Module #7 - Complexity

Example 2: Linear Search


procedure linear search (x: integer,
a1, a2, …, an: distinct integers)
i := 1 t1
while (i  n  x  ai) t2
i := i + 1 t3
if i  n then location := i t4
else location := 0 t5
return location t6
06/25/2025 (c)2001-2003, Michael P. Frank 12
Module #7 - Complexity

Linear search analysis


• Worst case time complexity order:
 n 
t (n) t1    (t 2  t3 )   t 4  t5  t6 (n)
• Best case:  i 1 

t (n) t1  t 2  t 4  t6 (1)


• Average case, if item is present:
 n/2 
t (n) t1    (t 2  t3 )   t 4  t5  t6 (n)
 i 1 

06/25/2025 (c)2001-2003, Michael P. Frank 13


Module #7 - Complexity

Review §2.2: Complexity


• Algorithmic complexity = cost of computation.
• Focus on time complexity for our course.
– Although space & energy are also important.
• Characterize complexity as a function of input
size: Worst-case, best-case, or average-case.
• Use orders-of-growth notation to concisely
summarize the growth properties of complexity
functions.

06/25/2025 (c)2001-2003, Michael P. Frank 14


Module #7 - Complexity

Example 3: Binary Search


procedure binary search (x:integer, a1, a2, …, an:
distinct integers, sorted smallest to largest)
i := 1
(1)
Key question:
j := n How many loop iterations?
while i<j begin
m := (i+j)/2 (1)
if x>am then i := m+1 else j := m
end
if x = ai then location := i else location := 0 (1)
return location
06/25/2025 (c)2001-2003, Michael P. Frank 15
Module #7 - Complexity

Binary search analysis


• Suppose that n is a power of 2, i.e., k: n=2k.
• Original range from i=1 to j=n contains n items.
• Each iteration: Size ji+1 of range is cut in ~half.
• Loop terminates when size of range is 1=20 (i=j).
• Therefore, the number of iterations is:
k = log2n = (log2 n)= (log n)
• Even for n2k (not an integral power of 2),
time complexity is still (log2 n) = (log n).

06/25/2025 (c)2001-2003, Michael P. Frank 16


Module #7 - Complexity

Names for some orders of growth


• (1) Constant
• (logc n) Logarithmic (same order c)
• (logc n) Polylogarithmic (With c
a constant.)
• (n) Linear
• (nc) Polynomial (for any c)
• (cn) Exponential (for c>1)
• (n!) Factorial
06/25/2025 (c)2001-2003, Michael P. Frank 17
Module #7 - Complexity

Problem Complexity
• The complexity of a computational problem
or task is (the order of growth of) the
complexity of the algorithm with the lowest
order of growth of complexity for solving
that problem or performing that task.
• E.g. the problem of searching an ordered
list has at most logarithmic time
complexity. (Complexity is O(log n).)
06/25/2025 (c)2001-2003, Michael P. Frank 18
Module #7 - Complexity

Tractable vs. intractable


• A problem or algorithm with at most polynomial
time complexity is considered tractable (or
feasible). P is the set of all tractable problems.
• A problem or algorithm that has complexity
greater than polynomial is considered intractable
(or infeasible).
• Note that n1,000,000 is technically tractable, but really
very hard. nlog log log n is technically intractable, but
easy. Such cases are rare though.

06/25/2025 (c)2001-2003, Michael P. Frank 19


Module #7 - Complexity

Computer Time Examples


(1.25 bytes) (125 kB)
#ops(n) n=10 n=106 Assume time
log2 n 3.3 ns 19.9 ns = 1 ns (109
n 10 ns 1 ms second) per
n log2 n 33 ns 19.9 ms op, problem
n2 100 ns 16 m 40 s size = n bits,
2n 1.024 s 10301,004.5 and #ops is a
Gyr function of n,
n! 3.63 ms Ouch! as shown.

06/25/2025 (c)2001-2003, Michael P. Frank 20


Module #7 - Complexity

Unsolvable problems
• Turing discovered in the 1930’s that there
are problems unsolvable by any algorithm.
– Or equivalently, there are undecidable yes/no
questions, and uncomputable functions.
• Classic example: the halting problem.
– Given an arbitrary algorithm and its input, will
that algorithm eventually halt, or will it
continue forever in an “infinite loop?”

06/25/2025 (c)2001-2003, Michael P. Frank 21


Module #7 - Complexity

The Halting Problem (Turing‘36)


• The halting problem was the first mathematical
function proven to have no algorithm that
computes it!
– We say, it is uncomputable.
• The desired function is Halts(P,I) :≡ the truth
value of this statement:
– “Program P, given input I, eventually terminates.”
• Theorem: Halts is uncomputable!
– I.e., there does not exist any algorithm A that
computes Halts correctly for all possible inputs. Alan Turing
• Its proof is thus a non-existence proof. 1912-1954
• Corollary: General impossibility of predictive analysis of
arbitrary computer programs.

06/25/2025 (c)2001-2003, Michael P. Frank 22


Module #7 - Complexity

Proving the Theorem


of the Undecidability of the Halting Problem
• Given any arbitrary program H(P,I),
• Consider algorithm Foiler, defined as:
procedure Foiler(P: a program) Foiler makes a
liar out of H, by
halts := H(P,P) simply doing the
if halts then loop forever opposite of
whatever H
• Note that Foiler(Foiler) halts iff predicts it will do!
H(Foiler,Foiler) = F.
• So H does not compute the function Halts!
06/25/2025 (c)2001-2003, Michael P. Frank 23
Module #7 - Complexity

P vs. NP
• NP is the set of problems for which there
exists a tractable algorithm for checking a
proposed solution to tell if it is correct.
• We know that PNP, but the most famous
unproven conjecture in computer science is
that this inclusion is proper.
– i.e., that PNP rather than P=NP.
• Whoever first proves this will be famous!
(or disproves it!)
06/25/2025 (c)2001-2003, Michael P. Frank 24
Module #7 - Complexity

Key Things to Know


• Definitions of algorithmic complexity, time
complexity, worst-case time complexity.
• Names of specific orders of growth of
complexity.
• How to analyze the worst case, best case, or
average case order of growth of time
complexity for simple algorithms.

06/25/2025 (c)2001-2003, Michael P. Frank 25

You might also like