0% found this document useful (0 votes)
8 views

CS 201 Lecture 16 - Algorithm Complexity

Uploaded by

aljubehjihad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

CS 201 Lecture 16 - Algorithm Complexity

Uploaded by

aljubehjihad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Dr.

Kholoud Nairoukh
Department of Computer Science
German-Jordanian University
Lecture 16
• (g) = {f | gO(f)}
“The functions that are at least order g.”

• o(g) = {f | c>0 k x>k : |f(x)| < |cg(x)|}


“The functions that are strictly lower order
than g.” o(g)  O(g) − (g).

• (g) = {f | c>0 k x>k : |cg(x)| < |f(x)|}


“The functions that are strictly higher order
than g.” (g)  (g) − (g).

Dr. Kholoud Nairoukh


Definitions of order-of-growth sets, g:R→R

• O(g) : {f |  c>0 k x>k |f(x)| < |cg(x)|}


• o(g) : {f | c>0 k x>k |f(x)| < |cg(x)|}
• (g) : {f | gO(f)}
• (g) : {f | go(f)}
• (g) : O(g)  (g)

Dr. Kholoud Nairoukh


• Informational or descriptional complexity:
The amount of information needed to
completely describe an object.

• In this course, we will study algorithmic or


computational complexity.

Dr. Kholoud Nairoukh


• The algorithmic complexity of a computation is,
most generally, a measure of how difficult it is to
perform the computation.

• That is, it measures some aspect of the cost of


computation (in a general sense of “cost”).

• Some of the most common complexity measures:


➢ “Time” complexity: # of operations or steps
required
➢ “Space” complexity: # of memory bits required

Dr. Kholoud Nairoukh


• Most algorithms have different
complexities for inputs of different sizes.
Example: searching a long list typically
takes more time than searching a short
one.
• Therefore, complexity is usually expressed
as a function of the input length.
– This function usually gives the complexity
for the worst-case input of any given
length.
Dr. Kholoud Nairoukh
• Suppose algorithm A has worst-case time
complexity (w.c.t.c., or just time) f(n) for
inputs of length n, while algorithm B (for the
same task) takes time g(n).

• Suppose that f(g), also written f g.

• Which algorithm will be fastest on all


sufficiently-large, worst-case inputs?

Dr. Kholoud Nairoukh


• Problem: Find the simplest form of the
exact order of growth () of the worst-case
time complexity (w.c.t.c.) of the max number
algorithm, assuming that each line of code
takes some constant time every time it is
executed (with possibly different times for
different lines of code).

Dr. Kholoud Nairoukh


procedure max(a1, a2, …, an: integers)
v := a1 t1
for i := 2 to n t2 Times of
execution
if ai > v then v := ai t3 for each
return v t4 line

First, what’s an expression for the exact total


worst-case time? (Not its order of growth.)

Dr. Kholoud Nairoukh


procedure max(a1, a2, …, an: integers)
v := a1 t1 Times of
for i := 2 to n t2 execution
for each
if ai > v then v := ai t3 line
return v t4
w.c.t.c.:
 n 
t (n) = t1 +   (t 2 + t3 )  + t 4
 i =2 
Dr. Kholoud Nairoukh
Now, what is the simplest form of the exact ()
order of growth of t(n)?

 n 
t (n) = t1 +   (t 2 + t3 )  + t 4
 i =2 
 n 
= (1) +   (1)  + (1) = (1) + (n − 1)(1)
 i =2 
= (1) + (n)(1) = (1) + (n) = (n)

Dr. Kholoud Nairoukh


procedure linear search (x: integer, a1, a2, …, an:
distinct integers)
i := 1 t1
while (i  n  x  ai) t2
i := i + 1 t3
if i  n then location := i t4
else location := 0 t5
return location t6

Dr. Kholoud Nairoukh


• Worst case time complexity order:
 n 
t (n) = t1 +   (t2 + t3 )  + t 4 + t5 + t6 = (n)
 i =1 
• Best case:
t (n) = t1 + t2 + t4 + t6 = (1)
• Average case, if item is present:
 n/2 
t (n) = t1 +   (t 2 + t3 )  + t 4 + t5 + t6 = (n)
 i =1 

Dr. Kholoud Nairoukh


procedure binary search (x:integer, a1, a2, …, an:
distinct integers, sorted smallest to largest)
i := 1
j := n (1)
while i<j begin
m := (i+j)/2 How many loop iterations?
if x>am then i := m+1 else j := m
(1)
end
if x = ai then location := i else location := 0
return location (1)

Dr. Kholoud Nairoukh


• Suppose that n is a power of 2, i.e., k: n=2k.
• Original range from i=1 to j=n contains n items.
• Each iteration: Size j−i+1 of range is cut in
~half.
• Loop terminates when size of range is 1=20 (i=j).
• Therefore, the number of iterations is:
k = log2n = (log2 n)= (log n)
• Even for n2k (not an integral power of 2),
time complexity is still (log2 n) = (log n).

Dr. Kholoud Nairoukh


• Algorithmic complexity = cost of
computation.
• Focus on time complexity for our course.
• Characterize complexity as a function of
input size: Worst-case, best-case, or
average-case.
• Use orders-of-growth notation to concisely
summarize the growth properties of
complexity functions.

Dr. Kholoud Nairoukh


• (1) Constant
• (logc n) Logarithmic (same order c)
• (logc n) Polylogarithmic (With c
constant)
• (n) Linear
• (nc) Polynomial (for any c)
• (cn) Exponential (for c>1)
• (n!) Factorial

Dr. Kholoud Nairoukh


• The complexity of a computational problem
or task is (the order of growth) the
complexity of the algorithm with the lowest
order of growth for solving that problem or
performing that task.

• Example: the problem of searching an


ordered list has at most logarithmic time
complexity. (Complexity is O(log n))

Dr. Kholoud Nairoukh


• A problem or algorithm with at most
polynomial time complexity is considered
tractable (or feasible).
• A problem or algorithm that has complexity
greater than polynomial is considered
intractable (or infeasible).
• Note that n1,000,000 is technically tractable,
but really very hard. nlog log log n is technically
intractable, but easy. Such cases are rare
though.
Dr. Kholoud Nairoukh
Assume time = 1 ns (10−9 second) per operation
(op), problem size = n bits, and #ops is a
function of n, as shown.
#ops(n) n=10 n=106
log2 n 3.3 ns 19.9 ns
n 10 ns 1 ms
n log2 n 33 ns 19.9 ms
n2 100 ns 16 m 40 s
2n 1.024 s 10301,004.5
Gyr
n! 3.63 ms Ouch!

Dr. Kholoud Nairoukh


• Alan Turing discovered that there are
problems unsolvable by any algorithm
(undecidable yes/no questions, and
uncomputable functions).

• Example: the halting problem.


Given an arbitrary algorithm and its input,
will that algorithm eventually halt, or will it
continue forever in an “infinite loop?”

Dr. Kholoud Nairoukh


• The halting problem was the first mathematical
function proven to have no algorithm that
computes it! it is uncomputable.

• The desired function is Halts(P,I) :≡ the truth


value of this statement:
“Program P, given input I, eventually terminates.”

• Theorem: Halts is uncomputable!


– I.e., there does not exist any algorithm A that
computes Halts correctly for all possible inputs.

• Its proof is thus a non-existence proof.


• Corollary: General impossibility of predictive analysis
of arbitrary computer programs.
Dr. Kholoud Nairoukh
• Given any arbitrary program H(P,I),
• Consider algorithm Foiler, defined as:
procedure Foiler(P: a program)
halts := H(P,P)
if halts then loop forever
• Note that Foiler(Foiler) halts iff
H(Foiler,Foiler) = F.
• So H does not compute the function Halts!
Foiler makes a liar out of H, by simply doing
the opposite of whatever H predicts it will
Dr. Kholoud Nairoukh
• P is the set of all tractable problems.
• NP is the set of problems for which there
exists a tractable algorithm for checking a
proposed solution to tell if it is correct.
• We know that PNP, but the most famous
unproven conjecture in computer science is
that this inclusion is proper.
– i.e., that PNP rather than P=NP.
• Whoever first proves this (or disproves it!)
will be famous!
Dr. Kholoud Nairoukh
• Definitions of algorithmic complexity, time
complexity, worst-case time complexity.

• Names of specific orders of growth of


complexity.

• How to analyze the worst case, best case, or


average case order of growth of time
complexity for simple algorithms.

Dr. Kholoud Nairoukh

You might also like