0% found this document useful (0 votes)
3 views

Complexity

Uploaded by

charupinky2020
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Complexity

Uploaded by

charupinky2020
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 72

C OMPLEXITY OF A LGORITHMS

2
C ONTENTS

 Introduction
 Complexity Classes
 Complexity of Algorithms
 Time Complexity
 Algorithm Analysis
 Asymptotic Notations
 Complexity of Problems

COMPLEXITY 23-NOV-24
3
I NTRODUCTION

 Complexity
 The state of being formed of many parts; the state of being difficult to
understand (Oxford Advanced Learner's Dictionary)

COMPLEXITY 23-NOV-24
4
C OMPLEXITY C LASSES

 A set of problems with related complexity.


 Help researchers to group problems based on how much time and space
they require to solve problems and verify the solutions.
 The branch of the theory of computation that deals with the resources
required to solve a problem.

COMPLEXITY 23-NOV-24
5
T YPES OF C OMPLEXITY C LASSES

 P Class
 NP Class
 CoNP Class
 NP-hard
 NP-complete

COMPLEXITY 23-NOV-24
6
P C LASS
 The P stands for Polynomial Time. It is the collection of decision problems
(problems with a “yes” or “no” answer) that can be solved by a deterministic
machine in polynomial time.
 Features:
 The solution to P problems is easy to find.
 P is often a class of computational problems that are solvable and tractable.
 Tractable:- the problems can be solved in theory as well as in practice.
 Intractable:- the problems that can be solved in theory but not in practice.
 Examples:
 Calculating the greatest common divisor, Finding a maximum matching,
Linear programming and Merge Sort

COMPLEXITY 23-NOV-24
7
NP C LASS

 The NP stands for Non-deterministic Polynomial Time. It is the collection of


decision problems that can be solved by a non-deterministic machine in
polynomial time.
 Features:
 The solutions of the NP class are hard to find since they are being solved by a
non-deterministic machine but the solutions are easy to verify.
 Problems of NP can be verified by a Turing machine in polynomial time.
 Examples:
 Boolean Satisfiability Problem (SAT), Hamiltonian Path Problem, The traveling
salesman problem and Graph coloring.

COMPLEXITY 23-NOV-24
8
C O NP C LASS

 Complement of NP Class.
 If the answer to a problem in Co-NP is No, then there is proof that can be
checked in polynomial time.
 Features:
 If a problem X is in NP, then its complement X’ is in CoNP.
 For an NP and CoNP problem, there is no need to verify all the answers at
once in polynomial time, there is a need to verify only one particular answer
“yes” or “no” in polynomial time.
 Examples:
 To check prime number and Integer Factorization.

COMPLEXITY 23-NOV-24
9
NP- HARD
 An NP-hard problem is at least as hard as the hardest problem in NP and it is a
class of problems such that every problem in NP reduces to NP-hard.
 Features:
 All NP-hard problems are not in NP.
 It takes a long time to check them.
 If a solution for an NP-hard problem is given then it takes a long time to check
whether it is right or not.
 A problem A is in NP-hard if, for every problem L in NP, there exists a
polynomial-time reduction from L to A.
 Examples:
 Halting problem and No Hamiltonian cycle.

COMPLEXITY 23-NOV-24
10
NP- COMPLETE

 A problem is NP-complete if it is both NP and NP-hard. NP-complete


problems are the hard problems in NP.
 Features:
 Any problem in NP class can be transformed or reduced into NP-
complete problems in polynomial time.
 If one could solve an NP-complete problem in polynomial time, then
one could also solve any NP problem in polynomial time.
 Examples:
 Knapsack Problem, Hamiltonian Cycle, Satisfiability and Vertex cover.

COMPLEXITY 23-NOV-24
11
I NTRODUCTION

 Complexity of Algorithms
 Complexity of Problems

COMPLEXITY 23-NOV-24
C OMPLEXITY OF A LGORITHMS

COMPLEXITY 12 23-NOV-24
13
C OMPLEXITY OF A LGORITHMS

 Algorithm - a finite sequence of instructions that solves a problem.


 Is believed to derive from the name of a 9th century Arabian
mathematician Al-Khowarizmi.
 Computer program - an implementation of an algorithm on a computer.
 Is one algorithm better than another?

COMPLEXITY 23-NOV-24
14
C OMPLEXITY OF A LGORITHMS

 Complexity
 Time Complexity – The amount of time required to execute an
algorithm
 Space Complexity – The amount of memory required to execute an
algorithm.
 Often there is a space/time trade-off.
 We are considering Time Complexity

COMPLEXITY 23-NOV-24
15
T IME C OMPLEXITY

 The absolute times depends on


 how fast the computer is
 RAM capacity of the computer
 OS the computer uses
 quality of code generated by the compiler
 etc

 Absolute time is not useful as a measure of an algorithm’s performance.

COMPLEXITY 23-NOV-24
16
T IME C OMPLEXITY

 One way to compare algorithms would be to count the instructions that


the algorithm requires to solve a problem.
 The number of instructions will vary depending on the input.
 Compute the number of instructions as a function of the input size.
 But do we need to count all instructions?

COMPLEXITY 23-NOV-24
17
T IME C OMPLEXITY

 Many algorithms consist of a basic loop which is executed for each item of
the input.
 As the input size grows, the number of instructions carried out in the loop
far exceeds those before and after the loop.
 To simplify the calculations, do not count instructions outside the basic
loop.
 Do we need to count all instructions inside the loop?

COMPLEXITY 23-NOV-24
18
T IME C OMPLEXITY

 The number of instructions inside the loop for any two algorithms to solve
a specific problem does not vary widely !
 What does differ is the number of times the loop is executed.
 So, count the number of times the loop is executed.

COMPLEXITY 23-NOV-24
19
T IME C OMPLEXITY

 In a sorting algorithm, the loop compares the current element of the list
with the item to be placed.
 In this case it is the number of comparisons that we count
 The number of comparisons depends on the size of the list.
 Express the number of comparisons in terms of the list length.

COMPLEXITY 23-NOV-24
20
T IME C OMPLEXITY

 Consider two algorithms A and B that solve the same class of problems.
 The time complexity of A is 5,000n, the one for B is 1.1n for an input with
n elements.
 For n = 10, A requires 50,000 steps, but B only 3, so B seems to be superior
to A.
 For n = 1000, however, A requires 5,000,000 steps, while B requires
2.51041 steps.

COMPLEXITY 23-NOV-24
21
T IME C OMPLEXITY

 What is important is the growth of the complexity functions.


 The growth of time and space complexity with increasing input size n is a
suitable measure for the comparison of algorithms.

COMPLEXITY 23-NOV-24
22
I NSERTION S ORT A LGORITHM

Sorted subsequence Value to be “inserted”

8 5 2 6 9 4 6

5 8 2 6 9 4 6

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
23
I NSERTION S ORT A LGORITHM

5 8 2 6 9 4 6

2 5 8 6 9 4 6

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
24
I NSERTION S ORT A LGORITHM

2 5 8 6 9 4 6

2 5 6 8 9 4 6

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
25
I NSERTION S ORT A LGORITHM

2 5 6 8 9 4 6

2 5 6 8 9 4 6

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
26
I NSERTION S ORT A LGORITHM

2 5 6 8 9 4 6

2 4 5 6 8 9 6

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
27
I NSERTION S ORT A LGORITHM

2 4 5 6 8 9 6

2 4 5 6 6 8 9

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
28
I NSERTION S ORT A LGORITHM

2 4 5 6 6 8 9

We’re done!

Sorted portion of the list First element in unsorted portion of the list
COMPLEXITY 13-FEB-14
29
A LGORITHM A NALYSIS T IME !
INSERTION-SORT(A) cost times
1 for j ← 2 to length[A] c1 n
2 do key ← A[ j ] c2 n−1
3 Insert A[ j ] into the sorted
sequence A[1 . . j − 1]. 0 n−1
4 i←j−1 c4 n−1
n
5 while i > 0 and A[i ] > key c5 t j =2
j

n
6 do A[i + 1] ← A[i ] c6  (t
j =2
j − 1)
n
7 i ←i − 1 c7  (t
j =2
j − 1)

COMPLEXITY
8 A[i + 1]← key c8 n−1 23-NOV-24
30
R UNNING T IME

T (n) = c1n + c2(n − 1) + c4(n − 1) + c5  t j


j =2
n n

+ c6  (t j − 1) + c7  (t j − 1) + c8(n − 1)
j =2 j =2
.

The best-case running time is


T (n) = c1n + c2(n − 1) + c4(n − 1) + c5(n − 1) + c8(n − 1)
= (c1 + c2 + c4 + c5 + c8)n − (c2 + c4 + c5 + c8) .
= an - b

COMPLEXITY 23-NOV-24
31
R UNNING T IME

The worst case running time is


n(n + 1)
T (n) = c1n + c2 (n − 1) + c4 (n − 1) + c5 ( − 1)
2
n(n − 1) n(n − 1)
+ c6 ( ) + c7 ( ) + c8 (n − 1)
2 2
 c5 c6 c7  2  c5 c6 c7 
=  + +  n +  c1 + c2 + c4 + − − + c8  n
2 2 2  2 2 2 
− ( c2 + c4 + c5 + c8 )
= an2 + bn + c
it is a quadratic function of n
COMPLEXITY 23-NOV-24
32
O THER WAYS TO M EASURE T IME C OMPLEXITY

 The Average Case – More difficult to compute because it requires some


knowledge of what you should expect on average, but is a best measure of
an algorithm. Bubble sort shares the same worst case time complexity
with insertion sort, but on average is much worse.
 The Best Case – Not exactly the best measure of an algorithm’s
performance because unless it is likely to continually be the best case
comparisons between algorithms are not very meaningful.

COMPLEXITY 23-NOV-24
33
A QUICK LOOK AT S PACE C OMPLEXITY

 Space complexity is typically a secondary concern to time complexity given


the amount of space in today’s computers, unless of course its size
requirements simply become too large.

COMPLEXITY 23-NOV-24
34
W HY IS TIME COMPLEXITY IMPORTANT ?

 Allows for comparisons with other algorithms to determine which is more


efficient.
 A way to determine whether or not something is going to take a
reasonable amount of time to run or not…Time complexities of 2n are no
good. For n = 100, would be 1267650600228229401496703205376
operations (which would take a super long time.)

COMPLEXITY 23-NOV-24
35
A SYMPTOTIC E FFICIENCY

 The order of growth of the running time of an algorithm, gives a simple


characterization of the algorithm's efficiency.
 To compare the relative performance of alternative algorithms.
 Although we can sometimes determine the exact running time of an
algorithm, the extra precision is not usually worth the effort of computing
it.

COMPLEXITY 23-NOV-24
36
A SYMPTOTIC E FFICIENCY

 For large enough inputs, the multiplicative constants and lower-order


terms of an exact running time are dominated by the effects of the input
size itself.
 Concerned with how the running time of an algorithm increases with the
size of the input in the limit (as the size of the input increases without
bound).
 An algorithm that is asymptotically more efficient will be the best choice
for all but very small inputs.

COMPLEXITY 23-NOV-24
37
A SYMPTOTIC N OTATIONS

 Used to describe the asymptotic running time of an algorithm.


 Functions whose domains are the set of natural numbers N = {0, 1, 2, ...}.
 Convenient for describing the worst-case running-time function T (n),
which is usually defined only on integer input sizes.

COMPLEXITY 23-NOV-24
38
A SYMPTOTIC N OTATIONS

 Θ notation (Big Theta)


 O notation (Big-oh)
 Ω notation (Big omega)
 o notation (Little-oh)
 ω notation (Little omega)

COMPLEXITY 23-NOV-24
39
B IG Θ N OTATION

Let f and g be functions from the integers or the real numbers to the real
numbers.
 Θ(g(n)) = {f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤
c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}.

COMPLEXITY 23-NOV-24
40
B IG Θ N OTATION

COMPLEXITY 23-NOV-24
41
B IG O N OTATION

 Provides an asymptotic Upper bound


 O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤
cg(n) for all n ≥ n0}.
 Used to describe the amount of time a given algorithm would take in the
worst case, based on the input size n.
 For the sake of analysis, we ignore constants:
 O(C * f(n)) = O(g(n)) or O(5N) = O(N)

COMPLEXITY 23-NOV-24
42
N OTATIONAL I SSUES

Big-O notation is a way of comparing functions. Notation unconventional:


eg: 3x 3 + 5x 2 – 9 = O (x 3)
Doesn’t mean
“3x 3 + 5x 2 – 9 equals the function O (x 3)”
Which actually means
“3x 3+5x 2 –9 is dominated by x 3”
Read as: “3x 3+5x 2 –9 is big-Oh of x 3”

COMPLEXITY 23-NOV-24
43
B IG O N OTATION

Example:
Show that f(x) = x2 + 2x + 1 is O(x2).

For x > 1 we have:


x2 + 2x + 1  x2 + 2x2 + x2
 x2 + 2x + 1  4x2
Therefore, for C = 4 and k = 1:
f(x)  Cx2 whenever x > k.

 f(x) is O(x2).

COMPLEXITY 23-NOV-24
44
B IG O N OTATION

COMPLEXITY 23-NOV-24
45
I NTUITIVE N OTION OF B IG -O
domain – [0,2]
y = 3x 3+5x 2 –9

y=x3
y=x2
y=x

COMPLEXITY 23-NOV-24
46
I NTUITIVE N OTION OF B IG -O

domain – [0,5]
y = 3x 3+5x 2 –9

y=x3

y=x2

y=x

COMPLEXITY 23-NOV-24
47
I NTUITIVE N OTION OF B IG -O

y = 3x 3+5x 2 –9 domain – [0,10]

y=x3

y=x2
y=x

COMPLEXITY 23-NOV-24
48
I NTUITIVE N OTION OF B IG -O
domain – [0,100]

y = 3x 3+5x 2 –9

y=x3

y=x2
y=x
COMPLEXITY 23-NOV-24
49
I NTUITIVE N OTION OF B IG -O

In fact, 3x 3+5x 2 –9 is smaller than 5x 3 for large enough values of x:

y = 5x 3

y = 3x 3+5x 2 –9

y=x2
y=x
COMPLEXITY 23-NOV-24
50
B IG O N OTATION

Question: If f(x) is O(x2), is it also O(x3)?

Yes. x3 grows faster than x2, so x3 grows also faster than f(x).

Therefore, we always have to find the smallest simple function g(x) for which
f(x) is O(g(x)).

COMPLEXITY 23-NOV-24
51
U SEFUL R ULES FOR B IG -O

 For any polynomial f(x) = anxn + an-1xn-1 + … + a0, where a0, a1, …, an are real
numbers, f(x) is O(xn).
 If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then
(f1 + f2)(x) is O(max(g1(x), g2(x)))
 If f1(x) is O(g(x)) and f2(x) is O(g(x)), then
(f1 + f2)(x) is O(g(x)).
 If f1(x) is O(g1(x)) and f2(x) is O(g2(x)), then
(f1f2)(x) is O(g1(x) g2(x)).

COMPLEXITY 23-NOV-24
52
B IG Ω N OTATION

 Provides an asymptotic lower bound


 Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤
f(n) for all n ≥ n0}.

COMPLEXITY 23-NOV-24
53
B IG Ω N OTATION

COMPLEXITY 23-NOV-24
54
B IG Ω AND B IG Θ

Big-: reverse of big-O. I.e.


f (x ) = (g (x )) → g (x ) = O (f (x ))
so f (x ) asymptotically dominates g (x ).
Big-: domination in both directions. I.e.
f (x ) = (g (x ))
→
f (x ) = O (g (x ))  f (x ) = (g (x ))
Synonym for f = (g): “f is of order g ”

COMPLEXITY 23-NOV-24
55
L ITTLE O N OTATION

 The asymptotic upper bound provided by O-notation may or may not be


asymptotically tight.
 2n2 = O(n2) is tight, but 2n = O(n2) is not.
 o-notation is to denote an upper bound that is not asymptotically tight.
 o(g(n)) = {f(n) : for any positive constant c > 0, there exists a constant n0 >
0 such that 0 ≤ f(n) < cg(n) for all n ≥ n0}.
 Example, 2n = o(n2), but 2n2 ≠ o(n2).

COMPLEXITY 23-NOV-24
56
L ITTLE ω N OTATION

 ω-notation is to Ω-notation as o-notation is to O-notation.


 ω-notation to denote a lower bound that is not asymptotically tight.
 ω(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0
such that 0 ≤ cg(n) < f(n) for all n ≥ n0}.
 Example, n2/2 = ω(n), but n2/2 ≠ ω(n2).

COMPLEXITY 23-NOV-24
57
T IME C OMPLEXITY, THE BIGGER PICTURE .

 One of the big questions in Computer Science right now is the finding a
way to determine if an NP-Complete problem can be computed in
polynomial time.
 NP-Complete problems are problems that cannot, to our knowledge, be
solved in polynomial time, but whose answer can be verified in polynomial
time.

COMPLEXITY 23-NOV-24
58
B IG -O: A G RAIN OF S ALT

 Big-O notation gives a good first guess for deciding which algorithms are
faster.
 But, the guess isn’t always correct.
 Consider n 6 vs. 1000n 5.9.
 Asymptotically, the second is better. But…

COMPLEXITY 23-NOV-24
59
B IG -O: A G RAIN OF S ALT
Running-time
In days
Assuming each operation
T(n) = takes a nano-second, so
1000n 5.9 computer runs at 1 GHz

T(n) = n 6

Input size n

COMPLEXITY 23-NOV-24
60
B IG -O: A G RAIN OF S ALT

In fact, 1000n 5.9 only catches up to n 6 when


1000n 5.9 = n 6, i.e.:
1000= n 0.1, i.e.:
n = 100010 = 1030 operations
= 1030/109 = 1021 seconds  1021/(3x107) 
3x1013 years
 3x1013/(2x1010)
 1500 universe lifetimes!

COMPLEXITY 23-NOV-24
C OMPLEXITY OF P ROBLEMS

COMPLEXITY 61 23-NOV-24
62
A LGORITHM VS . P ROBLEM C OMPLEXITY

 Algorithmic complexity is defined by analysis of an algorithm

 Problem complexity is defined by


 An upper bound – defined by an algorithm
 A lower bound – defined by a proof

COMPLEXITY 23-NOV-24
63
T HE U PPER B OUND

 Defined by an algorithm
 Defines that we know we can do at least this good
 Perhaps we can do better
 Lowered by a better algorithm
 “For problem X, the best algorithm was O(N3), but my new algorithm is
O(N2).”

COMPLEXITY 23-NOV-24
64
T HE L OWER B OUND

 Defined by a proof
 Defines that we know we can do no better than this
 It may be worse
 Raised by a better proof
 “For problem X, the strongest proof showed that it required O(N), but
my new, stronger proof shows that it requires at least O(N2).”

COMPLEXITY 23-NOV-24
65
U PPER AND L OWER B OUNDS

 The Upper bound is the best algorithmic solution that has been found for
a problem.
 “What’s the best that we know we can do?”

 The Lower bound is the best solution that is theoretically possible.


 “What cost can we prove is necessary?”

COMPLEXITY 23-NOV-24
66
C HANGING THE B OUNDS

Lowered by better
Upper bound algorithm

Raised by better
Lower bound proof

COMPLEXITY 23-NOV-24
67
O PEN P ROBLEMS

The upper and lower bounds differ.


Lowered by better
Upper bound algorithm

Unknown

Raised by better
Lower bound proof

COMPLEXITY 23-NOV-24
68
C LOSED P ROBLEMS

The upper and lower bounds are


identical.
Upper bound

Lower bound

COMPLEXITY 23-NOV-24
69
C LOSED P ROBLEMS

 Better algorithms are still possible

 Better algorithms will not provide an improvement detectable by “Big O”

 Better algorithms can improve the constant costs hidden in “Big O”


characterizations

COMPLEXITY 23-NOV-24
70
T RACTABLE VS . I NTRACTABLE

 Problems are tractable if the upper and lower bounds have only
polynomial factors.
 O (log N)
 O (N)
 O (NK) where K is a constant
 Problems are intractable if the upper and lower bounds have an
exponential factor.
 O (N!)
 O (NN)
 O (2N)

COMPLEXITY 23-NOV-24
71
R EFERENCES

 Introduction to Algorithms, 2nd ed., Thomas H. Cormen, Charles E.


Leiserson, Ronald L. Rivest and Clifford Stein, McGraw-Hill , 2001, England
 Lecture Notes on Algorithm Analysis and Computational Complexity, Ian
Parberry, 2001
 Complexity of Algorithms (Lecture Notes), Peter Gács and László Lovász,
1999

COMPLEXITY 23-NOV-24
72

COMPLEXITY 23-NOV-24

You might also like