Automata and Complexity Theory
Automata and Complexity Theory
Operation on sets
• Let A and B be two sets and
A c B iff, for all x ε A, x ε B
A=B iff, A c B and B c A
A c B iff, Ac B but A ≠ B
A u B={x: x ε A or x ε B}
A ∩ B={x: x ε A and x ɆB}
A X B={ (x, y), x ε A and y ε B}
Formal Language
• String:
– string over Σ is some number of elements of Σ (possibly
none) placed in order. So if Σ ={ a ,b} then strings over Σ
can be a, ab, bbaa, abab and so on.
– Length of a String
It is the number of symbols present in a string. (Denoted
by |S|). E.g.
• If S=„cabcad‟, |S|= 6
• If |S|= 0, it is called an empty string
(Denoted by λ or ε)
Con’t
Kleene Star
Definition: The set Σ* is the infinite set of all possible
strings of
all possible lengths over Σ including λ.
Representation: Σ* = Σ0 U Σ1 U Σ2 U…….
Example: If Σ = {a, b}, Σ*= {λ, a, b, aa, ab, ba,
bb,………..}
Kleene Closure / Plus
– Definition: The set Σ+ is the infinite set of all possible
strings of all possible lengths over Σ excluding λ.
– Representation: Σ+ = Σ0 U Σ1 U Σ2
U……. Σ+ = Σ* − { λ }
– Example: If Σ = { a, b } , Σ+ ={ a, b, aa, ab, ba,
bb,………..}
Con’t
Transpose Operation
• for any string x ε Σ* and a ε Σ, (xa)T = a(x)T.
• A palindrome of even length can be obtained by the
concatenation of a string and its transpose.
e.g. Let Σ={0,1}
Y= (01001)T
(xa)T=(0100 1)T
=1(0100)T
=10(010)T
=100(01)T
=10010
Language
Example
Let a deterministic finite automaton be
• Q = {a, b, c},
• Σ = {0, 1},
• q0={a},
• F={c}, and
Con’t
• Transition function δ as shown by the
following table:
Con’t
• Step 4 Find out the combination of States {Q0, Q1,... , Qn} for
each possible input alphabet.
• Step 5 Each time we generate a new DFA state under the
input alphabet columns, we have to apply step 4 again,
otherwise go to step 6.
• Step 6 The states which contain any of the final states of the
NDFA are the final states of the equivalent DFA.
Con’t
Example 1
• Let us consider the NDFA shown in the figure below.
Con’t
Con’t
The difference between DFA & NFA
DFA NFA
For each symbolic representation of No need to specify how does the
the alphabet, there is only one state NFA react according to some
transition in DFA. symbol.
DFA cannot use Empty String NFA can use Empty String transition
transition.
Time needed for executing an input Time needed for executing an input
string is less string is more
DFA requires more space NFA requires less space than DFA
DFA requires more space
δ: QxΣ -> Q i.e. next possible state
belongs to Q. δ: QxΣ -> 2^Q i.e. next possible state
belongs to power set of Q.
Con’t
Exercise
1. Find a deterministic finite accepter that recognizes the set
of all strings on Σ= {a,b} starting with the prefix ab.
2. Construct grammar for L(M) = {x | x is a string of a’s and
b’s and |x| >= 2}
3. construct NFA all the strings with exactly two a‟s and
more than two b‟s.
4. .Construct an equivalent DFA for a given NFA
Formal Language and
Automata Theory
Chapter Two
Simachew A FLTA 1
Chomesky language hierarchy
Simachew A FLTA 2
Chomesky language hierarchy
Type - 1 Grammar
Type-1 grammars generate context-sensitive languages. The
productions must be in the form
αAβ→αγβ
where A ∈ N (Non-terminal)
and α, β, γ ∈ (T ∪ N)* (Strings of terminals and non-terminals)
The strings α and β may be empty, but γ must be non-empty.
The rule S → ε is allowed if S does not appear on the right side of any
rule. The languages generated by these grammars are recognized by a
linear bounded automaton.
AB → AbBc
A → bcA
B→b
Simachew A FLTA 3
Chomesky language hierarchy
Type - 2 Grammar
Type-2 grammars generate context-free languages.
The productions must be in the form A → γ
where A ∈ N (Non terminal)
and γ ∈ (T ∪ N)* (String of terminals and non-terminals).
These languages generated by these grammars are be recognized by a
non-deterministic pushdown automaton
S→Xa
X→a
X → aX
X → abc
X→ε
Simachew A FLTA 4
Chomesky language hierarchy
Type - 3 Grammar
Type-3 grammars generate regular languages. Type-3 grammars
must have a single non-terminal on the left-hand side and a right-hand
side consisting of a single terminal or single terminal followed by a
single non-terminal.
The productions must be in the form X → a or X → aY
where X, Y ∈ N (Non terminal)
and a ∈ T (Terminal)
The rule S → ε is allowed if S does not appear on the right side of any
rule.
Example
X→ε
X → a | aY
Y→b
Simachew A FLTA 5
Chomesky language hierarchy
Simachew A FLTA 6
Context Free Langauges
Definition
A grammar G = (V, T , S, P) is said to be context-free if all
productions of p are in the form
A → x,
wehre A ∈ V and x ∈ (v ∪ T ) ∗ .
Simachew A FLTA 7
Context Free Language
Simachew A FLTA 8
Leftmost and Rightmost Derivations
S → aAB,
A → bBB,
B → A|λ
Simachew A FLTA 9
Leftmost and Rightmost Derivations
Then
S ⇒ aAB ⇒ abBbB ⇒ abAbB ⇒ abbBbbB ⇒ abbbbB ⇒
abbbb .. leftmost derivation
S ⇒ aAB ⇒ aA ⇒ abBb ⇒ abAb ⇒ abbBbb ⇒ abbbb ...
rightmost derivation
Simachew A FLTA 10
Derivation Trees
A second way of showing a derivation, independent of the order
in which productions are used, is by derivation tree.
A derivation tree is an ordered tree in which nodes are labeled
with the left sides of productions and in which the children of a
node represent its corresponding right sides.
A → abABc.
Simachew A FLTA 11
Ordered Tree
Definition
Let G = (V, T , S, P) be a context-free grammar. An order tree
is a derivation tree for G iff it has the following properties.
The root is labeled S
Every leaf has a label from T ∪ { λ}
Every interior vertex has a label from V.
If a vetex has label A ∈ V , and its children are labeled
a1, a2, ..., an them P must contain a production of the form
A → a1a2...an
Simachew A FLTA 12
Dervation Tree
S → aAB,
A → bBb
B → A|λ
Simachew A FLTA 13
Derivation Tree
S → aaB
A → bBb|λ
B → Aa
Show that the string aabbabba is not in the language generated
by this grammar.
Simachew A FLTA 15
Parsing and Ambiguity
S → SS|aSb|bSa|λ
Round 1.
S ⇒ SS
S ⇒ aSb
S ⇒ bSa
S⇒λ
Round 2. ...
Simachew A FLTA 16
Simple Grammar
Definition
A context-free grammar G = (V, T , S, P) is said to be a simple
grammar or s-grammar if all its productions are of the form
A → ax,
Definition
A context-free grammar G is said to be ambiguous if there
exists some w ∈ L(G) that has at least two distinct derivation
trees. Alternatively, ambiguity implies the existence of two or
more leftmost or rightmost derivations.
S → aSb|SS|λ
Simachew A FLTA 18
Ambiguity in Grammars and Languages
Simachew A FLTA 19
Context Free Grammar and Programming Language
E → T |E + T ,
F → F |T ∗F ,
F → I|(E ),
I → a|b|c.
Simachew A FLTA 20
Context Free Grammar and Programming Language
Simachew A FLTA 21
Exercise
Simachew A FLTA 22
Simplification of Context Free Grammars
Simachew A FLTA 23
A Useful Substitution Rule
B → abbA|b.
Simachew A FLTA 24
A Useful Substitution Rule
A ⇒ aaA ⇒ aaabbc
in Gˆ
Simachew A FLTA 25
Removing Useless Productions
Example:
S → aSb|λ|A,
A → aA
Simachew A FLTA 26
Removing Useless Productions
Definition
Let G = (V, T , S, P) be a context-free grammar. A variable
A ∈ V is said to be useful if and only if there exist at least one
w ∈ L(G) such that
S →−∗ xAy →−∗ w,
with x, y ∈ (v ∪ T ) ∗
Example:
S → A,
A → aA|λ
B → bA.
Simachew A FLTA 27
Removing Useless Productions
Example:
S → A,
A → aA|λ
B → bA.
Simachew A FLTA 28
Removing Useless Productions
Definition
Any productions of a contex-free grammar of the form
A→λ
is called nullable.
S1 → aS1b|λ
Simachew A FLTA 31
Removing λ − Productions
S1 → λ
Simachew A FLTA 32
Removing λ − Productions
B → A1A2...An.
A → x1x2 ...xm , m ≥ 1
Simachew A FLTA 33
Removing λ − Productions
A → BC,
B → b|λ
C → D|λ
D → d.
Simachew A FLTA 34
Removing λ − Productions
A → B|C|BC,
B → b,
C → D,
D → d.
Simachew A FLTA 35
Removing Unit-Productions
Definition
Any productions of a contex-free grammar of the form
A→B
Simachew A FLTA 36
Bonga University
College of Enginering and Technology
Department of Computer Science
CoSc3011 – Automata and complexity Theory
Chapter 4 – Push Down Automata (PDA
• Informally:
– A PDA is an NFA-ε with a stack.
– Transitions are modified to accommodate stack operations.
• Questions:
– What is a stack?
– How does a stack help?
A DFA can “remember” only a finite amount of information, whereas a PDA can “remember” an
infinite amount of (certain types of) information, in one memory-stack.
1
The tape is divided into finitely many cells. Each cell contains a symbol in an alphabet Σ.
The stack head always scans the top symbol of the stack. It performs two basic operations:
The head scans at a cell on the tape and can read a symbol on the cell. In each move, the head
can move to the right cell.
The finite control has finitely many states which form a set Q. For each move, the state is
changed according to the evaluation of a transition function
δ : Q x (Σ U {ε}) x (Γ U {ε}) → 2
2
• (p, u) δ(q, a, v) means that if the tape head reads a, the stack head read v, and the
finite control is in the state q, then one of possible moves is that the next state is p, v is
replaced by u at stack, and the tape head moves one cell to the right.
• (p, u) δ(q, ε, v) means that this a ε-move.
• (p, u) δ(q, a, ε) means that a push operation performs at stack.
• (p, ε) δ(q, a, v) means that a pop operation performs at stack
• There are some special states: an initial state s and a final set F of final states.
• Initially, the PDA is in the initial state s and the head scans the leftmost cell. The tape
holds an input string. The stack is empty.
• When the head gets off the tape, the PDA stops. An input string x is accepted by the
PDA, when All the input is consumed AND the last state is an accepting state. we do not
care about the stack contents at the end of the accepting computation.
• If the automaton attempts to pop from empty stack then it halts and rejects input.
The States
3
The PDA can be represented by
Where Σ is the alphabet of input symbols and Γ is the alphabet of stack symbols.
• The set of all strings accepted by a PDA M is denoted by L(M). We also say that the
language L(M) is accepted by M.
Types of PDA
1. NDPDA
2. DPDA
Allowed transitions:
4
(Deterministic choices)
Not allowed:
5
Example of Non-DPDA
• Theorem: Given a CFG grammar G, then some pushdown automata P recognizes L(G).
– To prove this, we must show that we can take any CFG and express it as a PDA.
Then we must take a PDA and show we can construct an equivalent CFG.
– We’ll show the CFGPDA process, but skip the PDACFG.
• Proof idea:
– The PDA P will work by accepting its input w, if G generates that input, by
determining whether there is a derivation for w.
– Design rules for P such that the transitions match the production rules in the
grammar
• But the PDA can access only the top symbol on the stack and that might
be a terminal
– But if our PDA makes transitions only for variables, we we won’t
know to do
• To get around this problem, we’ll use the non-determinism of the PDA to
match terminal symbols on the stack with symbols in the input string
before the first variable
– This “chomps” any leading terminals until we can process a
variable
6
Example
• Consider the grammar
S aTb | b Start qstart
T Ta | ε
ε, ε S
ε, SaTb
ε, TTa
Given the string “aab” derived by qloop ε, Sb
S aTb aTab aab ε, Tε
We have the corresponding moves: a, aε
(qs, aab, $) ├ (qloop, aab, S$) ├ b, bε
ε, $ε
(qloop, aab, aTb$) ├ (qloop, ab, Tb$) ├
(qloop, ab, Tab$) ├ (qloop, ab, ab$) ├
qaccept
(qloop, b, b$) ├ (qloop, ε, $) ├
(qaccept, ε,ε)
7
• A DPDA is simply a pushdown automaton without non-determinism.
– This machine accepts a class of languages somewhere between regular languages and
context-free languages.
– For this reason, the DPDA is often skipped as a topic
– In practice the DPDA can be useful since determinism is much easier to implement.
• Parsers in programs such as YACC are actually implemented using a DPDA.
• Transitions read a symbol of the string and push a symbol onto or pop a symbol off of the stack
• Stack alphabet is not necessarily the same as the alphabet for the language
– e.g., $ marks bottom of stack in previous (0n1n) example
8
Simplification of CFG
Another grammar:
9
Unit-Productions
Observation:
Example Grammar:
S aA
A a
A B
B A
B bb
10
Normal forms
• It has been proven that all CFG could be convert into CNF
Steps
• Step 1: get rid of useless symbols and remove all -productions and unit productions
– Example:
S ASB | AB
11
A aAS | aA | a
B SbS | bS | Sb | b | aAS | aA | a | bb
• Step 2: get rid of production whose bodies are mixes of terminals and variables, or consist of
more than one terminal
S ASB | AB
A CAS | CA | a
B SDS | DS | SD | b | CAS | CA | a | DD
Ca
Db
A CFG is in Chomsky Normal Form if all its productions are of the form:
A BC or
Aa
Examples of CNF
Example 1: S AB
A BC | CC | a
B CB | b
Cc
12
Example 2: S AB | BC | AC |
A BC | a
B AC | b
C AB | c
Is that all Contexts Free Grammars can be expressed in Chomsky Normal Form?
A cA | a
B ABC | b
Cc
A CFG is in Greibach normal form if each rule has one these forms:
i. A aA1A2…An
ii. Aa
iii. S
13
CFG conventions
Work out:
1. S S1 | S2
S1 S1b | Ab
A aAb | ab | ε
S2 S2a | Ba
B bBa | ba| ε
14
Bonga University
College of Enginering and Technology
Department of Computer Science
CoSc3101 – Automata and complexity Theory
Chapter 5 – Turing Machines
Introduction
Background
• We can easily write an algorithm (in C) recognizing if a sequence of characters have the
form anbncn or not.
Def: A type 1 grammar G= (V,, S,P) is context-sensitive if for every production rule in P,
||≤||.
S abc | aAbc
Ab bA
Ac Bbcc
bB Bb
aB aa | aaA
Theorem:
A language is context sensitive if and only if is accepted by a Linear-Bounded Automaton
Page 1
Linear Bounded Automaton (LBA)
Linear Bounded Automata (LBAs) are the same as Turing Machines with one difference:
The input string tape space is the only tape space allowed to use
Page 2
Type 0 grammars: Recursively DTM
Phrase-structure grammars, enumerable, NDTM
Unrestricted grammars Unrestricted
Turing Machine
Page 3
Church-Turing Thesis
Therefore, C, C++, Prolog, Lisp, Small talk, and Java programs can be simulated in Turing
machines
Page 4
Structure of Turing Machine
Page 5
States & Transitions
Initially, the input string (finite-length string of symbols) is placed on the tape
All other tape cells, extending infinitely to left and right, hold blanks
Blank is a tape symbol, but not an input symbol
Initially, tape head points to the beginning of the input string
: the transition function
(q,X): a state q and a tape symbol X
(q,X) = (p,Y,D) where:
p is next state in Q
Y is the symbol written in the cell being scanned
D is a direction, either L or R
Tape head: always positioned at one of tape cells
A move (or say ‘a step’) may:
Read an input symbol
Change machine state
Write a tape symbol in the cell scanned
Move the tape head left or right
So simple, right? But…
The computational capabilities of all other known computational models (e.g.
any automata) are less than or equivalent to TM
Their speeds may not be as same as that of the TM’s
Their computational capabilities are less than or equivalent to TM, i.e., no
‘more’ mathematical functions can be calculated
Page 6
• If a language L is accepted by a Turing machine M then we say that L is:
Turing Recognizable
Page 7
Example 1
Page 8
Infinite Loop Example
Page 9
Because of the infinite loop:
Page 10
The input string is rejected
Example 2
Basic Idea:
Match a’s with b’s:
Repeat:
replace leftmost a with x
find leftmost b and replace it with y
Until there are no more a’s or b’s
If there is a remaining a or b reject
For a TM for balanced brackets, one idea is to find the innermost matching pair of brackets, erase them,
and repeat the process. We use x to indicate an erased bracket.
Page 11
Example 4: Even-length Palindromes
For even-length palindromes, we match first and last symbols and erase; then repeat. If reach ɛ
without mismatch, then string was palindrome
Summary
A Turing Machine (TM) is like an FA, but it has an infinite tape. The input starts on the tape
surrounded by blank cells denoted ∆. The programof a TM is represented as a diagram:
depending on the symbol under the head and the state, the machine writes a symbol, moves
left or right or stays in place, and/or changes state.Once a TM enters the accept state it stops.
Page 12
Chapter 6
2015 1
· Decision Problem: computational problem
with intended output of “yes” or “no”, 1 or 0
· Optimization Problem: computational
problem where we try to maximize or
minimize some value
· Introduce parameter k and ask if the optimal
value for the problem is a most or at least k.
Turn optimization into decision
2015 2
· Deterministic in nature
· Solved by conventional computers in
polynomial time
◦ O(1) Constant
◦ O(log n) Sub-linear
◦ O(n) Linear
◦ O(n log n) Nearly Linear
◦ O(n2) Quadratic
· Polynomial upper and lower bounds
2015 3
· Some problems are intractable:
as they grow large, we are unable to solve them
in reasonable time
· What constitutes reasonable time?
◦ Standard working definition: polynomial time
◦ On an input of size n the worst-case running time is
O(nk) for some constant k
◦ O(n2), O(n3), O(1), O(n lg n), O(2n), O(nn), O(n!)
◦ Polynomial time: O(n2), O(n3), O(1), O(n lg n)
◦ Not in polynomial time: O(2n), O(nn), O(n!)
2015 4
· Are some problems solvable in polynomial time?
◦ Of course: many algorithms we’ve studied provide
polynomial-time solutions to some problems
· Are all problems solvable in polynomial time?
◦ No: Turing’s “Halting Problem” is not solvable by any
computer, no matter how much time is given
· Most problems that do not yield polynomial-
time algorithms are either optimization or
decision problems.
2015 5
· Optimization Problems
◦ An optimization problem is one which asks, “What is the
optimal solution to problem X?”
◦ Examples:
🞄0-1 Knapsack
🞄Fractional Knapsack
🞄Minimum Spanning Tree
· Decision Problems
◦ An decision problem is one with yes/no answer
◦ Examples:
🞄Does a graph G have a MST of weight W?
2015 6
· An optimization problem tries to find an optimal
solution
· A decision problem tries to answer a yes/no question
· Many problems will have decision and optimization
versions
◦ Eg: Traveling salesman problem
🞄optimization: find hamiltonian cycle of minimum weight
🞄decision: is there a hamiltonian cycle of weight k
· Some problems are decidable, but intractable:
as they grow large, we are unable to solve them in
reasonable time
◦ Is there a polynomial-time algorithm that solves the problem?
2015 7
P: the class of decision problems that have
polynomial-time deterministic algorithms.
◦ That is, they are solvable in O(p(n)), where p(n) is a
polynomial on n
◦ A deterministic algorithm is (essentially) one that always
computes the correct answer
Why polynomial?
◦ if not, very inefficient
◦ nice closure properties
🞄 the sum and composition of two polynomials are always polynomials
too
2015 8
· MST
· Sorting
· Others?
2015 9
NP: the class of decision problems that are solvable
in polynomial time on a nondeterministic machine
(or with a nondeterministic algorithm)
· (A determinstic computer is what we know)
· A nondeterministic computer is one that can “guess”
the right answer or solution
◦ Think of a nondeterministic computer as a parallel machine
that can freely spawn an infinite number of processes
· Thus NP can also be thought of as the class of
problems
◦ whose solutions can be verified in polynomial time
· Note that NP stands for “Nondeterministic
Polynomial-time”
2015 10
· MST
· Sorting
· Others?
◦ Hamiltonian Cycle (Traveling Salesman)
◦ Graph Coloring
◦ Satisfiability (SAT)
🞄the problem of deciding whether a given
Boolean formula is satisfiable
2015 11
i = 23
AND
L
s
2015 14
· Polynomial-Time Approximation Schemes
· Much faster, but not guaranteed to find the
best solution
· Come as close to the optimum value as
possible in a reasonable amount of time
· Take advantage of rescalability property of
some hard problems
2015 15
· P = set of problems that can be solved in
polynomial time
◦ Examples: Fractional Knapsack, …
· NP = set of problems for which a solution can be
verified in polynomial time
◦ Examples: Fractional Knapsack,…, Hamiltonian Cycle, CNF
SAT, 3-CNF SAT
· Clearly P NP
· Open question: Does P = NP?
◦ Most suspect not
◦ An August 2010 claim of proof that P ≠ NP, by Vinay
Deolalikar, researcher at HP Labs, Palo Alto, has flaws
2015 16
· What if a problem has:
◦ An exponential upper bound
◦ A polynomial lower bound
· We have only found exponential algorithms,
so it appears to be intractable.
· But... we can ’ t prove that an exponential
solution is needed, we can ’ t prove that a
polynomial algorithm cannot be developed,
so we can’t say the problem is intractable...
2015 17
· The upper bound suggests the
problem is intractable
· The lower bound suggests the
problem is tractable
· The lower bound is linear: O(N)
· They are all reducible to each other
◦ If we find a reasonable algorithm
(or prove intractability) for one,
then we can do it for all of them!
2015 18
· With N teachers with certain hour
restrictions M classes to be scheduled,
can we:
◦ Schedule all the clas ses
◦ Make sure that no two teachers teach
the s am e clas s at the s am e time
◦ No teacher is scheduled to teach two
classes at once
2015 19
· With N students and K projects, where N
is even, can we:
◦ Assign pairs of students to each project
◦ Every s tudent works on every project
◦ No student has the same partner more
than once
· Is this an NP-complete
problem?
2015 20
· Graph isomorphism is NP-hard; is it NP-
complete?
2015 21
· What is N P?
· NP is the set of all decision problems (question
with yes-or-no answer) for which the 'yes'-answers
can be verified in polynomial time (O(n^k) where n
is the problem size, and k is a constant) by a
deterministic Turing machine. Polynomial time is
sometimes used as the definition of fast or quickly.
· What is P?
· P is the set of all decision problems which can be
solved in polynomial time by a deterministic Turing
machine. Since it can solve in polynomial time, it
can also be verified in polynomial time. Therefore P
is a subset of NP.
2015 22
· What is NP-Complete?
· A problem x that is in NP is also in NP-
Complete if and only if every other problem in
NP can be quickly (ie. in polynomial time)
transformed into x. In other words:
· x is in NP, and
· Every problem in NP is reducible to x
· So what makes NP-Complete so interesting is
that if any one of the NP-Complete problems
was to be solved quickly then all NP problems
can be solved quickly
2015 23
· What is NP-Hard?
· NP-Hard are problems that are at least as hard as
the hardest problems in NP. Note that NP-
Complete problems are also NP-hard. However not
all NP-hard problems are NP (or even a decision
problem), despite having 'NP' as a prefix. That is
the NP in NP-hard does not mean 'non-
deterministic polynomial time'. Yes this is
confusing but its usage is entrenched and unlikely
to change.
2015 24
· Nondeterministic algorithms produce an
answer by a series of “correct guesses”
2015 25
“NP-Complete” comes from:
◦ Nondeterministic Polynomial
◦ Complete - “Solve one, Solve them all”
2015 26
· Show that the problem is in N P. (i.e. Show that a
certificate can be verified in polynomial time.)
· As sume it is not N P complete
· Show how to convert an existing NPC problem into
the problem that we are trying to show is NP
C omplete (in polynomial time).
2015 27
· Poly time algorithm: input size n (in some
encoding), worst case running time – O(nc) for
some constant c.
· Three classes of problems
◦ P: problems solvable in poly time.
◦ NP: problems verifiable in poly time.
◦ NPC: problems in NP and as hard as any problem in
NP.
2015 28
· If a problem is proved to be NPC, a good
evidence for its intractability (hardness).
· Not waste time on trying to find efficient
algorithm for it
· Instead, focus on design approximate
algorithm or a solution for a special case of
the problem
· Some problems looks very easy on the
surface, but in fact, is hard (NPC).
2015 29
· A decision problem D is NP-complete iff
1. D NP
2. every problem in NP is polynomial-time reducible
to D
2015 30
· A problem R can be reduced to another problem Q
if any instance of R can be rephrased to an
instance of Q, the solution to which provides a
solution to the instance of R
◦ This rephrasing is called a transformation
· Intuitively: If R reduces in polynomial time to Q, R
is “no harder to solve” than Q
· Example: lcm(m, n) = m * n / gcd(m, n),
lcm(m,n) problem is reduced to gcd(m, n)
problem
2015 31
View of Theoretical Computer Scientists on P, NP, NPC
NP NPC
2015 32