TOC Final PDF
TOC Final PDF
TOC Final PDF
Question#1
b)
The DK automaton in Figure shows that the grammar passes the test and is unambiguous.
Observe that every final state contains exactly one completed rule (having the dot at the
end) and no other rule with a terminal symbol (a or b) following a dot.
c)
The grammar generates the language as:
Question#2
(c)
Group 2
Question#1
Design A truing Machine M That Decide A = {02^n | n ≥ 0} the language consisting of all Strings of 0s
Whose length is a power of 2.
A = {02^n | n ≥ 0}
A = {02^n | n ≥ 0} (CONT….)
Question#2
a) Write down the formal definition of the Turing Machine?
Answer:
b) Design the state diagram of the Turing Machine that accepts the language: L = {w#w | w ∈ {a,b}*}
Answer:
If we take a string w = abaab, then w#w = abaab#abaab. The following would the state diagram of the
Turing machine that will accept the strings belonging to this language.
Group 3
Question#1
Q1: Design a Two-Tape Turing Machine for {0n1n | n ≥ 0}. Give its formal description and
show that how can we simulate this using Single-Tape Turing Machine?
Answer:
Two-Tape Turing Machine for {0n1n | n ≥ 0}:
Formal Description:
M = (Q, Σ, Γ, δ, q1, qaccept ,qreject) where
Σ = {0,1}
Γ = {0,1,x,␣}
q1 = Start state
qaccept = Accept state
δ function is given below:
δ(q1, 0␣)=( q2, 0x,RR)
δ(q2, 0␣)=( q2, 00 ,RR)
δ(q2, 1␣)=( q3, 1␣,SL)
δ(q3, 10)=( q3, 10,RL)
δ(q3, 1x)=( q4, 1x,RS)
δ(q4, ␣x)=( qaccept,␣x ,LS)
qreject = To simplify the figure, we don’t show the reject state or the transitions going to
the reject state. Those transitions occur implicitly whenever a state lacks an outgoing
transition for a particular symbol.
Tape Condition:
2. To simulate a single move, S scans its tape from the first #, which marks the left-hand end, to the third #, which
marks the right-hand end, in order to determine the symbols under the virtual heads.
3. Then S makes a second pass to update the tapes according to the way that M’s transition function dictates.
4. If at any point S moves one of the virtual heads to the right onto a #, this action signifies that M has moved the
corresponding head onto the previously unread blank portion of that tape. So, S writes a blank symbol on this
tape cell and shifts the tape contents, from this cell until the rightmost #, one unit to the right.
Q2 : Let A be the language consisting of all strings representing undirected graphs that
are connected. Give example of an undirected graph and its encoding. Also give a high-
level description of a TM M that decides A.
A ={⟨G⟩|G is a connected undirected graph}.
Answer:
Example of G :
Encoding of G:
Group 4
Question#1
Prove that the language L = {anbn where n >= 0} is a non-regular language and can be
recognized by Turning Machine.
Solution:
Since all the regular languages can be modelled using the Finite State Machines (FSM),
therefore if above language is regular we should be able to construct its DFA or NFA but in
reality it is not possible. If we attempt to find a DFA
that recognizes L, we discover that the machine seems to need to remember
how many 0s have been seen so far as it reads the input. Because the number of
0s isn’t limited, the machine will have to keep track of an unlimited number of
possibilities. But it cannot do so with any finite number of states.
We can recognize the strings of such language using the Turing Machine and understand
how this language can be recognized.
Question#2
Question # 2: Construct a turning machine for given cfg to prove given grammar is decidable or
not and also describe relationship among classes or languages ?
S →A
A→0
B→IB/E
Solution:
S→ABA
A→0
B→IB/E
Step 1 = Language that is presented by given CFG is : 01*0
Where
V= {A , B}
E= {0 , 1}
R=Rules
S=>Start state
Step 2 # Turning Machine
L=01*0
This machine accept all the strings that are in this language and reject others.
Part 2 #
Relationship among classes of languages
→All regular languages are context free but not all context free languages are regular .Regular
languages are subset of context free languages .
→A language is called decidable if there is a tuning machine which accept and halts on every input string
w.
→Tuning Recognizable .
→A language is recognizable if there is a tuning machine which will halt and accept only the strings in
that language and for strings not in the language. The TM either rejects or does not halt at all.
Tuning
Group 5
Question#1
We assume that ATM is decidable and obtain a contradiction. Suppose that H is a decider
for ATM. On input ⟨M, w⟩, where M is a TM and w is a string, H halts and accepts if M
accepts w. Furthermore, H halts and rejects if
M fails to accept w. In other words, we assume that H is a TM, where
Now we construct a new Turing machine D with H as a subroutine. This new TM calls H
to determine what M does when the input to M is its own description ⟨M⟩. Once D has
determined this information, it does the opposite. That is, it rejects if M accepts and
accepts if M does not accept. The following is a description of D.
D = “On input ⟨M⟩, where M is a TM:
1. Run H on input ⟨M,⟨M⟩⟩.
2. Output the opposite of what H outputs. That is, if H accepts, reject; and if H rejects,
accept.”
Don’t be confused by the notion of running a machine on its own description! That is
similar to running a program with itself as input, something that does occasionally occur in
practice. For example, a compiler is a program that translates other programs. A compiler
for the language Python may itself be written in Python, so running that program on itself
would make sense.
In summary,
What happens when we run D with its own description ⟨D⟩ as input? In that
case, we get
Let’s review the steps of this proof. Assume that a TM H decides ATM. Use H to build a
TM D that takes an input ⟨M⟩, where D accepts its input ⟨M⟩ exactly when M does not
accept its input ⟨M⟩. Finally, run D on itself. Thus, the machines take the following
actions, with the last line being the contradiction.
• H accepts ⟨M, w⟩ exactly when M accepts w.
• D rejects ⟨M⟩ exactly when M accepts ⟨M⟩.
• D rejects ⟨D⟩ exactly when D accepts ⟨D⟩.
Question#2
Q No. 2: Let B be the set of all infinite sequences over {0,1}. Show that B is
uncountable using a proof by diagonalization.
Each element in B is an infinite sequence (b1, b2, b3,…), where each bi ∈ {0,1}. Suppose
B is countable. Then we can define a correspondence f between N = {1,2,3,...} and B.
Specifically, for n ∈ N, let f(n) = (bn1, bn2, bn3, ...), where bni is the ith bit in the nth
sequence, i.e.,
Now define the infinite sequence c = (c1, c2, c3, c4, c5, ...) ∈ B, where ci = 1−bii for each i ∈ N. In other words, the
ith bit in c is the opposite of the ith bit in the ith sequence. For example, if
then we would define c = (1,1,0,0,...). Thus, for each n = 1,2,3,..., note that c ∈ B differs from the nth sequence in
the nth bit, so c does not equal f(n) for any n, which is a contradiction. Hence, B is uncountable.
Group 6
Question#1
Q1: Prove that HALTTM is undecidable using reducibility.
Ans: Let’s assume for the purpose of obtaining a contradiction that TM “R” decides HALTTM. We construct
TM “S” to decide ATM, with S operating as follows.
S = “On input ⟨M,w⟩, an encoding of a TM “M” and a string w:
1. Run TM R on input ⟨M,w⟩.
2. If R rejects, reject .
3. If R accepts, simulate M on w until it halts.
4. If M has accepted, accept; if M has rejected, reject .”
Clearly, if R decides HALTTM, then S decides ATM. Because ATM is undecidable, HALTTM also must be
undecidable.
Question#2
Q2: a) Let M be an LBA, what will be the number of exactly distinct configurations?
b) How it is known in LBA that the machine is now in loop?
c) In the proof of “ALLCFG is undecidable”, why the representation of computation histories is
modified and how it is modified?
Ans: a) Let M be an LBA with q states and g symbols in the tape alphabet. There are exactly qngn distinct
configurations of M for a tape of length n.
b) The idea for detecting when M is looping is that as M computes on w, it goes from configuration to
configuration. If M ever repeats a configuration, it would go on to repeat this configuration over and over
again and thus be in a loop and the number of configurations would become greater than qngn.
c) Why: The reason is that when Ci is popped off the stack, it is in reverse order and not suitable for
comparison with Ci+1. That’s why the representation is modified.
How: Every other configuration appears in reverse order. The odd positions remain written in the forward
order, but the even positions are written backward. Thus, an accepting computation history would appear
as shown below:
# C1 # C2(Reverse) # C3 # C4(Reverse) # ... # Cl #
Group 7
Question#1
Group 8
Group 9
Question#1
of all triples requires O(|V |3) time. Checking whether or not all three edgesbelong to E
takes O(|E|) time. Thus, the overall time is O (|V |3 |E|), which is polynomial in the length
of the input (G).Therefore, TRIANGLE∈P.
Remark: Note that for TRIANGLE, we are looking for a clique of fixed size 3, so even
though the 3 is in the exponent of the time bound, the exponent is a constant, so the time
bound is polynomial. We could modify the above algorithm for TRIANGLE to work for
CLIQUE = {(G, k) | G is an undirected graph with a k-clique} by enumerating all collections
of k vertices, where k is the size of the clique desired. But the number of such collections is
so the time bound is O(|V |k k|E|), which is exponential in k. Because k is part of the
input (G, k), the time bound is no longer polynomial. Hence, we cannot use this
algorithm to show that CLIQUE ∈ P. Nor does it show that CLIQUE ƒ∈ P since we’ve
only shown that one algorithm doesn’t have polynomial runtime, but there might be
another algorithm for CLIQUE that does run in polynomial time. However at this time,
it is currently unknown if CLIQUE ∈ P or CLIQUE ƒ∈ P. Because CLIQUE is NP-
complete, this question will be answered if anyone solves the P vs. NP problem, which
is still unresolved.
Question#2
M2 = “On input w:
1. Run M1 with input w.
If M1 accepts, reject ; otherwise, accept.”
b) Convert this Boolean formula in to the graph, using the clique theorem.
SOLUTION:
Question#2
NP Hard: If an even harder problem is reducible to all the problems in NP set (at
least as hard as any NP-problem) then that problem is called as NP-hard.
NP-complete vs NP-hard
Group 11
Question#1
Question 1: To prove that HAMPATH is NP-complete problem show that the given Boolean
formula is reducible to HAMPATH.
SOLUTION:
Question#2
Q2: Use following graph to prove that UHAMPATH is NP-complete.
UHAMPATH in G’ is : S, u2in, u2mid, u2out, u1in, u1mid, u1out, u3in, u3mid, u3out, u4in, u4mid, u4out, u5in,
u5mid, u5out, u6in, u6mid, u6out, T
As the given graph G is reduced to G’ so we UHAMPATH is in NP-Complete.
Group 12
Question#1
What is yieldability problem in Savitch theorem? And how can solve it?
ANSWER:
There are two configurations of NTM (Non-deterministic Turing machine) c1 and c2 with a
number t, and we test whether the NTM can get from c1 and c2 within t steps using only f(n)
space. This problem is called yieldability problem.
C1 is start configuration
C2 is accept configuration
T is num of steps that take non-deterministic machine
For this purpose, let us define a recursive function, called CAN_YIELD(c1,c2,t), the checks if c1
can yield c2 in t steps as follows:
Function CAN_YIELD(c1,c2,t) {
1. If t = 1, test whether c1 = c2 or whether c1 yields c2 in one step using the rule of NTM
N. Accept if either test succeeds; Reject otherwise.
2. For each config cm using k f(n) space:
a. Run CAN_YIELD(c1,cm,t/2)
b. Run CAN_YIELD(cm,c2,t/2)
c. If both accept, accept
3. If haven’t accept yet, reject
We modify N a bit, and define some terms:
• We modify N so that when it accepts, it clears the tape and moves the tape head to
leftmost cell. We denote such a configuration c(accept)
• Let c(start) = start configuration of N on w
• Select a constant d such that N has at most 2d f(n)configurations (which is the upper
bound of N’s running time)
Based on this new N, there exists a DTM M that simulates N as follows:
M = “On input w,
1. Output the result
CAN_YIELD(c(start),c(accept), 2d f(n)) ’
• When CAN_YIELD invokes itself recursively, it needs to store c1, c2, t, and the
configuration it is testing (so that these values can be restored upon return from the
recursive call)
• Each level of recursion thus uses O(f(n)) space
• Height of recursion: df(n) = O(f(n))
• Total space = O((f2(n)))
Question#2
Explain the relationship between P, NP, PSPACE,NPSPACE and EXPTIME with hierarchy
diagram?
ANSWER:
Group 13
Question#1
Question#2
• 2. If φ(formula) equals ∃x ψ(input) , recursively call T on ψ, first with0 substituted for x and
then with 1 substituted for x. If either result is accept, then accept; otherwise , reject.
• 3. If φ(formula) equals ∀xψ (input), recursively call T on ψ, first with0 substituted for x and
then with 1 substituted for x. If both results are accept, then accept; otherwise , reject.”
Group 14
Question#1
Q 1 (a) Describe the Classes L and NL.
Answer: NL = coNL.
We show that PATH is in NL, and thereby establish that every problem in coNL is
also in NL, because PATH is NL-complete. The NL algorithm M that we present for
PATH must have an accepting computation whenever the input graph G does not
contain a path from s to t. First, let’s tackle an easier problem. Let c be the
number of nodes in G that are reachable from s. We assume that c is provided as
an input to M and show how to use c to solve PATH. Later we show how to
compute c. Given G, s, t, and c, the machine M operates as follows. One by one,
M goes through all the m nodes of G and nondeterministically guesses whether
each one is reachable from s. Whenever a node u is guessed to be reachable, M
attempts to verify this guess by guessing a path of length m or less from s to u. If a
computation branch fails to verify this guess, it rejects. In addition, if a branch
guesses that t is reachable, it rejects. Machine M counts the number of nodes
that have been verified to be reachable. When a branch has gone through all of
G’s nodes, it checks that the number of nodes that it verified to be reachable from
s equals c, the number of nodes that actually are reachable, and rejects if not.
Otherwise, this branch accepts. In other words, if M nondeterministically selects
exactly c nodes reachable from s, not including t, and proves that each is
reachable from s by guessing the path, M knows that the remaining nodes,
including t, are not reachable, so it can accept. Next, we show how to calculate c,
the number of nodes reachable from s. We describe a nondeterministic log space
procedure whereby at least one computation branch has the correct value for c
and all other branches reject. For each i from 0 to m, we define Ai to be the
collection of nodes that are at a distance of i or less from s (i.e., that have a path
of length at most i from s). So A0 = {s}, each Ai ⊆ Ai+1, and Am contains all nodes
that are reachable from s. Let ci be the number of nodes in Ai. We next describe a
procedure that calculates ci+1 from ci. Repeated application of this procedure
yields the desired value of c = cm. We calculate ci+1 from ci, using an idea similar
to the one presented earlier in this proof sketch. The algorithm goes through all
the nodes of G, determines whether each is a member of Ai+1, and counts the
members. To determine whether a node v is in Ai+1, we use an inner loop to go
through all the nodes of G and guess whether each node is in Ai. Each positive
guess is verified by guessing the path of length at most i from s. For each node u
verified to be in Ai, the algorithm tests whether (u, v) is an edge of G. If it is an
edge, v is in Ai+1. Additionally, the number of nodes verified to be in Ai is
counted. At the completion of the inner loop, if the total number of nodes
verified to be in Ai is not ci, all Ai have not been found, so this computation
branch rejects. If the count equals ci and v has not yet been shown to be in Ai+1,
we conclude that it isn’t in Ai+1. Then we go on to the next v in the outer loop.
Formal description:
P1 = (Q, Σ, Γ, δ, q1, F), where
Σ= {a, b, c},
Γ= {x, $},
F ={q5},
Input: a b c ε
Stack: x $ ε x $ ε x $ ε x $ ε
q1 {(q2,$)}
q2 {(q2,x)} {(q3,ε)}
q3 {(q3,x)} {(q4,ε)}
q4 {(q4,ε)} {(q5,ε)}
q5
(ii)
State diagram for the PDA that recognizes D = {ai bj ck | i, j, k ≥ 0, and i = j or j = k} is given
below:
The PDA has a nondeterministic branch at q1. If the string is ai bj ck with i = j, then the PDA takes the
branch from q1 to q2. If the string is ai bj ck with j = k, then the PDA takes the branch from q1 to q5.
Formal Description:
P2 = (Q, Σ, Γ, δ, q1, F), where
Q = {q1, q2, q3, q4, q5, q6, q7, q8},
Σ= {a, b, c},
Γ= {a, b, $},
F = {q4, q8},
Input: a b c ε
Stack: a b $ ε a b $ ε a b $ ε a b $ ε
q1 {(q2,$)}
{(q5,$)}
q2 {(q2,a)} {(q3,ε)}
q3 {(q3,ε)} {(q4,)}
q4 {(q4,ε)}
q5 {(q5,ε)} {(q6,ε)}
q6 {(q6,b)} {(q7,ε)}
q7 {(q7,ε)} {(q8,ε)}
q8
Q2: (i) Prove the Lemma: “If a pushdown automaton recognizes some language,
then it is context free”.
(ii) Convert the following Pushdown automaton to equivalent CFG according to
procedure given in the proof of above lemma.
Answer: (i)
Proof: We have a PDA P, and we want to make a CFG G that generates all the strings that P accepts.
Say that P = (Q, Σ, Γ, δ, q0, {qaccept}) and construct G.
• First, we simplify our task by modifying P slightly to give it the following three features.
1. It has a single accept state, qaccept.
2. It empties its stack before accepting.
3. Each transition either pushes a symbol onto the stack (a push move) or pops one off the
stack (a pop move), but it does not do both at the same time.
• The variables of G are {Apq| p, q ∈ Q}.
• The start variable is Aq0, qaccept.
• Now we describe G’s rules in three parts.
1. For each p, q, r, s ∈ Q, u ∈ Γ, and a, b ∈ Σε, if δ(p, a, ε) contains (r, u) and δ(s, b, u)
contains (q, ε), put the rule Apq → aArsb in G.
2. For each p, q, r ∈Q, put the rule Apq →Apr Arq in G.
3. Finally, for each p ∈Q, put the rule App → ε in G.