Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
169 views
117 pages
Sudkamp Languages and Machines Notes 1 PDF
Uploaded by
portindaniel
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save Sudkamp_Languages_and_Machines_Notes_1.pdf For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
0 ratings
0% found this document useful (0 votes)
169 views
117 pages
Sudkamp Languages and Machines Notes 1 PDF
Uploaded by
portindaniel
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save Sudkamp_Languages_and_Machines_Notes_1.pdf For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
Download
Save Sudkamp_Languages_and_Machines_Notes_1.pdf For Later
You are on page 1
/ 117
Search
Fullscreen
CHAPTER 9 Turing Machines ‘The Turing machine exhibits many of the features commonly associated with a modern computer. This is no accident; the Turing machine predated the stored-program computer ‘and provided a model for its design and development. Utilizing a sequence of elementary ‘operations, a Turing machine may access and alter any memory position. A Turing ma- chine, unlike a computer, has no limitation on the amount of time or memory available for a computation. The Turing machine is another step in the development of finite-state computing machines. In a sense to be made precise in Chapters 11 and 13, this class of ‘machines represents the ultimate achievement in the design of abstract computing devices. 9.1 The Standard Turing Machine ‘A Turing machine is a finite-state machine in which a transition prints a symbol on the tape. The tape head may move in either direction, allowing the machine to read and ‘manipulate the input as many times as desired, The structure of a Turing machine is similar {o that of a finite automaton, with the transition function incorporating these additional features, Definition 9.1.1 A Turing machine is a quintuple M = (Q, £, 1, 5, go) where Q is a finite set of states, Pisa finite set called the tape alphaber, T contains a special symbol B that represents 259260 Chapter9 Turing Machines a blank, is a subset of T— {B} called the input alphabet, 6 is @ partial function from Qx PQ xP x (L.R} called the transition function, and qo € Q is a distinguished state called the start stat, ‘The tape of a Turing machine extends indefinitely in one direction, The tape positions are numbered by the natural numbers with the leftmost position numbered zero. Oo ass f 4% A computation begins with the machine in state qo and the tape head scanning the leftmost Position. The input, a string from ¥*, is writen on the tape beginning at position one. Position zero and the remainder of the tape are assumed to be blank. The tape alphabet provides additional symbols that may be used during a computation, A transition consists of three actions: changing the state, writing a symbol on the square scanned by the tape head, and moving the tape head, The direction of the movernent is specified by the final component of the transition. An L indicates a move of one tape position to the left and R one position to the right, The machine configuration ‘The transition changed the state from qi to qj, replaced the tape symbol x with y, and ‘moved the tape head one square to the left. The ability of the machine to move in both, directions and process blanks introduces the possibility of a computation continuing in- definitly, ‘A Turing machine halts when it encounters a state, symbol pair for which no transi- tion is defined. A transition from tape position zero may specify a move to the left of the boundary of the tape. When this occurs, the computation is said to terminate abnormally. ‘When we say that a computation halts, we mean that it terminates in a normal fashion,98.1. The Standard Turing Machine 261 Turing machines are designed to perform computations on strings from the input alphabet, A computation begins with the tape head scanning the leftmost tape square and the input string beginning at position one, All tape squares to the right of the input string are assumed to be blank. The Turing machine defined in Definition 9.1.1, with initial conditions as described above, is referred to as the standard Turing machine. Example 9.1.1 ‘The transition function of a standard Turing machine with input alphabet {a,b} is given below. The transition from state go moves the tape head to position one to read the input. ‘The transitions in state gy read the input string and interchange the symbols a and b. The transitions in g> return the machine (othe initial position 5 |B a > go | 48k a | Bk GbR anak a gal gabob ‘A Turing machine can be graphically represented by a state diagram, The transition 8(qi.x) = [aj Yudh d € {L, R] is depicted by an arc from q; to q; labeled x/y d. The state diagram Alb R ala L bb BIBR wipe he represents the Turing machine defined above, a A machine configuration consists of the state, the tape, and the position of the tape head. At any step in a computation of a standard Turing machine, only a finite segment Of the tape is nonblank. A configuration is denoted ugivB. where all tape positions to the right of the B are blank and uv is the string spelled by the symbols on the tape from the left-hand boundary to the B. Blanks may occur jn the string ww, the only requirement is thatthe entire nonblank portion ofthe tape be included in wv. The notation ug, B indicates that the machine is in state g scanning the first symbol of v and the entire tape to the right of wv is blank. ‘This representation of machine configurations can be used to trace the computations of a Turing machine, The notation wgivB kz xq;y'B indicates that the configuration xq yB is obtained from ug,vB by a single transition of M. Following the standard conventions, ugivB Ki xq)yB signifies that xgjyB can be obtained from ugivB by a finite number,262 Chapter 9 Turing Machines possibly zero, of transitions, The reference to the machine is omitted when there is no possible ambiguity The Turing machine in Example 9.1.1 interchanges the a’s and b’s inthe input string. ‘Tracing the computation generated by the input string abab yields qoBababB + BqiababB F BbqrbabB + BbagiabB + BbabgibB + BbabagsB } BbabqaB + BbagabaB + BoqrabaB } BaxbabaB F q2BbabaB. Example 9.1.2 ‘The Turing machine COPY with input alphabet (a, } produces a copy of the input string ‘That is, a computation that begins with the tape having the form Bu B terminates with tape BuBuB. NKR WR92 Turing Machines 9s Language Acceptors 263, ‘The computation copies the input string one symbol ata time beginning withthe leftmost symbol in the input. Tape symbols X and ¥ record the portion of the input that has been copied. The first unmarked symbol in the string specifies the are to be taken from state gu. The cycle qi, gas qs, ain 4i Feplaces an a with X and adds an a to the string being. constructed, Similarly, the lower branch copies a b using ¥ to mark the input string. After the entire string has been copied, the X"s and Ys ae returned to a’s and b's in state gy 0 9.2 Turing Machines as Language Acceptors Turing machines have been introduced as a paradigm for effective computation. A Turing ‘machine computation consists of a sequence of elementary operations. The machines con- structed in the previous section were designed to illustrate the features of Turing machine computations. The computations read and manipulated the symbols on the tape; no inter- pretation was given to the result of a computation. Turing machines can be designed to accept languages and to compute functions. The result of a computation can be defined in terms of the state in which computation terminates or the configuration of the tape at the end of the computation. In this section we consider the use of Turing machines as language acceptors; a computation accepts or rejects the input string. Initially, acceptance is defined by the final state of the computation, This is similar to the technique used by finite-state and pushdown automata to accept strings. Unlike finite-state and pushdown automata, a Turing, ‘machine need not read the entire input string to accept the string. A Turing machine augmented with final states is a sextuple (Q, E, 1, 3, qo, F), where F< Q is the set of final states, Definition 9.2.1 Let M=(Q, E, 1, 8, go, F) be a Turing machine. A string w € £* is accepted by final state if the computation of M with input u halts in a final state, A computation that terminates abnormally rejects the input regardless of the final state of the machine, The language of M, LOM), isthe set ofall strings accepted by M. A language accepted by a Turing machine is called a recursively enumerable Tanguage, The ability of a Turing machine to move in both directions and process blanks introduces the possibility that the machine may not halt for a particular in- put. A language that is accepted by a Turing machine that halts for all input strings is said to be recursive, Being recursive is a property of a language, not of a Turing machine that accepts it, There are multiple Turing machines that accept particular264 Chapter9 Turing Machines language; some may halt for all input while others may not. The existence of one Tur. ing machine that halts for all inputs is sufficient to show that the language is recur- Membership in a recursive language is decidable; the computations of a Turing ma- chine that halts for all inputs provide a decision procedure for determining membership, ‘The state of the machine in which the computation terminates indicates whether the in- put string is in the language. The terms recursive and recursively enumerable have their origin in the functional interpretation of Turing computability that will be presented in Chapter 13, Example 9.2.1 ‘The Turing machine wor 2 pie ee @ QO wok accepts the language (aU b)*aa(a U6)", The computation qoBaabbB F BqaaboB + BagabbB + BaagsboB examines only the first half ofthe input before accepting the string aabb. The language (aUb)*aa(a Ud)" is recursive: the computations of M halt for every input string. A successful computation terminates when a substring aa is encountered. All other compu tations halt upon reading the first blank following the input. a Example 9.2.2 ‘The language (a‘b'! |i > 0} is accepted by the Turing machine93 Alternative Acceptance Criteria 265 aia bE vv 2IZL ‘The tape symbols X, ¥, and Z mark the a’s, b's, and c’s as they are matched. A computa- tion successfully terminates when all the symbols inthe input string have been transformed ‘o the appropriate tape symbol. The transition from 4; to gg accepts the null sting, 9.3 Alternative Acceptance Criteria ‘The acceptance of a string in the Turing machine defined in Definition 9.2.1 is based upon the state of the machine when the computation halts. Alternative approaches to defining the language of a Turing machine are presented in this section, ‘The first alternative is acceptance by halting. In a Turing machine that is designed {0 accept by halting, an input string is accepted if the computation initiated with the string eauses the Turing machine to halt. Computations for which the machine terminates abnormally reject the string. When acceptance is defined by halting, the machine is defined by the quintuple (Q, E, 1°, 8, go). The final states are omitted since they play no role in the determination of the language of the machine.266 Chapters Turing Machines Definition 9.3.1 Let M=(Q, ©, P, 8, qo) be a Turing machine that accepts by halting. A string u¢ E* is accepted by halting if the computation of M with input u halts (normally), Theorem 9.3.2 The following statements are equivalent i) The language L is accepted by a Turing machine that accepts by final state. ii) The language L is accepted by a Turing machine that accepts by halting, Proof Let M =(Q, ©. I, 8, go) be a Turing machine that accepts L. by halting. The machine M' = (Q, E, I, 8, go, Q), in which every state isa final state, accepts L by final state Conversely, let M = (Q, E, P, 5, qo, F) be a Turing machine that accepts the lan- ‘guage L by final state. Define the machine M’= (QU (gy), EP, 8, qo) that accepts by halting as follows: 1) IF5(q,) is defined, then 5'(qi,) = (qi) ii) For each state g; € Q—F. if 6(qi,x) is undefined, then 8'(qj.x) = (gy, Rl iii) Foreach x eT, 5(gy,x) =1gy,2.R] Computations that accept strings in M and M’ are identical. An unsuccessful computation in M may halt in a rejecting state, terminate abnormally, or fail to terminate, When an unsuccessful computation in M halts, the computation in M’ enters the state qy. Upon entering q the machine moves indefinitely to the right, The only computations that halt in M’ are those that are generated by computations of M that halt in an accepting state, Thus LOM’) = LOM). . Example 9.3.1 ‘The Turing machine from Example 9.2.1 is altered to accept (aUb)*aa(aUb)* by halting. The machine below is constructed as specified by Theorem 9.3.2. A compu- tation enters gy only when the entire input string has been read and no aa has been ‘encountered.9.4 Multitrack Machines 267 bbR ‘The machine obtained by deleting the ares from qo to qy and from qj to qf labeled a/a R and b/b R also accepts (a U b)*aata Ub)" by halting, o In Exercise 7 a type of acceptance, referred to as acceptance by entering, is intro- duced that uses final states but does not require the accepting computations to terminate. A sting is accepted if the computation ever enters a final state; after entering a final state, the remainder of the computation is irrelevant to the acceptance of the string. As with accep- tance by halting, any Turing machine designed to accept by entering can be transformed into a machine that accepts the same language by final state. Unless noted otherwise, Turing machines will accept by final state as in Defini- tion 9.2.1. The altemative definitions are equivalent in the sense that machines designed in this manner accept the same family of languages as those accepted by standard Turing machines, 9.4 Multitrack Machines ‘The remainder of the chapter is dedicated (0 examining variations of the standard Turing machine model, Each of the Variations appears to increase the capability of the machine, We prove that the languages accepted by these generalized machines are precisely those accepted by the standard Turing machines. Additional variations will be presented in the ‘A multitrack tape is one in which the tape is divided into tracks. A tape position in an nntrack tape contains m symbols from the tape alphabet, The diagram depicts a two-track ‘ape with the tape head scanning the second position,268 Chapter Turing Machines Track 2 Track | ‘The machine reads an entire tape position, Multiple tracks increase the amount of informa- tion that can be considered when determining the appropriate transition. A tape position in a 1wo-track machine is represented by the ordered pair (x, y, where x is the symbol in track 1 and y in track 2 The states, input alphabet, tape alphabet, initial state, and final states of a two- track machine are the same as in the standard Turing machine. A two-track transition reads and rewrites the entire tape position. A transition of a two-track machine is writen 5(qi. (x.y) = [gi [zw d}, where d € {LR} ‘The input t0 a two-track machine is placed in the standard input position in track 1 All the positions in track 2 are initially blank, Acceptance in multitrack machines is by final state ‘We will show that the languages accepted by two-track machines are precisely the recursively enumerable languages. The argument easily generalizes to n-track machines ‘Theorem 9.4.1 A language L is accepted by a two-track Turing machine if, and only if, it is accepted by a standard Turing machine. Proof Clearly, if L is accepted by a standard Turing machine it is accepted by a two- track machine, The equivalent two-track machine simply ignores the presence of the sec- ond track Let M =(Q, ©, 1, 6, qo, F) be a two-track machine, A one-track machine will be constructed in which a singe tape square contains the same information as a tape position in the twostrack tape. The representation ofa two-track tape positon as an ordered pair indicates how this can be accomplished. The tape alphabet of the equivalent one-track ‘machine M’ consists of ordered pais of tape elements of M. The input to the two-track ‘machine consists of ordered pairs whose second component is blank. The input symbol a of M is identified with the ordered pair [a, of M'. The one-track machine =(QE «(BLP x P,8q0,F) with transition function Sisley) = 6x yD accepts LOM). .95 Two-Way Tape Machines 269 9.5 Two-Way Tape Machines ‘A Turing machine with a two-way tape is identical to the standard model except that the tape extends indefinitely in both directions. Since a two-way tape has no left boundary, the input can be placed anywhere on the tape. Al other tape positions are assumed to be blank. The tape head is initially positioned on the blank to the immediate left of the input string. ‘A machine with a two-way tape can be constructed to simulate the actions of a stan- dard machine by placing a special symbol on the tape to represent the left-hand boundary of the one-way tape. The symbol #, which is assumed not to be an element of the tape alphabet ofthe standard machine, is used to simulate the boundary of the tape. A computa- tion in the equivalent machine with two-way tape begins by writing # to the immediate left of the initial tape head position, The remainder of a computation inthe two-way machine is ‘dentical to that ofthe one-way machine except when the computation of the one-way ma chine terminates abnormally, When the one-way computation attempts to move to the left Of the tape boundary, the two-way machine reads the symbol # and enters a nonaccepting, state that terminates the computation, The standard Turing machine M accepts strings over (a,b) in which the first 6, if present, is preceded by at least three a's, aR ala dal ala BIB a bie. me @eee a BL @ BL Gy [All the states of M other than qo are accepting. When the first b is encountered, the tape head moves four positions to the left, if possible. Acceptance is completely deter- ‘mined by the boundary of the tape. A string is rejected by M whenever the tape head attempts to cross the left-hand boundary. ll computations that remain within the bounds of the tape accept the input ‘The transitions from states q, and q, insert the simulated endmarker to the left of the initial position of the tape head of M', the wwo-way machine that accepts L(M). After ‘writing the simulated boundary, the computation enters a copy of the one-way machine M. The failure state q is entered in M’ when a computation in M attempts to move to the left of the tape boundary.270 Chapter9 Turing Machines y ala L BBL ‘We will now show that a language accepted by a machine with a two-way tape is accepted by a standard Turing machine, The argument utilizes Theorem 9.4.1, which establishes the interdefinability of two-track and standard machines. The tape positions of the two-way tape can be numbered by the complete set of integers. The initial position of the tape head is numbered zero, and the input begins at position one. 4% Imagine taking the two-way infinite tape and folding it so that position ~ sits di rectly above position . Adding an unnumbered tape square over position zero produces ‘a two-track tape. The symbol in tape position / of the two-way tape is stored in the cor- responding position of the one-way, (wo-track tape. A computation on a two-way infinite tape can be simulated on this one-way, two-track tape, : LetM = (Q, E, 1, 5, gp, F) bea Turing machine with a two-way tape. Using the cor- respondence between a two-way tape and two-track tape, we construct a Turing machine (M’ with two-track, one-way tape to accept L(M). A transition of M is specified by the state95 Two-Way Tape Machines 271 and the symbol scanned. M’, scanning a two-track tape, reads two symbols at each tape position. The symbols U (up) and D (down) are introduced to designate which ofthe two tracks should be used to determine the transition. This information is maintained by the states ofthe two-track machine. ‘The components of M’ are constructed from those of M and the symbols U/ and D. Y= QU (qu geh) x UD} Bead rerum P= ((g.U). (gn DI gi € Fh, The initial state of M’ isa pair [qy, D]. The transition from this state writes the marker # ‘on the upper track in the leftmost tape position. A transition from [q,, D] returns the tape head to its original position to begin the simulation of a computation of M. The transitions ‘of M’ are defined as follows 1. 8'qs, DI, (B, BY) = qr, DI (8.41, RI. 2. Forevery x €1,5((gr, Dh. x. B) = [lqo, Dh [xB], LI 3. For every 2€ 1 [#} and d € (L,R}, 8(Ugis Dh Ut ever 5(4i,£) = [aja isa transition of M, 4, Forevery xe — (#) andd ¢ (L, R}, 8'UgiUI x1) =llg/.U, [eyh 1 when ever 8(qi,) = [q), sd] isa transition of M, where d’ is the opposite direction of d. 5. 81g. Dh (eA) = Ug. UL, [ys]. R] whenever 8(q).2) = [qy. £1 is a transition of M. 6. 8g. Dh, (x. = lq). Dh Lvl, RY whenever 3(g;,) = [aie¥s R] is a transition of M. 7. 8(qi.U), be MD) = lay, DI, [yA], R] whenever 8(g,,.) = [qj.¥sR1 is & transition of M 8. 5(qi.U), Lx.) = [lay UI, [yA], R] whenever 8(g;,x) = [aj ys] is a transition of. lq). DL fy. zh dl when- A transition generated by schema 3 simulates a transition of M in which the tape head begins and ends in positions labeled with nonnegative values. In the simulation, this is represented by writing on the lower track of the tape. Transitions defined in 4 use only the upper track of the two-track tape. These correspond to transitions of M that occur to the left of position zero on the two-way infinite tape. ‘The remaining transitions simulate the transitions of M from position zero on the two-way tape, Regardless of the U or D in the state, transitions from position zero are determined by the tape symbol on track 1. When the track is specified by D, the transition is defined by schema 5 or 6. Transitions defined in 7 and 8 are applied when the state is (a. U1272 Chapter Turing Machines ‘The preceding informal arguments outline the proof of the equivalence of one-way and two-way Turing machines. Theorem 9.5.1 A language L is accepted by a Turing machine with a two-way tape if, and only if, itis accepted by a standard Turing machine 9.6 Multitape Machines ‘A ketape machine consists of k tapes and k independent tape heads. The states and alpha- bets of a multitape machine are the same as in a standard Turing machine. The machine reads the tapes simultaneously but has only one state. This is depicted by connecting each of the independent tape heads to a single control indicating the current state tres ([T . LLE t Tape 2 \ Tape 1 ‘A transition is determined by the state and the symbols scanned by each of the tape heads. A transition in a multitape machine may i) change the state ii) write a symbol on each of the tapes iit) independently reposition each of the tape heads. The repositioning consists of moving the tape head one square to the left or one square to the right or leaving it at its current position. A transition of a two-tape machine scanning x, on tape 1 and x2 on tape 2is written 8(q,.¥1,2) = [43 Yiedis where x1.) €T and d; € {L, R, S}. This transition causes the machine to write y; on tape 7. The symbol d, specifies the direction of the movement of tape head i: L signifies a move to the left, R & ‘move to the right, and S means the head remains stationary. ‘The input to a multitape machine is placed in the standard position on tape 1. All the ‘other tapes are assumed to be blank. The tape heads originally scan the leftmost position of each tape. Any tape head attempting to move to the left of the boundary of its tape terminates the computation abnormally.96 Mutttape Machines 273 ‘A standard Turing machine is a multitape Turing machine with a single tape. Con- sequently, every recursively enumerable language is accepted by a multitape machine. A ‘computation in a two-tape machine can be simulated by a computation in a five-track ma- chine. The argument can be generalized to show that any language accepted by a k-tape machine is accepted by a 2k + I-track machine. Theorem 9.6.1 A language L is accepted by a multitape Turing machine if, and only if, i standard Turing machine is accepted by a Let M =(Q, E, 1, 8, go, F) be a two-tape machine, The tape heads of a multiape machine are independently positioned on the two tapes. Tape 2 Tape 1 ‘The single tape head of a multitrack machine reads all the tracks of a fixed position. The five-track machine M’ is constructed to simulate the computations of M. Tracks | and 3 maintain the information stored on tapes | and 2 of the two-tape machine. Tracks 2 and 4 have a single nonblank square indicating the position of the tape heads of the mulitape machine, Tracks [# 1 Track 4 Socecct tas [apoyo fefey ck? | 1 me Pb ‘The initial action of the simulation in the multitrack machine is to write # in the left- ‘most position of track 5 and X in the leftmost positions of tracks 2 and 4. The remainder of the computation of the multitrack machine consists of a sequence of actions that simulate the transitions ofthe two-tape machine. ‘A ansition ofthe two-tape machine is determined by the two symbols being scanned and the machine state. The simulation in the five-track machine records the symbols marked by each of the X's. The states are S-tuples of the form [s, gis X1s 42. ¥1s 92 dis ds}, where qr €Q: xi, ye © EU (Us and d; € [L, R,S,U). The element s represents the274 Chapter Turing Machines status of the simulation ofthe transition of M. The symbol U, added tothe tape alphabet and the set of directions, indicates that this item is unknown, Let 8(gi,81-42) = (4): yisdis y2sda] be a two-tape transition. M’ begins the simu lation of a transition of M in the siate [f1, qi U, U, U, U, U, U}. The following five actions simulate transition of M in the multitrack machine, 1. 1 (ind frst symbol: M’ moves to the right until it reads the X on track 2. State Lf 1. gi, x1, U. U, U, U, U] is entered, where x; is the symbol in track | under the X. After recording the symbol on track | in the state, M’ returns to the initial position ‘The # on track 5 is used to reposition the tape head {2 (find second symbol): The same sequence of actions records the symbol beneath the X on track 4. M’ enters state [f2, qi. 41,2. U, Uy Uy U] where x2 isthe symbol in track 3 under the X. The tape head is then returned to the initial position. 3. M'enters the state [p], gj, 115 425 Ys 925i, del. This state contains the information needed to simulate the transition of the M 4. pl (print frst symbol): M’ moves tothe right to the X in track 2 and writes the symbol 1 on track 1. The X on track 2 is moved in the direction designated by di. The machine then returns tothe initial position, 5. p2 (print second symbol): M’ moves to the right to the X in track 4 and writes the symbol 92 on track 3, The X on track 4 is moved in the direction designated by d ‘The simalation eycle terminates by returning the tape head tothe initial position 1f8(gi,411,22) is undefined in the two-tape machine, the simulation halts after return. ing to the initial position following step 2. A state [/2, qi, 41, 1, U, U, U, U1 is a final state of the multitrack machine M’ whenever qj isa final state of M. Example 9.6.1 ‘The set (at |k isa perfect square) is a recursively enumerable language. The design of a three-tape machine that accepts this language is presented. Tape | contains the input sting ‘The input is compared with a string of X°s on tape 2 whose length is a perfect square Tape 3 holds a string whose length is the square root ofthe string on tape 2. The initial configuration for a computation with input aaaaa is Tape 3 I Cl Tape2—e [| Tape 1 input [[[[@ [= [ ]96 Mutttape Machines 275 ‘The values of and are incremented until the length of the string on tape 2s greater than or equal to the length of the input, A machine to perform these comparisons consists, of the following actions: 1, AF the inpot is the nul sting, the computation halts in an accepting state. If not, tapes 2 and 3 are initialized by writing X in position one. The three tape heads are then moved 1 position one. 2. Tape 3 contains a sequence of k X's and tape 2 contains k? X's. Simultaneously, the hheads on tapes 1 and 2 move tothe right while both heads scan nonblank squares. The hhead reading tape 3 remains at position one. ) If both heads simultaneously read a blank, the computation halts and the string is accepted, b) If tape head 1 reads a blank and tape head 2 an X, the computation halts and the string is rejected, 3. The tapes are reconfigured for comparison with the next perfect square. a) An X is added to the right end of the string of X's on tape 2 ) Two copies ofthe string on tape 3 are concatenated to the right end of the string on tape 2. This constructs a sequence of (k + 1)? Xs on tape 2. ©) An X is added to the right end of the string of X°s on tape 3. This constructs a sequence of +1 X's on tape 3. 4) The tape heads are then repositioned at position one of their respective tapes 4, Steps 2 through 4 are repeated. ‘Tracing the computation forthe input string aaaaa, step | produces the configuration res x ne Taper [[De I He Tape 1 —input [fe Tale [aTo 1 (fe ‘The simultaneous left-to-right movement of tape heads 1 and 2 halts when tape head 2-scans the blank in position two.276 Chapter Turing Machines were fT EEE vet—iee (eTETEELEEECE H Part (c) of step 3 reformats tapes 2 and 3 so thatthe input string can be compared with the next perfect square. Tape 3 ~ ‘Tape2—2° x [x [x De Tope 1 input [Peta [2 [a [a CUI Another iteration of step 2 Tape 3—3 Tope 23" Tape 1 — input ‘The machine outlined above is defined by the following transitions:96 Mutttepe Machines 277 5(qo, BB, B)=[qi; B,R; B,R; B,R] (initialize the tape) 8(q1.4,B, B= (42; a,S; X,S; X,S] 8(q2,a,X,X) =[gx; a, X.R; X,S] (compare strings on tapes 1 and 2 8(q2,B. BX) = [gsi B.S; B.S; XS] (accept) 8(q2.0,B,X) = [gsi a8; XR; X,S1 5(qs.a.B,X) = [gsi aS; XR; X,$]_ (rewrite tapes 2 and 3) 8(g6.a,B.B) = (go, aL; BLL; XL) 5(qs.a,B.X) = lqas a, S; XR: XR) 8(go.a,X,X) = las: a,Ls X.L; X,L1 (reposition tape heads) 8(qo.a,X,B) = (96: a,b; X,L; B,S] 5(qo,a, B, B) =| qo; a,L; B.S; B.S} 3(q6. BX, B) = (qs; B.S; X,L; B.S) 8(qo,B.B,B) (ass B,R; BR; B.RI- (repeat comparison cycle) The accepting states are gi and qs. The null string is accepted in gy. Strings at, where k is ‘perfect square greater than zero, are accepted in gs. Since the machine designed above halts for all input strings, we have shown that the language (a" | kis a perfect square) is not only recursively enumerable but also recursive a Example 9.6.2 ‘A multitape machine can also be represented by a state diagram. The ares in the diagram ‘contain the information for each tape. The two-tape Turing machine, whose state diagram follows, accepts the language (uu | u ¢ {a,b}*). The symbols x and y on the labels of the ares represent an arbitrary input symbol. The computation begins by making a copy of the input on tape 2. When this is com- plete, both tape heads are to the immediate right of the input. The tape heads now move back to the left with tape head 1 moving two squares for every one square that tape head 2 ‘moves. If the computation halts in gs the input string has odd length and is rejected. The loop in qs compares the first half of the input with the second; if they match, the string is accepted in state qs278 Chapter 9 Turing Machines [ote Bec] x —~ wien, oer) §~) (60, BBL Laie yh xe tab) xg wmansin So) wsis00 QO) Fateh LiL yy 5] [B18 R, iy R) t & Latex} bik, BIB R] @) 9.7. Nondeterministic Turing Machines A nondeterministic Turing machine may specify any finite number of transitions for a ‘given configuration. The components of a nondetetministic machine, with the exception of the transition function, are identical to those of the standard Turing machine. Transitions in a nondeterministic machine are defined by a partial function from Q x T' to subsets of QxP x {LR}. Whenever the transition function indicates that mote than one action is possible, a ‘computation arbitrarily chooses one of the transitions. An input string is accepted by a nondeterministic machine if there is a computation that terminates in an accepting stat. As usual, the language of a machine is the set of strings accepted by the machine, Example 9.7.1 ‘The nondeterministic Turing machine given below accepts strings containing a c preceded or followed by ab.9.7 Nondeterministic Turing Machines 279 Hoe @weL The machine processes the input in state q, until a is encountered. When this occurs, the computation may continue in state gy, enter state 42 to determine if the ¢ is followed by ‘ab, or enter gs to determine ifthe cis preceded by ab. In the language of nondeterminism, the computation chooses ac and then chooses one of the conditions to check, a ‘A nondeterministic Turing machine may produce several computations for a single input string. The computations can be systematically produced by ordering the alternative transitions fora state, symbol pair. Let m be the maximum number of transitions defined for any combination of state and tape symbol. The numbering assumes that 5(q,x) defines m, ‘not necessarily distinct, ransitions for every state qj and tape symbol x with 8(qi.2) #8. Ifthe wansition function defines fewer than n transitions, one transition is assigned several ‘numbers to complete the ordering. ‘A sequence (m),mz,...,ma) of values from | ton defines a computation in the nondeterministic machine. The computation associated with this sequence consists of k or fewer transitions. The jth transition is determined by the state, te symbol scanned and ‘my, the jth number in the sequence. Assume the j ~ Ist transition leaves the machine in sate q scanning x. If5(g,,.x) = @, the computation halts. Otherwise, the machine executes the transition in 8(g),x) numbered m). ‘The transitions of the nondeterministic machine in Example 9.7.1 can be ordered as shown in Table 9.7.1. The computations defined by the input sing acab and the sequences (11,1, Dy 1,2, 1 1), and @, 23, 3,1) are qvBacabB 1 — quBacabB 1 ——quBacabB 2 FBqacabB | BqyacabB 1 — + BqyacabB 2 BageabB 1 + Bagieabh 2+ BagicabB 3 F RacquabB 1 + BacqzabB 1+ BgsacabB F BacagibB 1 + BacagsbB 1 FBacabg:B + BacabggB ‘The number on the right designates the transition used to obtain the subsequent configura- tion, The third computation terminates prematurely since no transition is defined when the280 Chapter 9 Turing Machines TABLE 9.7.1 Ordering of Transitions State Symbol Transition State Symbol Transition a8 1q.B.R 9 a Lana R 2qB.R anak 34 BR 34s.a,k a8 Voie ae LassbR 2aqaR 2aebR Sana. k 3 qurb, R a8 TqnbR gs Vasbb 2qub.R Daub 3 qubR 3 donb be ee Age Rs en Vanat Zane 2 qnak Basel Banal ‘machine is in state gs scanning an a. The string acab is accepted since the computation defined by (1, 1,2, 1, 1) terminates in state q4- ‘The machine constructed in Example 9.7.1 accepts strings by final state. As with standard machines, acceptance in nondeterministic Turing machines can be defined by final state or by halting alone, A nondeterministic machine accepts a string u by halting if there is at least one computation that halts normally when run with u. Exercise 23 establishes that these alternative approaches accept the same languages. Nondeterminism does not increase the capabilities of Turing computation; the lan- ‘guages accepted by nondeterministic machines are precisely those accepted by determinis- tic machines, Let M = (Q, , TP’, 8, qo) be anondeterministic machine that accepts strings by halting. Assume thatthe transitions of M have been numbered according tothe previous scheme, with n the maximum number of transitions fora state, symbol pair. A determinis- tic three-tape machine M' is constructed to accept the language of M, Acceptance in M’ is also defined by halting. ‘The computations of M are simulated in M’. The correspondence between a sequence (7m)..-.smg) and a computation of M’ ensures that all possible computations of length & are examined. A computation in M’ consists of the actions: 1. A sequence of integers from | to n is written on tape 3. 2. The input string on tape | is copied to the standard position on tape 2.8.7 Nondeterministic Turing Machines 281 Rollover) WIR Q o~ BIL /(yerement sequence) W2L (ind end of sequence) FIGURE 9.1 Turing machine generating (1, 2, .-..n)” 3. The computation of M defined by the sequence on tape 3 s simulated on tape 2 4. Ifthe simulation halts, the computation of M’ halts and input is accepted 5. Anew sequence is generated on tape 3 and steps 2 through 5 are repeated. ‘The simulation is guided by the sequence of values on tape 3. The deterministic Turing machine in Figure 9.1 generates all sequences of integers from 1 10 n. Sequences of length one are generated in numeric order, followed by sequences of length two, and so on. A computation begins in state go at position zero. When the tape head returns to position zero the tape contains the next sequence of values. The notation i/abloreviates I, 222, --- n/n Using this exhaustive generation of numeric sequences, we now construct a determin- istic three-tape machine M’ that accepts L(M). The machine M’ is constructed by inter- ‘weaving the generation ofthe sequences on tape 3 withthe simulation on tape 2. M’ halts ‘when the sequence on tape 3 defines a computation that halts in M. Recall that both M and 1M accept by halting Let © and I” be the input and tape alphabets of M. The alphabets of M’ are Ew 7 z txt |e ET}U(L.. cam).282 Chapter® Turing Machines Symbols of the form #x represent tape symbol x and are used to mark the leftmost square ‘on tape 2 during the simulation of the computation of M. The transitions of M’ are natu rally grouped by their function. States labeled q,.j are used in the generation of a sequence ‘on tape 3. These transitions are obtained from the machine in Figure 9.1. The tape heads reading tapes | and 2 remain stationary during this operation, 5(q0, B.B, BY =[qu5 BS: B.S; BER) (Qs. B Bst) = [gus BS; BLS; REO Lean 5 (qs, BBB) = [qsa; B.S: B.S; BL] 5(q,2, BB.) = [gsi BySs B.S; AL) 8(q,.2.B, Bt ~ 1) BS; UL 0 5(qz2,B,B,B) B.S; BR) 5(q.3.8,B, 1) BS; BS; AR] 8(qu3, B,B, B) B,S; BS; \.L1 5 (qua, B, Bt) B.S; BS: tL) t ” 5(qca B, BB) B.S; B.S; BS) ‘The next step is to make a copy of the input on tape 2. The symbol #2 is written in position 0 to designate the left boundary of the tape. 5(qc0, B.B,B) = [yeni B,Ri #BLR: B.S) B(ge.in¥+B, BY = [gens X/Ri x.Rs B,S] for allx eT (Bh 8(qc.1,B, B,B) = [ei BLL; BL; B.S) 8lge2s%sx, BY = les XL5 815 BS] forall er 8(q.2, B.#B, B) = [go; B.S: #B, 8; BR) The transitions that simulate the computation of M on tape 2 of M’ are obtained directly from the transitions of M. (qi, B.x,1) = [ays BLS: yudit, RI where 8,2) =[4j, Yd is 5 (gi B,#x.1) = [ays B,Si¥y.d; t,R) transition of M numbered ¢ If the sequence on tape 3 consists of & numbers, the simulation processes at most k transitions. The computation of M’ halts if the computation of M specified by the sequence ‘on tape 3 halts. When a blank is read on tape 3, the simulation has processed all of the transitions designated by the current sequence. Before the next sequence is processed, the result of the simulated computation must be erased from tape 2. To accomplish this, the tape heads on tapes 2 and 3 are repositioned at the leftmost position in state ge and ge, respectively. The head on tape 2 then moves tothe right, erasing the tap.9.7 Nondeterministic Turing Machines 283 5 (qi, Bx, B) B.S; x,8;B,8] forallxeT (qi, B,#x, B) B,S:¥x,S; B.S] forall eT 5geo,B,x,B) = (eo: B,S:x,L; BS] forall x eT (geo, B, 4x, B) B,S;B,S; BL] forall xe B(qe.B. Bot) BS:BS: tL) ¢ (qe. B, By BY = [dea BS; B, Ry BR] SCdea, Bri) = [Ge2i BS: B,R: LR 8(q.2 BB, B) = deat BS; B,L; BL) 8(qe3,B, Bt) = [ess BS: B,L; tL) 8(qe3. BB, B) = [goo B.S:B,S; B,S] ‘When a blank is read on tape 3, the entire segment of the tape that may have been accessed during the simulated computation has been erased. M’ then returns the tape heads to their initial position and enters q,o to generate the next sequence and continue the simulation of computations. ‘The process of simulating computations of M, steps 2 through $ of the algorithm, continues until a sequence of numbers is generated on tape 3 that defines a halting com- putation. The simulation of this computation causes M’ to halt, accepting the input. Ifthe input string is not in L(M), the cycle of sequence generation and computation simulation in M’ will continue indefinitely, The actions the determinstic machine constructed following the preceding strategy are illustrated by examining the following sequence of machine configurations. Consider a nondeterministic machine M in which the computation defined by the sequence (1, 3, 2, 1,1) and input string abed is qoBabed BA F BgiabedB 3 b BdgnbedB 2 F BgxdacdB. The sequence (1, 3,2, 1,1) that defines the computation of M is writen on tape 3 of M' ‘The configuration of the three-type machine M’ prior tothe execution of the third transition of Mis284° Chapter9 Turing Machines Tape 3 — sequence Tape 2— simulation ‘Tape 1 — input Transition 2 fom state gz with M scanning b causes the machine to print a, enter state qs, and move to the left. This transition is simulated in MY by the transition 342, B.b,2) {qs; B.S; a,L; 2, R]. The wansition of M’ alters tape 2as prescribed by the transition of 'M and moves the head on tape 3 to indicate the subsequent transition. ee eos ‘Tape 2— simulation [a Nondeterministic Turing machines can be defined with a multitrack tape, two-way tape, or multiple tapes. Machines defined using these alternative configurations can also be shown to accept precisely the recursively enumerable languages. 9.8 Turing Machines as Language Enumerators In the preceding sections, Turing machines have been formulated as language acceptors: ‘A machine is provided with an input string, and the result of the computation indicates the acceptability of the input. Turing machines may also be designed to enumerate a language. The computation of such a machine sequentially proxluces an exhaustive listing of the elements of the language. An enumerating machine has no input; its computation ‘continues until it has generated every string in the language.9.8 Turing Machines as Language Enumerators 285, Like Turing machines that accept languages, there are a number of equivalent ways to define an enumerating machine, We will use k-tape deterministic machine, k > 2, asthe underlying Turing machine mode! in the definition of enumerating machines. The first tape is the output tape and the remaining tapes are work tapes. A special tape symbol # is used fon the output tape fo separate the elements of the language that are generated during the computation. Since the machines in this section are designed for two distinet purposes, a machine that accepts a language will be denoted M while an enumerating machine will be de noted E. Definition 9.8.1 A ketape Turing machine B= (Q,,P, 5, qo) enumerates a language L if {) the computation begins with all apes blank i) with each transition, the tape head on tape 1 (the output tape) remains stationary or ‘moves to the right iit) at any point in the compuation, the nonblank portion of tape 1 has the form, Biuyturt..ttut or Bau uatt.. tus, where u; € Land v€ 5° iv) u will be written on tape I preceded and followed by # if, and only if, w€ L. ‘The last condition indicates that the computation of a machine F that enumerates L eventually writes every string in L on the outpot tape. Since all of the elements of a Tanguage must be produced, a computation enumerating an infinite language will never hhalt. The definition does not require a machine to halt even if itis enumerating a finite language. Such a machine may continue indefinitely after writing the last element on the output tape. Example 9.8.1 ‘The machine E enumerates the language L this language was given in Example 9.2.2 (a‘bic! | > 0}. A Turing machine accepting286 Chapter9 Turing Machines (Bia Raa] [Bit R, ala 8} WB R, BIB W# R Bla) GAMBR BRL GUO. BOS) (iB, BIBL) GS) (wR, ala t [iB , BiB R) \ @)) Weer, aia) (wins, Bly \é (18S, aia L} ‘The computation of E begins by writing ## on the output tape, indicating that 4 € L. Simultaneously, an a is written in position 1 of tape 2, with the head returning to tape position 0, At this point, E enters the nonterminating loop described below. 1. The tape heads move to the right, writing an a on the output tape for every a on the ‘work tape. 2. The head on the work tape then moves right to left through the a’s and a b is written fon the output tape for each a. 3. The tape heads move to the right, writing a.¢ on the output tape for every a on the work tape, 4, Ama is added to the end of the work tape and the head is moved to position 1 5. Adis written on the outpot tape, Afr a string is completed on the output tape, the work tape contains the information required to construct the next string in the enumeration. a The definition of enumeration requires that each string in the Language appear on the ‘output tape but permits a string to appear multiple times. Theorem 9.8.2 shows that any Janguage that is enumerated by a Turing machine can be enumerated by one in which each string is written only once on the output tape.98 Turing Machines as Language Enumerators 287 Theorem 9.8.2 Let L be a language enumerated by a Turing machine E. Then there is a Turing machine that enumerates L and each string in L appears only once on the output tape oF E’ Proof Assume E is a k-lape machine enumerating L. A k + I-tape machine E’ that satisfies the “single output” requirement can be built from the enumerating machine E. Intuitively, Eis a submachine of E’ that produces strings to be considered for output by E’. ‘The output tape of Bis the additional tape added to E, while the output tape of E becomes, a work tape for E’. For convenience, we call tape | the output tape of E’. Tapes 2,3, ..- K+ 1 are used to simulate E, with tape 2 being the output tape of the simulation. The actions of E’ are 1. The computation begins by simulating the actions E on tapes 2,3, ...,k-+ 1 2, When the simulation of E writes #u# on tape 2, B’ initiates a search procedure to see if w already occurs on tape 2. 3. If'wis not on tape 2, its added to the output tape of E’ 4, The simulation of E is restarted to produce the next string. Searching for another occurrence of uw requires the tape head to examine the entire non- blank portion of tape 2. Since tape 2 is not the output tape of E’, the restriction that the tape head on the output tape never move to the left is not violated. . ‘Theorem 9.8.2 justifies the selection of the term enumerate to describe this type of computation, The computation sequentially and exhaustively lists the strings in the lan- ‘guage. The order in which the strings are produced defines a mapping from an initial sequence of the natural numbers onto L. Thus we can talk about the zeroth string in L, the first string in L, etc. This ordering is machine specific; another enumerating machine ‘may produce a completely different ordering. Turing michine computations now have two distinct ways of defining a language: by acceptance and by enumeration, We show that these two approaches produce the same languages. Lemma 9.8.3, IFL is enumerated by a Turing machine, then L is recursively enumerable. Proof Assume that L is enumerated by a k-tape Turing machine E, A k-+ I-tape ma- chine M accepting L can be constructed from E. The additional tape of M is the input tape; the remaining & tapes allow M to simulate the computation of E. The computation of M begins with a string w on its input tape. Next M simulates the computation of E. When the simulation of E writes #, a string w € L has been generated. M then compares u with w and accepts u if u = w. Otherwise, the simulation of E is used to generate another string from L and the comparison cycle is repeated. If u € L, it will eventually be produced by E and consequently accepted by M. .288 Chapter 9 Turing Machines ‘The proof that any recursively enumerable language L can be enumerated is compli cated by the fact that a Turing machine M that accepts L need not halt for every input string. A straightforward approach to enumerating L. would be to build an enumerating, ‘machine that simulates the computations of M to determine whether a string should be ‘written on the output tape. The actions of such a machine would be 1. Generate a string u € * 2. Simulate the computation of M with input u 3. IM accepts, write w on the output tape. 4, Repeat steps 1 through 4, until all strings in 5* have been tested. ‘The generate-and-test approach requires the ability to generate the entire set of strings over for testing. This presents no difficulty, as we will sce late. However, step 2 of this naive approach causes it to fail. It is possible to produce a string w for which the computation ‘of M does not terminate. In this ease, no strings after u will be generated and tested for membership in L. To construct an enumerating machine, we first introduce the lexicographical ordering, of the input strings and provide a strategy to ensure that the enumerating machine E will check every string in Z*. The lexicographical ordering of the set of strings over a nonempty alphabet ¥ defines a one-to-one correspondence between the natural numbers and the strings in ©* Definition 9.8.4 Let © = {a;,....d,) be an alphabet. The lexicographical ordering /o of Eis defined recursively as follows: i) Basis: [o(4) = 0, lo(a)) =i ford = 1,2,...,7 ii) Recursive step: lo(au) = Lo(u) +é - nln, ‘The values assigned by the funct wand v ate said to satisfy u
lo(2), respectively. n Lo define a total ordering on the set E*. Strings v, and u > v if fo(u)
¢ a BR 1 | gBL qua gueR eR n gees gnbL 4) Trace the computation for the input string aabea. ) Trace the computation for the input string bebe. ©) Give the state diagram of M. 4) Describe the result of a computation in M. 2, Let M be the Turing machine defined by a |B a > « go | aB.R 1 | BR guaR GbR a @ Geb gal a) Trace the computation for the input string abcab, 'b) Trace the first six transitions of the computation for the input string abab. ©) Give the state diagram of M. dd) Describe the result of a computation in M. 3. Construct a Turing machine with input alphabet (a, b} to perform each of the follow- ing operations. Note that the tape head is scanning position zero in state q whenever ‘a computation terminates,292 Chapter 9 Turing Machines a) Move the input one space to the right. Input configuration qoBuB, result qBBuB. ) Concatenate a copy of the reversed input string to the input, Input configuration qoBuB, result gy Buu* B. ©) Insert a blank between each of the input symbols. For example, input configuration qoBabaB, result q; BaBbBaB. «d) Erase the b’s from the input. For example, input configuration qoBbabaababB, result qpBaaaaB. 4, Construct a Turing machine with input alphabet {a,b,c} that accepts strings in which the first c is preceded by the substring aaa. A string must contain a c to be accepted by the machine, 5. Construct Turing machines with input alphabet (a,>) that accept the following lan- guages by final state a) {a'bijiz0, 72a) b) ta'bla'b! |i, j > 0) ©) Strings with the same number of a’s and b's 6. Modify your solution to Exercise S(a) to obtain a Turing machine that accepts the Tanguage (a'b' | 7 > 0, j > 1) by halting 7. Am altemative method of acceptance by final state can be defined as follows: A string. wis accepted by a Turing machine M if the computation of M with input u enters (but does not necessarily terminate in) a final state, With this definition, a string may bbe accepted even though the computation of the machine does not terminate. Prove thatthe languages accepted by this definition are precisely the recursively enumerable languages. 8 Construct a Turing machine with two-way tape and input alphabet (a} that halts if the tape contains a nonblank square, The symbol « may be anywhere on the tape, not necessarily to the immediate right of the tape head. 9. & wwo-dimensional Turing machine is one in which the tape consists of a two- dimensional array of tape squares.Exercises 293, A transition consists of rewriting a square and moving the head to any one ofthe four adjacent squares. A computation begins with the tape head reading the corner pos tion. The transitions of the two-dimensional machine are written 8(q, x) = [qj.).dh, where d is U (up), D (down), L (lef, or R (right). Design a two-dimensional Turing ‘machine with input alphabet (a) that halts if the tape contains a nonblank square LetL be the st of palindromes over (a,b) 4) Build a standard Turing machine that accepts L. ) Build a two-tape machine that accepts L in which the computation with input u should take no more than 3fengeh(u) + 4 transitions Construct a two-tape Turing machine that accepts strings in which each a is followed by an increasing number of 6's, that i, the strings are of the form abMab™...ab",k > 0, Where my <2 <--
3, ii) w contains the same number of a’s and b's. 17. Construct a two-tape nondeterministic Turing machine that accepts the strings of odd Tength over (a, b,c) with a c in the middle position. Every computation with input w should halt after at most length(w) +2 transitions. 18, Construct a two-tape nondeterministic Turing machine that accepts L {a,5}*). Every computation with input w should terminate after at most 2fength(w) + 2 transitions. Using the deterministic machine from Example 9.6.2 that accepts L, ‘what is the maximum number of transitions required for a computation with an input of length n’? 19, A machine that generates all sequences consisting of integers from I to n was given “gure 9.1, Trace the first seven cycles of the machine for n = 3. A cycle consists of the tape head returning to the initial position in state go 20. Construct a Turing machine that generates the set L = {a‘ |/ is divisible by 3). Lis ‘generated by a machine M under the following conditions: j) After the first transition, whenever the machine is in state go scanning the left- ‘most square, an element of L is on the tape. ii) All elements of L are eventually generated, 21. Construct a Turing machine that generates the set (a'b! | 7 = 0} 22. Let L be a language accepted by a nondeterministic Turing machine in which every computation terminates. Prove that Lis recursive, 23, Prove the equivalent of Theorem 9.3.2 for nondeterministic Turing machines. 24, Prove that every finite language is recursive. 25. Prove that a language L is recursive if, and only if, Land L. are recursively enumer- able. 26. Prove that the recursive languages are closed under union, intersection, and comple- ment 27, Build a Turing machine that enumerates the set of even length strings over {a}, 28. Build a Turing machine Ex that enumerates E* where ¥ = (0, /}. Nore: This ma- chine may be thought of as enumerating all finite length bit strings. 29, Build a machine that enumerates the ordered pairs N x N. Represent a number by a string of m +1 J's, The output for ordered pair [/,/] should consist of the representation of the number i followed by a blank followed by the representation of j. The markers # should surround the entire ordered paiBibliographic Notes 295 30. In Theorem 9.8.7, the proof that every recursive language can be enumerated in lexi- ccographical order considered the cases of finite and infinite languages separately. The argument for an infinite language may not be sufficient fora finite language. Why? 31. Prove that the two-stack automata introduced in Section 8.6 accept precisely the re- ccursively enumerable languages. 32, Define a nondeterministic two-track Turing machine. Prove that these machines ac- cept precisely the recursively enumerable languages. 33. Prove that every context-free language is recursive. Hint: Construct a nondeterminis- tic two-tape Turing machine that simulates the computation of a pushdown automaton, Bibliographic Notes The Turing machine was introduced by Turing [1936] as a model for algorithmic com- putation. Turing’s original machine was deterministic, consisting of a two-way tape and a single tape head. Independently, Post [1936] introduced a family of abstract machines with the same computational capabilities as Turing machines ‘The capabilities and limitations of Turing machines as language acceptors are ex- amined in Chapter 11. The use of Turing machines for the computation of functions is presented in Chapters 12 and 13, The books by Kleene [1952], Minsky [1967], Brainerd and Landweber [1974], and Hennie [1977] give an introduction to computability and Tur- ing machines.CHAPTER 10 The Chomsky Hierarchy Phrase-structure grammars provide a formal system for generating strings over an alpha- bet. The productions, or rules, of a grammar specify permissible string transformations. Families of grammars are categorized by the form of the productions. The regular and context-free grammars introduced in Chapter 3 are two important families of phrase structure grammars. In this chapter, two additional families of grammars, unrestricted grammars and context-sensitive grammars, are presented. These four families make up the ‘Chomsky hierarchy, named after Noam Chomsky, who proposed them as syntactic models of natural language. ‘Automata were designed to mechanically recognize regular and context-free lan- ‘guages. The relationship between grammatical generation and mechanical acceptance is extended to the new families of grammars, Turing machines are shown to accept the languages generated by unrestricted grammars. A class of machines, the linear-bounded automata, obtained by limiting the memory available to a Turing machine, accepts the lan- ‘guages generated by context-sensitive grammars 10.1 Unrestricted Grammars ‘The unrestricted grammars are the largest class of phrase-structure grammars. A produc tion u — v indicates that an occurrence of a substring w in a string may be replaced with the string v. A derivation is a sequence of permissible replacements. The only constraint 297298 chapter 10 The Chomsky Hierarchy (on a production of an unrestricted grammar is that the left-hand side not be null. These general string rewriting systems are also called rype 0 grammars Definition 10.1.1 An unrestricted grammar is a quadruple (V, BP, S) where V is finite se of variables, E (the alphabet) isa finite set of terminal symbols, P is a set of productions, and Sis a distinguished element of V. A production ofan unrestricted grammar has the form u > v, where u € (VUE)* and v € (VU 3)", The sets V and E are assumed to be disjoint. ‘The previously defined families of grammars are subsets of the more general class of unrestricted grammars, A context-free grammar is a phrase-structure grammar in which the left-hand side of every rule isa single variable, The productions of a regular grammar are required to have the form i) A> aB i) Asa iii) Aa, where A,B e Vanda € © The notational conventions introduced in Chapter 3 for context-free grammars are used for arbitrary phrase-stracture grammars. The application of the rule w > v to the string xuy, writen suy -> xvy, produces the string xu. A string q is derivable from 1. p> qui there is a sequence of role applications that transforms p to q. The language of G, denoted L(G), is the set of terminal strings derivable from the start symbol S Symbolically, L(G) = (we ¥*) 5% w) Example 10.1.1 The unrestricted grammar V={S.A.C] E=label Sr aAbe| a A= aAbC|A choc Com ce with start symbol 5 generates the language (a‘b'c! | = 0}. The string albicl, > 0, is ‘generated by a derivation that begins $= aAbe Ss al Aboybe al (oC)! "be40.1 Unrestricted Grammars 299 ‘The rule Ch > bC allows the final C to pass through the b's that separate it from the c's at the end of the string. Upon reaching the leftmost c, the variable C is replaced with c. This process is repeated until each occurrence of the variable C is moved to the right of all the sand transformed into a ¢ ° Example 10.1.2 ‘The unrestricted grammar with terminal alphabet (a,b, } defined by the productions Sr aT la} bTIN 0 T+ aT[A| BIBL Aa aA Ab DA Ba aB Bb > bB Ala) B)>d) ‘generates the language (ulu]| w € (4,6}") The addition of an a or b to the beginning of the string is accompanied by the gen- eration of the variable A or B. Using the rules that interchange the position of a variable and a terminal, the derivation progresses by passing the variable through the copy of the siring enclosed in the brackets. When the variable is adjacent to the symbol |, the appropri- ate terminal is added to the second string. The entire process may be repeated to generate additional terminal symbols or be terminated by the application of the rule T{- [. The derivation S=aTta) => aaT {Aa} > aaT [aA] > aaT{aa] > aabT {Baa} > aabTlaBal > aabT aaB) = aabT [aad] > aablaad] exhibits the roles of the variables in derivation, In the preceding grammars, the letchand side of each rule contained a variable. This, is not required by the definition of unrestricted grammar. However, the imposition of the300 Chapter 10 The Chomsky Hierarchy restriction that the left-hand side of a rule contain a variable does not reduce the set of languages that can be generated (Exercise 3). ‘Theorem 10.1.2 Let G = (V, , P, 5) be an unrestricted grammar. Then L(G) isa recursively enumerable language. Proof We will sketch the design of a three-tape nondeterministic Turing machine M that accepts L(G). Tape 1 holds an input string p from Z*. We will design M so that its ‘computations simulate derivations of the grammar G. A representation of the rules of G is, written on tape 2. A rule w —> v is represented by the string uv, where # isa tape symbol reserved for this purpose. Rules are separated by two consecutive #"s. The derivations of Gare simulated on tape 3, ‘A.computation of the machine M that accepts L(G) consists of the following actions: S is written on position one of tape 3 ‘The rules of G are written on tape 2. ‘Avrule wiv is chosen from tape 2 An instance of the string w is chosen on tape 3, if one exists. Otherwise, the computa tion halts in a rejecting state, The string w is replaced by v on tape 3 If the strings on tape 3 and tape 1 match, the computation halts in an accepting state. 7. To apply another rule, steps 3 through 7 are repeated. ‘Since the length of w and v may differ, the simulation of a rule application xuy =» xvy ‘may require shifting the position of the string y. For any string p € L(G), there is a sequence of rule applications that derives p. This derivation will be examined by one of the nondeterministic computations of the machine M. Conversely, the actions of M on tape 3 generate precisely the strings derivable from $ in G. The only strings accepted by M are terminal strings in L(G). Thus, L(M) = L(G). . Example 10.1.3 ‘The language L = (a‘bic! |i = 0) is generated by the rules S$ aAbe | A aAbC| 2 Ch bc Com ce. ‘Computations of the machine that accepts L simulate derivations of the grammar. The rules of the grammar are represented on tape 2 by101 Unrestricted Grammars 301 BSta Abcitt Sit Ata AbCHRAHHECDHDCHEC CHC. ‘The rule $2 is represented by the string S###, The first # separates the left-hand side Of the rule from the right-hand side, The right-hand side of the rule, the null string in this cease, is followed by the string # 0 Theorem 10.1.3 Let Lbe a recursively enumerable language. Then there san unrestricted grammar G with LG Proof Since Lis recursively enumerable, itis accepted by a deterministic Turing ma- chine M = (Q, ©, P, 8, qo. F). An unrestricted grammar G = (V, ©, P.$) is designed ‘whose derivations simulate the computations of M. Using the representation of Turing ‘machine configuration a8 string that was introduced in Section 9.1, the effec of a Turing ‘machine transition (qx) = [gj.y. R) onthe configuration uq.xB can be represented by the string transformation ugixvB = uygvB. “The derivation of a terminal sting in G consists of three distinct subdervations: i) The generation ofa string ulgoBu] where u € E* ii) The simulation of a computation of M on the sting [goBu] i) IF M accepts u, the removal of the simulation substring ‘The grammar G contains a variable A; foreach terminal symbol a; € ©. These vari ables, along with S,7,{, and |. are used in the generation of the strings wlqpBul, The simulation of a computation uses variables corresponding tothe states of M. The variables Eg and Ex are used in the third phase ofa derivation. The set of terminals of G is the input alphabet of M Bm laraa, sda) {(S.T, Bk Ebsle Als Aan) UQ “The rules for each ofthe three parts ofa derivation are given separately, A derivation begins by generating ulqoBu), where w is an arbitrary sting in E*. The strategy used for generating strings ofthis form was presented in Example 10.1.2. 1, $+ a Tla|[qoB] fort sisn 2. Tl aTIAi|goB fori sicn 3. Aiaj > ajAi fort
ai] forlsisn “The computation ofthe Turing machine with input wis simulated on the string go8 The rules are obtained by rewriting ofthe transitions of M as string transformations 5. gury > cay whenever 8(qi.2) = [aj.e, Rland y eT 6. gaxl—> £4)B)_ whenever 5(g,.x) = [gj,2- RI 7. yqpx > qjye whenever (g),x) =[qj.2, EL] and y eT802 Chapter 10 The Chomsky Hierarchy If the computation of M halts in an accepting state, the derivation erases the string Within the brackets. The variable Ep erases the string to the right of the halting position of the tape head. Upon reaching the endmarker J, the variable E:, (erase left) is generated, 8. qix > Eg whenever 5(gj,x) is undefined and q; € F 9. Eee > Eq forxer 10. Ex) Ex M sEp—> Ey forse 12, (Epa ‘The derivation that begins by generating ulqoBu] terminates with u whenever u € L(M). Ifu ¢ L(M), the brackets enclosing the simulation of the computation are never erased and, the derivation does not produce a terminal string, . Example 10.1.4 ‘The consinuetion of a grammar that generates the language accepted by a Turing machine is demonstrated using the machine ala BBR Gp) —HBR_ GE that accepts a*b(a U b)*. When the first b is encountered, M halts and accepts in state gy ‘The variables and terminals of G are B= (a,b) T, Ep, Este AX) U {90.41 ‘The rules are given in three sets Input generating rules: S$ aT {a} | bT(6] | (go T+ aTIALOTIX | laoB Aa aA Ab bA Al al Xa aX Xb bx x1) ‘Simulation rules:101 Unrestricted Grammars 303 Rules 5(go.B) = [41.8 R] qua > Bua Bb > Byyd BB > BB 4B) ~+ Ba] Sig.a)=l4naR) quae > aga tab + agb nak > anB anal > aqiB] 5g1.8)= [4.8K] gia > Bava q1Bb > Bab BB > BB 8) > BaiB) Erasure rules: gb En Ena En aE, > Ex Exb>Eg bEL > Ex EgB—> En BEL Ey Ea Ec (Ev ‘The computation that accepts the string a in M and the corresponding derivation in the grammar G that accepts ab are qoBabB S=>aTta) bBqabB = abTIXa) + BagbB —— =>ablqyBXal = ablqoBaX) = ablqnBabl = ablBquab] = ablBaq,b] = ab|BaEg] > ablBak, > ablBEL => ablEy sab o304 Chapter 10 The Chomsky Hierarchy Properties of unrestricted grammars can be used to establish closure results For recur- sively enumerable languages. The proofs, similar to those presented in Theorem 8.5.1 for context-free languages, are left as exercises. Theorem 10.1.4 ‘The set of recursively enumerable languages is closed under union, concatenation, and Kleene star 10.2 Context-Sensitive Grammars ‘The context-sensitive grammars represent an intermediate step between the context-free and the unrestricted grammars. No restrictions are placed on the left-hand side of a pro-
v, where u € (VU E)+,v © (V UE)", and lengih(u) < length(v), A rule that satisfies the conditions of Definition 10.2.1 is called monotonic. Context- sensitive grammars are also called monotonic or noncontracting, since the length of the derived siring remains the same or increases with each rule application. The language generated by a contextsensitive grammar is called, not surprisingly, a context-sensitive language. Context-sensitive grammars were originally defined as phrase-structure grammars in which each rule has the form wAv > uwv, where A € Vw e (VUE) and u,v € (VU E)*. The preceding rule indicates that the variable A can be replaced by w only when it appears in the context of being preceded by w and followed by v. Clearly, every grammar defined in this manner is monotonic. On the other hand, a transformation defined by a ‘monotonic rule can be generated by a set of rules of the form wAw —> uw (Exercises 10 and 11). property of the rules guarantees that the null string is not an element fe language. Removing the rule S— 2 from the grammar in Exam- the unrestricted grammar S>aAbe A> aAbC A Cb bc Cos ce that generates the language {a‘b'c! | > 0). The lambda rule violates the monotonicity property of context-sensitive rules. Replacing the S and A rules with102 Contoxt-Sensitive Grammars 305 S§ + aAbe | abe A aAbC | abe produces an equivalent context-sensitive grammar. ‘A nondeterministic Turing machine, similar to the machine in Theorem 10.1.2, is designed to accept a context-sensitive language. The noncontracting nature of the rules permits the length of the input string to be used to terminate the simulation of an unsuc- cessful derivation, When the length of the derived string surpasses that of the input, the computation halts and rejects the string ‘Theorem 10.2.2 Every context-sensitive language is recursive Proof Following the approach developed in Theorem 10.1.2, derivations of the context- sensitive grammar are simulated on a three-tape nondeterministic Turing machine M. The centre derivation, rather than just the result, is recorded on tape 3, When a rule u —> » is applied to the string xuy on tape 3, the string xvy is written on the tape following xuy#. ‘The symbol # is used to separate the derived strings A computation of M with input string p performs the following sequence of actions: ‘S# is writen beginning at position one of tape 3. ‘The rules of G are written on tape 2. Arule uitv is chosen from tape 2. |. Let g# be the most recent string written on tape 3: ) An instance of the string w in q is chosen, if one exists. In this case, q can be written xuy, ) Otherwise, the computation halts in a rejecting state 5. xuyitis writen on tape 3 immediately following gf. 6. a) If-xvy = p, the computation halts in an accepting state. ‘occurs at another position on tape 3, the computation halts in a rejecting ©) If lengeh(xuy) > lengrh(p), the computation halts in a rejecting state. 7. Steps 3 through 7 are repeated. There are only a fnite numberof sings in (V U E)* with length less than or equal to lengah(p). Tis implies that evry derivation eventually alls, ener a cycle, of derives {string of fength greater than engi). A computation halts a step 4 when the rule that hasbeen selected cannot be applied tothe curent sting. Cyclic derivations, $2 w 2 w, are terminated instep 6). Te length bound is used in sep (6) to terminate all other unsuccessful derivations,306 Chapter 10 The Chomsky Hierarchy Every string in L(G) is generated by a noneyclic derivation. The simulation of such a derivation causes M to accept the string, Since every computation of M halts, L(G) is recursive (Exercise 9.22) . 10.3 Linear-Bounded Automata We have examined several alterations to the standard Turing machine that do not alter the set of languages accepted by the machines. Restricting the amount of the tape avail- able for a computation decreases the capabilities of a Turing machine computation. A. Tinear-bounded automaton is a Turing machine in which the amount of available tape is determined by the length of the input string, The input alphabet contains two symbols, { and ), that designate the left and right boundaries of the tape Definition 10.3.1 A linear-bounded automaton (LBA) is a structure M = (Q, &, T, 5, qo. (.). F), where Q.E, I, 5, qo, and Fare the same as for a nondeterministie Turing machine. The symbols (and ) are distinguished elements of ©. ‘The initial configuration of a computation is qo(w), requiring lengrh(w) +2 tape positions. The endmarkers ( and } are written on the tape but not considered part of the input, A computation remains within the boundaries specified by ( and ). The endmarkers may be read by the machine but cannot be erased. Transitions scanning { must designate a move to the right and those reading ) to the lef. A string w € ( — {(,)})* is accepted by ‘an LBA if a computation with input (w) halts in an accepting state. We will show that every context-sensitive language is accepted by a linear-bounded automaton. An LBA is constructed to simulate the derivations of the context-sensitive grammar, The Turing machine constructed to simulate the derivations of an unrestricted ‘grammar begins by writing the rules of the grammar on one of the tapes. The restriction fon the amount of tape available to an LBA prohibits this approach. Instead, states and transitions of the LBA are used to encode the rules. Consider a context-senstive grammar with variables (S, A) and terminal alphabet (a) ‘The application of a rule Sa — @AS can be simulated by a sequence of transitions in an LBA (Figure 10.1), The first two transitions verify thatthe string on the tape beginning at the position ofthe tape head matches the left-hand side of the rule ‘The application ofthe rule generates a string transformation Sav => ua ASv. Before Sais replaced with aS, the sting » is traversed to determine whether the derived string fits on the segment of the tape available to the computation If the is read, the computation terminates. Otherwise, the string v i shifted one position to the right and Sa is replaced by aAS.103 Linear-Bounded Automata 307 ala R ey AAR SSR 58 tg ako) a XIKR ‘is L BIS NAL XlaL FIGURE 10.1 LBA simulation of application of Sa + aAs Theorem 10.3.2 Let L be a context-sensitive language. Then there is a linear-bounded automaton M with, Le) = Proof Since L is a context-sensitive language, L = L(G) for some context-sensitive ‘grammar G = (V, ©, P, S). An LBA M with a two-track tape is constructed to simulate the derivations of G. The first track contains the input, including the endmarkers. The second track holds the string generated by the simulated derivation. Each rule of G is encoded in a submachine of M. A computation of M with input (p} consists ofthe following sequence of actions 1. Sis writen on track 2 in position one. 2. The tape head is moved into a position in which it scans a symbol of the string on wack 2, 3. Arle u — v is nondeterministically selected, and the computation attempts to apply the rue. 4, a) Ifa substring on track 2 beginning at the position of the tape head does not match 1, the computation has in a rejecting state,308 Chapter 10 The Chomsky Hierarchy ) Irthe tape head is scanning u but the string obtained by replacing u by w is greater than Jengrh(p), then the computation halts in a rejecting state. ©) Otherwise, w is replaced by v on track 2 5, Itrack 2 contains the string p, the computation halts in an accepting state 6, Steps 2 through 6 are repeated Every string in L is generated by a derivation of G. The simulation of the derivation causes M to accept the string. Conversely, a computation of M with input (p) that halts in an accepting state consists of a sequence of string transformations generated by steps 2 ‘and 3. These transformations define a derivation of p in G. . ‘Theorem 10.3.3 Let L be a language accepted by a linear-bounded automaton. Then L ~ (2) is a context- sensitive language. Proof LetM=(Q. Ea. P, 6. go, (. js F) bean LBA that accepts L. A context-sensitive ‘grammar G is designed to generate L(M). Employing the approach presented in Theo- rem 10.1.3, a computation of M that accepts the input string p is simulated by a derivation of p inG. The techniques used to construct an unrestricted grammar that simulates a Tut- ing machine computation cannot be employed since the rules that erase the simulation do not satisfy the monotonicity restrictions of a context-sensitive grammar. The inability to erase symbols in a derivation of context-senstive grammar restricts the length ofa derived. string to that of the input. The simulation is accomplished by using composite objects as variables “The terminal alphabet of G is obtained from the input alphabet of M by deleting the endmarkers. Ordered pairs are used as variables. The first component of an ordered pair is terminal symbol. The second isa string consisting of a combination ofa tape symbol and possibly a state and endmarker(s) Eo=Eu— {6} Va (S.Acle rh fay (eh lair} [ae (YD laser, 3,14) Lai. de(h Laie (gurl (aisaex)h Lai.xgelh (@iege 2) leis (gax)} Ua (xa) Th. where a; € Zg.x €P,and gx €Q. ‘The S and A rules generate ordered pairs whose components represent the input string and the initial configuration of a computation of M. 1. S$ fa,.go(alAa = lai.gota;)] for every a; € Ee 2. As lanai = [aia for every a) € EG410.4 The Chomsky Hierarchy 309 Derivations using the S and A rules generate sequences of ordered pairs of the form Lai.gotai| Ley go Main igh [igs ig) “The string obtained by coneatenating the elements inthe first components of the ordered pats, aici... dys Fepresents the input string to a computation of M. The second compo- nents produce qo(ai ai...) the initial configuration of the LBA. The rules that simulate & computation are obtained by rewriting the transitions of M as transformations that alter the second components of the ordered pairs. Note that the second components do not produce the string qo); the computation with the null string as {npatis not simulated by the grammar. The techniques presented in Theorem 10.1.3 can be modified to produce the rules needed to simulate the computations of M. The details are left as an exereise Upon the completion of a successful computation, the derivation must generate the oviginl input sting. When an accepting configuration is generated, the variable with the accepting state in the second component of the ordeted pair is transformed into the terminal symbol contained inthe fist component. 3. faiegete] > a Ta. gee] + a whenever 5(qu, () = and ge € F laj.xg4)] > a; la, (xge)] > a whenever 8(g4, )) Band qy €F lai, gex) > as Lai. gux) > ay as. (gu) > a; lav. (gus) > a whenever 3(q...) = and gx € F The derivation is completed by transforming the remai contained in the first component. ing variables to the terminal 4. (ai.ulaj > aay ala.) + aya; for every aj € Yc and [a;,u] eV . 10.4 The Chomsky Hierarchy Chomsky numbered the four families of grammars (and languages) that make up the hi crarchy. Unrestricted, context-sensitive, context-free, and regular grammars are referred to310 Chapter 10 The Chomsky Hierarchy as type 0, type 1, type 2, and type 3 grammars, respectively. The restrictions placed on the rules increase with the number ofthe grammar. The nesting of the families of grammars of the Chomsky hierarchy induces a nesting of the corresponding languages. Every context- free language containing the null string is generated by a context-free grammar in which Sis the only lambda rule (Theorem 5.1.5). Removing this single lambda rule pro- duces a context-sensitive grammar that generates L ~ (2). Thus, the language L — (2) is ccontext-sensitive whenever L is contextiree. Ignoring the complications presented by the null string in context sensitive languages, every type i language is also type (# ~ 1) The preceding inclusions are proper. The set [a'b’ |i > 0) is context-free but not regular (Theorem 7.5.1). Similarly, {a'b'c! | 7 > 0} iscontext-sensitive but not context-free (Example 8.4.1). In Chapter 11, the language Ly is shown to be recursively enumerable but not recursive. Combining this result with Theorem 10.2.2 establishes the proper inclusion of context sensitive languages in the set of recursively enumerable languages. Each class of languages in the Chomsky hierarchy has been characterized as the lan- guages generated by a family of grammars and accepted by a type of machine. The rela- tionships developed between generation and recognition are summarized in the following, table, Grammars Languages Accepting Machines Type 0 grammars, Recursively ‘Turing machine, phrasestructre grammars, enumerable snondeterministic Unrestricted grammars Fanguages Turing machine Type | grammars, Context-semitive Linear bounded cntext-semsitive grammars, languages automata ‘monotonic grammars ‘Type 2 grammars, Contexc-free Pushdown automata languages ‘Type 3 grammars, Regular Deterministic finite regular grammars, languages automat, Tett-linear grammars, ondeterministe right-linear grammars finite automata Exercises, |. Design unrestricted grammars to generate the following languages: a) [a'bla‘b! |i,j > 0) by (a'bie'd' |i > 0) ©) {www | w € (a,6}*) 2. Prove that every terminal string generated by the grammarExercises 311 SaAbe| i A> aAbC |i Cb 0c Comce has the form a’B'c! for some i > 0. Prove that every recursively enumerable language is generated by a grammar in which each rule has the form u — v where u € V* and v € (VU 5)* Prove that the recursively enumerable languages are closed under the following oper- ations: a) union ») intersection ©) concatenation 4) Kleene star ©) homomorphic images Let M be the Turing machine “ak ) Give a regular expression for L(M), 'b) Using the techniques from Theorem 10.1.3, give the rules of an unrestricted gram- ‘mar G that accepts LM), «) Trace the computation of M when run with input bab and give the corresponding derivation in G. LetG be the monotonic grammar G S+SBAla BA> AB aA > aaB Bob. a) Give a derivation of aaabb. b) Whatis L(G)? ©) Construct a context-free grammar that generates L(G), Let L be the language {a'b%a! | > 0)312 Chapter 10 The Chomsky Hierarchy £8) Use the pumping lemma for context-free languages to show that L is not context- free. b) Construct a contextsensiive grammar G that generates L. ©) Give the derivation of aabbbbaa in G. 4) Construct an LBA M that accepts L. ©) Trace the computation of M with input aabbbbaa. 8, LetL = [atb/eh|0
uwv that define the same transformation as the monotonic rule AB > CD. Hint: A sequence of thee rules, each of whose left-hand side and right hand side is of length wo, suffices 12, Use the results from Exercises 10 and 11 to prove that every context-sensitive lan guage is generated by a grammar in which each rule has the form wAv—> wu, where we (VUE)* and uve (VUE). 13, Let T be a full binary tree. A path through Tis a sequence of left-down (L), right. down (R), oF up (U) moves. Thus paths may be identified with stings over E = {L,R.U}. Consider the language L = (w € E°* | w describes a path from the root back to the root} For example, 3, LU. LRUULU € Land U, LRU ¢ L. Establish L's place in the Chomsky hierarchy. 14, Prove that the context-sensitive languages are not closed under arbitrary homomor phisms. A homomorphism is i-fre if hu) = A implies u = 2, Prove thatthe context. sensitive grammars are closed under J-free homomorphisms. 15. LetL be a recursively enumerable language over © and ¢ a terminal symbol notin , Show that there is acontext-sensitive language L’ over © U (c} such that, for every we E+, we Lif, and only if, we! eL’ for some i = 0Bibliographic Notes 313 16, Prove that every recursively enumerable language is the homomorphic image of a context sensitive language. Hint: Use Exercise 15, 17. A grammar is said to be context-sensitive with erasing if every rule has the form uAv—> wow, where Ae V and u,v,w € (VU B)*, Prove that this family of grat mars generates the recursively enumerable languages. 18. A linear-bounded automaton is deterministic if at most one transition is specified for each state and tape symbol. Prove that every context-free language is accepted by a deterministic LBA, 19. Let be a context sensitive language that is accepted by a deterministic LBA, Prove that Eis context-senstive. Note that a computation in an arbitrary deterministic LBA need not halt Bibliographic Notes The Chomsky hierarchy was introduced by Chomsky [1959]. This paper includes the proof thatthe unrestricted grammars generate precisely recursively enumerable languages. Linear-bounded automata were presented in Myhill [1960]. The relationship between linear-bounded automata and context-sensitive languages was developed by Landweber [1963] and Kuroda [1964]. Solutions to Exercises 10, 11, and 12, which exhibit the re- lationship between monotonic and contextsensitive grammars, can be found in Kuroda 11964)PART IV Decidability and Computability‘The Turing machine represents the culmination of a series of increasingly powerful ab- stract computing devices, The Church-Turing thesis, proposed by logician Alonzo Church in 1936, asserts that any effective computation in any algorithmic system can be accom- plished using a Turing machine. The formulation of this thesis for decision problems and its implications for computability are discussed in Chapter | With the acceptance of the Church-Turing thesis, it becomes obvious that the extent of algorithmic problem solving can be identified with the capabilities of Turing machines. Consequently, to prove a problem to be unsolvable it sutfices to show that there is no Turing machine that solves the problem. The first several problems that we show to have no algorithmic solution are concerned with properties of computations, but later problems exhibit the undecidability of questions on desivability using the rules of a grammar, search, procedures, and properties of context-free languages. Arguments establishing both the decidability and undecidability of problems utilize the ability of one Turing machine to simulate the computations of another. To accomplish this, an encoding of the transitions of the machine to be simulated are provided as input ‘This is the same philosophy as that employed by a stored-program computer, where the instructions of the program to be executed are loaded into the memory of the computer for execution, ‘The capabilities of Turing machines are extended by defining the result of a computa tion by the configuration of the tape when the computation terminates. This generalization permits Turing machines to compute functions and perform other standard computational tasks, The extended Church-Turing thesis asserts that any algorithmic process can be per- formed by an appropriately designed Turing machine. As supporting evidence, we show that a computation defined using a typical programming language can be simulated by a Turing machine, The examination of the properties of computability is completed with a characteriza- tion of the set of functions that can be algorithmically computed. This provides an answer to the question concerning the capabilites of algorithmic computation that motivated our excursion into computability theory.CHAPTER 11 Decidability A decision problem consists of a set of questions whose answers are either yes or no. A So- lution to a decision problem is an effective procedure that determines the answer for each question in the set. A decision problem is undecidable if there is no algorithm that solves the problem. The ability of Turing machines to return affirmative and negative responses makes them an appropriate mathematical system for constructing solutions to decision problems, The Church-Turing thesis asserts that a Turing machine can be designed to solve any decision problem that is solvable by any effective procedure, Consequently, t0 es- tablish that a problem is unsolvable it suffices to show that there is no Turing machine solution. Techniques are developed to establish the undecidability of several important {questions concerning the capabilities of algorithmic computation, Throughout the first four sections of this chapter we will consider Turing machines with input alphabet (0, /} and tape alphabet (0, /,B}. The restriction on the alphabets imposes no limitation on the computational capabilities of Turing machines, since the computation of an arbitrary Turing machine M can be simulated on a machine with these restricted alphabets. The simulation requires encoding the symbols of M as strings over (0,1). This is precisely the approach employed by digital computers, which use the ASCII (American Standard Code for Information Interchange) or EBCDIC (Extended Binary Coded Decimal Interchange Code) encodings to represent characters as binary strings, 317318 Chapter 11. Decidabity 11.1 Decision Problems A decision problem P is a set of questions, each of which has a yes or no answer. The single question “Is 8 a perfect square?” is an example of the type of question under consideration in a decision problem. A decision problem usually consists of an infinite umber of related questions. For example. the problem Psq of determining whether an arbitrary natural number is a perfect square consists of the following questions: Po: Is 0a perfect square? pi: Is | aperfect square? pe) Is2a perfect square? AA solution to decision problem P is an algorithm that determines the appropriate answer toevery question p « P. Since a solution to a decision problem is an algorithm, a review of our intuitive notion of algorithmic computation may be beneficial. We have not defined—and probably cannot precisely define—algorithm. This notion falls into the category of “Tcan't describe it but 1 know one when I see one.” We can, however, list several properties that seem fundamental ‘o the concept of algorithm. An algorithm that solves a decision problem should be © Complete: It produces an answer, either positive or negative, to each question in the problem domain, ‘© Mechanistic: It consists of a finite sequence of instructions, each of which can be carried out without requiring insight, ingenuity, or guesswork ‘© Deterministic: When presented with identical input, it always produces the same re- sult ‘A procedure that satisfies the preceding properties is often called effective ‘The computations of a standard Turing machine are clearly mechanistic and deter- ministic. A Turing machine that halts for every input string is also complete, Because of the intuitive effectiveness of their computations, Turing machines provide a formal frame- work that can be used to construct solutions to decision problems. A problem is answered affirmatively if the input is accepted by a Turing machine and negatively if itis rejected, Recall the newspaper vending machine described at the beginning of Chapter 6. Thirty cents in nickels, dimes, and quarters is required to open the latch. If more than 30 cents is inserted, the machine keeps the entire amount. Consider the problem of a miser who wants to buy a newspaper but refuses to pay more than the minimum. A solution to this problem 1s a procedure that determines whether a set of coins contains a combination that totals exactly 30 cents11.41 Decision Problems 319 Te transformation of & decison problem from its natural domain to an equivalent problem that can be answered by a Turing machine is known as constructing a represen- tation of the problem, To solve the corret-change problem with a Turing machine, the problem must be formulated as a question of accepting strings. The miser's change can be represented as an element of (dig) where md, and q designate a nickel, a dime, and a quarter, respectively. Note that a representation is not unigue; there are six strings that represent the se of coins consisting ofa nickel, a dime, and a quarter ‘The Turing machine in igure 11.1 solves the correct-change problem. The sequences in the star state represent the five distinct combinations of cons that provide an afirmative answer tothe question. For example, the solution consisting of one dime and four nickels is represented by (d, 4m). The sequences in the state entered as the result of a transition specify the coins that, when combined with the previously processed coins, produce & combination tha totals exactly 30 cents. The input qqdnd is acceped since these of coins represented by this string contains a quarter and a nickel, which total precisely 30 cents We have chosen the standard model of the Turing machine as a formal system in which solutions to decision problems are formulated. The completeness property of ef- fective computation requires the computation of the machine to terminate for every input string, Thus the language accepted by a Turing machine that solves a decision problem is recursive. Conversely, every deterministic Turing machine M that accepts a recursive language can be considered a solution to a decision problem. The machine M solves the problem consisting of questions of the form “ls the string w in L(M)?” for every string wert The duality between solvable decision problems and recursive languages can be ex: ploited to broaden the technigucs available for establishing the decdabiity ofa decision problem. A problem is decidable if thas a representation in which the set of accepted in- Put strings form a recursive language, Since computations of deterministic multitrack and ‘mullitape machines can be simulated on a standard Turing machine, solutions using these ‘machines aso establish the decidabilty ofa problem. Example 11.1.1 The decision problem Psq is decidable. The three-tape Turing machine from Exam- ple 9.6.1 solves Psq. a Determinism is one of the fundamental properties of algorithms. However, it is often ‘much easier to design a nondeterministic Turing machine to solve a decision problem. In Section 9.7 it was shown that every language accepted by a nondeterministic Turing ma- chine is also accepted by a deterministic one. A solution to a decision problem requires. more than a machine that accepts the appropriate strings; it also demands that all compu- tations terminate. A nondeterministic machine in which every computation terminates can be used to establish the existence ofa decision procedure. The languages of such machines are recursive (Exercise 9.22), ensuring the existence of a deterministic solution,320 Chapter 11. Decidabumty Wak aig alge Ofer Ofer OO aoe ae ee Oh ce 2d jd aiqR a _ an - or | wae lan |_aae {4" 3d My a 24,20 fe 2n aan h2n nin ig in R 2a lan sn dar q bn 4 win R_,| 4n Bion 2 ‘ 420 ain ‘ aia aia aia FIGURE 11.1. Solution to the correct-change problem. Example 11.1.2 ‘The problem of determining whether there is a path from a node v; to a node vj in a directed graph is decidable. A directed graph consists of a set of nodes N and ares A C N x N. To represent the graph as strings over (0, 1}, node vs is encoded by11.2 The Church-Turing Thesis 321 14, Am are [vs, is represented by the string en(v,)Oen(v,), where en(u,) and en(v,) are the encodings of nodes v, and v,. The string 00 is used to separate arcs. The input to the machine consists of a representation of the graph followed by the encoding of nodes v; and v;. Three 0's separate en(v) and en(v;) from the representation of the graph. The directed graph N= (v1.v2,03} A= (let. vah Lv. vih, [v2 3h [v3.02 is represented by the string 101100101001 101110011101. A computation to determine ‘Whether there isa path from vs {0 vs in this graph begins with the input 10/700701001 10 1110011101 100011101. ‘A nondeterministic two-tape Turing machine M is designed to solve the path problem. ‘The actions of M are summarized below. 1, The input is checked to determine if its format is that of a representation of a directed graph followed by the encoding of two nodes. If not, M halts and rejects the string. The input is now assumed to have form ROO0en(v;)0en(u;), where R is the represen. tation of a directed graph. If/ = j, M halts in an accepting state 3. The encoding of node followed by 0 is written on tape 2 in the leftmost position, 4. Let v, be the rightmost node encoded on tape 2. An arc from v, t0 uj, where vis not already on tape 2, is nondeterministically chosen from R. If no such arc exists, M halts in a rejecting state 5. If = j, Mhalts in an accepting state. Otherwise, en(v;)0 is written atthe end of the string on tape 2 and steps 4 and 5 are repeated Steps 4 and 5 generate paths beginning with node v; on tape 2, Every computation of M {erminates after at most iterations, where 1 is the number of nodes in the graph, since step 4 guarantees that only noncyclic paths are examined. It follows that L(M) is recursive and the problem is decidable. a 11.2. The Church-Turing Thesis ‘The concept of an abstract machine was introduced to provide a formalization of algo- rithmic computation. Turing machines have been used to accept languages and solve de- Cision problems. These computations are restricted to returning a yes or no answer. The822 Chapter 11 Decidat ty ‘Church-Turing thesis asserts that every solvable decision problem can be transformed into an equivalent Turing machine problem. By defining the result of a computation by the symbols on the tape when the machine halts, Turing machines can be used to compute functions. Chapters 12 and 13 examine capabilities and limitations of evaluating functions using Turing machine computations. ‘A more general and concise form of the Church-Turing thesis in terms of effectively ‘computable functions is presented in Chapter 13. Since our attention has been focused on. ryesino problems, the Church-Turing thesis is presented and its implications are discussed for this class of problems. ‘A solution to a decision problem requires the computation to return an answer for very instance of the problem, Relaxing this restriction, we obtain the notion of a partial solution, A partial solution 10 a decision problem P is not necessarily complete effective procedure that returns an affirmative response for every p & P whose answer is yes. Ifthe answer to p is negative, however, the procedure may return no of fail to produce an answer. Just as a solution to a decision problem can be formulated as a question of member- in a recursive language, a partial solution to a decision problem is equivalent to the uestion of membership in a recursively enumerable language. The Church-Turing thesis for decision problems There is an effective procedure to solve a decision problem if, and only if, there is @ Turing machine that halts forall input strings and solves the problem, ‘The extended Church-Turing thesis for decision problems A decision problem P is partially solvable if, and only if, there is a Turing machine that accepts precisely the elements of P whose answer is yes. To appreciate the content of the Church-Turing thesis, it is necessary to understand the nature of the assertion. The Church-Turing thesis is not a mathematical theorem; it cannot be proved. This would require a formal definition of the intuitive notion of an effective procedure, The claim could, however, be disproved. This could be accomplished by discovering an effective procedure that cannot be computed by a Turing machine. There is an impressive pool of evidence that suggests that such a procedure will not be found. More about that later, The Chusch-Turing thesis may be considered to provide a definition of algorithmic computation. This is an extremely limiting viewpoint. There are many systems that sat- isfy our intuitive notion of an effective algorithm, for example, the machines designed by Post {1936}, recursive functions [Kleene, 1936], the lambda calculus of Church [1941], and, more recently, programming languages for digital computers, These are but a few of the systems designed to perform effective computations. Moreover, who can predict the formalisms and techniques that will be developed in the future? The Church-Turing the- sis does not claim that these other systems do not perform algorithmic computation. It does, however, assert that a computation performed in any such system ean be accom- plished by a suitably designed Turing machine. Perhaps the strongest evidence supporting, the Church-Turing thesis is that all known effective procedures have been able to be trans formed into equivalent Turing machines.11.3 TheHalting Problem for Turing Machines 323, ‘The robustness of the standard Turing machine offers additional support for the Church-Turing thesis. Adding multiple tracks, multiple tapes, and nondeterministic com- putation does not increase the set of recognizable languages. Other approaches to compu- tation, developed independently of the formalization of Turing machines, have been shown to recognize the same languages. In Chapter 10 we demonstrated thatthe recursively enu- 'merable languages are precisely those generated by unrestricted grammars. ‘A proof by the Church-Turing thesis is a shortcut often taken in establishing the existence of a decision algorithm, Rather than constructing a Turing machine solution to a decision problem, we describe an intuitively effective procedure that solves the problem, The Church-Turing thesis guarantees that a Turing machine can be designed to solve the problem. We have tacitly been using the Church-Turing thesis in this manner throughout the presentation of Turing computability. For complicated machines, we simply gave an effective description of the actions of a computation of the machine, We assumed that the complete machine could then be explicitly constructed, if desired. ‘The Church-Turing thesis asserts that a decision problem P has a solution if, and only if, there is a Turing machine that determines the answer for every p € P. Ifno such Turing ‘machine exists, the problem is said to be undecidable. A Turing machine computation is not encumbered by the restrictions that are inherent in any “real” computing device, The existence of a Turing machine solution to a decision probiem depends entirely on the na: lure of the problem itself and not on the availability of memory or central processor time ‘The universality of Turing machine computations also has consequences for undecidabil- ity. If a problem cannot be solved by a Turing machine, it clearly cannot be solved by a resource-limited machine. The remainder of this chapter is dedicated to demonstrating the undecidability of several important problems from computability theory and formal lan- uage theory. 11.3 The Halting Problem for Turing Machines The most famous ofthe undecidable problems is concerned with the properties of Turing machines themselves. The halting problem may be formulated as follows: Given an arbi- ‘rary Turing machine M with input alphabet © and a string w « :*, will the computation ‘of M with input w halt? We will show that there is no algorithm that solves the halting problem. The undecidability of the halting problem is one of the fundamental results in the theory of computer science. Ttis important to understand the statement of the problem. We may be able to dete. rine that a particular Turing machine will halt fora given string. In fact, the exact set of strings for which a particular Turing machine halts may be known. For example, the ‘machine in Example 9.3.1 halts for all and only the strings containing the substring aa. ‘A solution to the halting problem, however, requires a general algorithm that answers the halting question for every possible combination of Turing machine and input string.324 Chapter 11. Decldabiity ‘A solution to the halting problem requires the Turing machine M and the string w to be represented as an input string. Recall that, because ofthe ability to encode arbitrary symbols as strings over {0,1}, we are limiting our analysis to Turing machines that have in- putalphabet (0, /) and tape alphabet (0, 1, 8}. The states of a Turing machine are assumed tobe named (go, 4u-- dnl With go the start state, ‘A Turing machine is completely defined by its transition function. A transition of a delerministic Turing machine has the form 5(q).x) =[qj..dl, Where qi, 4) €Q: x. y ePsandd € [L, R). We encode the elements of M using strings of 1's: Symbol__Eneoding 0 1 1 un B mt ® 1 a u ca re L 1 R u Let en(z) denote the encoding of a symbol z. A transition 5(gi.) = [g). ysdl is encoded by the string en(q,)Oen(x)Oen(q )0en(y)Oen(d). The 0's separate the components of the transition, A representation of the machine is con- structed from the encoded transitions, Two consecutive 0's are used to separate transitions, ‘The beginning and end of the representation are designated by three Example 11.3.1 ‘The Turing machine113. The Halting Problem for Turing Machines 325 accepts the null string and strings that begin with 77. The computation of M does not terminate for any input string beginning with 0. The encoded transitions of M are given in the table below. ‘Transition Encoding 50.8) ={qi.8.R]— 107170170111011 511.) = (go. 0,L) 1401010101 Sq.) = laa 1.R) Morton 10rr011 Sign. 1) = (go. 1.L) 110110101101 The machine M is represented by the string (0001011 1011011011001 101010101001 101 1011101101 1001110110101101000. 3 ‘A Turing machine can be constructed to determine whether an arbitrary string u 10,1)" isthe encoding of a deterministic Turing machine. The computation examines u to see if it consists ofa prefix 000 followed by a finite sequence of encoded transitions sepa rated by 00's followed by 000, A string that satisfies these conditions is the representation Cf some Turing machine M. The machine M is deterministic if the combination of the state and input symbol in every encoded transition is distinct. Utilizing the encoding described above, the representation of a Turing machine with input alphabet (0, 1} is itself a string over (0, 1}. The proof of Theorem 11,3.1 does not depend upon the features of this particular encoding. The argument is valid for any representation that encodes a Turing machine as a string over its input alphabet. The representation of the machine M is denoted R(M). Theorem 11.3.1 ‘The halting problem for Turing machines is undecidable. Proof The proof is by contradiction, Assume that there is a Turing machine H that solves the halting problem, A string is accepted by H if {) the input consists of the representation of a Turing machine M followed by a string w ii) the computation of M with input w halts. If either of these conditions is not satisfied, H rejects the input. The operation of the machine H is depicted by the diagram M halts with Halting [|___inpute ceo RM) —e] machin _—— R Mdoes otha ‘with input w326 Chapter 11. Decidabilty The machine H is modified to construct a Turing machine H’. The computations of H are the same as H except H' loops indefinitely whenever H terminates in an accepting state, that is, whenever M halts on input w. The transition function of H’ is constructed from that of H by adding transitions that cause HY to move indefinitely tothe right upon entering an accepting configuration of H. His combined with acopy machine to construct another Turing machine D. Te input to Dis a Turing machine representation R(M). A computation of D begins by creating the string R(M)R(M) from the input R(M). The computation continues by running H’ on ROM)ROM), Malis with = input ROM) i iM COPY FR(M) —>) Hr mer ~ hale ™M doesnot halt with input ROM The input to the machine D may be the representation of any Turing machine with alphabet (0, 1, 8). In particular, Dis such a machine. Consider a computation of D with input R(D). Rewriting the previous diagram with M replaced by D and ROM) by R(D), we get D halts with fe input RD) ROY copy |= RiD)RD) W loop ~ ———» alt does not halt with input RD) Examining the preceding computation, we see that D halts with input R(D) if, and only if, D does not halt with input R(D). This is obviously a contradiction, However, the machine D can be constructed directly from a machine H that solves the halting problem. The assumption that the halting problem is decidable produces the preceding contradiction. ‘Therefore, we conclude that the halting problem is undecidable. . Corollary 11.3.2 ‘The language Lyi = (R(M)w | R(M) is the representation of a Turing machine M and M hhalts with input} over (0, 1}* is not recursive. AA similar argument can be used to establish the undecidability of the halting problem for Turing machines with arbitrary alphabets. The essential feature of this approach is the ability ( encode the transitions of a Turing machine as a string over its own input alphabet, ‘Two symbols are sufficient 0 construct such an encoding.11.4 A Universal Machine 327 — — 11.4 A Universal Machine The halting problem provides a negative result conceming the ability to determine the ‘outcome of a Turing machine computation from a description of the machine. The unde- cidabilty of the halting problem establishes thatthe language Ly, consisting of all strings R(M)w for which machine M halts with input w, is not recursive. We will exhibit a sin- ‘gle machine U that accepts the input R(M)w whenever the computation of M halts with input w. Note that this machine does not solve the halting problem since U is not guaran- teed to reject strings that are notin Lyx. In fact, a computation of U with input R(M)uo will continue indefinitely whenever M does not halt with input w. ‘The Turing machines in this section are assumed to be deterministic with tape alpha- bet (0, 1, B). The machine U is called a universal Turing machine since the outcome of, the computation of any machine M with input w can be obtained by the computation of, ‘Uwith input R(M)w. The universal machine alone is sufficient to obtain the results of the computations of the entire family of Turing machines. Universal |___M halts with cept Mw —A] machine U loop M does not halt swith input w Theorem 11.4.1 The language La is recursively enumerable. Proof A deterministic three-tape machine U is designed to accept Ly by halting. A ‘computation of U begins with the input on tape 1. Te encoding scheme presented in Section 11.3 is used to represent the input Turing machine. Ifthe input string has the form R(M)w, the computation of M with input w is simulated on tape 3. The universal machine uses the information encoded in the representation R(M) to simulate the transitions of M A.computation of U consists of the following actions 1. Ifthe input string does not have the form R(M)w for a deterministic Turing machine Mand string w, U moves indefinitely to the right 2. The string w is written on tape 3 beginning at position one. The tape head is then repositioned at the leftmost square of the tape. The configuration of tape 3 is the initial configuration of a computation of M with input w. 3. A single J, the encoding of state gy, is written on tape 2. 4, A transition of M is simulated on tape 3. The transition of M is determined by the symbol scanned on tape 3 and the state encoded on tape 2. Let x be the symbol from tape 3 and q; the state encoded on tape 2.928 Chapter 11. Decidabitty 4) Tape 1 is scanned for a transition whose first two components match en(q,) and eno) If there is no such transition, U halts accepting the input by Assume tape I contains the encoded transition en(qi)Oen(x)Oen(q;)0en(y)0en\d. ‘Then i) en(qi) is replaced by en(g,) on tape 2 ji) The symbol y is written on tape 3. ii) The tape head of tape 3 is moved in the direction specified by d 5. The next transition of M is simulated by repeating steps 4 and 5 The simulations of the universal machine U accept the strings in Ly. The computations of U loop indefinitely for strings in (0, 1)* ~ Ly. Since Lyy = L(U), Ly. is recursively enumerable 7 Corollary 11.4.2 The recursive languages are a proper subset of the recursively enumerable languages Proof The acceptance of Li by the universal machine demonstrates that Ly is recur- sively enumerable while Corollary 11.3.2 established that Ly is not recursive, . In Exercise 925 it was shown that a language L is recursive if both Land Care recursively enumerable, Combining this wth Theorem 11.4.1 and Corollary 11.3.2 vields Corollary 11.4.3, The language Lar is not recursively enumerable. ‘The computation of the universal machine U with input R(M) and w simulates the computation M with input w. The ability to obtain the results of one machine via the Computations of another facilitates the design of complicated Turing machines. When we say that a Turing machine M’ “runs machine M with input w,” we mean that M’ is supplied with input R(M)w and simulates the computation of M. 11.5 Reducibility ‘A decision problem P is many-to-one reducible 0 a problem P’ if there is a Turing ma- chine that takes any problem p; € P as input and produces an associated problem py ¢ P \where the answer to the original problem p; can be obtained from the answer to p. As the ‘name implies, the mapping from P to P’ need not be one-to-one: Multiple problems in P ‘may be mapped to the same problem in P. Ifa decision problem P’ is decidable and P is reducible to P, then P is also decidable. ‘A solution to P can be obtained by combining the reduction with the algorithm that solves P’ (Figure 11.2).118 Reductbilty 329 a Reduction of a Algorithm 0] = Yes pep ai PoP’ Pier — solve P” No Solution to P - FIGURE 11.2 Solution of p, using reduction top Reduction is a technique commonly employed in problem solving. When faced with ‘anew problem, we often try 1 tranform it into a problem that has previously been solved. ‘This is precisely the strategy employed in the reduction of decision problems. Example 11.5.1 Consider the problem P of accepting strings in the language L = {uu | w= a'be! for some i > 0}. The machine M in Example 9.2.2 accepts the language (a'bic! | > 0}. We will reduce the problem P to that of recognizing a single instance of a'bc!. The original problem can then be solved using the reduction and the machine M. The reduction is obtained as follows: 1. The input string w is copied. 2, The copy of w is used to determine whether w = wu for some string u € (a,b, ¢}" 3. If w un, then the tape is erased and a single a is written inthe input position 4. If w = uae, then the tape is erased, leaving win the input position, If the input string w has the form uu, then w € Lif, and only if, u=
v, where u € * and v E*. There is no division ofthe symbols into Variables and terminals nor i there a designated stat symbol. As before, u = v signifies that v is derivable from w by a finite number of rule applications. The word problem for semi-Thue systems isthe problem of determining, for an arbitrary semi-Thue system = (, P) and stings u,v € E*, whether vis derivable from 1 in S. ‘We will show that the halting problem is reducible othe word problem for semi-Thue systems. The reduction is obtained by developing a relationship between Turing machine computations and derivations in appropriately designed semi-Thue systems. LetM = (Q, E, I, 8, gps F) bea deterministic Turing machine. Using a modification of the construction presented in Theorem 10.1.3, we can construct a semi-Thue system Sy = (Ey,Pyo) Whose derivations simulate the computations of M. The alphabet of Suis the set QUT U (f.]egy+dav dt). The set Py of rules of Sy is defined by ‘xy > eqyy whenever 8(.x) Rland ye ‘gix) > 2qB) whenever 8(gi.2) = [qj.2.R] git > give whenever 8(qi.*) = [gjo2sL] and y € TP 4 > gn iF 8(q.2) is undefined ane > qn force? wal qu an force + Lan lay The rules that generate the string (qo8w] in Theorem 10.1.3 are omitted since the ‘word problem for a semi-Thue system is concemed with derivability of a string v from another string u, not from a distinguished star symbol. The erasing rules (5 through 8)936 Chapter 11. Decidability have been modified to generate the string (q] whenever the computation of M with input w halts ‘The simulation of a computation of M in Sy manipulates strings of the form (1g) With u,v e F*, and g © QU {ay.4e,qu}- Lemma 11.71 lists several important properties, ‘of derivations of Sy that simulate @ computation of M. Lemma 11.7.1 Let M be a deterministic Turing machine, Sy be the semi-Thue system constructed from M, and w = [ugu] be a string with u,v €T*,andq € QU (ay.4n.ae) i) There is at most one string z such that w => < ii) IF there is such a 2, then z also has the form (wq'v] with w QUigy-gaae) Proof The application of a rule replaces one instance of an element of QU (qy.ga-4L} With another. The determinism of M guarantees that there is at most one rule in Py that can be applied to [ugu] whenever q € Q. If q = qe there isa unique rule that can be applied to {ign}. This rule is determined by the first symbol in the string v}. Similarly, there is only ‘one rule that can be applied to fwqz]. Finally there are no rules in Pyq that can be applied toa string containing gy. Condition (ii) follows immediately from the form of the rules of Pu. . 9", and q'e A computation of M that halts with input w produces a derivation (qq Bw] 2> ugg). The erasure rules transform this string to [9]. These properties are combined to yield Lemma 11.7.2 Lemma 41.72 ‘A detemminitie Turing machine M halts with input w if, and only if [goBw 2] 2 lay Example 11.7.1 ‘The language of the Turing machine bbR ala Me oe eee ” il is (@Ub)*c(a Ub Uc)*. The computation that accepts ac is given with the corresponding
(BagicB + aacq,B = (Bacq:B) > (Bacqe] = [Bacqi] (Baqi) = (B41) = lal > [asl a Theorem 11.7.3 ‘The Word problem for semi-Thue systems is undecidable Proof The preceding lemmas sketch the reduction of the halting problem to the word problem for semicThue systems, For a Turing machine M and corresponding semi-Thue system Sy, the computation of M with input w halting is equivalent wo the derivability of {q1 from (4oBwB} in Sy. An algorithm that solves the word problem could also be used to solve the halting problem. . By Theorem 1.7.3, there is no algorithm that solves the word problem for an arbitrary semi-Thue system $= (Z, P) and pair of strings in Z*. The relationship between the computations of a Turing machine M and derivations of Sy developed in Lemma 11.7.2 ccan be used to prove that there are particular semi-‘Thue systems whose word problems are undecidable. Theorem 11.7.4 Let M be a deterministic Turing machine that accepts a nonrecursive language. The word problem for the semi-Thue system Sy is undecidable. Proof Since M recognizes a nonrecursive language, the halting problem for M is unde- cidable (Exercise 11). The correspondence between computations of M and derivations of Sw yields the undecidabiity ofthe word problem for this system. . 11.8 The Post Correspondence Problem ‘The undecidable problems presented in the preceding sections have been concerned with the properties of Turing machines or mathematical systems tha simulate Turing machines The Post correspondence problem is a combinatorial question that can be described as &338 Chapter 11. Decidabilty simple game of manipulating dominoes. A domino consists of two strings from a fixed alphabet, one on the top half of the domino and the other on the bottom. aba bbaba The game begins when one of the dominoes is placed on a table, Another domino is then placed to the immediate right of the domino on the table. This process is repeated, constructing a sequence of adjacent dominoes. A Post correspondence system can be thought of as defining a finite set of domino types. We assume that there is an unlimited number of dominoes of each type; playing a domino does not limit the number of future A string is obtained by concatenating the strings in the top halves of a sequence of dominoes. We refer to this as the top string. Similarly, a sequence of dominoes defines a bottom string. The game is successfully completed by constructing a finite sequence of dominoes in which the top and bottom strings are identical. Consider the Post correspon dence system defined by dominoes cach Le te | =| ac | ba is a solution to this Post correspondence system. Formally, a Post correspondence system consists of an alphabet F and a finite se of ordered pairs (uj), = 1,2,...4m, where 0; € E*. A solution to a Post correspon- dence system isa sequence f.i3, jx such that The problem of determining whether a Post correspondence system has a solution is the Post correspondence problem.11.8 The Post Correspondence Problem 339 Example 11.8.1 The Post correspondence system with alphabet (a,b) and ordered pai {6aa,abaaa] has a solution [aaa,aa), m= [om [om on [otal ow | fone 3 Consider the Post correspondence system with alphabet (a,b) and ordered pairs [ab, aba], [bba, aa}, faba, bab). A solution must begin with the domino Example 11.8.2 ab aba since this is the only domino in which prefixes on the top and bottom agree. The string in the top half ofthe next domino must begin with a. There are 1wo possibilities: | Jab | aba @ ‘The fourth elements of the strings in (a) do not match, The only possible way of construct- ing a solution is to extend (b). Employing the same reasoning as before, we see that the first element in the top ofthe next domino must be b. This lone possibility produces ‘aba | bab {ab | aba | bho ‘which cannot be the initial subsequence of a solution since the seventh elements in the top and bottom differ. We have shown that there is no way of “playing the dominoes” in which the top and bottom strings are identical. Hence, this Post correspondence system has no solution, a340 Chapter 11. Decidabity ‘Theorem 11.8.1 ‘There is no algorithm that determines whether an arbitrary Post correspondence system has a solution, Proof Let S = (5, P) be a semi-Thue system with alphabet (0. /} whose word problem is unsolvable. The existence of such a system is assured by Corollary 11.4.2 and Theo- rem 11.74 For each pair of strings u,v € E*, we will construct a Post correspondence system Cus that has a solution if, and only if, wv, Sinee the later problem is undeeidale, there can be no general algorithm that solves the Post correspondence problem. ‘We begin by augmenting the set of productions of $ with the rules 0 —> O and / —> 2. Derivations in the resulting system are identical to those in $ except for the possible audition of rule applications that do not transform the sting. The application of such a rule, however, uarantees that whenever vv may be obiined from u by a derivation of even length. By abuse of notation, the augmented system is also denoted S. Now let u and v be strings over (0 1". A Post correspondence system Cy,» is con- structed from u,v, and 8. The alphabet of Cy, consists of 0,6, 1.7, [, ] and ¥. A string 1w consisting entirely of “barred” symbols is denoted T. Each production x; — yj, = 1,2,...9m, of $ Gneluding 0—> 0 and 1 1) defines two dominoes The system is completed by the dominoes r lie ela t elle la oe ‘The dominoes o|(al{. |i eee Jee ccan be combined to form sequences dominoes that spell11.8. The Post Correspondence Problem 341 for any string w € (0, 1}*. We will feel free to use these composite dominoes when con- structing @ solution to a Post correspondence system Cy, First we show that C,,y has & solution whenever u => v. Let Wu sE bbe a derivation of even length. The rules 0 > 0 and / -> J ensure that there is derivation (of even length whenever v is derivable from u. The ith step ofthe derivation can be written Mot = PreiSj Ment > Pinte = Mir where u; is obtained from u;-1 by an application ofthe rule x), —> yyy. The sting (uo 4 yup # eR) isa solution oC. This solution can be constructed as follows: 1. Initially play 2. To obtain a match, dominoes spelling the string producing tug on the bottom are played, tw | me | 5, (ae ee ‘The dominoes spelling py and qo are composite dominoes. The middle domino is ‘generated by the rule x p. 3. Since poy jgo = ur. the top string can be written [up + 7 and the bottom [up. Repeat- ing the strategy employed above, dominoes must be played to spell 7 on the bottom,842 Chapter 11 Decidability tw |r | %m | He | = | m | | a] (ee ee producing [ug + /#u2+ on the top. 4. This process is continued for steps 2, 3, .. . ,k~ 1 of the derivation, producing tw | a] etal fa [+ | [ee Ce Po: [ey [tee [ee | Pi a Fes 5. Completing the sequence with the domino produces the string [up + )#u> * --» Hk 17ua] in both the top and the bottom, solv ing the correspondence system. ‘We will now show that a derivation w + w can be constructed from a solution to the Post ‘correspondence system Cy... A solution to C,,., must begin with t since this is the only domino whose strings begin with the same symbol. By the same argument, a solution must end with ‘Thus the string spelled by a solution has the form [u* w¥v]. If w contains J, then the solution can be written (4 + x#v]yFu]. Since ] occurs in only one domino and is the rightmost symbol on both the top and the bottom of that domino, the string [u # x0] is, also a solution of Cy».11.9 Undecldable Problems in Context-Free Grammars 343 In light of the previous observation, let (ue -: #0] be a string that is a solution of the Post correspondence system Cy,» in which ] occurs only as the rightmost symbol. The information provided by the dominoes atthe ends ofa solution determines the structure of the entire solution. The solution begins with ur LI A sequence of dominoes that spell « on the bottom must be played in order to match the string already generated on the top. Let u = xj.» Xg be bottom strings in the dominoes that spell u in the solution, Then the solution has the form we | 5, t fm [| Since each domino represents a derivation x), => yj,. we combine these to obtain the derivation u 251, where a1 = yy.» YeThe prefix of the top string of the dominoes that make up the solution has the form (u'7, and the prefix ofthe bottom string is (us. Repeating this process, we see tht a solution defines a sequence of stings [wt yFu> * #0] [uxt yFup * IF... Fe) lu + iy Fup + sF ug... FU] where 2 ui with up Combining these produces a derivation u => ‘The preceding two arguments constitute a reduction of the word problem for the semi- Thue system S to the Post correspondence problem. It follows thatthe Post correspondence problem is undecidable. 7 11.9 Undecidable Problems in Context-Free Grammars Context-free grammars provide an important tool for defining the syntax of programming languages. The undecidability ofthe Post correspondence problem can be used to establish344 Chaptor 11. Decidabilty the undecidability of several important questions concerning the languages generated by context-free grammars. Let C = (Be. {fs 01h [ua.02} + [uqsty) be a Post correspondence system. ‘Two context-free grammars Gu and Gy are constructed from the ordered pairs of C Gus Vo= (Suh EoU U2...) Su > m Suh, Sv > mai |i an) Svh Ey = Ec U(l,2,....n] Py = (Sy > uSyi, Sy —> uff |i = 1,2,...,0] Determining whether a Post correspondence system C has a solution reduces to deciding the answers to certain questions concerning derivability in corresponding grammars Gu; and Gy. The grammar Gy generates the strings that can appear in the top half of a sequence cof dominoes. The digits in the rule record the sequence of dominoes that generate the string (in reverse order). Similarly, Gy generates the strings that can be obtained from the bottom half of a sequence of dominoes. ‘The Post correspondence system C has a solution if there is a sequence iyi. ive such that In this case, Gu and Gy contain derivations Su 2 wig ta iinet iy Sy 3a tle lkat dae Where tig tin iyiae-t ooo doit = Vita Vig wigelk «fit, Hleneey the inter- section of L(Gu) and L(Gy) is not empty. Conversely, assume that w € L(G) ML(Gy). Then w consists of a string we ZG followed by a sequence iuip1 .-i2i. The string w! = wits. diy _Miy = UiVig >> Ui, 4 18 a Solution to C. Theorem 11.9.1 There is no algorithm that determines whether the languages of two context-free grammars are disjoint. Proof Assume there is such an algorithm. Then the Post correspondence problem could be solved as follows: 1. For an arbitrary Post correspondence system C, construct the grammars Gy and Gy from the ordered pais of C.1.9. Undecidable Probloms in Context-Free Grammars 345 2. Use the algorithm to determine whether L(Gy) and L(Gy) are disjoint. 3. Chas a solution if, and only if, L(Gy) M L(Gy) is nonempty Step I reduces the Post correspondence problem to the problem of determining whether two context-free languages are disjoint. Since the Post correspondence problem has al- ready been shown to be undecidable, we conclude that the question of the intersection of context-free languages is also undecidable. . Example 11.9.1 ‘The grammars Gy and Gy are constructed from the Post correspondence system (aaa, aa], {baa,abaaaa} from Example 11.8.1 Gy: Sy > aaaSyl | aaa Gy: Sy > aaSyl [al > baaSy2 | baad + abaaaSy2 | abaaa2 Derivations that exhibit the solution to the correspondence problem are Sy = aaaSyt Sy =aaSyl = aaabaaSy21 = aaabaaaSy21 => aaabaaaaa\21 = aaabaaaaal2l. 0 ‘The set of context-free languages is not closed under complementation. However, for tn arbitrary Post correspondence system C, the languages L(Gu) and L(Gy) are context- free, The task of constructing context-free grammars that generate these languages is left Theorem 11.9.2 There is no algorithm that determines whether the language of a context-free grammar G=(V, BP, S)is E* Proof First, note that L = 5 is equivalent to C=. We use this observation and show that there is no algorithm that determines whether L(G) is emp a ‘Again, let C be a Post correspondence system. Since L(@Gv) and EGy) are context- free, 80 is L = L(Gu) U (Gy). Now, by DeMorgan’s law, [= L(Gu) N L(Gy). An alk gorithm that determines whether L = can also be used to determine whether L(Gu) and UGy) are . Theorem 11.9.3 There is no algorithm that determines whether an arbitrary context-free grammar is ain biguous. Proof A grammar is ambiguous if it contains a string that can be generated by two distinct leftmost derivations. As before, we begin with an arbitrary Post correspondence346 Chapter 11. Decidability system C and construct Gy and Gy. These grammars are combined to obtain the grammar G:V5{5,5v,5v) B=ke P=PyUPVU[S > Sy,8 > Sv) ‘with start symbol 5. Clearly all derivations of G are leftmost; every sentential form contains at most one variable. A derivation of G consists ofthe application of an $ rule followed by a derivation of Gy or Gy. The grammars Gy and Gy are unambiguous; distinct derivations generate distinct suffixes of integers. This implies that G is ambiguous if, and only if, L(Gy) 0 L(Gy) #@. But this condition is equivalent to the existence of a solution to the original Post correspondence system C. Since the Post correspondence problem is reducible to the problem of determining whether a context-free grammar is ambiguous, the later problem is also undecidable. . Exercises In Exercises | through 5, describe a Turing machine that solves the specified decision problem. Use Example 11.1.2 a @ model for defining the actions of a computation of the ‘machine. You need not explicitly construct the transition function, 1. Design a two-tape Turing machine that determines whether two strings w and v over (0.1) are identical. The computation begins with BuBvB on the tape and should require no more than 2engrh(u) + 1) transitions. 2, Design a Turing machine whose computations decide whether a natural number is prime, Represent the natural number n by a sequence of + 1 I's, 3. LetG=(V, 5, P, $) bea regular grammar. 4) Construct a representation forthe grammar G over (0, } b) Design a Turing machine that decides whether a sting w is in L(G). The use of nondeterminism facilitates the construction of the desired machine. 4. A tour in a directed graph is a path pp, pic--+s Px in which 1) Po= Pm ii) ForO
) Trace the computation of M with input 07 and give the corresponding derivation in Sm. 25. Finda solution for each of the following Post correspondence systems. 8) (a,aa}, (66,0), [a,b] ») fa,aaal, [aab,b}, labact ab} ©) (aa, ab}, (bb, ba}, (abb.b] &) (a,ab}, (ba,aba}, (6,aba}, (bba,b) 26. Show thatthe following Post correspondence systems have no solutions. 8) [b,da), [aa,b}, (bab, a}, fab, ba} ) [ab,a}, (6a, bab), [6,24], (oa, ab} ©) [ab aba}, baa, ca}, aba, baa] 4) [ab, Bb, faa, ba. [ab,abb), (bb, bab] ©) [abb, ab}, [aba, bal, [aab,abab] 27. Prove that the Post correspondence problem for systems with a one-symbol alphabet is decidable. 28. Build the context-free grammars Gy and Gy that are constructed from the Post corre- spondence system (0,6) [aa, baa, [ab,a]. Is (Gy) (Gy) = 8? 29, LetC bea Post correspondence system. Construct a context-free grammar that gener- ates L(Gv) 30. Prove that there is no algorithm that determines whether the intersection of the lan- _uages of two context-free grammars contains infinitely many elements. 31. Prove that there is no algorithm that determines whether the complement of the Ian- guage of a context-free grammar contains infinitely many elements Bibliographic Notes ‘Turing {1936] envisioned the theoretical computing machine he designed to be capable of performing all effective computations. This viewpoint, now known as the Church- Turing thesis, was formalized by Church {1936]. Turing's original paper also included the undecidability of the halting problem and the design of a universal machine. The proof of the undecidability ofthe halting problem presented in Section 11.3 is from Minsky [1967] A proof that an arbitrary Turing machine can be simulated by a machine with tape al- phabet (0, 1,8) can be found in Hoperoft and Ullman [1979]. Techniques for establishing tundecidability using properties of languages were presented in Rice {1953] and [1956] ‘The string ransformation systems of Thue were introduced in Thue [1914], The undecid- ability ofthe word problem for semi-Thue systems was established by Post [1947]350 Chaptor 11. Decidabitty ‘The undecidabilty of the Post correspondence problem was presented in Post [1946], ‘The proof of Theorem 11.8.1, based on the technique of Floyd [1964], is from Davis and Weyuker [1983]. Undecidabitity results for context-free languages, including Theo- rem 11.8.1, ean be found in Bar-Hillel, Perles, and Shamir [1961], The undecidability of ambiguity of context-free languages was established by Cantor [1962], Floyd [1962], and Chomsky and Schutzenberger [1963]. The question of inherent ambiguity was shown t0 be unsolvable by Ginsburg and Urtian {1966a},CHAPTER 12 Numeric Computation Turing machines have been used as a computational framework for constructing solutions to decision problems and for accepting languages. The result of a computation was de- termined by final state or by halting, In either case there are only two possible outcomes: accept or reject. The result of a Turing machine computation can also be defined in terms of the symbols written on the tape when the computation terminates. Defining the result in terms of the halting configuration permits an infinite number of possible outcomes. This technique will be used to construct Turing machines that compute number-theoretie func tions. 12.1 Computation of Functions A function f : X > ¥ can be thought of as a mapping that assigns at most one value of the range Y to each element of the domain X. Adopting a computational viewpoint, we refer to the variables of fas the input of the function, The definition of a function does not specify hhow to obtain /(x), the value assigned to x by the function f, from the input x, Turing machines will be designed to compute the values of functions. The domain and range of a function computed by a Turing machine consist of strings over the input alphabet of the ‘machine, Recall that the term function refers to both partial and total functions. 351352 Chaptor12 Numeric Computation Definition 12.1.1 A deterministic one-tape Turing machine M = (Q, E. P, 5. go. ¢7) computes the unary function f : 5" > E* if {) there is only one transition from the state qo and it has the form 8(go, B) = (qi. B. R] ii) there are no transitions of the form 6(qi.) = [go..d] for any gi €Q, x <1 and de (LR) iii) there are no transitions of the form 8(gy. B) iv) the computation with input 1 hats in the configuration q Bu B whenever f(u) _) the computation continues indefinitely whenever fu). ‘A function is said to be Turing computable if there is a Turing machine that com- ‘putes it, A Turing machine that computes a function has two distinguished states: the initial state go and the halting state. A computation begins with a transition from state go that positions the tape head at the beginning of the input string, The state go is never reentered: its sole purpose is to initiate the computation, All computations that terminate do so in state gy. Upon termination, the value of the function is written on the tape beginning at position one, The remainder ofthe tape is blank. ‘An arbitrary function need not have the same domain and range. Turing machines can be designed to compute functions from E* to a specific set R by designating an input alphabet © and a range R. Condition (iv) is then interpreted as requiring the string v 10 be an element of R, To highlight the distinguished states qo and q, a Turing machine M that computes a function is depicted by the diagram ot] Intuitively, the computation remains inside the box labeled M until termination, This dia- ‘gram is somewhat simplistic since Definition 12.1.1 permits multiple transitions to state q and transitions from qj. However, condition (ii) ensures that there are no transitions from 4g when the machine is scanning a blank. When this occurs, the computation terminates ‘with the result written on the tape. Example 12.1.1 ‘The Turing machine M computes the function f from (a, b}* to (a, b}* defined by A. ifwcontains ana otherwise Fw) -(121 Computation of Functions 353 bibR ‘aR BL BIBR BOR bp BIBR ‘The function f is undefined if the input does not contain an a. In this case, the ‘machine moves indefinitely to the right in state q). When an ais encountered, the machine enters state g> and reads the remainder of the input. The computation is completed by erasing the input while returning to the initial position. A computation that terminates produces the configuration 4 BB designating the null string as the result, o The machine M in Example 12.1.1 was designed to compute the unary function f. It should be neither surprising nor alarming that computations of M do not satisfy the re 4Quirements of Definition 12.1.1 when the input does not have the anticipated form. A com- putation of M initiated with input BbB8bBaB terminates in the configuration BbBbg rB. In this halting configuration, the tape does not contain a single value and the tape head is ‘notin the correct position. This is just another manifestation of the time-honored “garbage in, garbage out" principle of computer science. Functions with more than one argument are computed in a similar manner. The input is placed on the tape with the arguments separated by blanks. The initial configuration of a computation of a temary function f with input aba, bbb, and bab is If f(aba, bbb, bab) is defined, the computation terminates with the configuration gy B ‘faba, bbb, bab) B. The initial configuration for the computation of f (aa, i, Bb) is Seceeo t [«) ‘The consecutive blanks in tape positions three and four indicate that the second argument is the null string354 Chapter 12 Numeric Computation Example 12.1.2 ‘The Turing machine given below computes the binary function of concatenation of strings over {a}. The initial configuration of a computation with input strings and v has the form qoBuBuB. Either or both of the input strings may be null. Bin R ‘The initial string is read in state qy. The eycle formed by states qo, gs, gs, 42 translates fan a one position to the left. Similarly, q2. qs, 4s, 42 shift a b 10 the left. These cycles are repeated until the entire second argument has been translated one position to the let, producing the configuration gy BuvB. 0 12.2 Numeric Computation ‘We have seen that Turing machines can be used to compute the values of functions whose domain and range consist of strings over the input alphabet. In this section we turn our Attention to numeric computation, in particular the computation of number-theoretic func- tions. A number-theoretie function is a function ofthe form f :N x Nx--- x NN. ‘The domain consists of natural numbers or n-tuples of natural numbers. The function sq N+ N defined by sq() =n isa unary number-theoretc function. The standard op- erations of addition and multiplication define binary number-theoretic functions The transition from symbolic to numeric computation requires only a change of per- spective since numbers are represented by strings of symbols. The input alphabet of the Turing machine is determined by the representation of the natural numbers used in the computation. We will represent the natural number» by the string /**', The number zero is represented by the string /, the number one by 11, and s0 on. This notational scheme is known as the unary representation of the natural numbers. The unary representation122 Numeric Computation 355 of a natural number 1 is denoted 11. When numbers are encoded using the unary represen- tation, the input alphabet for a machine that computes a number-theoretic function is the singleton set {1} ‘The computation of f(2,0,3) in a Turing machine that computes a temary number theoretic function f begins with the machine configuration CEEEy EEE Eee yee 5 . the computation terminates with the configuration If £2,0,3) {a AA Kevariable total number-theoretic function r:N x N x +N => (0,1) defines a ‘ary relation R on the domain of the function. The relation is defined by Inj.ngy.cesmi] € Rif romemay.-.m) = 1 ng] Rif rm ma, “The function ris called the characteristic function of the relation R. A relation is Turing computable if its characteristic function is Turing computable. ‘Turing machines that compute several simple, but important, number-theoretic func tions are given below. The functions are denoted by lowercase letters and the correspond ing machines by capital letters. [rns ny) =0. ‘The successor function: s(n) =n + 1 oo ‘The zero function: z(n} WIR UBL san bo) ani &) ann mit356 Chapter 12 Numeric Computation ‘The empty function: e(n) ¢ BIBR WIR ue The machine that computes the successor simply adds a / to the right end of the input siring. The zero function is computed by erasing the input and writing / in tape position one. The empty function is undefined for all arguments; the machine moves indefinitely 10 the right in state q The zero function is also computed by the machine BR BIBL eee oe ‘That two machines compute the same function illustrates the difference between functions and algorithms. A function maps elements in the domain to elements in the range. A “Turing machine mechanically computes the value of the function whenever the function is defined, The difference is that of definition and computation, In Section 125 we will se that there are number theoretic functions that cannot be computed by any Turing machine. The value ofthe k-varable projection function pis defined as the ith argument of the input, pl! (n,n... »-sme) =i. The superscript k specifies the number of arguments and the subscript designates the argument that defines the result of the projec: tion. The superscript is placed in parentheses so i is not mistaken for an exponent. The machine that computes p leaves the fist argument unchanged and erases the remaining arguments MIR aR max BIBL me (-) ann $e § we qtr Lane Looe ove 4 wor Sune " @ a fas ‘The function p maps a single input to itself. This function is also called the identity {Function and is denoted id. Machines P® that compute p!) wil be designed in Exam- ple 23.1 Example 12.2.1 ‘The Turing machine A computes the binary function defined by the addition of natural numbers.123. Sequential Operation of Turing Machines 357 WR mR me ~~ ie a mee Ga Gp MOL GOL, GLY ‘The unary representations of natural numbers mand m are J"! and "+! The sum of these numbers is represented by 7", This string is generated by replacing the blank between the arguments with a J and erasing two /°s from the right end of the second argument, a Example 12.2.2 ‘The predecessor function 0 ifm=0 oret= Ly Shere is computed by the machine D (decrement) FFor input greater than zero, the computation erases the rightmost / on the tape. 12.3 Sequential Operation of Turing Machines ‘Turing machines designed to accomplish a single task can be combined to construct ma- chines that perform complex computations. Intuitively, the combination is obtained by running the machines sequentially. The result of one computation becomes the input for the succeeding machine. A machine that computes the constant function ¢(n) = 1 can be constructed by combining the machines that compute the successor and the zero functions. Regardless ofthe input, a computation of the machine Z terminates with the value zero on the tape, Running the machine $ on this tape configuration produces the number one, ‘The computation of Z terminates with the tape head in position zero scanning a blank. ‘These are precisely the input conditions for the machine S. The initiation and termination conditions of Definition 12.1.1 were introduced to facilitate this coupling of machines. The handoff between machines is accomplished by identifying the final state of Z with858 Chapter 12 Numeric Computation the initial state of S. Except for this handoff, the states ofthe two machines are assumed to be distinct. This can be ensured by subscripting each state of the composite machine with the name of the original machine wR BL ee Ss}am Sey aR, > au. > jaar BIL @s) @) \) oy HR WL ‘The sequential combination of two machines is represented by the diagram The state names are omitted from the initial and final nodes in the diagram since they may be inferred from the constituent machines. ‘There are certain sequences of actions that frequently occur in a computation of a Turing machine. Machines can be constructed to perform these recurring tasks. These machines are designed in a manner that allows them to be used as components in more complicated machines. Borrowing terminology from assembly language programming, we call a machine constructed to perform a single simple task a macro. ‘The computations of a macro adhere to several of the restrictions introduced in Def- inition 12.1.1. The intial state go is used strictly to initiate the computation, Since these ‘machines are combined to construct more complex machines, we do not assume that a computation must begin with the tape head at position zero, We do assume, however, that each computation begins with the machine scanning a blank. Depending upon the op eration, the segment of the tape to the immediate right or left of the tape head will be examined by the computation. A macro may contain several states in which a computa- tion may terminate. As with machines that compute functions, a macro is not permitted 10 contain a transition of the form 8(q. B) from any halting state 4123 Sequential Operation of Turing Machines 359 A family of macros is often described by a schema. The macro MRy moves the tape hhead to the right through i consecutive natural members (sequences of 1's) on the tape, MR} is defined by the machine MR MR, ot_) MBR¢ is constructed by adding states to move the tape head through the sequence of k natural numbers. WR mR me pu CC) \ C wn: XH Ag ape Soy) ome, wR YY nae The move macros do not affect the tape to the left of the intial position of the tape head. A computation of MR» that begins with the configuration B7,qo B72 B73 BB terminates in the configuration Bm Bn BMq j Bn. Macros, like Turing machines that compute functions, expect to be run with the input having a specified form. The move right macro MR; requires a sequence of at least 7 natural numbers to the immediate right of the tape at the initiation of a computation. The design of a composite machine must ensure the appropriate input configuration is provided to each macro. Several families of macros ate defined by describing the results of a computation of the machine. The computation of each macro remains within the segment of the tape indicated by the initial and final blank in the description. The application of the macro will neither access nor alter any portion of tape outside of these bounds. The location of the tape head is indicated by the underscore, The double arrows indicate identical tape positions in the before and after configurations. ML (move left) BRBR2B...BMB — k>0 ? t FR (find right): BBB 120 tot Bi BRB360 Chapter 12 Numerle Computation FL (find le) pan’ 120 +t Baie (eras) BR BmB.. BB kz I t + BB BB PY: (copy i Big. Bin, BBB BB 1 + + + Bi BRB... BR, BN; BRB... BAB CPY4, (copy through i numbers): Bi BiB... Bi, Biiss -.. Brn BB BB Ok>I + + 1 Bin BiyB... Brig Bygs... BA BR BiB... BAB T (eanslat) BBnB i=0 tt Baba ‘The find macros move the tape head into a position to process the first natural number t0 the right or left of the current position. Fy erases a sequence of k natural numbers and halts ‘with the tape head in its original position, ‘The copy machines produce a copy of the designated number of integers. The segment Of the tape on which the copy is produced is assumed to be blank. CPY,, expects a sequence of k +i numbers followed by a blank segment large enough to hold a copy of the first k numbers. The translate macro changes the location of the first natural number to the right of the tape head. A computation terminates with the head in the position it ‘occupied at the beginning of the computation with the translated string to its immediate right, ‘The input to the macro BRN (branch on zero) is a single number. The value of the input is used to determine the halting state of the computation. The branch macro is depicted123 Sequential Operation of Turing Machines 361 oO —faws [a9 n>0 ) ‘The computation of BRN does not alter the tape nor change the position of the tape head. Consequently, it may be run in any configuration BRB. The branch macro is often used in the construction of loops in composite machines. ‘Additional macros can be created using those defined above, The machine O-fer,]-O-[ e J-o-l + boa -O-{ 7 ]-O-Mn]-O interchanges the order of two numbers. The tape configurations for this macro are INT (interchange): BM BBB $ ¢ Biba eB In Exercise 3, you are asked to construct a Turing machine for the macro INT that does not Teave the tape sexment Bri BmB. Example 123.1 The computation ofa machine that evaluates the projection function p') consists of three distinct actions: erasing the initial i ~ 1 arguments, translating the th argument to tape Position one, and erasing the remainder ofthe input. A machine to compute p" can be designed usin the macros FR, FL, Ey. MR, and. 0 E,fO-[7 Ox{Me -O~! re +O! -o{n +o ‘Turing machines defined to compute functions can be used like macros in the design of composite machines, Unlike the computations of the macros, there is no a priori bound ‘on the amount of tape required by a computation of such a machine. Consequently, these ‘machines should be run only when the input is followed by a completely blank tape.362 Chapter 12 Numeric Computation Example 12.3.2 ‘The macros and previously constructed machines can be used to design a Turing machine that computes the function f(n) = 3m. O-fer}-O-[un,}-O-ferv|-o-f a be@xlmt Oxf +o ‘The initial state of the complete machine is that of the macro CPY). The machine A, constructed in Example 12.2.1, adds two natural numbers. A computation with input 7 ‘generates the following sequence of tape configurations. ‘Machine Configuration ‘ane cry, BaBnB MRy BaBRB oy BrBRBRB A BaBRTAB ML; A Bata RB [Note that the addition machine A is run only when its arguments are the two rightmost encoded numbers on the tape. o Example 12.3.3 ‘The one-variable constant function zero defined by z(n) =0, for all n é N, can be built from the BRN macro and the machine D that computes the predecessor function. © es] ESD 0 >] Example 1234 ‘A Turing machine MULT is constructed to compute the multiplication of natural numbers, Macros can be mixed with standard Turing machine transitions when designing a compos- ite machine, The conditions on the initial state of a macro permit the submachine to be123. Sequential Operation of Turing Machines 363 centered upon the processing of a blank from any state, The identification of the start state of a macro with a state q is depicted @ mesa = Since the macro is entered only upon the processing of a blank, transitions may also be defined for state gj With the tape head scanning nonblank tape symbols. “po-fa}-9 T L & 7 p as.) UBL XBL MR, Py,‘964 Chapter 12 Numeric Computation If the first argument is zero, the computation erases the second argument, returns to the initial position, and halts. Otherwise, a computation of MULT adds m to itself times. ‘The addition is performed by copying m and then adding the copy to the previous total. The number of iterations is recorded by replacing a / in the first argument with an X when a copy is made. a 12.4 Composition of Functions Using the interpretation of a function as a mapping from its domain to its range, we can represent the unary number theoretic functions g and h by the diagrams O11! | Oua ‘A mapping from N to N can be obtained by identifying the range of g with the domain of +h and sequentially traversing the arrows in the diagrams. oe -® ‘The function obtained by this combination is called the composition of ft with g. The ‘composition of unary functions is formally defined in Definition 12.4.1. Definition 12.4.2 extends the notion to n-variable functions. Definition 12.4.1 Let g and fr be unary number-theoretic functions. The composition of h with g is the unary function f : N— N defined by + ifet FOS] if glx) = yand hey) t My) if'g(x) = y and Wty) ‘The composite function is denoted f = ho g. ‘The value of the composite function f =f g for input x is written f(x) = A(g(x)). ‘The latter expression is read “h of g of x.” The value /(g(x)) is defined whenever g(x) is defined and fis defined for the value g(x). Consequently, the composition of total functions produces a total function,124 Composition of Functions 365, From a computational viewpoint, the composition ft 0 g consists of the sequential evaluation of functions g and /h. The computation of g provides the input for the com putation of fe: Input x | (ae | Result -A(gta)) “The composite function is defined only when the preceding sequence of computations can be successfully completed. Definition 12.4.2 Let gi, g2.-.-+ 8 be K-variable number theoretic functions and let h be an n-variable rnumber-theoretic function. The k-variable function f defined by FO 20K) SIGIR g eee esate KD) is called the composition of fh with gy, g2,.--+ &q and written f =o (gi,.--.8n). FG%1s.+- 044) is undefined if either 1) glx... m4) t forsome I
BILL XE ER OE WAG HBR me aL me ne Me sq BBR gy HER A) BBL YAR BLS XQ HOR @ ® ® Both My and My compute the unary constant function. The two machines difer only in the names given to the states and the markers used during the computation. These symbols have no effect onthe result of a computation and hence te function computed by the mackine. Since the aes ofthe sates and tape symbols other than and Fare immaterial ve adopt te fllowing conventions concerning the naming ofthe components of a Turing machine i) The set of states isa finite subset of Qo = (4: | > Ob ii) The input alphabet is (7). iil) The tape alphabet isa finite subset ofthe set Py iv) The initial state is go. B.1,X; |i = 0) ‘The transitions of a Turing machine have been specified using functional notation; the transition defined for state q; and tape symbol x is represented by 5(q.,) = [aj ‘This information can also be represented by the quintuple126 Toward 8 Programming Language 369 lo 3. gh i ' current state | ‘symbo! scanned symbol 0 write | | sirection | new state With the preceding naming conventions, a transition of a Turing machine is an element of the set T = Qo x Mp x Po x [L, R} x Qo, The set T is countable since itis the Cartesian product of countable sets. The transitions of a deterministic Turing machine form a finite subset of T in which the first two components of every element are distinct. There are only a countable number of such subsets. I follows that the number of Turing computable functions is at most countably infinite. On the other hand, the number of Turing computable functions is at least countably infinite since there are countably many constant functions, all of which are Turing computable by Example 12.4.2 ‘Theorem 12.5.1 ‘The set of computable number-theoretic functions is countably infinite In Section 1.4, the diagonalization technique was used to prove that there are uncount- ably many total unary number-theoretic functions. Combining this with Theorem 12.5.1, wwe obtain Corollary 12.5.2 Corollary 12.5.2 ‘There is a total unary number-theoretic function that is not computable. Corollary 12.5.2 vastly understates the relationship between computable and uncom- putable functions. The former constitute a countable set and the latter an uncountable set 12.6 Toward a Programming Language High-level programming languages constitute a well-known family of computational sys tems. A program defines a mechanistic and deterministic process, the hallmark of algo rithmic computation. The intuitive argument that the computation of a program written in a programming language and executed on a computer can be simulated by a Turing ‘machine rests in the fact that a machine (computer) instruction simply changes the bits in some location of memory. This is precisely the type of action performed by a Turing370 Chapter 12 Numeric Computation 5, BB... BY, B5,,, 8... BB BBB input tocal 4 registers and variables vanables | work space a il home position FIGURE 12.1 Turing machine architecture for high level computation machine, writing 0’s and /’s in memory. Although it may take a large number of Turing ‘machine transitions to accomplish the task, i is not at all difficult to envision a sequence of transitions that will access the correct position and rewrite the memory. In this section we will explore the possibility of using the Turing machine architecture as the underlying framework for high-level programming. The development of a program- ‘ming language based on the Turing machine architecture further demonstrates the power of the Turing machine model which, in turn, provides additional support for the Church- ‘Turing thesis. In developing an assembly language, we use Turing machines and macros to define the operations. The objective ofthis section is not to create a functional assembly language, but rather to demonstrate further the universality of the Turing machine archi- tecture The standard Turing machine provides the computational framework used throughout this section, The assembly language TM is designed to bridge the gap between the Turing machine architecture and the programming languages. The first objective of the assembly language is to provide a sequential description of the actions of the Turing machine, The “program flow” of a Turing machine is determined by the arcs in the state diagram of the machine. The flow of an assembly language program consists of the sequential execution of the instructions unless this pattern is specifically altered by an instruction that redirects, the flow. In assembly language, branch and goto instructions are used to alter sequential program flow. The second objective of the assembly language is to provide instructions that simplify memory management, ‘The underlying architecture of the Turing machine used to evaluate an assembly lan- guage program is pictured in Figure 12.1. The input values are assigned to variables Dis +++ Uk ANE Vey, «- +5 Uq ate the local variables used in the program. The values of the variables are stored sequentially separated by blanks. The input variables are in the standard input position for a Turing machine evaluating a function. A TM program begins by declaring the local variables used in the program, Each local variable is initialized to 0 atthe start of a computation ‘When the initialization is complete, the tape head is stationed at the blank separating the variables from the remainder of the tape. This will be referred to as the home position. Between the evaluation of instructions, the tape head returns to the home position. To the126 Toward a Programming Language 371 TABLE 12.6.1 TM Instructions TM Instruction Interpretation INIT; Initialize local variable v, t0 0 HOME: Move the tape head to the home position when variables are allocated LOAD wt Load value of variable into register r STOR wnt ‘Store value in register into location of , RETURN wy Erase the variables and leave the value of wy in the output pesition CLEAR Erase value in register BRNL, ‘Branch to instruction labeled L if value in register # is 0 GOTOL Execute instruction labeled L NOP [No operation (used in conjunction with GOTO commands) Ince Increment the value of register ¢ DECr Decrement the value of register? ZERO! Replace value in register ¢ with O right of the home position is the Turing machine version of “registers.” The first value to the right is considered to be in register 1, the second value in register 2, etc. The registers must be assigned sequentially; that is, register 7 may be written to or read from only if, registers 1,2,...,4 ~ 1 are assigned values. The instructions of the language TM are given in Table 12.6.1 “The tape initialization is accomplished using the INIT and HOME commands. INIT v reserves the location for local variable v; and initializes the value to 0. Since variables are stored sequentially on the tape, local variables must be initialized in order at the beginning. of a TM program. Upon completion of the initialization of the local variables initiation, the HOME instruction moves the tape head to the home position. These instructions are defined by Instruction Definition INITy, HOME! MR, where ZR is the macro that writes the value 0 to the immediate right of the tape head position (Exercise 3). The initialization phase of a program with one input and two local variables would produce the following sequence of Turing machine configurations:372 Chapter 12 Numeric Computation Instruction Configuration iB itr? Bi8dB INIT 3 aiBdROB HOME BinoeoR where i is the value of the input to the computation. The position of the tape head is indicated by the underscore In TM, the LOAD and STOR instructions are used to access and store the values of the variables, The objective ofthese instructions is to make the details of memory management transparent to the user. In Turing machines, there is no upper bound to the amount of tape that may be required to store the value of a variable. The lack of a preassigned limit to the amount of tape alloted to each variable complicates the memory management of a Turing. ‘machine. This omission, however, is intentional, allowing maximum flexibility in Turing machine computations. Assigning a fixed amount of memory to a variable, the standard approach used by conventional compilers, causes an overflow error when the memory required to store a value exceeds the preassigned allocation, The STOR command takes the value from register # and stores it in the specified vari able location, The command may be used only when ¢ is the largest register that has an assigned value. In storing the value of register 1 in a variable vj, the proper spacing is maintained for all the variables. The Turing machine implementation of the store com: mand utilizes the macro INT to move the value in the register to the proper position, The macro INT is assumed to stay within the tape segment BXBYB (Exercise 3). ‘The STOR command is defined by sro. (2) we i)" copes 9) ns MR, ca} RES oe (wer) MR ER, Mua ‘where ¢ > 1 and nis the total number of input and local variables. The exponents n ~ i +1 and n ~ i indicate repetition of the sequence of macros. After the value of register 1 is stored, the register is erased.126 Toward a Programming Language 373 ‘The configurations of a Turing machine obtained by the execution of the instruction STOR v2, | are traced to show the role of the macros in TM memory management, Prior to the execution of the instruction, the tape head is atthe home position. Machine Configuration ML INT MLi INT BU BTBU BB MRi BU BFBTIBUB INT BU BrBUBUSB MR: BUBTBTBUB Ey Boy BrBSBR ‘The Turing machine implementation of the LOAD instructions simply copies the value of variable wy to the specified register. Instruction Definition LOAD wt MLy-rst CPYininie MRy-i+t ‘As mentioned above, o load a value into register ¢ requires registers 1,2, ...,f~ 1 to be filled. Thus the Turing machine must be in configuration Bi, BY, BBY, Bi, B..BE, BF, BB... BF,_, B 4 for the instruction LOAD vj,t to be executed, “The instructions RETURN and CLEAR reconfigure the tape to return the result of the computation, When the instruction RETURN vy is run with the tape head in the home374 Chapter 12 Numeric Computation position and no registers allocated, the tape is rewritten placing the value of vy in the Turing, machine output position. CLEAR simply erases the value inthe register. Instruction Definition RETURN ML, Ei MR; FR Ey itt AL CLEAR MR. B Mbit Arithmetic operations alter the values in the registers. INC, DEC, and ZERO are defined by the machines computing the successor, predecessor, and zero functions. Ad- ditional arithmetic operations may be defined for our assembly language by creating a ‘Turing machine that computes the operation, For example, an assembly language instruc- tion ADD could be defined using the Turing machine implementation of addition given by the machine A in Example 12.2.1. The resulting instruction ADD would add the val: ues in registers | and 2 and store the result in register 1. While we could greatly increase the number of assembly language instructions by adding additional arithmetic operations, INC, DEC, and ZERO will be sufficient for purposes of developing our language. ‘The execution of assembly language instructions consists of the sequential operation of the Turing machines and macros that define each of the instructions. The BRN and GOTO instructions interrupt the sequential evaluation by explicitly specifying the next instruction t© be executed, GOTO L indicates that the instruction labeled L is the next to be executed. BRN L, 1 tests register r before indicating the subsequent instruction, If the register is nonzero, the instruction immediately following the branch is executed Otherwise, the statement labeled by L.is executed. The Turing machine implementation of the branch is illustrated by BRNLA “instruction 1 L “instruction 2"126 Toward a Programming Language 375 O— im} =+O—{E} +O EES 10 machine for instruction 2 ‘The value is tested, the register erased, and the machines that define the appropriate in- struction are then executed. Example 12.6.1 ‘The TM program with one input variable and two local variables defined below computes the function f(x1) = 2x1 + 1. The input variable is vy and the computation uses local variables vp and vs, INIT ez INIT oy HOME 3 u LOAD 1,1 INC STOR v4.1 LOAD 1,1 DEC STOR 2,1 GoTOLt LR LOAD v INC STOR 11,1 RETURN 1 ‘The variable v2 is used as a counter, which is decremented each time through the loop defined by the label L1 and the GOTO instruction. In each iteration, the value of vy is incremented. The loop is exited after n iterations, where n is the input. Upon exiting the loop, the value is incremented again and the result 2; + 1 is left on the tape. a ‘The objective of constructing the TM assembly language is to show that instructions ‘of Turing machines, like those of conventional machines, can be formulated as commands376 Chapter 12 Numeric Computation ina higher-level language. Utilizing the standard approach to programming language def- inition and compilation, the commands of a high-level language may be defined by a sequence of the assembly language instructions. This would bring Turing machine com: putations even closer in form to the algorithmic systems most familiar to many of us. Exercises 1. Construct Turing machines with input alphabet {a,b} that compute the specified func tions. ‘The symbols w and v represent arbitrary strings over [a,b)* a fw wy f= {% lenathowiseren ©) flu) =uk flay {% Hienerhe > tengo 2. Construct Turing machines that compute the following number theoreti functions and relations, Do not use macros in the design ofthese machines 2) finy=2043 by haf n) = [n/2) where {xi the greatest integer Iss than oF equal to ©) flnjnasns) =m +m +n venta) =| Hniseven oeainmea[) Hem D iiam={) hee . 0 otherwise 3. Construct Turing machines that perform the actions specified by the macros fisted below. The computation should not leave the segment ofthe tape specified inthe input configuration. a) ZR; input BBB, output BOB b) FL; input B78 B, output BABB ©) Ex: input BRBMmB, output BBM "9B 4) T; input BB°%B, output BAB’ B ©) BRN input BB, output BRB 1 INT; input BHBMB, output Br BRB 4, Use the macros and machines constructed in Sections 12.2 through 12.4 to design ‘machines that compute the following functions:10. Exercises 377 a) f(a) =2n+3 b) fm) =n? +2042 ©) f(jengyns) =n) ma tn3 firm) =m* ©) S(mi.naym3) =a + 2m Design machines that compute the following relations. You may use the macros and machines constructed in Sections 12.2 through 12.4 and the machines constructed in Exercise 2 9 erm =| Shere ercatny {1 itis. perfect square © esate) = {5 oterise © dein) = | n> 0, m> Oandm die ‘race the actions of the machine MULT for computations with input a) n=0.m=4 b) n=1m=0 on Describe the mapping defined by each of the following composite functions: a) add 0 (mult o (id,id),add 0 fid.id)) b) pPolsop?,e0p?) ©) mult 0 (add 0 (p'?,5 0 p)) €) mutt 0 (mult 0 (p?,p°?), p') Give examples of total unary aumber-theoretic functions that satisfy the following conditions 2m =2. a) g isnot id and h is not id but g oh =id b) g isnot a constant function and /t is not a constant function but g o/h is @ constant function Give examples of unary number-theoretic functions that satisfy the following condi- tions: a) g isnot one-to-one, his not total, fo g is total b) g Heh Fe,hog =e, where cis the empty function ©) gid, hid, ho g = id, where id is the identity function 4) g is total, h is not one-to-one, ho g = id Let F be a Turing machine that computes a total unary number-theoretie function {f. Design a machine that retums the first natural number m such that (1) =0. A
You might also like
Turing Machines New Modified 2025
PDF
No ratings yet
Turing Machines New Modified 2025
88 pages
Unit Iv
PDF
No ratings yet
Unit Iv
23 pages
Automata Theory Lecture 9 Slides
PDF
No ratings yet
Automata Theory Lecture 9 Slides
30 pages
Wolkite University: Complexity Theory (Cosc4131) Chapter Two
PDF
100% (1)
Wolkite University: Complexity Theory (Cosc4131) Chapter Two
67 pages
Turing Machine
PDF
No ratings yet
Turing Machine
84 pages
CH9 Turing Machines: Formal Languages and Automata
PDF
No ratings yet
CH9 Turing Machines: Formal Languages and Automata
118 pages
Turing Machines
PDF
No ratings yet
Turing Machines
16 pages
Sud Kamp
PDF
No ratings yet
Sud Kamp
66 pages
Chapter 5 and 6 Turing Machines
PDF
No ratings yet
Chapter 5 and 6 Turing Machines
106 pages
Fla-Unit 4
PDF
No ratings yet
Fla-Unit 4
129 pages
Chapter 6 Part 1
PDF
No ratings yet
Chapter 6 Part 1
81 pages
ToA - Lecture 21 22 - Turing Machine
PDF
No ratings yet
ToA - Lecture 21 22 - Turing Machine
104 pages
7 Partc TM
PDF
No ratings yet
7 Partc TM
64 pages
Turing Machines
PDF
No ratings yet
Turing Machines
52 pages
FLAT - Ch-6
PDF
No ratings yet
FLAT - Ch-6
87 pages
Tuning Machine Unit 5
PDF
No ratings yet
Tuning Machine Unit 5
16 pages
Turing Machine
PDF
No ratings yet
Turing Machine
27 pages
It 503 (A) Toc Unit 5 Notes
PDF
No ratings yet
It 503 (A) Toc Unit 5 Notes
16 pages
11 Turing Machine
PDF
No ratings yet
11 Turing Machine
52 pages
09StdTuring Part1
PDF
No ratings yet
09StdTuring Part1
83 pages
Multi Head and Multi Tape Turing Machines-21-03-2024
PDF
No ratings yet
Multi Head and Multi Tape Turing Machines-21-03-2024
67 pages
Turing Machine
PDF
No ratings yet
Turing Machine
39 pages
Tcom 010 N
PDF
No ratings yet
Tcom 010 N
28 pages
Chapter 9: Turing Machines
PDF
No ratings yet
Chapter 9: Turing Machines
14 pages
Class 15
PDF
No ratings yet
Class 15
87 pages
Class 15
PDF
No ratings yet
Class 15
87 pages
PCC CS501 5
PDF
No ratings yet
PCC CS501 5
52 pages
Module 5 Notes
PDF
No ratings yet
Module 5 Notes
25 pages
Chapter05 TM
PDF
No ratings yet
Chapter05 TM
35 pages
Module-I: Turing Machine
PDF
No ratings yet
Module-I: Turing Machine
23 pages
Turing Machine 2831
PDF
No ratings yet
Turing Machine 2831
17 pages
Chapter 6
PDF
No ratings yet
Chapter 6
43 pages
Turing Machines
PDF
No ratings yet
Turing Machines
45 pages
Unit 4 - Turing Machineuu
PDF
No ratings yet
Unit 4 - Turing Machineuu
50 pages
Jan-June 2025 Btcs 4 Sem v10 Btcs404 Btcs404 Unit5 Notes
PDF
No ratings yet
Jan-June 2025 Btcs 4 Sem v10 Btcs404 Btcs404 Unit5 Notes
10 pages
LectureNotes Chapter5 v1.1
PDF
No ratings yet
LectureNotes Chapter5 v1.1
22 pages
Toc Unit 5
PDF
No ratings yet
Toc Unit 5
23 pages
Lec 17
PDF
No ratings yet
Lec 17
21 pages
Unit 4
PDF
No ratings yet
Unit 4
13 pages
A More Powerful Computation Model Than A PDA ?
PDF
No ratings yet
A More Powerful Computation Model Than A PDA ?
13 pages
Unit 5 - Theory of Computation - WWW - Rgpvnotes.in
PDF
No ratings yet
Unit 5 - Theory of Computation - WWW - Rgpvnotes.in
17 pages
CSCI 2670 Introduction To Theory of Computing: September 28, 2005
PDF
No ratings yet
CSCI 2670 Introduction To Theory of Computing: September 28, 2005
27 pages
Cse303 Elements of The Theory of Computation: Professor Anita Wasilewska
PDF
No ratings yet
Cse303 Elements of The Theory of Computation: Professor Anita Wasilewska
40 pages
Ch03 Handout
PDF
No ratings yet
Ch03 Handout
11 pages
Turing
PDF
No ratings yet
Turing
16 pages
Church Turing
PDF
No ratings yet
Church Turing
12 pages
Theorynotestoc 250114125232 E5ba9c8f
PDF
No ratings yet
Theorynotestoc 250114125232 E5ba9c8f
10 pages
Turing Machines and Linear Bounded Automata
PDF
No ratings yet
Turing Machines and Linear Bounded Automata
29 pages
1 Definition of A Turing Machine
PDF
No ratings yet
1 Definition of A Turing Machine
24 pages
Turing Machines - Northeastern University
PDF
No ratings yet
Turing Machines - Northeastern University
16 pages
Turing
PDF
No ratings yet
Turing
13 pages
Unit 5 - Theory of Computation - WWW - Rgpvnotes.in
PDF
No ratings yet
Unit 5 - Theory of Computation - WWW - Rgpvnotes.in
17 pages
(An Alternate View Of) Turing Machines: This Handout Was Written by Maggie Johnson and Mehran Sahami
PDF
No ratings yet
(An Alternate View Of) Turing Machines: This Handout Was Written by Maggie Johnson and Mehran Sahami
9 pages
4.5. A Model of Computation - Advanced
PDF
No ratings yet
4.5. A Model of Computation - Advanced
6 pages
Turing Machines: 1. Definition and Examples
PDF
No ratings yet
Turing Machines: 1. Definition and Examples
9 pages
Chapter 67
PDF
No ratings yet
Chapter 67
10 pages
Turing Machine: School of Engineering & Technology
PDF
No ratings yet
Turing Machine: School of Engineering & Technology
10 pages
C445 Lecture25
PDF
No ratings yet
C445 Lecture25
2 pages