Remaining Solutions Previous Homework - PowerPoint PPT Presentation

About This Presentation
Title:

Remaining Solutions Previous Homework

Description:

Why this proof does not work to prove that Turing-recognizable languages are ... Please provide a secret nickname. So I can post all grades on web site. ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 27
Provided by: leh71
Category:

less

Transcript and Presenter's Notes

Title: Remaining Solutions Previous Homework


1
Remaining Solutions Previous Homework
  • Decidable languages are closed under
    concatenation
  • Why the idea of bridging the transitions
    doesnt work
  • Proof by splitting input in parts
  • Why this proof does not work to prove that
    Turing-recognizable languages are closed under
    concatenation
  • How to repair the proof
  • an bn cn n 1, 2 , is decidable
  • K-Stack pushdown automata

2
Multi-tape Turing Machines
We add a fixed number of tapes

3
Multi-tape Turing Machines Transitions
NEW SLIDE
Transactions in a multi-tape Turing machine
allows the head to stay put where they are. So
the general for a 2-Tape Turing machine are
Q ? (? ? ?) ? Q ? (? ? ?) ? (L,R,S ?
L,R,S) So for example the transition
((q,(a,b)), (q, (a,a), (L,S))) Is moving the
head of the first tape to the left while keeping
the head of the second tape where it was
4
Multi-tape Turing Machines vs Turing Machines
M2

a1
a2
?
ai



b1
b2
?
bj
  • We can simulate a 2-tape Turing machine M2 in a
    Turing machine M
  • we can represent the contents of the 2 tapes in
    the single tape by using special symbols
  • We can simulate one transition from the M2 by
    constructing multiple transitions on M
  • We introduce several (but fixed) new states into M

5
Using States to Remember Information
Configuration in a 2-tape Turing Machine M2
Tape1
a
b
?
a
b
State s
Tape2
b
b
?
a
  • M2 is in state s
  • Cell pointed by first header in M2 contains b
  • Cell pointed by second header in M2 contains an a

6
Using States to Remember Information (2)
How many states are there in M?
Yes, we need large number of states for M but it
is finite!
7
Configuration in a 2-tape Turing Machine M2
Tape1
a
b
?
a
b
State in M2 s
Tape2
b
b
?
a
Equivalent configuration in a Turing Machine M
State in M sb1a2
8
Simulating M2 with M
  • The alphabet ? of the Turing machine M extends
    the alphabet ?2 from the M2 by adding the
    separator symbols ?1, ?2, ?3 , ?4 and ?e, and
    adding the mark symbols ? and ?
  • We introduce more states for M, one for each
    5-tuple p?1 ?2 where p in an state in M2 and
    ?1 ?2 indicates that the head of the first
    tape points to ? and the second one to ?
  • We also need states of the form p?1?2 for
    control purposes

9
Simulating transitions in M2 with M
10
Simulating transitions in M2 with M (2)
  • To check if the transformation (q,(?,?),) is
    applicable, we go forwards from the first cell.
  • If transformation is ? (or ?) we move the marker
    to the right (left)
  • To overwrite characters, M must first determine
    the correct position

11
state s
?
?
?4
?
?e
a
b
?
?2
?
?
?
?
b
b
?3
?
?
a
b
a
?1
state sb1
12
Multi-tape Turing Machines vs Turing Machines
(final)
  • We conclude that 2-tape Turing machines can be
    simulated by Turing machines. Thus, they dont
    add computational power!
  • Using a similar construction we can show that
    3-tape Turing machines can be simulated by 2-tape
    Turing machines (and thus, by Turing machines).
  • Thus, k-tape Turing machines can be simulated by
    Turing machines

13
Implications
  • If we show that a function can be computed by a
    k-tape Turing machine, then the function is
    Turing-computable
  • In particular, if a language can be decided by a
    k-tape Turing machine, then the language is
    decidable

Example Since we constructed a 2-tape TM that
decides L anbn n 0, 1, 2, , then L is
Turing-computable.
14
Summary of Previous Class
  • There are languages that are not decidable
  • (we have not proved this yet)
  • Why not extend Turing machines just as we did
    with finite automata and pushdown automata?
  • First Try multi-tape Turing machines
  • More convenient than Turing machines
  • But multi-tape Turing machines can be simulated
    with Turing machines
  • Therefore, anything we can do with a multi-tape
    Turing machines can be done with Turing machines
  • On the bright side we can use this to our
    advantage!
  • Second Try k-stack Pushdown automata
  • But k-stack Pushdown Automata are equivalent to
    Turing machines for k ? 2

15
Third Try Multi-Head Turing Machines
update
16
Configuration in a 2-header Turing Machine
Tape
a
b
a
b
State s
And use states to remember the pointers sb1b2
17
Configuration in a 2-header Turing Machine
Tape
a
b
?
a
b
State s
18
Fourth Try Nondeterministic Turing Machines
Reminder When a word is accepted by a
nondeterministic automata?
Given an NFA, a string w ? ?, is accepted by A
if at least one of the configurations yielded by
(s,w) is a configuration of the form (f,e) with f
a favorable state
(s,w) ? (p,w) ? ? (q,w) with w? e (s,w) ?
(p,w) ? ? (q,e) with q ? F (s,w)
? (p,w) ? ? (f,e)
? ? ?
We will do something similar for nondeterministic
TMs
19
Deciding, Recognizing Languages with
Nondeterministic Turing machines
Definition. A nondetermistic Turing machine NM
decides a language L (L is said to be decidable)
if
  • If w ? L then
  • at least one possible computation terminates in
    an acceptable configuration
  • No computation ends in a rejecting state
  • If w ? L then
  • No computation ends in an acceptable state
  • At least one possible computation terminates in
    a rejecting configuration

Definition. A Turing machine recognizes a
language L if it meets condition 1 above (L is
said to be Turing-recognizable)
20
Example of Nondeterministic TM
Find a nondeterministic TM recognizing the
language aabb
Solution
ML
accept
21
Nondeterministic TMs can be Simulated by TMs
Theorem. If a nondeterministic Turing machine,
NM, recognizes a language L, then there is a
Turing machine, M, recognizing the language
L Theorem. If a nondeterministic Turing machine,
NM, decides a language L, then there is a Turing
machine, M, deciding the language L
22
Nondeterministic TMs can be Simulated by TMs
Idea
Let w hRS
  • Simulate all computations in NM by computing them
    doing breadth-first order

C0 ?q0hRS
NM
  • If w ? L one of the Cij is an accepting
    configuration (h,?) which will also be found by
    M

C11
C1n
C12
C13

C21

C22
  • Similar argument can be made for w ? L

M
23
Context-Free Languages are Decidable
  • Let L be a context-free language
  • Let A be a pushdown automata accepting L
  • We simulate the pushdown automata A using a
    2-tape nondeterministic Turing machine, M. The
    second tape is used for simulating the stack
  • The header of the first tape points to the next
    character to be processed and the header of the
    second tape points to the last element (top of
    the stack)

24
Context-Free Languages are Decidable
For each transition ((q, ?,?), (q,?)) in A, we
construct several transitions in M doing the
following steps
  • Check if the next character is is ? (check if
    head of Tape 1 points to ?, and move the header
    to the right)
  • Check if the top of the stack is ? (check if head
    of Tape 2 points to ?, write a blank, and move
    the header to the left)
  • If 1 and 2 hold, push ? on tape 2 (move header
    to the right and write ?)

25
Possible Extensions of Turing Machines
Turing Machines
Physical
Computational
None of these extensions add computational power
to the Turing Machines
26
Homework for Friday
  • Please provide a secret nickname
  • So I can post all grades on web site.
  • 2. Proof that the function z x y is
    decidable. Assume that x, y and z are binary
    numbers. Hint Use a 3-tape Turing-Machine as
    follows
  • Tape 1 ?x
  • Tape 2 ?y
  • Tape 3 ?z
  • 3. Suppose that a language L is enumerated by an
    enumerator Turing machine (we say that L is
    Turing-enumerable).
  • Prove that L must be Turing-recognizable
  • Can we prove that L is decidable? Provide the
    proof or explain why not
  • 4. Problem 3.14 (note that there are 2 parts)
Write a Comment
User Comments (0)
About PowerShow.com