• No results found

Turing Machines and Computability

N/A
N/A
Protected

Academic year: 2022

Share "Turing Machines and Computability"

Copied!
51
0
0

Loading.... (view fulltext now)

Full text

(1)

CS310 : Automata Theory 2020

Lecture 27: Turing Machines and Computability

Instructor: S. Akshay

IITB, India

Spring 2020

(2)

This lecture

Turing Machines and Computability

In this lecture we will cover I Variants of Turing Machines

I Encoding description of Turing Machines as Strings I Turing recognizability and decidability

Reading Material

I Section 8.4, 8.5, Introduction to Automata Theory, Languages and Computation, Hopcroft, Motwani and Ullman

I Section 3.2, 4.1, Introduction to the Theory of Computation, Michael Sipser

(3)

This lecture

Turing Machines and Computability

In this lecture we will cover I Variants of Turing Machines

I Encoding description of Turing Machines as Strings I Turing recognizability and decidability

Reading Material

I Section 8.4, 8.5, Introduction to Automata Theory, Languages and Computation, Hopcroft, Motwani and Ullman

I Section 3.2, 4.1, Introduction to the Theory of Computation, Michael Sipser

(4)

Recall: Definition of Turing Machines

Definition

A Turing Machine is a 7-tuple (Q,Σ,Γ, δ,q0,qacc,qrej) where 1. Q is a finite set ofstates,

2. Σ is afinite input alphabet,

3. Γ is afinite tape alphabet wheret ∈Γ and Σ⊆Γ, 4. q0 is the startstate,

5. qacc is the acceptstate, 6. qrej is the rejectstate,

7. δ :Q×Γ→Q×Γ× {L,R}is the transition function.

(5)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by:

current state,tape contents andcurrent head location. Notation: C =uqv

where,uv is the tape content, current state isq and head is at start of v We defineC1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(6)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where,uv is the tape content, current state isq and head is at start of v We defineC1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(7)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where, uv is the tape content, current state is q and head is at start of v

We defineC1 yields C2 if the TM can move fromC1 toC2 in one step: I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(8)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where, uv is the tape content, current state is q and head is at start of v We define C1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(9)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where, uv is the tape content, current state is q and head is at start of v We define C1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(10)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where, uv is the tape content, current state is q and head is at start of v We define C1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(11)

Recall: configurations

A configuration is a snapshot of the system during computation.

Described by: current state,tape contents andcurrent head location.

Notation: C =uqv

where, uv is the tape content, current state is q and head is at start of v We define C1 yields C2 if the TM can move fromC1 toC2 in one step:

I left move: u a qi b v yields u qj a c v ifδ(qi,b) = (qj,c,L) I right move: u a qi b v yields u a c qj v ifδ(qi,b) = (qj,c,R)

I right-end: assume thatu a qi is the same asu a qi tas tape has blanks to the right.

I left-end (for single side infinite tape): qi b v yields (1)qj c v if transition is left moving or (2)c qj v if it is right moving

(12)

Recall: computation of a Turing Machine

I We define start (q0,w), accepting (∗qacc∗), rejecting (∗qrej∗) and halting configurations.

I A Turing Machine computation on a given input may not halt!

I A TM M acceptsinput word w if there exists a sequence of configurationsC1,C2, . . . ,Ck (called a run) such that

I C1is the start configuration I eachCi yieldsCi+1

I Ck is an accepting configuration

I Language of TMM, denotedL(M), is the set of strings accepted by it.

(13)

Recall: computation of a Turing Machine

I We define start (q0,w), accepting (∗qacc∗), rejecting (∗qrej∗) and halting configurations.

I A Turing Machine computation on a given input may not halt!

I A TM M acceptsinput word w if there exists a sequence of configurationsC1,C2, . . . ,Ck (called a run) such that

I C1is the start configuration I eachCi yieldsCi+1

I Ck is an accepting configuration

I Language of TMM, denotedL(M), is the set of strings accepted by it.

(14)

Recall: computation of a Turing Machine

I We define start (q0,w), accepting (∗qacc∗), rejecting (∗qrej∗) and halting configurations.

I A Turing Machine computation on a given input may not halt!

I A TMM acceptsinput word w if there exists a sequence of configurationsC1,C2, . . . ,Ck (called a run) such that

I C1 is the start configuration I eachCi yieldsCi+1

I Ck is an accepting configuration

I Language of TMM, denotedL(M), is the set of strings accepted by it.

(15)

Recall: computation of a Turing Machine

I We define start (q0,w), accepting (∗qacc∗), rejecting (∗qrej∗) and halting configurations.

I A Turing Machine computation on a given input may not halt!

I A TMM acceptsinput word w if there exists a sequence of configurationsC1,C2, . . . ,Ck (called a run) such that

I C1 is the start configuration I eachCi yieldsCi+1

I Ck is an accepting configuration

I Language of TMM, denotedL(M), is the set of strings accepted by it.

(16)

Recall: Turing recognizable and decidable languages

Turing recognizable or Recursively enumerable (r.e)

A language is Turing recognizable or r.eif there is a Turing machine accepting it.

Turing decidable or recursive

A language isdecidable(or recursive) if there is a Turing machine accepting it, which has the additional property that it halts on all possible inputs.

Every decidable language is Turing recognizable, but is the converse true?

(17)

Recall: Turing recognizable and decidable languages

Turing recognizable or Recursively enumerable (r.e)

A language is Turing recognizable or r.eif there is a Turing machine accepting it.

Turing decidable or recursive

A language is decidable(or recursive) if there is a Turing machine accepting it, which has the additional property that it halts on all possible inputs.

Every decidable language is Turing recognizable, but is the converse true?

(18)

Recall: Turing recognizable and decidable languages

Turing recognizable or Recursively enumerable (r.e)

A language is Turing recognizable or r.eif there is a Turing machine accepting it.

Turing decidable or recursive

A language is decidable(or recursive) if there is a Turing machine accepting it, which has the additional property that it halts on all possible inputs.

Every decidable language is Turing recognizable, but is the converse true?

(19)

Variants of a Turing Machine

I Multi-tape TMs.

I Non-deterministic TMs I Multi-head TMs

I Single sided vs double sided infinite tape TMs I ...

What are the relative expressive powers? Do we get something strictly more powerful than standard TMs?

(20)

Multi-tape to single-tape TM

Definition

A multitape TMis a TM with several tapes, each having its own head for reading and writing. Input is on first tape and others are blank.

Formally,

δ:Q×Γk →Q×Γk× {L,R}k, wherek is the number of tapes.

δ(qi,a1, . . . ,ak) = (qj,b1, . . . ,bk,L,R, . . . ,L)

Theorem

Every multi-tape TM has an “equivalent” single-tape TM. Proof idea:

I Keep a special marker # to separate tapes I Keep copy of alphabet to have different heads

I When you encounter # during simulation, shift cells to make space.

(21)

Multi-tape to single-tape TM

Definition

A multitape TMis a TM with several tapes, each having its own head for reading and writing. Input is on first tape and others are blank.

Formally, δ:Q×Γk →Q×Γk× {L,R}k, wherek is the number of tapes.

δ(qi,a1, . . . ,ak) = (qj,b1, . . . ,bk,L,R, . . . ,L)

Theorem

Every multi-tape TM has an “equivalent” single-tape TM. Proof idea:

I Keep a special marker # to separate tapes I Keep copy of alphabet to have different heads

I When you encounter # during simulation, shift cells to make space.

(22)

Multi-tape to single-tape TM

Definition

A multitape TMis a TM with several tapes, each having its own head for reading and writing. Input is on first tape and others are blank.

Formally, δ:Q×Γk →Q×Γk× {L,R}k, wherek is the number of tapes.

δ(qi,a1, . . . ,ak) = (qj,b1, . . . ,bk,L,R, . . . ,L)

Theorem

Every multi-tape TM has an “equivalent” single-tape TM. Proof idea:

I Keep a special marker # to separate tapes I Keep copy of alphabet to have different heads

I When you encounter # during simulation, shift cells to make space.

(23)

Multi-tape to single-tape TM

Definition

A multitape TMis a TM with several tapes, each having its own head for reading and writing. Input is on first tape and others are blank.

Formally, δ:Q×Γk →Q×Γk× {L,R}k, wherek is the number of tapes.

δ(qi,a1, . . . ,ak) = (qj,b1, . . . ,bk,L,R, . . . ,L)

Theorem

Every multi-tape TM has an “equivalent” single-tape TM.

Proof idea:

I Keep a special marker # to separate tapes I Keep copy of alphabet to have different heads

I When you encounter # during simulation, shift cells to make space.

(24)

Multi-tape to single-tape TM

Definition

A multitape TMis a TM with several tapes, each having its own head for reading and writing. Input is on first tape and others are blank.

Formally, δ:Q×Γk →Q×Γk× {L,R}k, wherek is the number of tapes.

δ(qi,a1, . . . ,ak) = (qj,b1, . . . ,bk,L,R, . . . ,L)

Theorem

Every multi-tape TM has an “equivalent” single-tape TM.

Proof idea:

I Keep a special marker # to separate tapes I Keep copy of alphabet to have different heads

(25)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM. Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or Theorem 8.4.4 in Hopcroft-Motwani-Ullman.

(26)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM.

Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or Theorem 8.4.4 in Hopcroft-Motwani-Ullman.

(27)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM.

Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or Theorem 8.4.4 in Hopcroft-Motwani-Ullman.

(28)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM.

Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree

using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or Theorem 8.4.4 in Hopcroft-Motwani-Ullman.

(29)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM.

Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or Theorem 8.4.4 in Hopcroft-Motwani-Ullman.

(30)

Non-deterministic TMs

Non-deterministic TMs

At any point in the computation, the TM may proceed according to several possibilities. Thus the transition function has the form:

δ :Q×Γ→2Q×Γ×{L,R}

Theorem

Every non-deterministic TM is equivalent to a deterministic TM.

Proof idea:

1. View NTMN’s computation as a tree.

2. explore tree using bfs and for each node (i.e., config) encountered, check if it is accepting.

For more details: See Theorem 3.16 (Section 3.2) in Sipser’s book or

(31)

Decidable languages

I A TMaccepts language Lif it has an accepting runon each word inL.

I A TMdecides languageL if it acceptsL and halts on all inputs.

Decidable and Turing recognizable languages

I A languageL isdecidable (recursive) if there exists a Turing machine M which decidesL (i.e.,M halts on all inputs and M acceptsL).

I A languageL isTuring recognizable (recursively enumerable) if there exists a Turing machine M which acceptsL.

(32)

Algorithms and Decidability

Algorithms ⇐⇒ Decidable (i.e, TM decides it)

I A decision problem P is said to be decidable (i.e., have an algorithm)if the language Lof all yes instances to P is decidable.

I A decision problem P is said to be semi-decidable (i.e., have a semi-algorithm)if the language Lof all yes instances to P is r.e.

I A decision problem P is said to be undecidableif the languageLof all yes instances toP is not decidable.

(33)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

(34)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

(35)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

But how do we encode <B>? In general a TM?

(36)

Encoding Turing machines as strings

Every Turing machine can be encoded as a string in {0,1}

I Just encode the description of the machine (in binary)!

I i.e., the 7-tuple: states, alphabet, transition etc... into a long binary string.

I See Section 3.3 of Sipser’s Book and Section 9.1.2 of Hopcroft, Motwani, Ullman’s book.

Every string in{0,1} is a TM

Of course, some strings don’t represent a valid encoding, then we can map them to a TM that does nothing.

Notation

I M →<M >, a string representation ofM. I α→Mα, a TM representation of a string.

(37)

Encoding Turing machines as strings

Every Turing machine can be encoded as a string in {0,1} I Just encode the description of the machine (in binary)!

I i.e., the 7-tuple: states, alphabet, transition etc... into a long binary string.

I See Section 3.3 of Sipser’s Book and Section 9.1.2 of Hopcroft, Motwani, Ullman’s book.

Every string in{0,1} is a TM

Of course, some strings don’t represent a valid encoding, then we can map them to a TM that does nothing.

Notation

I M →<M >, a string representation ofM. I α→Mα, a TM representation of a string.

(38)

Encoding Turing machines as strings

Every Turing machine can be encoded as a string in {0,1} I Just encode the description of the machine (in binary)!

I i.e., the 7-tuple: states, alphabet, transition etc... into a long binary string.

I See Section 3.3 of Sipser’s Book and Section 9.1.2 of Hopcroft, Motwani, Ullman’s book.

Every string in {0,1} is a TM

Of course, some strings don’t represent a valid encoding, then we can map them to a TM that does nothing.

Notation

I M →<M >, a string representation ofM. I α→Mα, a TM representation of a string.

(39)

Encoding Turing machines as strings

Every Turing machine can be encoded as a string in {0,1} I Just encode the description of the machine (in binary)!

I i.e., the 7-tuple: states, alphabet, transition etc... into a long binary string.

I See Section 3.3 of Sipser’s Book and Section 9.1.2 of Hopcroft, Motwani, Ullman’s book.

Every string in {0,1} is a TM

Of course, some strings don’t represent a valid encoding, then we can map them to a TM that does nothing.

Notation

I M →<M >, a string representation ofM. I α→Mα, a TM representation of a string.

(40)

Encoding Turing machines as strings

Every Turing machine can be encoded as a string in {0,1} I Just encode the description of the machine (in binary)!

I i.e., the 7-tuple: states, alphabet, transition etc... into a long binary string.

I See Section 3.3 of Sipser’s Book and Section 9.1.2 of Hopcroft, Motwani, Ullman’s book.

Every string in {0,1} is a TM

Of course, some strings don’t represent a valid encoding, then we can map them to a TM that does nothing.

Notation

I M →<M >, a string representation ofM.

(41)

How many Turing machines are there?

Consider set S of all Turing Machines. Is this set countable?

Recall: Countable set

I A set is countable if there is an injective map from the set to N. I The set{0,1} is countable

A corollary

I From previous slide, each TM corresponds to a string S I Number of strings is countable

I Hence, S is countable, i.e., there are countably many Turing machines. Also follows that set of Turing recognizable languages is countable!

(42)

How many Turing machines are there?

Consider set S of all Turing Machines. Is this set countable?

Recall: Countable set

I A set is countable if there is an injective map from the set to N. I The set{0,1} is countable

A corollary

I From previous slide, each TM corresponds to a string S I Number of strings is countable

I Hence, S is countable, i.e., there are countably many Turing machines. Also follows that set of Turing recognizable languages is countable!

(43)

How many Turing machines are there?

Consider set S of all Turing Machines. Is this set countable?

Recall: Countable set

I A set is countable if there is an injective map from the set to N. I The set{0,1} is countable

A corollary

I From previous slide, each TM corresponds to a string S I Number of strings is countable

I Hence, S is countable, i.e., there are countably many Turing machines.

Also follows that set of Turing recognizable languages is countable!

(44)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

I (Emptiness problem for DFA) Given a DFA does it accept any word?

I LDFA={<B>|B is a DFA,L(B) =∅}

I (Equivalence problem for DFA) Given two DFAs, do they accept the same language?

I LEQDFA={<A,B >|A,B are DFAs,L(A) =L(B)} I What about NFAs, regular expressions

(45)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

I (Emptiness problem for DFA) Given a DFA does it accept any word?

I LDFA={<B>|B is a DFA,L(B) =∅}

I (Equivalence problem for DFA) Given two DFAs, do they accept the same language?

I LEQDFA={<A,B >|A,B are DFAs,L(A) =L(B)} I What about NFAs, regular expressions

(46)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

I (Emptiness problem for DFA) Given a DFA does it accept any word?

I LDFA={<B>|B is a DFA,L(B) =∅}

I (Equivalence problem for DFA) Given two DFAs, do they accept the same language?

I LEQDFA={<A,B >|A,B are DFAs,L(A) =L(B)} I What about NFAs, regular expressions

(47)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

I (Emptiness problem for DFA) Given a DFA does it accept any word?

I LDFA={<B>|B is a DFA,L(B) =∅}

I (Equivalence problem for DFA) Given two DFAs, do they accept the same language?

I LEQDFA={<A,B >|A,B are DFAs, L(A) =L(B)}

I What about NFAs, regular expressions

(48)

Examples of Decidable languages and problems

I (Acceptance problem for DFA) Given a DFA does it accept a given word?

I LADFA={<B,w >|B is a DFA that accepts input wordw}

I (Emptiness problem for DFA) Given a DFA does it accept any word?

I LDFA={<B>|B is a DFA,L(B) =∅}

I (Equivalence problem for DFA) Given two DFAs, do they accept the same language?

I LEQDFA={<A,B >|A,B are DFAs, L(A) =L(B)}

I What about NFAs, regular expressions

(49)

Relationship among languages

Regular (Decidable ⊆

?

Turing recognizable ⊆

?

All languages

DFA/NFA <Algorithms/Halting TM≤

?

Semi-algorithms/TM

(50)

Relationship among languages

Regular (Decidable ⊆

?

Turing recognizable ⊆

?

All languages

DFA/NFA <Algorithms/Halting TM≤

?

Semi-algorithms/TM

(51)

Relationship among languages

Regular (Decidable ⊆

?

Turing recognizable ⊆

?

All languages

DFA/NFA <Algorithms/Halting TM≤

?

Semi-algorithms/TM

References

Related documents

A language is decidable (or recursive) if there is a Turing machine accepting it, which has the additional property that it halts on all possible inputs... Variants of a Turing

A language is decidable (or recursive) if there is a Turing machine accepting it, which has the additional property that it halts on all possible inputs.. Every decidable language

I Text processing: Given input as English alphabet, output can be capitalized words. I Translation: Given input in English, output can be

If the machine is able to fool the interrogator to behave like a human, then that machine passes the Turing Test..

Every Turing machine can be encoded as a string in {0, 1} ∗ Just encode the description of the machine (in binary).. Every string in {0, 1} ∗ is

Applications: showing (un)decidability of other problems (i) A string matching problem: Post’s Correspondance Problem (ii) A problem for compilers: Unambiguity of Context-free

I Computability: (Church Turing Hypothesis) All reasonable models of computation are equivalent, i.e., they decide the same class of languages. I Complexity: Choice of model

Providing cer- tainty that avoided deforestation credits will be recognized in future climate change mitigation policy will encourage the development of a pre-2012 market in