Why Language Acquisition Is A Snap: Stephen Crain and Paul Pietroski
Why Language Acquisition Is A Snap: Stephen Crain and Paul Pietroski
Why Language Acquisition Is A Snap: Stephen Crain and Paul Pietroski
Abstract
Nativists inspired by Chomsky are apt to provide arguments with the follow-
ing general form: languages exhibit interesting generalizations that are not
suggested by casual (or even intensive) examination of what people actually
say; correspondingly, adults (i.e., just about anyone above the age of four)
know much more about language than they could plausibly have learned on
the basis of their experience; so absent an alternative account of the relevant
generalizations and speakers’ (tacit) knowledge of them, one should conclude
that there are substantive “universal” principles of human grammar and, as a
result of human biology, children can only acquire languages that conform to
these principles. According to Pullum and Scholz, linguists need not suppose
that children are innately endowed with “speciÞc contingent facts about nat-
ural languages.” But Pullum and Scholz don’t consider the kinds of facts that
really impress nativists. Nor do they offer any plausible acquisition scenarios
that would culminate in the acquisition of languages that exhibit the kinds of
rich and interrelated generalizations that are exhibited by natural languages.
As we stress, good poverty-of-stimulus arguments are based on speciÞc princi-
ples – conÞrmed by drawing on (negative and crosslinguistic) data unavailable
to children – that help explain a range of independently established linguistic
phenomena. If subsequent psycholinguistic experiments show that very young
children already know such principles, that strengthens the case for nativism;
and if further investigation shows that children sometimes “try out” construc-
tions that are unattested in the local language, but only if such constructions
are attested in other human languages, then the case for nativism is made
stronger still. We illustrate these points by considering an apparently disparate
– but upon closer inspection, interestingly related – cluster of phenomena in-
volving: negative polarity items, the interpretation of ‘or’, binding theory, and
displays of Romance and Germanic constructions in Child-English.
1. Introduction
Before getting down to brass tacks, let us sketch two perspectives on the topic
of this special issue. The Þrst is a Chomskian view, which we endorse. On
this view, human languages exhibit interesting and unexpected generalizations.
The linguist’s job is to Þnd them and provide theories that explain why these
generalizations hold. However, the utterances speakers make, along with the
conversational contexts in which they make them, do not reveal the theoreti-
cally interesting linguistic phenomena – much less the deeper principles that
help unify and account for these phenomena. Moreover, facts about expres-
sions that speakers don’t use and the meanings that speakers’ don’t assign to
(otherwise well-formed) expressions, are at least as important as facts about
what speakers actually say, and the meanings they actually assign. As gen-
eralizations over this large initial data set emerge, across various languages,
linguists propose and test hypothesized grammatical principles that would (in
the course of normal human experience) give rise to languages that exhibit the
phenomena being described. But inquiry is hard. Many of the relevant facts
appear to be contingencies of human psychology, which may well have been
shaped in part by demands imposed on it by the kinds of signals and interpre-
tations that human minds can process; and little is known about these demands
on human psychology, especially on the meaning side of the equation. More-
over, the space of logically possible grammatical principles is immense. The
principles discovered so far describe a tiny fraction of the space of possible
languages, despite having impressive empirical coverage with regard to actual
human languages. So theorists search for constraints that would allow for only
the relatively small number of languages that humans can naturally acquire and
use.
Fortunately, linguists can draw upon a vast amount of positive and nega-
tive evidence – both within and across languages – and they can consider all
of the evidence at once. Nevertheless, there remains a familiar tension between
explanatory power and descriptive adequacy: good theories do not merely sum-
marize observed facts; and in a complex world, any nontrivial generalization
introduces an “idealization gap” between theory and data. As in all areas of
scientiÞc enquiry, considerations of simplicity and theoretical economy are
relevant in linguistic theories. In addition, linguistic theories must be com-
patible with observations about the nature of children’s cognitive abilities, as
well as their histories of linguistic stimulation. For all normal children acquire
adult linguistic competence despite the considerable latitude in environmental
input to different children. So any principles posited as descriptions of lan-
guages spoken by adults must be such that they are acquirable by any normal
child who undergoes the kind of experience that the corresponding adults un-
derwent. Any principles that are posited must also be learnable in the time
Why language acquisition is a snap 165
1. We use ‘grammar’ to talk about psychological properties of speakers. If one Þnds this termi-
nology objectionable, one can substitute Chomsky’s (1981) term ‘I-grammar’ (internal gram-
mar).
166 Stephen Crain and Paul Pietroski
2. Perhaps we are not criticizing the rebuttal by Pullum and Scholz of what they take to be
the argument from the poverty-of-stimulus (APS). But if so, then so much the worse for
them in focusing on that particular version of the APS. No charitable reader of Chomsky
could think that his arguments for nativism are supposed to be independent of the detailed
grammatical theories that he has defended. Indeed, we Þnd it hard to see how one could
advance an interesting version of linguistic nativism that is independent of speciÞc claims
about the grammatical knowledge of adults: until one has a Þrm grip on what adult linguistic
competence is like, one can’t even begin to hypothesize about the cognitive equipment that
children would need (in addition to their experience) to achieve adult-like states.
168 Stephen Crain and Paul Pietroski
to exhibit. On the Þrst view, there are natural seams (or parameters) of natural
language, and child speech should follow these seams, even when it diverges
from the speech of adults. Children will, under the pressure of experience, ex-
plore some part of the space of humanly possible languages; but they will never
“try out” a language that violates core principles of Universal Grammar. By
contrast, given what Pullum and Scholz say, the obvious prediction is that chil-
dren’s constructions should simply be less articulated ones than those of adults.
Children should initially try out “simple” construction types that may need to
be reÞned in light of experience. As we’ll see in Section 3 below, the evidence
from studies of child language favor the nativist view, and resist explanation on
the view taken by Pullum and Scholz.
2. Empirical details
We now turn to some empirical details. These illustrate the problems that beset
the perspective we attribute to Pullum and Scholz. There are many phenomena
we could discuss in this context. But since we want to display the form of
(what we take to be) a good poverty-of-stimulus argument, we will focus in
some detail on just one cluster of closely related facts.
Let’s start with some much-discussed facts concerning negative polarity
items (NPIs) like any, ever, or the idiomatic a red cent. The appearance of
such items is perfectly Þne in many linguistic contexts, but somehow wrong
in others. The following ten examples illustrate a small fraction of the con-
struction types that permit negative polarity items: sentences with negation (1)
or negative adverbs (2); prepositional phrases headed by before (3) or without
(4); antecedents of conditionals (5); verb-phrases headed by forbid (6) or doubt
(7); the Þrst argument of no (8) and its second argument (9); the Þrst argument
of every (10). The oddity of example (11) illustrates that NPIs are not licensed
in the second argument of every. 3
(1) I don’t talk to any other linguists.
(2) I never talk to any other linguists.
(3) I usually arrive at the gym before any other linguist wakes up.
(4) I went to the gym without any money.
(5) If any linguist goes to the gym, I go swimming.
3. One can specify the meaning of a quantiÞcational expression using (something like) set-
theoretic relations. On this view, a quantiÞcational expression in a simple declarative sentence
names a relation between two sets: Þrst, there is the set picked out by the NP (e.g., ‘linguist
with any brains’ in (8)); second, there is the set picked out by the VP (e.g., ‘admires Chomsky’
in (8)). We will refer to these as the Þrst and second arguments of the quantiÞer.
Why language acquisition is a snap 169
4. We restrict attention, in the present discussion, to any on its “true universal” as opposed to
“free choice” uses (see, e.g., Horn 2000; Kadmon and Landman 1993; Ladusaw 1996). While
speakers may assign an interpretation to I went to lunch with any money – i.e., I went to lunch
with any money at hand – the use of any in that construction clearly contrasts with that in
I went to lunch without any money. Some relevant contrasts to (1–10) include the following
degraded constructions (setting aside free-choice uses): I talk to any other linguists; I usually
arrive after any other linguist wakes up; if I go swimming, any other linguist goes to the
gym; some linguist with any brains admires Chomsky; some linguist admires any philosopher.
Using other negative polarity items, compare I never paid a red cent for that book and every
linguist who ever disagreed with me likes you with the degraded I paid a red cent for that book
and some linguist who ever disagreed with me likes you. Finally, while I think any linguist can
refute Chomsky sounds Þne (arguably because it involves the use of free-choice any), compare
I doubt you ever paid a red cent for that book with the terrible I think you ever paid a red cent
for that book.
170 Stephen Crain and Paul Pietroski
However, there are also disjunctive construction types that can only be un-
derstood with a “conjunctive” interpretation. For example, a disjunctive con-
struction with negation, such as not (A or B), is understood to be equivalent
in meaning to (not A) and (not B). Despite the abundance of exclusive-or in-
terpretations of disjunctive statements in the input to children, the exclusive-or
reading of the disjunction operator cannot be the source of the conjunctive
interpretation of disjunctive statements, because the negation of a disjunctive
statement using exclusive-or is true if both disjuncts are satisÞed. In forming
the conjunctive interpretation of disjunctive statements (e.g., with negation),
children must somehow ignore the available evidence from “positive” state-
ments with disjunction, in which exclusive-or is favored, as in (12) and (13);
children must somehow learn to use inclusive-or instead, at least when inter-
preting disjunctive statements with negation. 5
How do children navigate through their linguistic experience to discover
when to assign an exclusive interpretation to disjunctive statements, and when
not to? To answer this, it pays to look at a list of construction types that ex-
hibit the conjunctive interpretation of disjunction. Here are ten of the relevant
constructions: sentences with negation (14) or negative adverbs (15); preposi-
tional phrases headed by before (16) or without (17); antecedents of condition-
als (18); verb-phrases headed by forbid (19) or doubt (20); the Þrst and second
arguments of no (21) and (22); and the Þrst argument of every (23). Exam-
ple (24) illustrates that the second argument of every permits an exclusive-or
interpretation of disjunction, so this linguistic environment does not require a
conjunctive interpretation.
(14) I don’t talk to linguists or philosophers.
(15) I never talk to linguists or philosophers.
(16) I try to get to the gym before the linguists or philosophers.
(17) I go to the gym without the linguists or philosophers.
(18) If a linguist or a philosopher goes to the gym, I go swimming.
(19) I forbid linguists or philosophers from going to the gym.
(20) I doubt the linguists or the philosophers can refute Chomsky.
(21) No linguist or philosopher admires Chomsky.
(22) No one with any brains admires linguists or philosophers.
5. Our use of the term ‘reading’ is not intended to commit us to the view that or is ambiguous in
English, or that disjunction is ambiguous in any natural language. As we discuss shortly, it is
reasonable to suppose that the meaning of or conforms to that of disjunction in standard logic
(i.e., inclusive-or), but that statements with or are often judged to be true only in a subset of
its truth conditions, namely those that are associated with exclusive-or. Similar remarks apply
to the meaning of any (see Footnote 4).
Why language acquisition is a snap 171
Much work remains. One wants to know why downward entailment constrains
both NPI licensing and the interpretation of disjunctive statements. And why
these constraints? Ludlow (2002) explores with ingenuity the suggestion that
NPIs are, as their name suggests, indeed licensed by the presence of negation –
and that despite surface appearances, all of the licensing environments involve
an element of negation at the level of Logical Form (cf. Laka 1990). Chierchia
(2000) proposes that the so-called exclusive readings of disjunctive statements
in examples like (12) and (13) result from a kind of Gricean implicature that
is computed within the human language system. The idea is that a sentence
with a scalar term has both a “basic” meaning and a “derived” meaning, where
the derived meaning is determined by conjoining the basic meaning with the
negation of a corresponding statement in which the basic scalar operator is
replaced with the next strongest operator on the scale. If the derived meaning is
more informative than the basic meaning, then a speaker using the sentence will
be heard as “implicating” the more informative claim. For example, the logical
operators ‘v’ (inclusive disjunction) and ‘&’ (conjunction) form a simple scale;
the latter is stronger than the former, since ‘A & B’ is true only if ‘A v B’ is true,
but not vice versa. If ‘v’ gives the basic meaning of or, the derived meaning
of disjunctive statements of the form A or B is given by ‘(A v B) & not (A
& B)’ – which is equivalent to ’A exclusive-or B’. On this view, or always
stands for inclusive disjunction, but the derived meaning of A or B is more
informative than its basic meaning; correspondingly, a speaker who says A or
B will be heard as making a claim with the following implication: not (A and
B). However, if or appears in the scope of negation, as in (21), the derived
meaning is not more informative than the basic meaning. 6
Both the proposal by Ludlow and the one by Chierchia strike us as plausible.
Regardless of whether either of them is correct, however, we see no reason to
doubt the truth of what all the research in this area suggests: that some semantic
principle – call it ‘downward entailment’ – uniÞes what otherwise seem to
be the disparate phenomena of NPI licensing, the interpretation of disjunctive
statements, and the validity of inferences like the one in (26b).
One could deny all this, of course, and reject the claim that human grammars
(which children attain) are properly characterized by any deep generalization
like (27). Perhaps only the descriptive generalization in (25) – or no general-
ization at all – is correct. But simply denying apparent generalizations is just
bad science. One can’t avoid nativist conclusions by refusing to do linguistics.
And we don’t think Pullum and Scholz would advocate this approach. Instead,
we suspect that they would offer an alternative proposal about what children
6. In natural languages, not (A or B) is equivalent to not (A) and not (B) as in de Morgan’s laws;
but the derived meaning of not (A or B) – ‘not (A v B) & not [not (A & B)]’ – would be a
contradiction, and thus not a viable interpretation of not (A or B).
Why language acquisition is a snap 173
know when they know the descriptive generalization in (25). But any such prin-
ciple that is empirically equivalent to (27) will provide the basis for a poverty-
of-stimulus argument, absent a credible account of how all normal children
could learn the principle. Perhaps one can avoid direct appeal to (Universal
Grammar) constraints concerning downward entailment by saying that chil-
dren record what they hear in terms of abstract construction types that respect
(25). But then the question reduces to why children deploy those construction
types, as opposed to others.
The challenge for Pullum and Scholz, therefore, is to describe a plausible
acquisition scenario (e.g., for the descriptive generalization in (25)), according
to which children avoid uncorrectable overgeneralizations, without supposing
that children approach the acquisition process with speciÞc linguistic knowl-
edge of the sort they regard as unnecessary (e.g., the linguistic property of
downward entailment). So far as we can tell, Pullum and Scholz offer no hint
of how to formulate a learning account that eventuates in attainment of the spe-
ciÞc linguistic knowledge that nativists tend to focus on, such as downward
entailment. In short, it’s not enough to mention ways in which children could
learn some things without Universal Grammar. To rebut poverty-of-stimulus ar-
guments, one has to show how children could learn what adults actually know;
and as close investigation reveals, adults know a lot more than casual inspection
suggests. That is the nativist’s main point.
Let’s continue in this vein a bit further. Even given a characterization, say in
terms of downward entailment, of which construction types license NPIs, fur-
ther work remains. For example, we have seen that certain constructions with
a negative element, such as not, license NPIs, such as any (see (1) above). But
one wants to know how the negative element needs to be related to the NPI in
order to license it. One logical possibility is that the NPI any is licensed in con-
structions in which not precedes any. But in both of the following examples,
not precedes any, whereas any is licensed only in the second example.
(28) The news that Noam had not won was a surprise to some/*any of the
linguists.
(29) The news that Noam had won was not a surprise to some/any of the
linguists.
(31) a. . . . neg+V+V+NP+P+some
b. . . . V+neg+NP+P+some/any
Of course, one is left to wonder how children know to keep records of this
sort, as opposed to others. It seems implausible, to say the least, that children
are recording everything they hear and searching for every possible pattern.
Do children learn to apply category labels like NP, V and P, or is this part of
the cognitive apparatus human beings are disposed to project onto their expe-
rience?
But even setting such questions aside, the proposal that c-command is the
relevant structural relationship for the licensing of NPIs has much to recom-
mend it, as opposed to the construction type approach advocated by Pullum
and Scholz. The c-command account has unexpected and independent support
from a host of other linguistic constructions. Consider (32), for example.
(32) a. The bear who laughed never expected to Þnd any dogs at the
party.
b. *The bear who never laughed expected to Þnd any dogs at the
party.
In (32a), the negative adverb never c-commands any, but not in (32b). Cor-
respondingly, only (32a) is acceptable. Adopting the Pullum and Scholz ap-
proach, one could suppose that children encode the facts in (32) in terms of
construction types, where another construction type that permits NPIs is one
of the form never+V+INF+. . ., but NPIs would not be encountered in con-
structions of the form never+V+V+INF+. . . But even if some such proposal
could describe the facts, record keeping of this kind fails to explain why NPIs
are licensed in the Þrst type of construction, but not in the second; and it fails
7. While some linguists seem to use the licensing of NPIs as a diagnostic of c-command, its
precise deÞnition and the level of representation at which it applies (d-structure, s-structure,
LF, semantic representation) is the subject of considerable debate (see, e.g., the papers in
Horn and Kato 2000).
Why language acquisition is a snap 175
to tie this fact together with the fact that NPIs are licensed in (31b), but not in
(31a).
There are ample reasons for thinking that c-command plays a crucial role
in the interpretation of these constructions, and in many other constructions
where the licensing conditions for NPIs is not at issue (see Epstein et al. 1998).
To take a familiar kind of example, the pronoun he cannot be referentially
dependent on the referring expression, the Ninja Turtle, in (33); whereas this
relationship is possible in (34); and referential dependence is only possible, in
(35), between the reßexive pronoun himself and the referring expression, the
father of the Ninja Turtle (but not Grover or the Ninja Turtle).
Thus far, we have been pressing (what we take to be) familiar kinds of nativist
considerations. A less obvious problem for the kind of learning scenario ad-
vanced by Pullum and Scholz concerns the pattern of non-adult constructions
that appear in the language of young children. Other things being equal, Pullum
and Scholz should predict that children (in so far as they diverge from adults)
will initially employ constructions that are less articulated than those employed
by adults. Complexity in the child’s hypotheses about the local language should
be driven by what the child hears; otherwise, complex hypotheses will look like
reßections of a mental system that imposes certain structures more or less in-
dependently of experience.
But according to the perspective of linguists working within the generative-
transformational tradition, children should be expected to sometimes follow
developmental paths to the adult grammar that would be very surprising from
a data-driven perspective. Of course, any normal child quickly internalizes a
grammar equivalent to those of adults around them. But a child who has not
yet achieved (say) a dialect of American English can still be speaking a natu-
ral language – albeit one that is (metaphorically) a foreign language, at least
somewhat, from an adult perspective. And interestingly, the children of En-
glish speakers often do exhibit constructions that are not available in English
– but ones that are available in other languages spoken by actual adults. This
is unsurprising if children project beyond their experience, rather than being
inductively driven by it. From a nativist perspective, children are free to try out
various linguistic options (compatible with Universal Grammar) before ‘setting
parameters’ in a way that speciÞes some particular natural grammar, like that
of Japanese or American English. A natural extension of this line of thought is
sometimes called the Continuity Hypothesis (Pinker 1984; Crain 1991; Crain
and Pietroski 2001). According to the Continuity Hypothesis, child language
can differ from the local adult language only in ways that adult languages can
differ from each other. The idea is that at any given time, children are speak-
ing a possible (though perhaps underspeciÞed) human language – just not the
particular language spoken around them. If this is correct, we should not be sur-
prised if children of monolingual Americans exhibit some constructions char-
acteristic of German, Romance or East Asian languages, even in the absence
of any evidence for these properties in the primary linguistic data. Indeed, such
mismatches between child and adult language may be the strongest argument
for Universal Grammar.
We conclude with two examples: Wh-questions that reveal a trace of Ro-
mance in Child-English; and Wh-questions that reveal a trace of Germanic in
Child-English. In each case, the relevant facts come into view only when they
are framed within a detailed theory of some non-English phenomena, along-
178 Stephen Crain and Paul Pietroski
This brings us, at last, to Child-English. It has frequently been noted that
why-questions in Child-English tend to lack (subject-auxiliary) inversion to a
greater extent than other wh-elements, and that the absence of inversion for
why-questions persists in children’s speech well after inversion is consistently
present in other wh-questions. Adopting the Continuity Hypothesis, de Vil-
liers (1990) and Thornton (2001) have both suggested that children of English-
speaking adults initially treat the question-word why in the same way as Italian
adults treat perché or come mai. That is, children of English-speaking adults
base generate why in a structural position that differs from the position occu-
pied by other wh-expressions. On this view, children base generate why in a
position that does not require I-to-C movement – unlike other wh-elements.
This explains the absence of inversion in Child-English.
Following the Rizzi-style analysis, Child-English should nevertheless re-
quire inversion for long-distance why-questions, even if a particular child does
not require inversion for matrix why-questions. If this is correct, such a child
should differ from English-speaking adults in the way he forms matrix why-
questions (without inversion), but the child should parallel English-speaking
adults in producing well-formed long-distance why-questions. From a data-
driven perspective, this pattern of (non)conformity is surely not anticipated.
180 Stephen Crain and Paul Pietroski
University of Maryland
References
Braine, Martin D. S. and Barbara Rumain (1981). Development of comprehension of ‘or’: evidence
for a sequence of competencies. Journal of Experimental Child Psychology 31: 46–70.
— (1983). Logical reasoning. In Handbook of Child Psychology, vol. 3: Cognitive Development,
John Flavell and Ellen Markman (eds.), 46–70. New York: Academic Press.
Chierchia, Gennaro (2000). Scalar implicatures and polarity phenomena. Paper presented at NELS
31, Georgetown University, Washington, DC.
Chierchia, Gennaro, Stephen Crain, Maria Teresa Guasti, and Rosalind Thornton (1998). “Some”
and “or”: a study on the emergence of logical form. In Proceedings of the Boston Univer-
sity Conference on Language Development 22, Annabel Greenhill, Mary Hughes, Heather
LittleÞeld, and Hugh Walsh (eds.), 97–108. Sommerville, MA: Cascadilla Press.
Chomsky, Noam (1981). Lectures on Government and Binding. Dordrecht: Foris.
— (1986). Knowledge of Language: Its Nature, Origin and Use. New York: Praeger.
Crain, Stephen (1991). Language acquisition in the absence of experience. Behavioral and Brain
Sciences 14: 97–650.
Crain, Stephen and Paul Pietroski (2001). Nature, nurture and Universal Grammar. Linguistics and
Philosophy 24 (2): 139–186.
Crain, Stephen and Rosalind Thornton (1998). Investigations in Universal Grammar. A Guide to
Experiments on the Acquisition of Syntax and Semantics. Cambridge, MA: The MIT Press.
Crain, Stephen, Andrea Gualmini, and Luisa Meroni (2000). The acquisition of logical words.
LOGOS and Language 1: 49–59.
deVilliers, Jill (1990). Why questions? In Papers in the Acquisition of Wh: Proceedings of the
UMass Roundtable, Thomas L. MaxÞeld and Bernadette Plunkett (eds.), 155–171. Amherst,
MA: University of Massachusetts Occasional Papers.
Epstein, Samuel, Groat Erich M., Kawashima Ruriko, and Kitahawa Hisatsugu (1998). A Deriva-
tional Approach to Syntactic Relations. Oxford: Oxford University Press.
Fromkin, Victoria (ed.) (2000). Linguistics: An Introduction to Linguistic Theory. Malden, MA:
Blackwell Publishers.
Grice, H. Paul (1975). Logic and conversation. In Syntax and Semantics 3: Speech Acts, Peter Cole
and James Morgan (eds.), 41–58. New York: Academic Press.
Heim, Irene (1984). A note on negative polarity and downward entailingness. Proceedings of the
North East Linguistic Society, 14: 98–107.
Horn, Laurence (1989). A Natural History of Negation. Chicago, IL: University of Chicago Press.
— (2000). Pick a theory (not just any theory). In Negation and Polarity: Syntactic and Semantic
Perspectives, Laurence Horn and Yasuhiko Kato (eds.), 147–192. Oxford: Oxford University
Press.
Why language acquisition is a snap 183
Horn, Laurence and Yasuhiko Kato (eds.) (2000). Negation and Polarity: Syntactic and Semantic
Perspectives. Oxford: Oxford University Press.
Hornstein, Norbert and David Lightfoot (1981). Introduction. In Explanations in Linguistics: The
Logical Problem of Language Acquisition, Norbert Hornstein and David Lightfoot (eds.), 9–
31. London: Longman.
Kadmon, N. and F. Landman (1993). Any. Linguistics and Philosophy 16: 353–422.
Ladusaw, William (1996). Negation and polarity items. In Handbook of Contemporary Semantic
Theory, Shalom Lappin (ed.), 321–342. Oxford: Blackwell:.
Laka, Itziar (1990). Negation in syntax: on the nature of functional categories and projections.
Unpublished Ph.D. dissertation, MIT, Cambridge.
Lightfoot, David W. (1991). How to Set Parameters: Arguments from Language Change. Cam-
bridge, MA: MIT Press.
Ludlow, Peter (2002). LF and natural logic: the syntax of directional entailing environments. In
Logical Form and Language, Gerhard Preyer and George Peter (eds), 132–168. Oxford: Ox-
ford University Press.
McDaniel, Dana (1986). Conditions on wh-chains. Ph.D. dissertation, City University of New
York.
May, Robert (1985). Logical Form. Cambridge MA: MIT Press.
Munn, Alan B. (1993). Topics in the syntax and semantics of coordinate structure. Unpublished
Ph.D. dissertation, University of Maryland, College Park.
Musolino, Julien, Stephen Crain, and Rosalind Thornton (2000). Navigating negative semantic
space. Linguistics 38: 1–32.
Pinker, Steven (1984). Language Learnability and Language Development. Cambridge, MA: Har-
vard University Press.
Progovac, Ljiljana (1994). Negative and Positive Polarity: A Binding Approach. Cambridge, MA:
Cambridge University Press.
Rizzi, Luigi (1997). The Þne structure of the left periphery. In Elements of Grammar: Handbook
of Generative Syntax, Liliane Haegeman (ed.), 281–337. Dordrecht: Kluwer Academic Pub-
lishers.
Thornton, Rosalind (1990). Adventures in long-distance moving: the acquisition of complex wh-
questions. PhD. dissertation, University of Connecticut, Connecticut.
— (1995). Children’s negative questions: a production/comprehension asymmetry. In Proceed-
ings of ESCOL, J. Fuller, H. Han, and D. Parkinson (eds.), 306–317. Ithaca, NY: Cornell
University.
— (1996). Elicited production. In Methods for Assessing Children’s Syntax, Dana McDaniel,
Cecile McKee, and Helen S. Cairns (eds.), 77–102. Cambridge, MA: MIT Press.
— (2001). Two tasks. Paper presented at the 2nd Annual Tokyo Conference of Psycholinguistics,
Workshop, Keio University, Tokyo.
Tomasello, Michael (2000). Do young children have adult syntactic competence? Cognition 74:
209–253.