Introduction to syntax
Introduction to syntax
Introduction to syntax
1. Introduction to Syntax
Syntax is a fundamental branch of linguistics that focuses on the rules and principles
governing the structure of sentences in a language. It explores how words combine to
form phrases, clauses, and complete sentences, and how these structures convey
meaning. The study of syntax seeks to uncover the underlying patterns that dictate
word order, grammatical relations, and hierarchical sentence organization. Syntax is
not just about sentence correctness; it also involves understanding how different
sentence forms reflect emphasis, intention, and context in communication.
2. Phrase Structure
Phrase structure refers to the hierarchical organization of words into larger units or
phrases, such as noun phrases (NP), verb phrases (VP), and clauses, which together
form the syntactic skeleton of a sentence. In traditional generative grammar,
especially within the Chomskyan tradition, phrase structure is described using formal
rules—Phrase Structure Rules (PSRs)—which specify how smaller units combine to
form larger syntactic constituents. For example, a typical rule might be S → NP + VP,
indicating that a sentence is composed of a noun phrase followed by a verb phrase. In
A Critical Introduction to Syntax, Miller (2011) approaches this concept with a
critical and analytical lens. He outlines the basic assumptions of phrase structure
theories, particularly the idea that sentence formation is governed by recursive, rule-
based systems. However, Miller does not accept these assumptions uncritically. He
challenges the notion that all languages share a universal phrase structure,
highlighting significant cross-linguistic variability and structural flexibility that resist
rigid rule-based modeling.
Miller also questions the idealization of language that underlies much of traditional
phrase structure grammar. He argues that models which treat syntactic knowledge as a
set of abstract rules often fail to account for actual language use, especially in
spontaneous speech, idiomatic expressions, and non-canonical constructions. For
2
example, constructions like "There is a book on the table" or "The more you read, the
more you understand" do not easily fit into standard phrase structure models without
introducing numerous exceptions or transformations. In this context, Miller introduces
alternative approaches, such as Construction Grammar and Lexical-Functional
Grammar, which view phrase structure not as a set of abstract syntactic rules, but as
emergent from meaningful constructions stored in the mental lexicon. These models
are better suited to handling idiomaticity, irregularity, and frequency-based usage
patterns.
Another critical point raised by Miller is the cognitive status of phrase structures. He
raises the question of whether speakers actually process and mentally represent
sentences using hierarchical phrase structures, or whether this is an analytical artifact
developed by linguists. He draws on psycholinguistic evidence suggesting that
language comprehension and production may rely more on linear processing and
memory-based strategies than on abstract tree structures. Thus, while phrase structure
remains a valuable theoretical tool for analyzing sentence grammar, Miller urges
caution in treating it as a direct reflection of how language works in the mind.
3. Dependency Relations
Moreover, Miller critiques phrase structure grammars for often being more complex
than necessary. He notes that they frequently require additional theoretical
machinery—such as empty categories, movement traces, or transformations—to
account for surface-level variations in sentence structure. Dependency grammars, by
contrast, allow these variations to be captured more transparently through changing
dependency patterns without altering the underlying grammar. Miller argues that this
makes dependency approaches not only simpler but also more adaptable to
computational modeling, as seen in natural language processing and machine
translation.
However, Miller does not advocate for a wholesale replacement of phrase structure
theories. Instead, he promotes a pluralistic and critical approach, encouraging
linguists to consider which model best suits the linguistic data at hand. He also
recognizes that dependency grammars are not without limitations, particularly in
handling phenomena like coordination and certain types of displacement. Nonetheless,
Miller regards the dependency model as a powerful descriptive tool that aligns well
with psycholinguistic evidence, showing that humans often process language based on
direct word-to-word relationships rather than abstract phrase groupings.
4. Lexicon
The lexicon plays a foundational role in any theory of syntax, acting as the mental
repository of a speaker’s knowledge about words, including their meanings,
grammatical properties, and syntactic behavior. In traditional generative grammar, the
lexicon is often seen as a separate component from the syntactic system, providing the
basic building blocks—words with specific features—that syntax combines according
to rule-based phrase structures. However, in A Critical Introduction to Syntax, Miller
(2011) challenges this modular separation and presents a more integrated and
dynamic view of the lexicon, drawing on insights from functional and usage-based
models of grammar.
According to Miller, the lexicon is not just a static list of vocabulary items but a richly
structured system containing detailed information about how words function in
context. This includes morphological information (such as tense and number),
syntactic categories (like noun or verb), subcategorization frames (e.g., whether a
verb requires an object), and argument structure. Importantly, Miller critiques the
traditional Chomskyan model for minimizing the lexicon’s role in syntactic
explanation, arguing that many grammatical patterns can be more effectively
understood through lexical properties rather than abstract syntactic rules.
One of the key perspectives Miller brings to the discussion is the idea that much of
what is traditionally attributed to syntax may actually reside in the lexicon itself. For
example, the fact that the verb give typically requires three arguments (She gave him a
book) reflects a lexical pattern, not just a syntactic rule. Similarly, verbs like sleep or
4
arrive require only one argument (He slept, She arrived), and these valency patterns
are encoded in the lexicon. From this point of view, syntax emerges not from an
independent rule system but from the interaction of lexical items and their stored
usage patterns.
Miller also draws attention to the Construction Grammar perspective, where the
boundary between the lexicon and syntax becomes blurred. In this model, both words
and larger grammatical constructions (like the causative or passive) are stored as part
of a continuum of form-meaning pairings. That is, constructions themselves have
lexical-like status, and syntax is viewed as an extension of the lexicon rather than a
separate module. This approach allows linguists to better explain idiomatic
expressions, irregular forms, and cross-linguistic variation without resorting to
complex transformational rules.
In summary, Miller presents the lexicon not as a passive list of words but as a central,
dynamic component of syntax—one that stores not only lexical items but also
grammatical patterns and constructional knowledge. His approach underscores the
importance of lexical information in understanding syntactic structure and argues for
a more integrated and cognitively realistic view of grammar.
5. Constructions
Miller emphasizes that constructions are not limited to regular, rule-governed patterns
but also include idiomatic and irregular expressions that cannot be predicted from
general syntactic principles. Examples like give me a break, the more, the merrier, or
let alone show that language users acquire holistic units that carry both form and
meaning. These expressions defy traditional phrase structure analysis but are easily
captured in a construction-based model. In such models, a construction is not only a
syntactic template but also carries semantic and sometimes pragmatic information.
For instance, the ditransitive construction [Subject + Verb + Object1 + Object2] as
in She gave him a book conveys a meaning of transfer, regardless of the specific verb
used.
5
6. Case Theory
In GB theory, Case Theory posits that every noun phrase must receive (abstract) case
from an appropriate governor (like a verb, preposition, or tense). For example, in the
sentence He saw her, the verb saw assigns accusative case to the object her, while the
subject he receives nominative case from the finite tense (T). According to Miller, this
theory was introduced to explain patterns of pronoun usage, word order, and
movement phenomena across languages. It accounts for why certain sentences are
ungrammatical—not because of semantic issues, but because case has not been
properly assigned.
Miller is critical of the abstractness and complexity of Case Theory. He questions the
psychological and empirical basis for abstract case, especially since many languages
6
do not mark case overtly. For instance, English largely lacks morphological case
distinctions beyond pronouns. Miller argues that such an abstract system might reflect
theoretical elegance rather than actual linguistic behavior. He also notes that
alternatives—like Dependency Grammar and Construction Grammar—handle
grammatical roles through syntactic relationships rather than abstract licensing
mechanisms like case.
Furthermore, Miller draws attention to languages with rich case systems, such as
Russian or Latin, where case is morphologically visible. Even here, he argues, a
functionalist explanation—where case helps disambiguate meaning or clarify
syntactic roles in free word order contexts—may be more appropriate than abstract
rule-based theories. He emphasizes the need for syntax to be grounded in language
use and communicative function rather than idealized models.
7. Movement
Miller explains that movement rules were originally devised to account for word order
variations that cannot be explained by surface structure alone. These include wh-
movement (as in questions), NP movement (as in passives), and head movement (as
in auxiliary inversion). Generative syntax assumes that such movements leave behind
a trace or empty category, which is used in interpretation. However, these invisible
elements increase the theory’s complexity and abstractness.
He also discusses alternatives such as Construction Grammar, where the need for
movement is greatly reduced or eliminated altogether. In these models, constructions
that appear to involve movement (e.g., questions or passives) are treated as distinct
patterns with their own syntactic and semantic properties, learned and stored
holistically by speakers.
7
8. Grammaticality
Miller critiques the idea that grammaticality is strictly determined by syntactic rules
abstracted from real language use. He points out that many sentences judged
ungrammatical by native speakers are not necessarily syntactically ill-formed—they
might be unusual, pragmatically odd, or simply unfamiliar. For example, sentences
with heavy center embedding (The boy the girl the teacher scolded liked cried) are
technically grammatical by rule but are almost incomprehensible in practice.
9. Locality Conditions
Locality conditions are constraints in generative grammar that restrict how far
elements can "move" or relate to each other in syntactic structures. These conditions
are meant to explain why certain transformations, such as wh-movement or raising,
cannot cross specific boundaries. Miller (2011) explores these conditions as part of
his broader critique of movement and transformational rules, questioning their
necessity and universality.
In traditional models, locality principles like Subjacency and the Minimal Link
Condition limit movement to the nearest available position to avoid ungrammaticality.
For instance, the unacceptability of What do you wonder whether John bought? is
8
attributed to a violation of Subjacency, where what attempts to move across too many
clause boundaries. These principles are designed to make movement more predictable
and constrained.
Miller suggests that syntactic theory should move away from abstract, rule-driven
constraints and focus more on empirical evidence from language use, acquisition, and
processing. He promotes a more dynamic, flexible model of syntax that
accommodates variation without relying on rigid locality conditions.