0% found this document useful (0 votes)
0 views

Introduction to syntax

The document introduces syntax as a key area of linguistics that examines sentence structure, word combination, and meaning conveyance. It critiques traditional generative grammar, emphasizing the need for a more integrated approach that considers cognitive, functional, and cross-linguistic perspectives, particularly through the lens of dependency relations, the lexicon, constructions, and case theory. Miller advocates for models that reflect real language use and processing, challenging rigid rule-based systems in favor of more flexible, usage-based frameworks.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

Introduction to syntax

The document introduces syntax as a key area of linguistics that examines sentence structure, word combination, and meaning conveyance. It critiques traditional generative grammar, emphasizing the need for a more integrated approach that considers cognitive, functional, and cross-linguistic perspectives, particularly through the lens of dependency relations, the lexicon, constructions, and case theory. Miller advocates for models that reflect real language use and processing, challenging rigid rule-based systems in favor of more flexible, usage-based frameworks.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

1

Introduction to syntax

1. Introduction to Syntax

Syntax is a fundamental branch of linguistics that focuses on the rules and principles
governing the structure of sentences in a language. It explores how words combine to
form phrases, clauses, and complete sentences, and how these structures convey
meaning. The study of syntax seeks to uncover the underlying patterns that dictate
word order, grammatical relations, and hierarchical sentence organization. Syntax is
not just about sentence correctness; it also involves understanding how different
sentence forms reflect emphasis, intention, and context in communication.

In traditional grammar, syntax was largely prescriptive, focusing on how language


should be used. However, modern linguistic approaches view syntax descriptively,
analyzing how speakers naturally construct sentences. One of the most influential
contributions to modern syntax comes from Noam Chomsky, who introduced the
concept of generative grammar—an approach that proposes a set of formal rules
capable of generating all grammatical sentences of a language. This theory
emphasizes the innate, cognitive aspects of syntactic knowledge.

Furthermore, syntax interacts closely with other linguistic components such as


morphology (word formation), semantics (meaning), and pragmatics (language use in
context). By studying syntax, linguists aim to better understand the cognitive
structures behind language, the diversity of sentence patterns across languages, and
the universal principles that may underlie all human languages.

2. Phrase Structure

Phrase structure refers to the hierarchical organization of words into larger units or
phrases, such as noun phrases (NP), verb phrases (VP), and clauses, which together
form the syntactic skeleton of a sentence. In traditional generative grammar,
especially within the Chomskyan tradition, phrase structure is described using formal
rules—Phrase Structure Rules (PSRs)—which specify how smaller units combine to
form larger syntactic constituents. For example, a typical rule might be S → NP + VP,
indicating that a sentence is composed of a noun phrase followed by a verb phrase. In
A Critical Introduction to Syntax, Miller (2011) approaches this concept with a
critical and analytical lens. He outlines the basic assumptions of phrase structure
theories, particularly the idea that sentence formation is governed by recursive, rule-
based systems. However, Miller does not accept these assumptions uncritically. He
challenges the notion that all languages share a universal phrase structure,
highlighting significant cross-linguistic variability and structural flexibility that resist
rigid rule-based modeling.

Miller also questions the idealization of language that underlies much of traditional
phrase structure grammar. He argues that models which treat syntactic knowledge as a
set of abstract rules often fail to account for actual language use, especially in
spontaneous speech, idiomatic expressions, and non-canonical constructions. For
2

example, constructions like "There is a book on the table" or "The more you read, the
more you understand" do not easily fit into standard phrase structure models without
introducing numerous exceptions or transformations. In this context, Miller introduces
alternative approaches, such as Construction Grammar and Lexical-Functional
Grammar, which view phrase structure not as a set of abstract syntactic rules, but as
emergent from meaningful constructions stored in the mental lexicon. These models
are better suited to handling idiomaticity, irregularity, and frequency-based usage
patterns.

Another critical point raised by Miller is the cognitive status of phrase structures. He
raises the question of whether speakers actually process and mentally represent
sentences using hierarchical phrase structures, or whether this is an analytical artifact
developed by linguists. He draws on psycholinguistic evidence suggesting that
language comprehension and production may rely more on linear processing and
memory-based strategies than on abstract tree structures. Thus, while phrase structure
remains a valuable theoretical tool for analyzing sentence grammar, Miller urges
caution in treating it as a direct reflection of how language works in the mind.

Ultimately, Miller’s treatment of phrase structure encourages a broader and more


flexible understanding of syntax—one that integrates cognitive, functional, and cross-
linguistic perspectives. He advocates moving beyond rigid rule systems and toward
models that reflect how language is used, processed, and learned in real-life contexts.
Phrase structure, then, is not discarded but re-evaluated through a critical lens that
prioritizes empirical adequacy and cognitive realism.

3. Dependency Relations

Dependency relations offer an alternative to phrase structure theories by modeling


syntax as a set of binary relationships between individual words, rather than
hierarchical groupings of phrases. In a dependency-based approach, the structure of a
sentence is built upon the relationships between a head word and its dependents. For
example, in the sentence The student read a book, the verb read is considered the
syntactic head of the sentence, while the student and a book are its dependents. This
model focuses on how words function in relation to one another, rather than how they
are grouped into nested constituents. According to Miller (2011), dependency
grammars provide a more direct and often more cognitively plausible representation
of syntactic structure, especially for languages that exhibit free or flexible word order.

Miller gives considerable attention to the advantages of dependency relations in


syntactic analysis. He emphasizes that this model is particularly effective for dealing
with languages that do not conform easily to fixed word orders or strict phrase
structures, such as many Slavic languages. In such cases, the notion of dependencies
allows syntacticians to map grammatical roles (such as subject, object, or modifier)
without having to rely on rigid constituent hierarchies. The dependency model
simplifies syntactic representation by eliminating the need for intermediate nodes like
NP (Noun Phrase) or VP (Verb Phrase), instead drawing direct connections between
heads and dependents. This economy of representation is one of its major theoretical
strengths.
3

Moreover, Miller critiques phrase structure grammars for often being more complex
than necessary. He notes that they frequently require additional theoretical
machinery—such as empty categories, movement traces, or transformations—to
account for surface-level variations in sentence structure. Dependency grammars, by
contrast, allow these variations to be captured more transparently through changing
dependency patterns without altering the underlying grammar. Miller argues that this
makes dependency approaches not only simpler but also more adaptable to
computational modeling, as seen in natural language processing and machine
translation.

However, Miller does not advocate for a wholesale replacement of phrase structure
theories. Instead, he promotes a pluralistic and critical approach, encouraging
linguists to consider which model best suits the linguistic data at hand. He also
recognizes that dependency grammars are not without limitations, particularly in
handling phenomena like coordination and certain types of displacement. Nonetheless,
Miller regards the dependency model as a powerful descriptive tool that aligns well
with psycholinguistic evidence, showing that humans often process language based on
direct word-to-word relationships rather than abstract phrase groupings.

In conclusion, Miller uses dependency relations to challenge traditional syntactic


assumptions, highlighting their descriptive clarity, theoretical economy, and cross-
linguistic relevance. He positions them as a valuable framework within a broader
syntactic toolkit, particularly in cases where conventional phrase structure analysis
falls short.

4. Lexicon

The lexicon plays a foundational role in any theory of syntax, acting as the mental
repository of a speaker’s knowledge about words, including their meanings,
grammatical properties, and syntactic behavior. In traditional generative grammar, the
lexicon is often seen as a separate component from the syntactic system, providing the
basic building blocks—words with specific features—that syntax combines according
to rule-based phrase structures. However, in A Critical Introduction to Syntax, Miller
(2011) challenges this modular separation and presents a more integrated and
dynamic view of the lexicon, drawing on insights from functional and usage-based
models of grammar.

According to Miller, the lexicon is not just a static list of vocabulary items but a richly
structured system containing detailed information about how words function in
context. This includes morphological information (such as tense and number),
syntactic categories (like noun or verb), subcategorization frames (e.g., whether a
verb requires an object), and argument structure. Importantly, Miller critiques the
traditional Chomskyan model for minimizing the lexicon’s role in syntactic
explanation, arguing that many grammatical patterns can be more effectively
understood through lexical properties rather than abstract syntactic rules.

One of the key perspectives Miller brings to the discussion is the idea that much of
what is traditionally attributed to syntax may actually reside in the lexicon itself. For
example, the fact that the verb give typically requires three arguments (She gave him a
book) reflects a lexical pattern, not just a syntactic rule. Similarly, verbs like sleep or
4

arrive require only one argument (He slept, She arrived), and these valency patterns
are encoded in the lexicon. From this point of view, syntax emerges not from an
independent rule system but from the interaction of lexical items and their stored
usage patterns.

Miller also draws attention to the Construction Grammar perspective, where the
boundary between the lexicon and syntax becomes blurred. In this model, both words
and larger grammatical constructions (like the causative or passive) are stored as part
of a continuum of form-meaning pairings. That is, constructions themselves have
lexical-like status, and syntax is viewed as an extension of the lexicon rather than a
separate module. This approach allows linguists to better explain idiomatic
expressions, irregular forms, and cross-linguistic variation without resorting to
complex transformational rules.

Furthermore, Miller emphasizes that the lexicon is shaped by experience, usage


frequency, and context. He argues that language users do not just internalize abstract
rules but develop a network of lexical associations based on what they encounter in
real-world communication. This view aligns with cognitive and psycholinguistic
research, which supports a usage-based model of language acquisition and processing.

In summary, Miller presents the lexicon not as a passive list of words but as a central,
dynamic component of syntax—one that stores not only lexical items but also
grammatical patterns and constructional knowledge. His approach underscores the
importance of lexical information in understanding syntactic structure and argues for
a more integrated and cognitively realistic view of grammar.

5. Constructions

The concept of constructions is central to alternative approaches to syntax that move


beyond rule-based, generative models. In A Critical Introduction to Syntax, Miller
(2011) provides a comprehensive and critical discussion of constructions, especially
in relation to Construction Grammar, a framework that challenges the traditional
division between syntax and lexicon. According to this view, constructions are
learned pairings of form and meaning that range from single words to complex
sentence patterns. This approach contrasts with formal syntactic theories, where rules
are abstract, universal, and operate independently of meaning. For Miller,
constructions offer a more flexible and realistic account of how syntax actually works
in natural language.

Miller emphasizes that constructions are not limited to regular, rule-governed patterns
but also include idiomatic and irregular expressions that cannot be predicted from
general syntactic principles. Examples like give me a break, the more, the merrier, or
let alone show that language users acquire holistic units that carry both form and
meaning. These expressions defy traditional phrase structure analysis but are easily
captured in a construction-based model. In such models, a construction is not only a
syntactic template but also carries semantic and sometimes pragmatic information.
For instance, the ditransitive construction [Subject + Verb + Object1 + Object2] as
in She gave him a book conveys a meaning of transfer, regardless of the specific verb
used.
5

Miller further argues that constructions are psychologically real, as supported by


usage-based and cognitive linguistics research. Speakers and learners of a language
internalize these constructions as wholes, often without abstracting away from the
individual patterns. This perspective aligns with findings in language acquisition,
where children appear to learn language by acquiring frequent constructions before
they understand or apply abstract grammatical rules. Thus, constructions are not
byproducts of rules but are fundamental units of syntactic knowledge.

One of Miller's key contributions in discussing constructions is his critique of the


sharp distinction drawn in traditional generative grammar between the lexicon (where
meaning is stored) and syntax (where structure is generated). In Construction
Grammar and related theories, there is no such division: both words and larger
syntactic patterns are stored and accessed in similar ways. This model better accounts
for the fluid and gradient nature of language, including how speakers often innovate
with existing constructions or blend them in creative ways.

Moreover, Miller stresses the importance of frequency and context in shaping


constructions. Frequently used patterns become entrenched in the minds of speakers
and shape expectations during language processing. This focus on actual usage
contrasts with idealized models that ignore performance data. In his critical stance,
Miller advocates for a syntax theory that accounts for the richness and variability of
real-world language use, something that the construction-based approach does
effectively.

In conclusion, Miller treats constructions as a central element of syntactic theory,


essential for understanding both the regularities and irregularities of language. He
views them as cognitively grounded, empirically observable, and more adequate in
explaining the dynamic nature of syntax than abstract rule-based systems.

6. Case Theory

Case Theory is a central component of generative grammar, particularly in


Chomskyan frameworks, and deals with how noun phrases (NPs) receive grammatical
"case" in syntactic structures. Miller (2011) critically reviews Case Theory, especially
its use in earlier transformational grammar and Government and Binding (GB) theory.
In these models, case is not simply a morphological marker (like “-s” in English
possessives or nominative/dative forms in other languages); instead, it is treated as an
abstract syntactic requirement that ensures well-formedness of sentence structures.

In GB theory, Case Theory posits that every noun phrase must receive (abstract) case
from an appropriate governor (like a verb, preposition, or tense). For example, in the
sentence He saw her, the verb saw assigns accusative case to the object her, while the
subject he receives nominative case from the finite tense (T). According to Miller, this
theory was introduced to explain patterns of pronoun usage, word order, and
movement phenomena across languages. It accounts for why certain sentences are
ungrammatical—not because of semantic issues, but because case has not been
properly assigned.

Miller is critical of the abstractness and complexity of Case Theory. He questions the
psychological and empirical basis for abstract case, especially since many languages
6

do not mark case overtly. For instance, English largely lacks morphological case
distinctions beyond pronouns. Miller argues that such an abstract system might reflect
theoretical elegance rather than actual linguistic behavior. He also notes that
alternatives—like Dependency Grammar and Construction Grammar—handle
grammatical roles through syntactic relationships rather than abstract licensing
mechanisms like case.

Furthermore, Miller draws attention to languages with rich case systems, such as
Russian or Latin, where case is morphologically visible. Even here, he argues, a
functionalist explanation—where case helps disambiguate meaning or clarify
syntactic roles in free word order contexts—may be more appropriate than abstract
rule-based theories. He emphasizes the need for syntax to be grounded in language
use and communicative function rather than idealized models.

In conclusion, while acknowledging the explanatory power of Case Theory within


formal syntax, Miller calls for a more critical, empirical, and usage-based approach.
He encourages linguists to consider whether abstract mechanisms like case are
genuinely necessary or if simpler, more cognitively plausible models can better
account for syntactic patterns.

7. Movement

Movement refers to the transformational process where constituents are displaced


from their original positions to satisfy syntactic constraints. In generative grammar,
particularly in Chomsky’s Transformational-Generative and Government and Binding
models, movement is a central concept. For example, in a question like What did you
eat?, the object what is said to move from its original position after eat to the
beginning of the sentence. Miller (2011) examines movement critically,
acknowledging its role in traditional syntactic explanation but questioning its
necessity and cognitive reality.

Miller explains that movement rules were originally devised to account for word order
variations that cannot be explained by surface structure alone. These include wh-
movement (as in questions), NP movement (as in passives), and head movement (as
in auxiliary inversion). Generative syntax assumes that such movements leave behind
a trace or empty category, which is used in interpretation. However, these invisible
elements increase the theory’s complexity and abstractness.

Miller’s critique focuses on this abstractness. He questions whether postulating such


movements is the most economical or empirically justified way to account for
variation. In many languages, surface word order can be explained through linear
dependencies and discourse-driven choices without resorting to movement. Moreover,
constructionist and usage-based theories explain such structures in terms of stored
patterns and frequency rather than derivations from deep structure.

He also discusses alternatives such as Construction Grammar, where the need for
movement is greatly reduced or eliminated altogether. In these models, constructions
that appear to involve movement (e.g., questions or passives) are treated as distinct
patterns with their own syntactic and semantic properties, learned and stored
holistically by speakers.
7

In summary, while movement provides a powerful way to explain complex syntactic


patterns within generative grammar, Miller highlights its theoretical cost and
questions whether alternative, non-derivational models may offer more cognitively
and descriptively appropriate explanations.

8. Grammaticality

Grammaticality refers to whether a sentence is considered acceptable according to


the syntactic rules of a language. In generative grammar, grammaticality is often
judged in terms of conformity to an internalized set of formal rules, not whether the
sentence is meaningful or commonly used. Miller (2011) challenges this rule-based,
binary notion of grammaticality and calls for a more nuanced, usage-based
understanding.

Miller critiques the idea that grammaticality is strictly determined by syntactic rules
abstracted from real language use. He points out that many sentences judged
ungrammatical by native speakers are not necessarily syntactically ill-formed—they
might be unusual, pragmatically odd, or simply unfamiliar. For example, sentences
with heavy center embedding (The boy the girl the teacher scolded liked cried) are
technically grammatical by rule but are almost incomprehensible in practice.

He proposes that grammaticality should be seen as a gradient rather than a binary


concept. Language users make acceptability judgments based on multiple factors:
frequency of usage, processing difficulty, contextual appropriateness, and semantic
clarity. Miller supports this view with psycholinguistic and corpus-based evidence,
showing that acceptability often correlates with familiarity and exposure, not just with
syntactic conformity.

In constructionist and usage-based models, grammaticality emerges from patterns in


language use. If a certain construction is frequently heard and used, it is judged
grammatical. This perspective challenges the idea that there is a universal grammar
separate from language performance and encourages linguists to study language as it
is actually used.

Thus, Miller reframes grammaticality as an empirical, experience-based phenomenon


rather than a product of idealized rules. He advocates for syntactic theories that align
more closely with language users' actual intuitions and linguistic behavior.

9. Locality Conditions

Locality conditions are constraints in generative grammar that restrict how far
elements can "move" or relate to each other in syntactic structures. These conditions
are meant to explain why certain transformations, such as wh-movement or raising,
cannot cross specific boundaries. Miller (2011) explores these conditions as part of
his broader critique of movement and transformational rules, questioning their
necessity and universality.

In traditional models, locality principles like Subjacency and the Minimal Link
Condition limit movement to the nearest available position to avoid ungrammaticality.
For instance, the unacceptability of What do you wonder whether John bought? is
8

attributed to a violation of Subjacency, where what attempts to move across too many
clause boundaries. These principles are designed to make movement more predictable
and constrained.

Miller critiques these conditions as ad hoc solutions to problems created by the


movement framework itself. He argues that many such constraints are introduced not
because they reflect language use, but to patch up theoretical inconsistencies in
transformational models. He asks whether these conditions are psychologically real or
whether they simply reflect the internal logic of a complex theory.

In construction-based approaches, where movement is either minimized or eliminated,


such locality constraints are unnecessary. Instead of explaining ungrammaticality
through failed movements, these models account for acceptable constructions through
pattern recognition and frequency of usage. For example, rare or dispreferred
structures may be judged unacceptable not because they violate formal constraints,
but because they are unfamiliar or semantically awkward.

Miller suggests that syntactic theory should move away from abstract, rule-driven
constraints and focus more on empirical evidence from language use, acquisition, and
processing. He promotes a more dynamic, flexible model of syntax that
accommodates variation without relying on rigid locality conditions.

In conclusion, while locality conditions have been important in generative syntax,


Miller encourages linguists to question their necessity and to seek models grounded in
cognitive plausibility and actual language behavior.

You might also like