Topics in Grammar Workbook Corrected
Topics in Grammar Workbook Corrected
A Workbook
John Corbett
University of Sao Paulo: Letras
2019
Grammars of English
Contents
Chapter
i
Chapter 1
Early Approaches to Grammatical Analysis
1.0 Introduction
The name of this coursebook is ‘Topics in Grammar’, with the emphasis on topics. Its
aim is to give an overview of some of the main grammatical theories of the last century.
In short, it is a survey of some of the main grammatical theories developed in the 20th
and early 21st century. The workbook is, however, not exhaustive – some grammatical
theories are not covered – and I do not wish to imply that grammatical theorising began
in the early twentieth century. People were thinking and writing about grammatical
relationships in Ancient Greece – indeed many of our traditional categories (such as
noun, verb, conjunction, gender, case, tense, and so on) can be found in the work of
early Greek philosophers such as Dionysius Thrax, who lived in the first century B.C.
However, in the last one hundred years, the study of the grammar of the contemporary
vernacular languages has regained the status of an academic subject. Different
‘schools’ of grammatical theory have developed, all of which seek to account for the
facts of language.
Many university undergraduate courses select the grammatical approach of one of these
‘schools’ and deliver a detailed introduction to that approach. This course is different.
We will be looking at several approaches in somewhat less detail, in order to give a
general orientation towards what grammatical theories try to do. We will argue that
grammatical theories can be distinguished by their initial assumptions about what
language is and by the overall goals of their grammatical descriptions. Much of this
course will focus on the various approaches in order to show how their different
assumptions lead them to think about grammar in rather different ways. The workbook
is not organized chronologically, but thematically. To preview the workbook as a
whole, we begin with functional grammar. We look at a popular and widely discussed
linguistic theory, Systemic-Functional (SF) grammar, which assumes that grammar
realises a set of meaningful options that develop to serve the individual and social
requirement to communicate in different situations. The goal of the systemic-functional
grammarian is to identify and describe the set of options or meaning potential of any
given language. We then contrast functional grammar with a sequence of more formal
approaches, some of which may be familiar to you from earlier courses you have taken.
One of the earliest formal approaches, Immediate Constituent (IC) grammar, supposes
that language is a complex set of structures which must be described using procedures
which can be scientifically validated. A later approach, Transformational-Generative
(TG) grammar, supposes that the structures that are acceptable in any given language
derive from a set of instinctive mental procedures, and so the job of a grammarian is not
just to describe the structures, but to model the kind of mental procedures which will
generate the kind of sentences which are naturally produced by a native speaker. A
development of TG grammar, Universal Grammar (UG) seeks to model and describe
the initial mental processes which are supposedly common to all communicating
humans, and which allow them to develop the particular grammars of their various
mother tongues. More recently, since the early 1980s, technological progress in the
development of corpus-based linguistic analysis has allowed the computerised
searching of large amounts of ‘authentic’ language production to verify and dispute our
intuitions about how grammar works. Corpus-based grammars claim to be ‘data-
driven’ – that is, they claim to arise from the analysis of language as it is used, not from
the intuitions of the contemplative grammarian. However, as we shall see, there is no
1
such thing as a description of grammar that is innocent of some kind of theoretical
assumption. Finally, we combine corpus based grammar with a consideration of more
recent cognitive grammars, which attempt to account for the facts of grammar by
appealing to more general ways in which human beings perceive and make sense of the
world around them.
The breadth of this workbook is achieved at the cost of detail. This course is intended
to give a flavour of each grammatical theory by suggesting its goals and hinting at its
procedures. In the class tests and final essay, you are expected to demonstrate your
familiarity with the broad sweep of the survey presented here, but you may also wish to
look in greater depth one of the theories that appeals to you particularly. In other
words, you are expected to use your own initiative to find out more about at least one of
the grammatical theories introduced here. To this end, some introductory reading (upon
which much of this workbook is based) is recommended in the closing pages of this
workbook, and you will be given credit if you can show explicitly that you have
engaged with some of these books in a critical and perceptive way in your assessed
work.
In most basic grammar courses, you would first of all gave these words form labels, e.g.
The form labels tell us what kind of word each of the above is. They do not tell us the
relationship between the words as such. To show this, we gave the words function
labels – labels that tell us the relationship of each word to the others around them. The
three options for functional labels at the level of word and phrase are modifier (M),
headword (H) and neither (x):
H H M H x H H H
I joined this course because I love grammar.
pn V d N c pn V N
By looking at the functional labels, we can divide the sentence into phrases, usually
labelling each according to its headword:
H H M H x H H H
(I) (joined) (this course) because (I) (love) (grammar).
NP pn VP V NP d N c NP pn VP V NP N
2
We can now examine the functional relationship – not only between the words but
between the phrases. Basically, this means asking what kind of job each NP is doing,
what each VP is doing, and so on. In other words, we can begin to perceive the function
of each phrase as a Subject, Predicator, Object, Complement or Adverbial, and we
might come up with something like the following:
SH P H O M H x S H P H O H
(I) (joined) (this course) because (I) (love) (grammar).
NP pn VP V NP d N c NP pn VP V NP N
SH P H O M H A x S H P H O H
Se { MCl [(I) (joined) (this course) SCl [because (I) (love) (grammar).]}
NP pn VP V NP d N ACl c NP pn VP V NP N
Now we have shown that the embedded clause, [because I love grammar], functions as
an Adverbial in the main clause: it is therefore an Adverbial clause.
Not every introduction to grammar, nor every grammar reference book, uses the same
labels to describe grammatical categories – some books call Adverbials Adjuncts, for
example, and others merge Objects and Complements as different kinds of Complement.
But basically, with a little adjustment or flexibility with labelling, we will assume that
you can take a relatively complex sentence like the example shown above, and you can
identify the form labels, and how the words (= pn, N, V, etc) combine, as modifiers plus
headword, into phrases (NP, VP, AjP etc), which in turn function as clause elements
(SPOCA), which finally combine into sentences.
If you want to practise this kind of analysis, there is a good online program provided by
University College London, The Internet Grammar of English available at
https://www.ucl.ac.uk/internet-grammar and I strongly recommend you to explore that
resource. This course moves beyond this kind of basic analysis. The main aim of this
course will be to consider why we analyse sentences in the ways that we do – and to
examine a range of different approaches. You can therefore expect some of the
grammatical theories we are going to look at to make different assumptions about
matters like the structure of the NP, or the relationship between Predicator and Object.
This can be confusing at first, but as you become familiar with each theory, you should
begin to understand the principles on which it is based.
What we will do is look at the main theories of English grammar over the past century:
the European tradition is represented by the Prague School linguists and their heirs,
particularly the British linguist, Michael Halliday, while the American tradition is
represented by Leonard Bloomfield and Noam Chomsky and their followers. We shall
also take a brief look at the work of corpus grammarians, focusing in particular on John
Sinclair, and we complete this overview by looking at the cognitive grammar promoted
by Ron Langacker, and others. Despite their differences, which should become
3
apparent, these grammarians have struggled collectively to make explicit the rules
governing the structure of sentences: how are sentences organised; what are the best
ways of classifying the linguistic items; what is the best way of representing the rules by
which sentences are described, or even generated? These are the questions which link
the bewildering array of modern grammarians, and they are questions which are
popularly traced back to the work of a Swiss scholar, often described as the ‘father of
modern linguistics’, Ferdinand de Saussure (1857-1913).
4
Saussure who formulated in a clear and accessible way some guidelines for future
linguists to follow.
A crucial distinction he made was between langue and parole – two words that are
sometimes inadequately translated as ‘language’ and ‘speaking’. Most anglophone
linguists therefore still use the French terms when referring to these concepts. ‘Parole’
is probably easier to define – the set of actual utterances which people produce when
they are speaking or writing a language constitutes the ‘parole’. The ‘langue’ is the
abstract language system that people share, the evidence for which comes from the
actual utterances, the parole. According to Saussure, what the linguist does is look at
the parole and, using it as evidence, he or she tries to describe the langue. The rules of
grammar are therefore aspects of the ‘langue’ and we can use these rules to describe
actual utterances, the ‘parole’. To return to the chess analogy, the researcher looks at
actual games of chess (the equivalent to ‘parole’) and on the evidence of the way people
play, tries to write the rules of the game (‘the langue’).
If you think about it, this distinction – between actual events and the abstract system
they provide evidence for – has been influential well beyond grammar, or linguistics. It
is the key theoretical concept in structuralist theories of literature or media studies. One
way into genre studies – whether of folk-tales or film noir – is to consider a tale or a
movie as a particular example of a set of conventions or rules which it is the critic’s job
to make explicit. For this reason, you hear people talk about constructing a ‘grammar’
of folk tales, or a ‘grammar’ of film. As we move from concrete instances to the
abstract conventions that explain those instances, we are moving from parole to langue.
Paradigmatic relations are on the vertical axis between these two sentences: in other
words, the words ‘I’ and ‘she’ are somehow related, and we know this because one can
be substituted for another. Similarly, ‘joined’/’dropped’ and ‘love’/’hates’ can be
5
substituted for each other – so these must be somehow related. To acknowledge this
relationship, we classify these words similarly – ‘I’/’she’ are pronouns, while
‘joined/dropped’ and ‘love/hates’ are verbs. Part of the grammarian’s rationale for
classifying these words as belonging to the same set is because they exist in
paradigmatic relation to each other. ‘I’ and ‘the’ cannot be substituted for each other –
they are not in paradigmatic relation to each other – and so they are therefore different
parts of speech.
Syntagmatic relations are on the horizontal axis. You might have noticed that there is a
problem above with the straight substitution of ‘love’ and ‘hates’: if you try swapping
them you get *’I hates’, and *’she love’ – neither of which is acceptable in standard
English. What is happening here? Obviously, the form of the pronoun is influencing
the form of the verb – you may remember this grammatical fact described in your
earlier studies as Subject-Verb concord or agreement. Since concord is a grammatical
relation along the horizontal axis of the sentence, it is syntagmatic.
Now, armed with these two powerful grammatical relations, we can begin to develop a
methodology for analysing sentences. We can ask (i) what can be substituted for any
particular linguistic unit (i.e. what exists in paradigmatic relation to it?) and (ii) what
effect does the presence of a linguistic unit have on the others around it (i.e. what are its
syntagmatic relations)? Every grammatical theory that we will look at has at its core the
questions of classification and combination – what basic parts of speech are there, and
how can they be combined into more complex units?
6.0 Conclusion
I have begun this workbook as I mean to go on, by being very selective in the concepts
that I’ve chosen to present, in the hope that these concepts will seem reasonably clear
and simple. In every simplification there is a distortion, however, and again I hope that
you will engage in enough background reading to come to a more sophisticated
understanding of the main theoretical ideas introduced in this workbook.
A last word about Saussure – he lectured on much more than I have mentioned here –
on phonology, writing systems, dialect and even on diachronic linguistics. He was
interested in the relationship of language and the mind, and language and the social
group – as we shall see, these separate interests became the main focus of interest of
American and European grammarians, the former arguably being more interested in the
mind while the latter are arguably more interested in society. Saussure is also credited
with inventing the discipline of semiology, or semiotics – ‘the study of the life of signs’
– and he saw language study as being part of this wider, as yet largely undeveloped,
discipline. This insight has again been enormously influential in twentieth century
literary and cultural studies, as well as in linguistic study.
6
7. 1. Identifying words
a) Which of the following are prepositions and which are adverbs, and which can be
both? How can your knowledge of paradigmatic relations help you decide this?
b) Using similar ‘grammatical tests’, decide which of the following are prepositions
and which are conjunctions
c) Can you think of a syntagmatic test which can help you to distinguish the following
adverbs and conjunctions? Which can be either?
Note
Grammatical tests can help us identify parts of speech, but few are reliable in isolation.
Usually to determine how a word functions, different tests have to be devised.
2. Identifying phrases
Identify the phrase structure of the following by marking the phrases with round
brackets. Label the phrases only.
a) a man
b) a loud-voiced man
c) a very loud-voiced man
d) A very loud voiced man is calling for you because he wants to take you away
in his big, flashy automobile.
e) Don’t let him!
7
Chapter 2
Introducing Functional Grammar
1.0 Introduction
Many university departments collectively, or individuals within them, are primarily
interested in what has come to be known as a ‘functional’ explanation of language
structure. Even within the functional tradition, there are many different schools. One of
the most highly developed functional explanations of grammatical structure is Systemic
Functional Grammar (SFG), and that is what we shall focus on here.
Who is speaking?
Why is this being said?
What is the context?
What alternative ways of saying this are there?
Why has this particular realisation been chosen?
SF grammarians largely draw inspiration from the work of Michael Halliday, a British
linguist born in 1925. Much of the next few chapters is based on Halliday's work,
directly or indirectly. Although Halliday is the ‘father figure’ in systemic-functional
linguistics, he has precursors.
8
context, function and meaning in his programme for a new kind of linguistic description
-- a programme which Halliday was to inherit and develop.
A crucial concept in the developing linguistic theory is system. Firth (1957: 143) wrote:
Linguists and sociologists have to deal with systems, but systems very different
from physical systems. Personal systems and social systems are actively
maintained (with adaptation and change) in the bodily behaviour of men.
***
Language and personality are built into the body, which is constantly taking part
in activities directed to the conservation of the pattern of life. We must expect
therefore that linguistic science will also find it necessary to postulate the
maintenance of linguistic patterns and systems (including adaptation and change)
within which there is order, structure, and function. Such systems are maintained
by activity, and in activity they are to be studied. It is on these grounds that
linguistics must be systemic.
passive
VOICE
active
The network has a point of entry: the voice system at clause level. In this system there
are two options, active and passive. We can build on this network. In English it is also
possible to say James was kissed -- i.e. we can delete the actor in the passive voice.
This possibility can be added to our network:
9
actor explicit
passive
active
We shall return to the topic of systemic networks in more detail in the next chapter.
Systemic networks show the system: the relationship between different structures. But
so far we have said little about the functional values of the elements of the structures.
That is, the systemic network has shown us the difference between Janet kissed James
and James was kissed by Janet, but we still have said little about the function of Janet in
each sentence.
In 1844 the German linguist Henri Weil published two theses which were written for his
doctorate at the University of Paris. One of them was called De l'ordre des mots dans
les langues anciennes comparees aux langues modernes -- later translated into English
as The Order of Words in the Ancient Languages compared with that of the Modern
Languages (1887). One of Weil's arguments -- which is obvious from the title of the
book -- is that word order in a sentence is meaningful. The question is: what kind of
meaning does word-order carry? Weil divided the sentence into two parts and
explained them so:
It was in the first place necessary that this other personage, with whom it was
desired to communicate, should be placed at the same point of view with the
speaker; it was necessary that a word of introduction should precede the remark
which it was intended to utter; it was necessary to lean on something present, and
known, in order to reach out to something less present, nearer, or unknown. There
is then a point of departure, an initial notion which is equally present to him who
speaks and him who hears, which forms, as it were, the ground upon which the
two intelligences meet; and another part of discourse which forms the statement
(l'énonciation), properly so called. This division is found in almost all we say.
10
From Weil, H (1887; 1978) The Order of Words in the Ancient Languages compared with that of
the Modern Languages translated by CW Super, Amsterdam: John Benjamins p. 29
It is worth quoting Weil, or rather the 1887 translation of Weil, at some length, because
his attempt to articulate the differences between the two components of the sentence
finds many echoes down the decades, through the Prague School, to Hallidayan
functional grammar. Consider the functions of his two ‘parts’ of the sentence:
This formulation greatly influenced the Prague School linguist Mathesius, and he
labelled the two parts of the sentence Theme and Rheme: the Theme being the given
information, or point of departure, and the Rheme being the relatively new information.
How does this division work in practice? To answer this, consider the two versions
below of a paragraph written by the cultural theorist, Raymond Williams. Which of the
two versions do you find more readable?
Version A
What is the history of film? We are likely to put a defining emphasis on 'film' and pass lightly
over 'history' in considering this question. The noun that brings us to our subject is 'film'. Its
already defined properties seem to be followed naturally by its history or any other intellectual
properties relevant to it.
Version B
What is the history of film? In considering this question, we are likely to pass lightly over 'history'
and put a defining emphasis on 'film'. 'Film' is the noun that brings us to our subject. Its history,
or any other intellectual process relevant to it, seem to follow naturally from its already defined
properties.
From ‘Film History’ in Raymond Williams (1990) What I Came to Say Hutchinson p132
B is the version as it was originally published. If you look at it, you can see the
thematic progression from sentence to sentence. Sentence (1) asks a question. In
Version B, sentence (2) begins by referring to this question (which has just been read
and is therefore ‘known’), and ends with the claim that ‘film’ rather than ‘history’
should be the focus of the discussion. Sentence (3) picks up the Theme of ‘film’ and
Sentence (4) picks up the Theme of ‘history’.
In Version A, as I have rewritten the paragraph, each sentence begins with a Theme
which carries a considerable amount of ‘new’ information, and many of the Rhemes
also carry ‘given’ information. The question in sentence (1) is followed by thematic
‘We’ in sentence (2). Sentence (2) then presents the question that we have already read
in sentence (1) as its ‘new’ information at the end. Sentence (3) opens with ‘The noun
11
that brings us to our subject’ -- the thematic position in the sentence suggests that we
should be acquainted with this noun but we are not. The noun turns out to be ‘film’ --
which was introduced in sentence (1). The final sentence begins with a reference to
film's ‘already defined properties’ -- which, again, because of its thematic position in
the sentence, we perhaps feel that we should know something about.
If you found Version A less readable it might be because the Theme and Rheme in each
its sentences defy our expectation that sentences in English should begin with relatively
‘known’ or ‘given’ information, and proceed to relatively ‘new’ information. That at
least was the expectation of Mathesius, following Weil. The study of Theme and
Rheme was subsequently developed into a theory of Functional Sentence Perspective
(FSP) by Prague School linguists such as Firbas and Daneš. Prague School linguists
today see Theme and Rheme as a conflation not only of word order but also of
intonation patterns -- the nucleus of the tone unit falls on the new information.
...the Theme is the starting-point for the message; it is what the clause is going to
be about. So part of the meaning of any clause lies in which element is chosen as
its Theme. There is a difference in meaning between a halfpenny is the smallest
English coin, where a halfpenny is Theme ('I'll tell you about a halfpenny'), and
the smallest English coin is a halfpenny, where the smallest English coin is
Theme ('I'll tell you about the smallest English coin'). The difference may be
characterised as 'thematic'; the two clauses differ in choice of Theme. By glossing
them in this way, as 'I'll tell you about...' we can feel that they are two different
messages.
The point to grasp here is that the ‘functional’ part of the ‘Systemic-Functional’ label
should be seen in the context of a tradition of trying to understand grammar as the
organisation of meaningful, functional elements which presuppose a relationship
between a speaker, a hearer and a text. Theme and Rheme are two such functional
constituents. There are more, as we shall see in Chapter 3.
Whereas some grammarians would see a sentence in the active voice and a sentence in
the passive voice as preserving meaning, SF grammarians view the two sentences as
different in meaning. In SF grammar, each sentence would represent a choice from a
system of meaningful options, and the difference in meaning can be understood by
referring to the functional elements which make up each clause – elements such as
Theme and Rheme.
What I have tried to do here is to give a rough sketch of some of SF 's intellectual
ancestry. In the coming chapters we shall consider system and function in more detail.
12
Chapter 3:
What is ‘Systemic’ about Systemic-Functional Grammar?
1.0 Systems
Chapter 2 considered very briefly the general concept of ‘systems’ in systemic
functional grammar. This chapter moves on to a more detailed description of some
important systems of English grammar.
Systemic descriptions of a language are prompted by the realisation that languages are
complex, interwoven, interrelated structures: i.e. systems. Elements of a systemic
grammar have value because they are in contrast with other elements as part of a
system; in other words, each element of a systemic grammar does not have a value in
itself, but has value in contrast with other elements, which form part of a network of
related elements. In Margaret Berry's words: ‘A system, then, is a list. It is a list of
things between which it is possible to choose.’ (1975: 143)
The example used in Chapter 2 was the verb system of voice: verbs in English can be
active or passive. The individual categories, ‘active’ or ‘passive’, would not mean
anything by themselves: their value arises from the contrast, the fact that there is, in this
case, a binary option between them. Voice is therefore a system: it is a short list of
things (active, passive) and it is possible to choose between them.
We can take the process of describing systems a step further by introducing the notion
of delicacy. ‘Delicacy’ refers to the degree of complexity given in our descriptions.
We could confine our description of the voice system to the listing of the options,
active/passive, or we could go on to specify further options in the system. In Chapter 7
we added one more set of options to our network: the option of expressing or deleting
the actor in a passive structure. This addition to the network gives our description a
greater degree of delicacy.
To sum up, part of the project of SF grammar is to account for the possibilities of
English grammar, by displaying the options available in so-called ‘systemic networks’
or ‘systemic nets’. The remainder of this chapter will be devoted to looking at the way
a more complex net is built up, one which incorporates the simpler system of voice.
One final word before we embark on the adventure of building up a complex systemic
network: the basis for differentiating between options in a systemic net is meaning. The
basis, then, for differentiating between active and passive in the verb system, is that the
structures have related but contrasting meanings: Janet kissed James has a different
meaning from James was kissed by Janet. Moreover, James was kissed by Janet has a
different meaning from James was kissed. Note that this claim -- that the meanings are
different in some way -- distinguishes SF grammar from some other theories of
grammar, which argue that in the passive transformation meaning is preserved.
Propositional meaning is preserved, in this case, but more emphasis is placed in SF
grammar on nuance.
13
2.0 Entry Conditions
Systems have a point of entry -- that is, the point at which we start. There are two
things which should be noted about this point of entry. First, it is a point which will
lead onto a series of distinctions between elements which are related in meaning. There
is no reason for trying to distinguish between elements which are not related in meaning
-- between, for example, the number system and the tense system. Whether an element
is singular or plural has little bearing on whether it is marked for past or present tense.
So, when you are starting out, you confine yourself to those areas of the grammar where
the meanings are in some way related.
Secondly, this relationship in meaning has to have some representation on the ‘surface’
the grammar. To take the example of the number system again, English has a two-part
number system consisting of singular and plural. Plurality is marked by such ‘surface’
features of the grammar as morphology (-s suffix for regular plurals) and verb concord.
Some other languages are different: some languages have a three-part number system,
consisting of singular, binary and plural (plural now being more than two items). This
kind of system will also have representation on the ‘surface’ of the grammar, for
example different noun suffixes for one item, two items, and more than two items.
Some languages categorise their nouns partly at least by their perceived length, or in
some cases by how dangerous they are (cf. Lakoff's Women, Fire and Dangerous
Things). In these languages, the noun systems will be represented somehow on the
surface of the language – e.g. by the choice of particular determiners for each class of
item.
We might note in passing that English does categorise nouns in a particular way -- in
the distinction between count and mass nouns. We talk happily of one/two/three pens
but not *one/two/three chalks. We would therefore be justified in writing a systemic
network for the count/mass distinction because it does have a representation in the
grammar of the language.
Once we have identified an area of the language where there is a choice between a set of
related meanings, and once we have satisfied ourselves that these related meanings are
represented in the grammar of the language, we can begin our network. We usually
start by specifying the rank of the unit to which the system applies. That is, we state
whether the unit in question applies to the rank of clause, phrase, word, or morpheme.
The system of gender, for example, would relate to the rank of lexical item (Berry 1977:
62) and we would look at the way words are classed according to whether they are
animate/inanimate and (if animate) masculine/feminine/neuter, etc.
14
3.0 A Case Study: Transitivity and Voice
As a case study, we are going to look at the way a complicated systemic network is built
up for the system of transitivity, and we shall also see the way in which the system of
voice relates to the transitivity system.
The transitivity network (shown in part at the end of this chapter) is adapted slightly
from the one shown in Berry (1975; 1989: 189).
Step 1
The point of entry is at clause level. Here there are two options: major clause and minor
clause. Major clauses are those which have verbs, eg Carol read a book. Minor clauses
do not contain verb, and include expressions like How about you? and University
lecturer forgery shock! Since we are concerned partly with processes, and since
processes are expressed usually by verbs, we shall concentrate here on major clauses.
C major
L Transitivity
A
U
S minor
E
Step 2
The first step shows that the system allows two options, between major and minor
clauses. The major clause has many further options (the number varies from three to
six, depending on the theorist). Halliday (1985; see also Chapter 9) distinguishes six
types of process that are expressed by the verbs in major clauses: material, mental,
relational, behavioural, verbal and existential. Material processes are concerned with
‘physical’ events and actions, eg She shot the albatross. Mental and verbal processes
are fairly self-explanatory. Behavioural processes are restricted to those actions which
people might indulge in, eg He smiled. (Unlike material processes, these processes
relate to conscious participants and never take objects.) Relational processes express
processes of being eg She is happy, and existential processes express existence via the
distinctive There is/there are type of structure, eg There is a cat on the window-sill.
Some SF theorists conflate verbal and mental processes, arguing that verbal processes
are a kind of externalised mental process. Others stress their differences.
For a summary of the set of criteria for distinguishing process types, see Halliday, 1985:
154. Several surface criteria, for example, distinguish material from mental processes.
15
One of the most obvious is that the unmarked, ‘factual’, expression of a mental process
is given in the simple present (eg What do you think? I think that....), while a ‘factual’
expression of a material process is given in the present continuous (What are you
doing? I'm washing my hair...). Again, a perceived difference in meaning is supported
by identifiable differences in surface grammar.
In our example, we shall concentrate on only one type of process, the material process.
material
mental
C major
L Transitivity verbal
A
U relational
S
E behavioural
existential
minor
Step 3
So far our systemic network has shown a straightforward list of options. A clause is
either major or minor, and in a major clause the process can be material, mental, verbal,
relational, behavioural or existential. Once we look at the options available to a
material process, however, the choice is less straightforward. At this point various
options are available simultaneously. First of all, material process can be either actions
(She shot the albatross) or events (The boiler exploded). Actions involve participants
which are animate and have intentions. Events involve participants which are inanimate
and therefore events are not intentional.
However, there are other, simultaneous choices available to material processes. They
can be either restricted in the number of participants involved, or relatively unrestricted.
Shooting, for example, implies two participants: the shooter and the shot, and it is
therefore restricted. Opening, however, might involve either one or two participants: (a)
The window opened, or (b) She opened the window. It is therefore unrestricted. Note
that in sentence (b), we have made two simultaneous choices: unrestricted process and
action process (the actor is animate). Sentence (a), The window opened is still
unrestricted, but the process has changed from action to event (the actor is inanimate).
The third and final simultaneous choice made at this point is between typical and
untypical animacy. It is possible but unlikely (=untypical) that an action process will
have an inanimate actor: compare The gun shot the albatross and The gun murdered the
albatross. The first possibility here could be taken as a restricted event with typical
animacy -- shooting does not necessarily imply intention, if the shooter is inanimate.
But murdering does normally imply intentionality: so in this case we would have a
restricted action with untypical animacy. (You can see opportunities for discussion and
debate beginning to appear!)
16
The three simultaneous choices give rise to a set of possible combinations:
action/restricted/typical; or event/restricted/untypical; or event/unrestricted/typical; and
so on. In a systemic network, the simultaneity of the choices is marked by the right-
facing bracket.
unrestricted
restricted
event
typical animacy
untypical animacy
We shall concentrate on that part of the systemic network that concerns voice: the
choice between restricted and unrestricted numbers of participant.
Step four
Unrestricted processes can be further categorised in terms of causation: She opened the
window is causative; the window opened is non-causative.
Restricted processes, in turn, can be further categorised in terms of whether they are so-
called ‘middle’ or ‘non-middle’ processes. Restricted processes imply a fixed number
of either one or two participants. If there is normally only one participant (the actor)
then the process is a middle one (eg The albatross died). If there are normally two
participants (actor and goal), then the process is non-middle (She shot the albatross).
Step 5
There is a further subnetwork of the non-middle and middle processes. Non-middle
processes, remember, are defined by the fact that we expect there to be two participants,
actor and goal (She shot the albatross). Clauses where both participants are present or
explicit are transitive. However, in some cases the goal is absent or inexplicit, and the
clause is intransitive (She shot in the air).
In middle processes we expect there to be only one participant (He died). The
subnetwork of the middle process depends on whether or not there is a marked second
participant, eg At the Glasgow Empire, comedians died a death. Other possible
examples of typical and untypical middle processes are He walks every day (typical
middle; usually one participant) and He walks the dog every day (untypical middle; two
participants where you normally have one). Note that the categorisation very much
depends on the meaning and use of the verb.
Step 6
At this point -- at long last -- we arrive at the voice system: the choice between active
and passive with which we began this discussion. How does that system fit into the
transitivity network?
17
One thing that the voice system needs is at least two participants, an actor and a goal.
Therefore, voice relates to unrestricted processes which are causative (She opened the
door); untypical restricted middle processes (He walked the dog); and transitive
restricted non-middle processes (She shot the albatross).
For these process-types, there is a choice between active and passive realisation. If
passive is chosen, there is the further choice of whether or not the actor is explicit. The
choices are shown in the network given at the end of this chapter.
Summary
One of the main purposes of plunging you into part of the transitivity network is simply
to alert you to the complexities of devising a systemic network. More information on
the topic is given in Margaret Berry's An Introduction to Systemic Linguistics, Volumes
1 & 2 (1975; reprinted 1989) and in Suzanne Eggins’ An Introduction to Systemic
Functional Linguistics (1994).
What should you have learned from this chapter? First of all, you should have begun to
grasp what a systemic network looks like -- even from the partial example given here.
Secondly, you will have realised that a systemic network is a way of trying to deal with
some of the complexities of English grammatical behaviour that are not necessarily
apparent in a basic grammatical analysis. In basic grammatical analyses, for example,
She opened the door and She shot the albatross are treated as being identical, since both
are SPO structures. (But if they are identical, why can we say The door opened as an
alternative to the first, but not *The albatross shot as an alternative to the second?)
Possible realisations
Close the door! Let’s have dessert.
Where have you been? Have you seen my phone anywhere?
We’re from Sao Paulo.
Possible labels
open imperative exclusive declarative
closed interrogative inclusive indicative
18
network for this part of the transitivity system (ie material processes) might explain the
following realisations:
action process
intention process
event process
supervention process
typical animacy
untypical animacy
How would you explain, in simple terms, the following notation for complementary
entry into a system?
x
a
19
Transitivity and Voice System causative
unrestricted active
non-causative actor explicit
typical passive
material middle actor implicit
untypical
restricted
transitive
non-middle
intransitive
action
intention
major
C event supervention
L
A TRANSITIVITY
U (AND VOICE) typical animacy
S
E minor untypical animacy
mental (etc)
20
Chapter 4
What is 'Functional' about Systemic-Functional Grammar?
1.0 Introduction
In the last chapter, we looked at the 'systemic' aspects of systemic-functional grammar,
which can be simply summarised as stating that grammatical realisations exist as part of a
system of choices. These choices are determined by differences in meaning: if you want
to mean one thing, you make one set of choices; if you wish to convey another meaning,
you make another set of grammatical choices. Systemic networks are a way of trying to
give an account of the choices available in any point in the grammatical system.
What Chapter 3 did not do was actually analyse any individual sentences. True, certain
clause types (active, passive, middle, non-middle, etc) were displayed in opposition to
each other, but in doing so we were trying to put down on paper the meaning potential of
English. In other words, we were trying to show in a fairly abstract way the kinds of
meanings that English grammatical choices can communicate. In this chapter we shall be
looking at actual sentences, and how grammatical functions are realised in these
sentences. These grammatical functions relate to the systems illustrated in the previous
chapter – it is possible to devise a systemic network that applies to them. We are simply
moving from a statement of the choices that can be made in the grammar as a whole, to a
description of the choices that have been made in any given sentence.
The question of form is usually dealt with in morphology: certain classes of word will
often have certain roots and affixes. Function in a basic grammar course is usually
confined to a consideration of how words and phrases relate to each other. For instance,
determiners and nouns relate to each other as modifier and head, and together make up a
NP. If a NP shows the relationship of concord with a VP, then we have a Subject-
Predicator relationship. What was labelled underneath the sentence (i.e. word and phrase
labels) were formal labels; while what went above the sentence (modifier/head or
SPOCA labels), were function labels. When it was difficult to decide the classification of
a word based on form and function, the slippery and unreliable third criterion of meaning
might be called in:
S M H P M H A x H function labels
Se MCl { (The albatross) (was flying) (to shore)}
NP d N VP a V PP pr N form labels
21
have functional constituents which again relate to types of meaning expressed in a
communicative situation. These additional functions are sometimes called
‘metafunctions’:
In a systemic network, these metafunctions are simultaneous choices made for the clause:
each clause in English will be representational, it will exchange something, and it will be
constructed to communicate a message in a particular way. These functions will be
realised by the nature and sequence of the grammatical components. Altogether, the
three metafunctions are an attempt to look at the way grammar is organised and to relate
that organisation to quite specific things we do with language: describe the world,
exchange information, goods or services, and construct messages. Grammatical
categories are made meaningful – SF grammar is sometimes called a ‘semanticised
grammar’ because its categories are based on meaningful relationships rather than formal
characteristics. In the rest of this chapter, we shall consider each metafunction in turn.
You will have noticed that this metafunction is very much to do with the systemic
network of transitivity, discussed in Chapter 8: the ideational components of any given
clause are the result of choices made at different points in the transitivity system. Other
grammatical theories also have wrestled with this topic: Charles Fillmore's ‘case
grammar’ is an attempt to deal with similar meanings within a transformational-
generative framework (see Brown and Miller 1980, Ch 18 for a discussion based on this
approach).
You might remember from the transitivity network that different process types are
available in English. Halliday lists them roughly as follows, with key associated
participants. (Other analysts have a slightly different set of processes and participants, so
again be aware of the possibility of the terms meaning different things from theory to
theory.)
22
Process Type Key Participants
material Actor, Goal, Beneficiary, Range
behavioural Behaver
mental Senser, Phenomenon
verbal Sayer, Target, Receiver, Verbiage
existential Existent
relational (= ‘being’) Token, Value, Carrier, Attribute, Possessor, Possessed
The processes and participants above are illustrated in the examples below:
BEHAVER BEHAVIOURAL
Tom snores.
ø EXISTENTIAL EXISTENT
There were many policemen.
23
Circumstances are optional elements in the clause, mainly expressed by Adverbials.
They are indirect participants in the clause. Common circumstances express the
following meanings:
Circumstances Examples
extent (space and time) ....for a mile ...for a month
location (space and time) ...in a box ...in a minute
manner (means, quantity & comparison) ...by hard work ...a lot ...like a slave
cause (reason, purpose & behalf) ...because I must ...to help you ...for love
accompaniment ...with Fred and Barney
matter ... about a matter of some delicacy
concession ...although not in the first term
frequency ...six times a week
condition ...if you give it in on time
result ...as a direct consequence
role ...as a friend
The ‘ideational metafunction’ is the technical phrase used to express one job that the
clause does: i.e. to present a ‘picture’ of some kind of real or fictional universe. It is a
universe in which participants of different kinds get involved in processes of different
types, under certain kinds of circumstance. There is not a natural one-to-one relationship
between the universe and the language used to describe it: other choices from the system
are always possible. Language, then, constructs particular world-views. Imagine, for
example, that two people express affection by touching lips. This action in the real or
fictional world can be encoded in language in a number of equally plausible ways. The
way in which it is expressed, however, will subtly change the way in which factors like
‘responsibility for the action’ are realised. Look at the following possibilities and
consider how the grammatical options chosen alter the way the action is represented:
These functions are also encoded into the grammar of English as part of the interpersonal
function, realised by constituents that we shall call the mood and the residue. Of these
two constituents, the more important is the mood, which can be further subdivided into
Subject and Finite. The Subject expresses the ‘thing’ by reference to which the
proposition can be affirmed or denied, and the Finite is the part of the verb which ‘places’
the proposition in time or ‘factuality’. In the basic grammar descriptions, these rather
complicated notions are introduced as Subject-Predicator concord: the fact is that there is
24
a relationship between the grammatical Subject and the verb of a clause. It is this
relationship which is explored further by the investigation of mood.
The Finite ‘places’ the proposition, usually with reference to time and/or the speaker's
judgement. The Finite may be realised by an auxiliary verb or it may be ‘fused’ with the
lexical verb (as in the second example below):
The Finite is usually seen as placing the proposition in terms of time or belief: the
examples above can be interpreted as placing the proposition in the present (is) or past
(was), or in terms of possibility, present or past (might/could). These possibilities can be
seen as simultaneous options in the systems of tense and modality.
3.2.2 Residue
The structure of the residue is made up of those clause constituents which are not Subject
or Finite: namely, the Predicator, Object, Complement and Adverbial. (Note that
Halliday groups Object and Complement together as two different kinds of Complement,
extrinsic and intrinsic respectively; and Adverbial is termed Adjunct).
The Predicator (minus the Finite element) gives secondary information about time
(‘secondary tense’), and it also gives information about other verb systems, such as
aspect and voice. The lexical verb also gives information about the type of process
involved.
The Complement is either the element in the residue which might become the Subject of
another clause (ie the Object) or the element in the residue which express an attribute of
the Subject (ie the Complement).
The Adjunct is an element in the residue which does not have the potential to become the
Subject of another clause: it gives circumstantial information, it acts as a discourse
linking device, or it expresses a range of meanings similar to those of the modal auxiliary
verbs.
25
RESI- [MOOD ] -DUE
Adjunct Subject Finite Predicator Complement
1. Yesterday, she was peeling potatoes.
2. However,
3. Perhaps,
One such scholar, Mathesius, coined the terms (i) Theme, to refer to the initial element in
a clause, which often gives information that is known to the hearer, and from which the
speaker proceeds, and (ii) Rheme, which often contains new or salient information. Fairy
tales often give simple examples of the linear development of Themes:
The initial Theme (‘Once upon a time...’) is formulaic, placing the story in time (‘long
ago’) but more probably activating expectations by signalling the genre of the discourse.
The Rheme of the first sentence introduces new information, which can then become,
given, thematic information in the second clause. This linear development is continued
in the third clause: Rheme becomes Theme. However, clauses 4-6 are different:
In these three clauses, the ‘given’, thematic information is derived from one Theme, the
Theme of clause three, which Mathesius calls a ‘hypertheme’. The study of how
thematic and rhematic elements contribute to the development of discourse is termed
Functional Sentence Perspective (FSP).
26
Within the Prague School, later scholars such as Firbas and Daneš argue that ‘givenness’
is the defining criterion of the Theme, and Themes therefore do not necessarily have to be
sentence-initial. Prague School linguists developed a theory of communicative dynamism
(CD): the part of the sentence that has the newest information has the highest CD and is
therefore the Rheme. The part of the sentence that has the least new information has the
lowest CD and is therefore the Theme. Sentence position is not crucial to the argument,
although lowest CD is usually found in sentence-initial elements.
Which element of the answers to the following two questions has the highest and lowest
CD? How might a shift in CD be signalled by intonation?
Halliday is rather vague on this point, but we can possibly consider Theme as providing
an orientation or ‘mind-set’ for the hearer-reader: it provides a framework within which
the Rheme can be interpreted. Halliday distinguishes between three types of Theme:
The Theme of any particular clause is not considered ‘complete’ until the topical Theme
is realised:
Theme Rheme
Textual Interpersonal Topical
However, fortunately, she saved the albatross.
However, fortunately, yesterday she saved the albatross.
Theme Rheme
Topical
Last month we wanted to drive to Ubutuba,
27
Theme Rheme
Textual Interpersonal Topical
but sadly our car broke down.
It should be noted that some SF grammarians always include the Subject of the clause in
the Theme– they would therefore include ‘we’ as part of the Theme in Last month we
went to Glasgow. There are arguments for and against such a categorisation – if writing
on Theme make sure that your own position is explicit and consistent.
4.0 Summary
The ‘functional’ part of systemic-functional grammar therefore is a way of categorising
the constituents of clauses using meaning as the primary criterion for classification.
Form and function have also to be considered, but the main questions an SF grammarian
asks include: is this constituent representing a process, participant or circumstance; how
does the clause articulate the relationship between speaker and hearer; and what does
the construction of the clause tell us about the orientation of the message, and what is to
be considered given and new? Based on the answers to these questions, we can identify
and label the grammatical constituents which make up the processes, participants and
circumstances, the mood and residue, and the Theme and Rheme. However, since
meaning is such a difficult thing to agree about, there are a number of differences
amongst SF grammarians about how these constituents are to be defined and labelled.
(1) Identify the participants, processes and circumstances in the following sentences
(apparently originating in authentic motor insurance claim forms!) and use your
description to account for the fact that they are ‘howlers’.
(a) The other car collided with mine without giving warning of its intention.
(b) I had been shopping for plants all day and was on my way home. As I reached an
intersection a hedge sprang up obscuring my vision and I did not see the other car.
(c) My car was legally parked as it backed into the other vehicle.
(d) A pedestrian hit me and went under my car.
From http://www.businessballs.com/amusement-stress-relief/insurance-claims-forms-
gaffes/
28
(2) Identify the participants, processes and circumstances, and discuss the way male and
female babies are therefore represented in the following birth congratulations cards:
(a) In the 20th century, America saved freedom, transformed science, and redefined
the middle class standard of living for the entire world to see. Now, we must step
boldly and bravely into the next chapter of this great American adventure, and we
must create a new standard of living for the 21st century. An amazing quality of
life for all of our citizens is within our reach.
We can make our communities safer, our families stronger, our culture richer, our
faith deeper, and our middle class bigger and more prosperous than ever before.
But we must reject the politics of revenge, resistance, and retribution -- and
embrace the boundless potential of cooperation, compromise, and the common
good.
(b) In this great nation, Americans are skipping blood pressure pills, forced to choose
between buying medicine or paying rent.
Maternal mortality rates show that mothers, especially black mothers, risk death
to give birth and in 14 states, including my home state, where a majority want it,
our leaders refuse to expand Medicaid, which could save rural hospitals, save
economies, and save lives.
29
We can do so much more, take action on climate change, defend individual
liberties with fair-minded judges. But none of these ambitions are possible
without the bedrock guarantee of our right to vote.
Let’s be clear. Voter suppression is real. From making it harder to register and
stay on the rolls, to moving and closing polling places to rejecting lawful ballots,
we can no longer ignore these threats to democracy.
While I acknowledge the results of the 2018 election here in Georgia, I did not
and we cannot accept efforts to undermine our right to vote. That’s why I started a
nonpartisan organization called Fair Fight to advocate for voting rights. This is
the next battle for our democracy, one where all eligible citizens can have their
say about the vision we want for our country.
We must reject the cynicism that says allowing every eligible vote to be cast and
counted is a power grab. Americans understand that these are the values our brave
men and women in uniform and our veterans risk their lives to defend.
Is there any way of pronouncing this exchange so that it sounds less strange? Compare
the Prague School concept of Communicative Dynamism, with the Hallidayan concept of
Theme in your ‘revised’ reading.
(2) Consider the following two extracts from newspaper reports of a 21-12 defeat of
Scotland by England in a rugby match. Pay attention to the thematic structure, and see if
you can tell from that which extract is from a Scottish paper, and which from an English
paper?
(a) It would be ungracious to deny Scotland the credit they deserve for the defensive
scheme they threw across the Twickenham pitch like a seine net across the Spey,
which anomalously also contributed much to their downfall, as time and time
again they were penalised for lying up offside.
30
England won the match in the set pieces; the scrum was rock-solid and Ackford,
Dooley, and Richards provided an impregnable wall at the line-out, which gave
them emphatic possession against the Scots' more scrambled efforts.
But the English pack, which at all times moved forward like a bulldozer shifting
snowflakes, was continually checked by Scottish grit and resistance, which was
implicit in their tackling, in which Sole, Jeffrey, White and Turnbull played a
huge part.
(b) Scotland were simply bombed out. Not just by Hodgkinson's boot, but by the
accurate line-kicking of their stand-off Rob Andrew.
Scotland couldn't escape from defence often enough to mount any kind of
sustained attack.
England's juggernaut forwards saw to that. They gave nothing away in the scrum,
exerting the sort of shove which made life a joy ride for their scrum-half Hill.
Sole, Gray, Turnbull, White and Jeffrey all had England at panic-stations at odd
times, but the defence quickly regrouped and in the end yielded nothing.
The suspicion that the Scots were a bit bare in ideas behind the scrum was borne
out. There were individual touches, yes, but few combined movements.
It should be clear from the activities you have just completed that the kind of
grammatical analysis you have just done is often used in stylistic and critical discourse
analysis.
31
Chapter 5
Immediate Constituent Grammar
1.0 Introduction
In Chapter 1 we considered the effect of Ferdinand de Saussure's contribution to
twentieth century linguistics. We focused on a number of topics in his work: first of all,
his concentration on the state of language at a particular time, rather than its development
over time -- ie synchronic rather than diachronic linguistics. Secondly we discussed his
argument that the object of linguistic study should be the principles governing the set of
utterances as a whole, rather than any one individual utterance -- ie the grammarian
should attempt to describe the abstract rules represented by langue through reference to
individual utterances, or parole. Finally, we considered the two axes of de Saussure's
proposed grammatical relations: the paradigmatic relations which govern the selection of
different items which might be slotted into any particular part of a sentence, and the
syntagmatic relations which govern the 'horizontal' relations between any one constituent
in a sentence and another.
This chapter takes the notion of paradigmatic and syntagmatic relations a stage further,
and looks at the detailed description of sentence structure suggested by the American
linguist Leonard Bloomfield and his followers, the so-called Structural Linguists, from
the 1930s onwards. The Structuralists were less interested than Saussure in the general,
theoretical procedures governing linguistic description; they were more interested in the
practical necessities of identifying and relating the linguistic constituents of fast-
disappearing native American languages. This anthropological interest led some to
reconsider the nature of the grammatical constituents of English and to think again about
how these constituents work together.
Well, first of all, they are not random. There are a limited set of sounds and symbols and
they seem to recur and interact in a systematic fashion. This you take to be a sign of
intelligence, so you look closer. You start sorting your data into groups of sounds or
symbols which seem to conform to patterns. One group might look like the following:
32
As aliens from another planet, your first act is to put this data through your Universal
Translator. This machine notices several things. First of all, the sequence laugh,
laughing, laughter occurs. This seems to be a combination of a basic form laugh with
some kind of additional forms ing and ter. Does the form ing occur elsewhere? Yes, in
rocking and making. Furthermore, the -ing forms seem regularly to be preceded by is or
are. The forms think and thinks co-occur too, as do Tom, Tommy, Bill, Billy, Billy's.
Interesting.
Your Universal Translator will now go on to look for forms like laugh or Tom or pour, or
even -ing, -n't, or -s -- none of which can be broken down further into smaller units. And
it will also pay attention to the possible ways of combining these small units into larger,
complex units -- laugh-ing, mak-ing, does-n't etc -- until it builds up a picture of which
combinations are and are not possible in this alien language. In other words, it will write
a grammar of the language by building up a picture of which selections and combinations
of grammatical units are possible. The sequences that it analyses will have the following
(idealised) form:
construction
constituent A constituent B
IC analysis is not entirely dissimilar to the kind of analysis done in first-year. Then we
broke sentences down into phrases and phrases down into words. Morphemes were even
mentioned briefly. There is however one difference in IC analysis, in that constituents
always tend to be broken into two smaller constituents, until such a division is no longer
possible (and you have your basic unit, or morpheme). This means that sequences such
as the NP the big yellow taxi or the VP should be arriving are broken into two then two
again, until the morphemes are arrived at. This gives us the following structures:
33
should be arriving should be arriv ing
should + be arriving
should + (be+ arriving)
should + (be + (arriv + ing))
The structural description shown above is quite detailed. Within each phrase, we can see
what the relationship of each part is: for example, in the NP the determiner the modifies
not just the head word taxi, but the sequence big yellow taxi. Similarly, in the VP the
modal auxiliary, should modifies not just the headword arriving, but the sequence be
arriving. IC analysis does not usually label sequences of words as NPs or VPs –
generally IC grammarians content themselves with showing the grammatical relations
between constituents by annotating ‘tree diagrams’ such as those above, in ways that we
shall shortly consider more closely.
The notion of aliens bypassing Earth and trying to work out a grammar of English
according to the above principles might seem whimsical, but in fact it is not that far
distant from the motivations governing the American structural linguists. In early 20th
century America, linguistic anthropologists like the German immigrant Franz Boas
(1858-1942) were aware of the threats to the indigenous Amerindian languages, and they
very much wanted to record and describe these very different languages before they
became extinct. They needed some systematic, scientific discovery procedures to help
them figure out the meanings and structures behind the strings of sound produced by
speakers of these very different languages. The procedures developed by this
anthropological project were fed back into English language studies and used in the
description of English grammar too.
It was Leonard Bloomfield in his book Language (1933) who proposed this idea of a
basic grammatical unit, the morpheme. The morpheme is the ultimate constituent,
something that cannot be broken down into further grammatical units. For example, pour
is a morpheme, and ing is a morpheme. Both give grammatical information. Here, one
type of morpheme can occur independently in an utterance (Tom, laugh, pour) -- these
are 'free' morphemes. Others, such as -ing, -s, 's, -n't, can only occur in combination with
other morphemes -- these are called 'bound' morphemes.
34
3.1: Endocentric and exocentric constructions
Take the simple sentence:
Our assumption is that even a simple sentence like this is made up of complex units.
Some of these units can be reduced, for example, 'happy people' can be reduced to
'people' and the sentence will still make sense. 'Happy people' is therefore probably
structurally related in some way to the single word, ‘people’.
We can reduce the phrase 'in Recife', too, to one word 'there' -- but notice that this type of
reduction is different. We reduced 'happy people' to one of its constituents 'people' --
this single constituent is therefore deemed to be EQUIVALENT to the more complex
phrase, and so is described as its HEAD. If we can reduce a complex unit to one of its
constituent units (ie the head), we say that it is an ENDOCENTRIC construction.
However, when we reduce 'in Recife', we do not get a head word in the same class, (ie a
single preposition or a noun), but another type of word, the adverb 'there'. This unit, then,
is an EXOCENTRIC or 'headless' construction.
As we have seen, 'happy people' can be reduced only to 'people' -- 'happy' is subordinate
to 'people' – in other words it is a modifier. But in the other two cases, you have a choice
of head if you reduce the phrases. 'Men and women live in Recife' can be reduced to
either 'Men live in Recife' or 'Women live in Recife'. This is therefore described as a
coordinative endocentric construction. Similarly, 'Bruno, a grammarian, lives in Recife'
can be reduced either to 'Bruno lives in Recife' or 'A grammarian lives in Recife'. Again,
we have a choice of headwords when we reduce the phrase, so it is a coordinative
endocentric construction.
The labels might sound daunting at first but remember that their function is to distinguish
between different types of relationships between constituents in a phrase. These different
relationships can be summarised in a table:
35
Type of construction Reduction characteristics Relationship between constituents
Endocentric:
The arrows > or < in endocentric subordinative constructions show the relationship
between the modifiers and the headword. The arrow points from the modifier to the
headword.
=
ex ex
36
There are various points to note about this tree diagram. First, it could go further and
subdivide gangsters and raided into the morphemes gang + sters and raid + ed.
However, the level of analysis has stopped at the word. Otherwise, it is an attempt to
show the kinds of grammatical relationship between the elements of the sentence, by
breaking them into their immediate constituents, two by two. So we begin by breaking
the sentence into two: Some gangsters from Cupar and Elie + raided this store. Then
each of these constituents is further broken down, two by two, until the ultimate
constituents are reached: Some + gangsters from Cupar and Elie; gangsters + from
Cupar and Elie; from+ Cupar and Elie; raided+ this store; this + store. The annotations
show the kind of grammatical relationship holding between each constituent, eg the
arrows show what is modifying what.
4.1 Ambiguity:
It is not always immediately obvious from the structure of a sentence alone how it is to be
interpreted. Many sentences and structures have more than one possible interpretation.
ICA can show the possibilities but by itself it cannot identify which interpretation is
preferable in a given context. For example, the sentence Some thieves and varlets from
Elgin is open to various interpretations, such as:
(a) Some thieves from Elgin and some varlets from Elgin.
(b) Some thieves (from somewhere) and some varlets from Elgin.
(c) Some thieves and (an undetermined number of) varlets from Elgin.
etc
ICA can show by tree diagrams the different possibilities for interpretation, but it cannot
explain how people decide which is plausible in any given context.
A ‘hamburger’ (like a ‘frankfurter’) was originally a foodstuff named after its place of
origin. We therefore have historical justification for analysing it into the morphemes
hamburg+er by analogy with frankfurt+er. However, with the popularity of hamburgers,
37
and possibly the confusion caused by the accidental presence of the element ham, we
now have words like beefburger, cheeseburger, and so on – suggesting a morphemic
analysis of beef+burger, cheese+burger...and ham+burger. Here diachronic and
synchronic linguistics seem to pull us in different directions: the analysis hamburg+er
accords with the historical facts of the language (and indeed with the ingredients of the
foodstuff), while ham+burger accords with the way the language currently operates, with
burger as a still-productive free morpheme. It is difficult to say which is a ‘correct’
representation of the ultimate constituents of this word. Both analyses have arguments in
their favour.
The problem with sheep is that it belongs to a small class of English words that do not
formally mark the plural. Most English words mark plural by adding the bound
morpheme –s to the free morpheme of the stem, e.g. book+s. Usually, then, we can say
that the ultimate constituents of the English plural are made up of two morphemes, one of
which marks plurality. To make words like sheep, deer and aircraft fit this pattern, then,
some grammarians propose the existence of the ‘zero morpheme’ which is added to sheep
(singular) in order to arrive at sheep (plural). The former sheep is made up of one
morpheme; the latter is made up of two. There are drawbacks to this proposal, clearly
discussed in Brown and Miller (1980: 161-230). One problem, if you begin proposing
‘zero morphemes’ as ultimate constituents, is that it is difficult to know where to stop.
For example, if we argue that number in English is to be marked by the addition of a
singular or plural morpheme, then we can argue that actually the singular form has a
zero-morpheme: i.e in book+s, plurality is marked by the morpheme +s, while in book,
singularity is marked by a zero-morpheme. If we follow this logic with sheep, however,
then we are led to the position that both singular and plural forms are marked by zero-
morphemes!
The logic here might seem tortuous, but it represents a theoretical problem which IC
analysis needs to resolve. The morpheme is an abstract grammatical concept, and
although it accounts for many of the facts of grammatical behaviour, we cannot account
for them all without tinkering with the nature of the abstract concept.
4.3 Cross-cutting
In complex constructions, word-boundaries and morpheme boundaries do not entirely
correspond. For example, in 'loud-voiced man', the word 'loud' is assigned to 'voice', not
'voiced' (compare 'man with a loud voice'). The morpheme 'voice', therefore cuts across
the word boundary:
loud-voiced man
Although this is not a serious theoretical problem, it does cause difficulties when we are
attempting to represent grammatical relationships diagrammatically. See Simpson (1979:
117) for a further explanation of this and other examples.
38
4.4 Discontinuous constituents:
Another problem arises when we are trying to represent a discontinuous constituent, that
is, a structure which is interrupted by another structure. The rather messy solution is to
make one of your lines ‘hop over’ the other, so:
For further explanation and examples, again see Simpson (1979: 117).
The problem is that it is in fact almost impossible to give an adequate description of any
part of the language system without already making assumptions about any other part,
and it is very difficult to classify items without considering their meanings. Otherwise,
we would always be barking up wrong trees, for example, as Simpson observes (1979:
123) we would think gooseberry would have something to do with geese, and that
hipp+opotam+us shared a morpheme opotam with Mes+opotam+ia.
Still, structuralist linguistics has taught us to think carefully about how we classify
structures, and to recognise the assumptions, inconsistencies and sometimes even
contradictions that go into grammatical descriptions.
39
1. A brief examination revealed the proceeds of the robbery.
4. hare-brained scheme
5.2 Morphemes
What are the 'ultimate constituents' of the following words? On what basis have you
identified the morphemes? What problems arise during this identification?
1. disarm
2. dismay
3. solemn
4. condemn
5. connect
6. potash
7. potato
8 pottery
9. aircraft
10. crafty
A: Klingon
Let us begin by returning to the fantasy of visiting an alien planet and attempting to
untangle the grammatical structure of a non-human species, by observing how they
communicate. Let us imagine that you have been orbiting the Klingon homeworld, in a
cloaked starship, with a mission to boldly do an Immediate Constituent Analysis of
tlhIhngan, Klingon. So far you have figured out a paradigm for intransitive verbs such as
Qong (sleep). In Klingon speech, Qong appears with pronoun prefixes:
jIQong
bIQong singular
Qong
maQong
SuQong plural
Qong
40
With transitive verbs, however, a whole range of verbal prefixes is employed to indicate
simultaneously Subject and Object. Take, for example, the verb legh (see). The prefixes
seem to fall into certain groups, some of which are given below (S = Subject):
Through careful ethnographic observation, you discover that one Klingon challenges
each other to a duel by saying ‘show’ (ang) ‘face’ (qab). The adversary will usually
answer ‘I don’t hide (So’) it’. Observers might at that point murmur, ‘He/she shows
his/her face clearly’.
With all this data in hand, can you identify the various morphemes in the following
challenge issued by Chancellor Gowron to Worf, with Kor observing? In particular, can
you spot the morphemes that express imperative, possession and negation?
For further information, see Okrand, M (1985, 1992) The Klingon Dictionary (Pocket Books) and Okrand,
M (1997) Star Trek: Klingon for the Galactic Traveler (Pocket Books).
1. Bha an cù dubh.
2. Bha an cat bàn.
3. Bha Calum mór.
4. Bha an cù sgìth.
5. Bha Màiri beag.
6. Bha an gille mór.
7. Bha an cù beag.
8. Bha Màiri bàn.
9. Bha an gille beag.
10. Bha an cat mór.
41
11. Bha Màiri beag.
12. *An cat dubh bha.
13. *Bha cat an dubh.
14. *Bha dubh an cat
15. *Bha an Calum sgìth
16. *Bha cat dubh.
C: Bolivian Quechua
Here are six question-and-answer drills from a textbook on Bolivian Quechua, a South
American Indian language. Can you identify the morphemes that are being taught in this
unit from the textbook? (At the end of this chapter, the English translation is given and
the morphemes in question are identified.)
D. Basque
The activities above have been simplified by selecting and arranging the data. What (if
any) deductions about Basque can you make from the following excerpt from a guide to
the Guggenheim Museum in Bilbao? Can you identify recurring morphemes and make
any deductions about, say, the form of the nouns, verbs adjectives or adverbs? Again a
translation into English is given at the end of the chapter.
42
The morphemes being taught in this unit are the suffixes –chus and –pis. The form
–chus is attached to pi (who), imapaq (what for) and mashkha (how many) to indicate
that they are interrogative sentences. For example, in sentence 3 the morpheme –chus
marks the sentence as an interrogative:
This morpheme is also used in the declarative responses to indicate doubt. This doubt is
reinforced by adding –pis to the verb, eg to the relevant forms of apamuy (bring), as in
sentence 2:
For further information on Quechua, see Luis Morató Peña and Luis Morató Lara (1994)
Quechua Boliviano Trilingüe La Paz: Los Amigos del Libro
Translation of C: Basque
The Guggenheim Museum Bilbao opens its doors with the threefold mission of bringing
together and interpreting the most representative art of our time, fostering artistic
education and the public’s knowledge and understanding of the arts, and complementing
the extensive collection of the Solomon R. Guggenheim Foundation. The Guggenheim
Foundation, founded in the twenties by Solomon R. Guggenheim and his artistic advisor
Hilla Rebay, has collected objects produced in the twentieth century from the full range
of Western visual arts.
43
Chapter 6
Towards a Generative Grammar
Structuralist grammar dominated American linguistics from the 1930s to the 1950s.
Then, in 1957, a young man at Massachusetts Institute for Technology published a slim
monograph which was to revolutionise linguistics. His name was Noam Chomsky, his
book was Syntactic Structures, and his 'big idea' was that a real grammar of English
would not just tell you the patterns that sentences conformed to, or even how to discover
the patterns that sentences conformed to -- a grammar of English should tell you how to
make these sentences. This is a hugely ambitious undertaking, because, if you have a
grammar that tells you how to generate potentially all the possible sentences in English
(and only those sentences which are acceptable), then maybe you've got a model of that
other sentence-generating device, the human mind itself. Chomskyan grammar, or
transformational-generative (TG) grammar, dominated linguistics for the next 30 years or
so.
44
Despite the revolutionary nature of Chomsky's grammar, he did build on concepts that he
inherited from earlier linguists: that language was a structured set of sentences, and that
certain relationships held between the structures. Like Saussure and Bloomfield,
Chomsky and his followers attempt to devise a grammar of formal rules which explain
language behaviour. The difference about TG grammar is that it is an attempt to devise a
grammar that tells us how to generate acceptable sentences, not just a grammar that
describes them when they occur. Chomsky began by devising what are called 'rewrite
rules'. There are three types of rewrite rule -- phrase structure (PS) rules,
transformational (T) rules, and morphophonemic rules – in this workbook we shall focus
mainly on two of these: PS and T-rules.
This sentence could be represented as a 'tree diagram' showing the constituent parts:
NP VP
Vtr NP
There are various things to note about this ‘tree diagram’ or ‘derivational tree’. The first
is that it is shows slightly different grammatical relationships from those you may have
encountered in earlier grammar courses. There is no representation of functional
constituents such as SPOCA or modifiers and headwords. Chomsky’s is a formal
grammar, and so tends to ignore functional elements as such. You will note that, as in
structuralist grammar, the sentence is initially divided into two: in traditional grammars
these elements were called subject and predicate. This division means that the second
NP is deemed to be part of the verb phrase – and, indeed, this sentence would be
considered incomplete if the NP was missing: *The detective punched. The NP, then, is
considered to be an essential part of this VP.
The tree is made up of branches and nodes. The labels given to the various nodes are
sadly not consistent, even from one version of TG grammar to the next. Here Det is used
for the determiner, the, but you might equally find Art (Article) in some books. Prepare
to be flexible when you read grammatical theory, but also try to be consistent when you
45
are constructing your own tree diagrams! The other nodes here are NP (Noun Phrase),
VP (Verb Phrase), Vtr (transitive Verb), Vs (Verb stem), Tns (Tense, here the past tense
marker), and N (Noun). Sometimes the relationships between the elements labelled is
described as a ‘family tree’. Thus the Det and N are ‘daughters’ of the NP, which itself is
a ‘sister’ of the VP. Another way of expressing the relationship is to as that the NP
‘governs’ the Det and N, as the Vtrans ‘governs the Vs and Tns.
This 'derivational tree' still simply describes the sentence given. However, it can be
recast as rewrite rules. The rewrite rules which equally well account for the phrase
structure of the sentence. The arrow → means ‘can be rewritten as’.
S → NP + VP
VP → Vtr + NP
NP → Det + N
Vtr → Vs + Tns
Tns → ed
Vs → punch
N → detective, gangster
Det →the
The first four lines of this sequence of rewrite rules would allow you to generate any
sentence of this type. The next four lines fill in the lexical and morphological
components, ie the words and their inflexions. The beauty of rewrite rules is that they
account for all similar structures in the language. As well as The detective punched the
gangster, these PS rules can generate sentences such as The gangster punched the
detective, The stewardess boarded the flight, The pilot landed the plane, The car rounded
the bend. The possibilities are, in principle, boundless (though see section 5.0 below).
(a) the rule tells you to rewrite the symbol on the left of the arrow as a string of symbols
on the right of the arrow
(b) only one symbol may appear on the left of the arrow
(c) apart from the first symbol, S, anything which appears on the left of the arrow must
already have appeared further up, on the right.
(d) no symbol may be used on the left hand side more than once
(e) the symbols on the right which are not subject to rewriting constitute the lexicon
(words and morphemes); a comma separating them indicates a choice.
46
2.1 The Quest for Power and Economy
As the complexity and power of rewrite rules develop in TG grammar, their form and
order changes from the kind of simple sequence given above. The goal of a formal
description of language by rewrite rules is to achieve a powerful and economic set of
rules which account for as many acceptable structures as possible. To that end, for
example, more sophisticated PS rules place morphemes like Tns before rather than after
the Verb stem to which they apply. This would give the rewrite rule as
Vtr → Tns + Vs. If directly rewritten as a phrase, this would obviously give the affix
before the verb stem: edpunch. To avoid this happening, it is assumed that all verbs
undergo a transformation (see Chapter 5) which makes the affix ‘hop’ from the
beginning to the end of the verb. The reasons for doing this are complex, but they boil
down to the fact that the combination of a PS rule that positions affixes before verbs, plus
a transformation rule that makes affixes ‘hop’ to the end of the verb, is in the long run
more economical and powerful than having PS rules which position affixes after the verb
stems.
NP VP
FIRE sing
This derivational tree accounts for sentences like The notes were burned by the fire.
Look particularly at the Aux and V elements: the inflexional information about the
formation of the tense of BE and the past participle of BURN are both included before
the verb stems themselves. For further details, see, for example, Brown and Miller (1980:
204-221). The crucial point to understand here is simply that the rules and constituents of
TG grammar are formal abstractions and do not need to conform to the sequences of
phonemes and graphemes constructed by a speaker or writer. What people actually say
and write are handled by morphophonemic rules which turn the abstract constituents into
sounds and scribbles.
47
those in the lexicon, the words and morphemes. That is to say, in the rewrite rules and in
the tree diagram, the word detective is actually an abstraction, a token of the word as it
might be spoken or written. If we then attempt to describe written and spoken behaviour
(and not just grammatical structures) we would have to add another set of rewrite rules,
along the lines of (for my own speech):
the → /ðə/
detective → /dɪ'tɛktɪv/
gangster → /'ɡaŋstər/
punch+ed → /'pʌnʃt/
Obviously, not all rewrite rules are this simple. To consider a small complication,
imagine that the sentence had been, 'The detective saw the gangster'. The PS rules
governing the verb would then have been:
VP → Vt + Tns
Vt → see
Tns → ed
The symbols 'see' and 'ed' would mark the choice of verb and tense, but we don't actually
say 'see+ed'. We handle this problem by the morphophonemic or morphographemic
rewrites:
Notice that for these rewrite rules, it is possible to have more than one symbol on the left-
hand-side of the arrow.
48
1. Can a word or string of words be replaced by another phrase of a
given type? If so, it is also a phrase of the given type.
Who hit you with the book? What did she do?
The woman in the red dress. Hit me with a book.
NP VP
But not:
The man next door and his wife are very nice.
He is very clever but rather inarticulate.
But not: *The man next door and rather inarticulate are nice.
49
6. Can the word or string of words be 'shared' by two clauses, linked
by 'and' or 'but'? If so, it is a phrase.
That wonderfully gifted professional footballer on whose every word the tabloid
press hang in wonderment missed a penalty last night.
I first saw you in the crumbling yet picturesque streets of old Salerno.
I first saw you there. [pro-PP]
Douglas thinks Gillian is very aggressive but I've never found her so.
[pro-AP]
Douglas won't go to the party but Gillian will [go to the party].
Gillian likes mixing her drinks, but Douglas doesn't [like...].
But not:
Tests such as those above help us to decide which 'strings of words' should go together to
be analysed. For example, the final test (8) suggests that the phrases go to the party and
mixing her drinks should be analysed as a unit (ie a VP), because they can be omitted
under certain conditions. The auxiliaries 'might' and 'doesn't' are analysed separately (ie
outside the VP) because they cannot be omitted. Thus a tree diagram for 'Gillian might
go to the party' could look like this:
50
S
NP M VP
N V PP
P NP
Det N
S → NP + M + VP
NP → (Det) + N
VP → V + PP
PP → P + NP
M → might
Det → the
N → Gillian, party
V → go
P → to
Note that for reasons of economy, the NP rewrite rule is only given once, with the
brackets signifying that the determiner is optional.
There is little space here to go into lexical constraints in any depth. For more detail, see
Brown and Miller (1980, Ch 7). In brief, one way of solving the ‘colourless green ideas’
problem is to assign two types of categorisation to each lexical item. The first, ‘inherent
subcategorisation’, refers to the kind of lexical item a particular word is – is it
51
‘inherently’ a noun, a verb, an adjective, etc? The ‘inherent subcategorisation’ of green,
for example, is as an adjective. The second categorisation, or ‘strict subcategorisation’
constructs a frame which describes the linguistic environment in which a word can occur.
Effectively, rules are constructed in order to determine which words can collocate with
which. For example, green can collocate with concrete nouns such as armchair, but not
abstract nouns like ideas. Other categorisations that are relevant to nouns include
whether or not they are animate/inanimate, human/non-human, male/female or
common/proper. The lexical constraints on green therefore would look something like
this:
The first part of this description (Adj) is the inherent subcategorisation, telling us that
green is an adjective. The rest forms the strict subcategorisation, telling us what kind of
linguistic environments green occurs in. ‘Cop _____’ tells us that green follows
copulative verbs (ie those which are naturally followed by an Aj P, such as be, becomes,
seems, etc). The final part, ‘NP (N[+concrete]) ____’ tells us that green can also occur in
NPs where the noun is concrete. The constraints therefore allow sentences and phrases
like:
Every word in the language would need a detailed inherent and strict subcategorisation in
order to ensure that the rules account for ‘normal’ language usage. Metaphorical usage is
more difficult to account for – poets such as Andrew Marvell tend to talk about things
like ‘a green thought in a green shade’, but in fairness to Chomsky, poetic language
allows ‘deviation’ from the normative rules.
The general rule for lexical insertion can be formulated thus (adapted from Brown and
Miller 1980):
For any terminal symbol of the PS rules:
(i) select from the lexicon a member of the class named by the terminal symbol in
question (ie select a noun for the symbol N , a verb for V, etc)
(ii) Attach this item as a daughter of the relevant symbol
(iii) The strict subcategorisation for the relevant item must not conflict with the
environment into which the item is to be inserted.
Like the strict subcategorisation description suggested above, the lexical insertion rule
can be elaborated upon, but the simplified version given above illustrates the general
principle by which words are selected to fill the slots generated by the PS rules.
52
6.0 Summary
To sum up, then, part of the project of Transformational-Generative grammar is to go
further than Immediate Constituent Analysis: the constituents of a sentence are
discovered and categorised in such a way as to make possible the writing of formal
rewrite rules, which will not simply describe but also generate possible English
sentences.
a) The curious student looked up the word in the dictionary on the shelf.
b) The curious student looked up the kilt of the soldier on the ladder.
3. He might have been -- but he might not have been -- writing a letter.
He might have -- but he might not have -- been writing a letter.
He might -- or he might not -- have been writing a letter.
53
5. She says he might have been writing a letter and so he might have been.
She says he might have been writing a letter and so he might have.
She says he might have been writing a letter and so he might.
The different decisions made about the categories used to construct this sentence
obviously affect the writing of the PS rules. How do the rewrite rules for (a) (b) and (c)
differ?
NP VP
He VERB NP
AUX V D N
54
2. Chomsky, Logical Structure (1955/75)
NP PREDP
He AUX VP
M ASPECT V NP
NP AUX VP
He M ASPECT V NP
Other TG grammarians have come to other interpretations of the same sentence. See
Radford (1988: 163) for examples, and a discussion.
55
Chapter 7
Between Words and Phrases: X-Bar Theory
1.0 Introduction
In Chapter 6, we looked at the first stages of PS grammar: we considered the constituent
structure of sentences (categorised with reference to various structure tests) and we
started analysing sentences with a view to devising a set of rewrite rules that would
generate sentences of a similar type to the ones we were analysing.
In this chapter, before we move onto the 'transformational' part of TG grammar, we shall
look again at phrase structure, focusing less on complete sentences and more on phrases
themselves. In doing so, we shall consider some of the more subtle problems that arise if
you are trying to write a grammar that will generate language.
M H M x H
(the Queen (of Sheba))
NP d N PP p N
Bear in mind that ICA usually breaks things down into twos: the Queen of Sheba is
broken down first of all into the and Queen of Sheba, then Queen of Sheba is further
broken down into Queen and the postmodifying PP of Sheba, which in turn is broken
down into preposition of and noun Sheba. As we saw in Chapter 5, the relationship
between the and Queen of Sheba, and of Sheba and Queen, is subordinative endocentric
(since both can be reduced to the head word: Queen), and the relationship between of and
Sheba is exocentric (because these words cannot be reduced to a single head word).
A reasonable question is ‘How do you know what the first step is?’ In other words, how
do you know to break the sequence into the + Queen of Sheba and not the Queen + of
Sheba (which a basic grammatical analysis might actually suggest)? Here we bring in
structure tests of the type we were using in the previous chapter.
56
First of all, let us test that 'the Queen of Sheba' as a whole is indeed a NP. First of all, we
can replace the phrase with other NPs like the Queen or the pronoun she, so it seems
indeed to be a NP
Secondly, the postmodifier of Sheba does seem to be a full PP. It can be used as a
sentence fragment by itself:
She was the Queen of Sheba and monarch of all she surveyed.
Here the seems to modify not just Queen of Sheba but also the later monarch of all she
surveys. This suggests that the sequences Queen of Sheba and monarch of all she surveys
are in some way self-contained units. Now, if we accept the argument that there is a
structural division between 'the' and 'Queen of Sheba', then we are posed with a problem
when drawing a tree diagram showing the phrase markers:
NP
Det ?
the N PP
Queen P N
of Sheba
The constituent marked '?' is obviously bigger than a word, but is it a phrase? This
problem is particularly taxing for the TG grammarian, because he or she wishes to
generate similar phrases using PS rules. So, let us try calling it a phrase (specifically, a
NP) and see what happens:
57
Det → the
N → Queen, Sheba
P → of
The problem with this is that it does not work. First of all we have two separate rules for
making NPs. We should only have one. (Remember that anything appearing on the left
side of the arrows should appear only once.) More specifically, the first rewrite rule does
not work if you try to generate phrases from it. If we accept that 'the' is a determiner' and
'the Queen' is a NP, then D + NP actually gives us 'the the Queen'. In technical terms, the
rule is 'recursive' -- ie the same structure appears on both sides of the arrows, signifying
some kind of embedding. Sometimes recursion does operate in English phrase structure,
but not with the determiner! So, in this case, the '?' in the tree diagram must be
something else -- neither a word nor a phrase, but something in between. In TG
grammar, it's given the name N-bar, abbreviated to N', which gives us the rewrite rules:
This rule has the advantage both of working (in the sense that it generates similar
phrases) and of reflecting the relationship between constituents such as is suggested by
our structure tests.
a) a collector of butterflies
b) an actress with talent
Again, superficially, these NPs look like 'the Queen of Sheba', in that our first-year
analysis of them both would have shown a noun premodified by a determiner, and
postmodified by a PP. But structure tests suggest that there is something different about
them. For example, (a) can be paraphrased:
58
In (a) the postmodifying PP tells us what the collector collects; whereas in (b) the
postmodifying PP tells us extra information about the actress: she has talent. In (a) we
call the prepositional phrase the Complement; in (b) the Adjunct. (Note that these are
phrasal Complements and Adjuncts and must be distinguished from the kind of clause-
level Complements and Adjuncts that are usually found in basic grammatical
descriptions.) The phrase structure of a prepositional Complement and a prepositional
Adjunct are different, as can be seen in a noun phrase which has both: a collector of
butterflies with talent. The tree diagram for this is shown below.
The PS rules for NPs can now be made more sophisticated as:
You may have noticed that the first of these N’ rules seems to be illegal:
N’ → N’+ PP has the same symbol on both sides of the arrow and therefore invites
recursion. However, this is one area in English where recursion is possible: you can
indefinitely expand the number of adjunct PPs embedded inside the NP, as in:
a collector with talent, with charm, with a winning personality, with halitosis...
Having a rewrite rule which invites recursion is therefore justified in this special case.
N"
Det N'
a N' PP (Adjunct)
P N
N PP (Complement)
with talent
collector P N
of butterflies
59
the phrase. This N' allows us to write rules in such a way that we do not end up with
unacceptable phrases or gibberish.
X"
(Specifier) X'
X (Complement)
As we saw earlier, in the NP 'the Queen of Sheba', the Determiner 'the' functions as
Specifier; and the PP 'of Sheba' functions as the Complement. How does this work in
other phrases? A few examples are given below (the relevant phrases are italicised).
Notice that all the phrases shown below are variations on the basic X’ structure given
above. In other words, there seems to be a fundamental structure common to all phrase
types in English, and represented by the diagram immediately above.
NP M V"
Gillian be V PP
living P N
in London
60
4.2 Adjective Phrases
Douglas was extremely worried about grammar.
A"
Spec A'
extremely A PP
worried P N
about grammar
P"
Spec P'
right P N
on target
In each of the above, the analysis and the construction of PS rules is facilitated by
positing a structural unit at the level of the X-bar. Again, individual phrases can
obviously be much more complicated than those we have looked in this chapter, but the
point is that they can be analysed using exactly the same principles, and -- theoretically --
rules can be devised to generate all the complex phrases we need.
61
5.0 Review Activities
These activities are based on the work of the last two chapters, and focus particularly on
formal analyses of phrases.
a) N" → (Det) + N
b) N' → N' + PP
c) N' → N + (PP)
d) N' → (NP) + N
V"
ADVP V'
ADV V NP
this task.
62
V" → ADVP + V'
ADVP → ADV
V' → V + NP
NP → Det + N
ADV → completely
V → understand
Det → this
N → task
Note
The chapters here on transformational generative grammar and its developments merely
scratch the surface of a complex topic. For a fairly approachable and much more detailed
discussion, see Radford, 1988: especially, Chs 4 & 5, or another introduction to TG and
its successors. The more daring of you might try reading some unadulterated Chomsky,
but be warned -- it's challenging stuff!
63
Chapter 8
Movements:
Transformations, S-structure and D-structure
-- but the sentences were simply categorised as active and passive voice, and their
structures were described independently. However, in TG there is a crucial new
assumption: first, it is assumed that one sentence is not just related to the other -- one is
derived from the other, and we can write a T-rule which will describe the process of that
derivation. Secondly, and possibly even more importantly, TG grammarians argue that
both sentences are originally derived from a 'base structure', a kind of basic formula from
which surface structures of different kinds are generated. This set of basic formulae is
known variously as the 'base structure', the 'deep structure' -- or now simply as the 'D-
structure' of the language. What we recognise as sentences, derived from these 'D-
structure formulae', used to be called 'surface-structures' but now are more usually known
as 'S-structures'.
The main reason for devising transformational rules is, as mentioned in Chapter 6, the
desire for power and economy in our formal description of English grammar. Without
transformational rules, the TG grammarian would have to devise separate PS rules for,
say, both active and passive constructions. However, with a transformational rule the
grammarian can derive the passive structure from the active structure, and only one set of
PS rules is necessary. In general, the incorporation of transformational rules into the
grammar greatly increases the power and economy of the PS rules by reducing the need
for a proliferation of PS rules.
64
2.0 Transformational Rules
Transformational rules do three things: (1) they change the order of words in a sentence;
(2) they delete items in a sentence; (3) they add items to a sentence. Simple examples of
transformations in operation would be:
What would you need to do to devise a transformational rule for passivisation? Let us
assume for the moment that the active voice is primary:
The Topt Pass Rule (i.e optional transformation: passive rule) here can be formulated as:
This transformational rule reorders the elements in the sentence, adds the elements 'be'
and 'by', and shows (by bracketing) that the PP phrase can be deleted.
So, we now have a grammar that gives a formal account of the way sentences are related.
But remember that this account assumes that there is a D-structure from which both S-
structures here, both the active and passive voice -- are derived. What does this D-
structure look like?
D-structures
MOVEMENT TRANSFORMATIONS
S-structures
65
Consider the two related sentences:
1. Douglas is weird.
2. Is Douglas weird?
Let's assume the second is derived from the first, 'Douglas is weird'. What could this be
derived from? Basically, we have a NP 'Douglas' and a VP 'be weird'. The VP is made
up of the V 'be' and the AjP 'weird'. The deep structure of this sentence might therefore
look something like:
Provisional D-Structure
NP VP
N V AjP
Aj
Douglas be weird.
What happens to this structure is that the verb 'be' is then marked for tense and number.
The abbreviation for this is I (=Inflexion). When this happens, a transformation (called
V-movement) takes 'be' out of the VP, so:
NP I VP
N V AjP
Aj
Douglas is ø weird.
What happens next is a bit trickier. In order to get to the further derived structure 'Is
Douglas weird?' we have to take the inflexion 'is' out of the sentence altogether. 'Is'
becomes what is known as a 'Complementiser' -- that is, a clause-introducing particle
which remains outside the structure of the clause as such, because it relates to the clause
as a whole. In order to bring the complementiser into our analysis, we have to postulate
an S-bar in the same way as we postulated an X-bar in Chapter 7:
66
S'
C S
NP I VP
I N V AjP
Aj
Is Douglas ø ø weird?
The inflexion has now moved from within S to a complementising position outside the
clause as such (cf Radford, 1988: 298-303). The two transformations can be summed up
in the following diagram (Radford 1988: 420):
V-MOVEMENT
I-MOVEMENT
e = empty
The revised D-structure simply shows an empty (e) complementiser slot, ready to be
filled by the I-MOVEMENT transformation later on.
First of all, thinking about the relationships between sentences, and thinking about
derivations and basic forms -- even in quite general terms -- can help us sort out
superficially similar structures into distinct categories. This can be illustrated by asking
the question, 'When is a passive not a passive?'
67
Consider the following sentences. Which sentences are in the passive voice? Which are
not? And which are ambiguous?
We can use structure tests based on derivational rules to argue that only (1) is a 'true'
passive. Only (1) could be derived from an underlying structure such as
In (2) and (3) 'unexpected' and 'unusual' simply act as adjectives: despite the surface
similarity, the sentences are derived from a different D-structure from (1). Other
structural tests confirm that (2) and (3) are grammatically different from (1) at a deeper
level. Consider the transformation known as NP-RAISING:
In the full sentence, the second of these clauses is in some way subordinated to the first.
In the final derivation, the 'there' is raised from the subordinate clause to the main clause.
This works for the verb 'expected' but not for 'unexpected' or 'unusual'
One of the advantages of ‘unpacking’ deep structures from surface structures should now
be clear. By thinking of derivations, and by formalising the relationships between
sentences, we can systematically devise structure tests that will help us to sort out our
grammatical constituents into categories that might not be immediately obvious from S-
structures alone.
That is the first justification. The second was mentioned briefly in the opening section of
this chapter but is worth restating. By adding T-rules to our PS-rules we potentially make
our TG grammar much more economical and powerful. This is because we don't need as
many rules if we have T-rules -- we don't need, for example, one set of PS rules for active
sentences and another set of PS rules for passive sentences. Instead, we just have a set of
D-structures, one set of PS rules which will realise S-structures, and a set of T-rules
which will further transform our set of S-structures (if necessary) into final derived
structures.
68
4.1 Recent Developments in Chomskyan Linguistics: Minimalism
In 1995, Chomsky published a book called The Minimalist Program which aimed further
to refine his thought on the ways in which language works. The Minimalist Program is in
line with his earlier thinking insofar as it seeks economy by arguing for a grammatical
description that has ‘the minimal set of theoretical and descriptive apparatus necessary’
(Radford, 1997b: 265). This ‘minimal set’ still requires a formidable set of principles and
conditions, but in very rough terms, languages are seen as consisting of three elements (cf
Chomsky, 2000: 10):
IP
D I´
We I VP
don’t V IP
expect DP I´
D N I VP
ø students to V DP
enjoy D N
the course.
If you follow this tree diagram from the ‘bottom up’, you can see how the word ‘course’
merges with ‘the’ to form a ‘determiner phrase’, which then merges with ‘enjoy’ to form
a VP. This phrase then merges with ‘to’ to form an ‘Inflection-bar’, that is, not a
complete phrase. The complete Inflection Phrase ‘students to enjoy the course’ is
constructed by merging the I-bar with the Determiner Phrase ‘students’, which in turn is
constructed by merging ‘students’ with the zero-determiner. The IP is merged with
‘expect’ to form a VP, the VP is merged with the auxiliary verb to form another
incomplete phrase, or I-bar, and finally the I-bar is merged with ‘We’ to form the
maximum projection, the final IP. The whole construction is formed by merging
69
consecutive constituents until the maximum projection is formed. The resulting analysis
is again not entirely dissimilar to that which is found in Immediate Constituent analysis.
Movements in the Minimalist Program are constrained by the ‘Minimality condition’ that
requires that words or phrases be moved from one position in a structure to another by the
shortest possible steps. Thus ‘She may get arrested’ derives from a deep structure in
which ‘she’ is originally the complement of ‘arrested’ (compare the active ‘arrested her’).
The pronoun moves first into the position before ‘arrested’, then into the position before
‘get arrested’, and finally into the position before ‘may get arrested’, at which point the
nominative case of ‘she’ checks with the feature of ‘may’ that requires a nominative
head. The step-by-step movement of the pronoun from the complement of ‘arrested’ to
the head of ‘may’ satisfies the ‘Minimality condition’ imposed by the Minimalist
Program.
More can be found out about the Minimalist Program by reading Chomsky (1995), or the
rather more accessible Radford (1997a&b). Chomskyan linguistics is abstract and
difficult, and it embodies a more diverse set of approaches than is sometimes realised.
However, its goals remain reasonably consistent: to construct an abstract model of
grammar that accounts for the construction of acceptable English sentences using as few
phrase structure and movement rules as possible. The more economical the model, the
better the theory. In Chomskyan grammar less is definitely more.
TG grammarians are concerned with constructing a model which will generate all the
possible acceptable sentences in a language. They are not concerned to describe
language in use -- the language that is actually used, after all, only represents part of the
language that is possible. Chomsky differentiates between competence and performance:
competence is what any native speaker of a language intuitively knows about how
sentences are formed -- it is this knowledge that TG rules attempt to model. Performance
-- what people actually say and write -- only represents the rather distorted output of
these rules, and is of little value to the TG grammarian. The TG grammarian values
intuition -- our knowledge about whether or not sentence X or Y is acceptable. However,
language being language, and people being people, intuitions about acceptability vary.
And as we move into the age of computerised corpora, which can gather together and
search quantities of data undreamt of in the 1950s and 1960s, it is becoming evident that
we do not always use language in the ways that we think we do. In the past two decades,
performance rather than competence has come back to the fore (see further, Chapter 10).
70
Secondly, Chomsky, like the structuralist linguists who preceded him, downplayed the
role of semantics in his grammar. To some degree, semantics enters into the constraints
that TG grammarians have devised for the lexicon of English (see Chapter 6, Section 5.0)
but it is safe to argue that ideally a TG grammar would present a set of rules without
recourse to the slippery subject of meaning. That is well and good, but some of us are
interested in meaning, and in the way that grammar encodes meanings. Within the TG
tradition there have been attempts to shoehorn in a semantic component -- Case Grammar
is the best known of these attempts -- but even Charles Fillmore, the foremost Case
Grammarian, admits that the attempt to formalise meaning in ways compatible with TG
have so far been unsatisfactory.
C= Complementiser e=empty
S'
C S
NP I VP
e Douglas does V NP
eat meat.
What kind of transformations are involved in the following derived structures? Draw tree
diagrams of them.
71
Say why the active sentences below are ambiguous but their passive counterparts are not.
72
Chapter 9
The Acquisition of Grammatical Competence
UG theories also developed partly as an attempt to account for the similarities among the
world's languages. As such, it is one of a number of ‘universalist’ theories of language,
going back at least to the 17th Century. UG has evolved from transformational-generative
models of English, which accounts for a certain ‘ethnocentric’ interest in features like
word-order. Even so, UG takes its cue from the assumption that all normal humans are
born with a Language Acquisition Device (LAD) as part of their mental makeup.
Universal Grammar attempts to describe the characteristics of the LAD. Obviously, UG
relates to first language acquisition, but the theory has also been very influential in
theories of second language acquisition (SLA). We shall look in particular at some of the
consequences of the theory for teaching English as a Foreign Language later in this
chapter. (It should also be noted that there are other ways of accounting for the
acquisition of language, more in keeping with the functional models of grammar we
considered in Chapters 2-4 of this workbook. A different approach to first-language
acquisition can be found for example in MAK Halliday (1975) Learning How to Mean
London: Edward Arnold.)
73
Theorists continue to argue about the need to construct a Universal Grammar; however, it
is undeniably difficult to account for the sophistication of children’s linguistic acquisition
without arguing for some kind of ‘built in’ or instinctive knowledge about how grammars
work.
That the set of options is limited accounts for the similarity between languages: human
beings are designed to set their linguistic parameters in a finite number of ways. UG also
claims to account for the fact that children acquire their first language (L1) despite poor
feedback from the environment: a powerful in-built universal grammar constrains (i.e.
limits) the number of hypotheses that a learner will make. Otherwise, the number of
hypotheses which could be made is limitless.
2.1 Principles
We have already noted one very basic principle about language that children seem to
know ‘naturally’ – that word-order is important. Word-order is particularly important in
present-day English, in which it is often the only way of identifying Subject and Object,
but it is also important in those languages which have case-endings and therefore more
flexible word-orders. A more subtle principle in English involves the knowledge that
children acquire about when they can and cannot contract a spoken form like ‘want to’ to
‘wanna’ (cf White, 1989: 6-7). Consider the examples below. The asterisks (*) identify
those examples which are considered ‘unacceptable’:
Evidence suggests that ‘wanna-contraction’ is avoided in spoken sentences like (Dii) and
(Eii) while they are ‘allowed’ in (Aii-Cii). Universal Grammarians argue that there is a
74
principle at work here. (Dii) and (Eii) result from a transformation, or movement, of an
element from between ‘want’ and ‘to’. In recent TG grammatical theory, it is assumed
that elements which have been moved leave a ‘trace’ behind, shown in the sentences
below as ø. The element being moved is a pronoun, shown in the sentences below as
someone/who.
Notice that in sentences A-C there is no ‘trace’ of a moved element between ‘want’ and
‘to’, blocking the contraction to ‘wanna’.
Children seem to acquire knowledge of this principle (namely, that a trace element
between ‘want’and ‘to’ blocks their contraction to ‘wanna’), despite the facts that (a) the
trace element is an invisible and inaudible abstraction, and (b) they are exposed to
variable and unreliable amounts of input from adults on which to base their hypotheses.
Arguably, then, children are ‘primed’ instinctively to acquire such rules, thanks to innate
knowledge of a range of grammatical principles.
2.2 Parameters
As well as knowing principles, children seem to be born knowing about different options
available to different kinds of language. Two well-known parameters are the pro-drop
parameter, and the head-position parameter (see White, 1989, Ch 4).
75
The interesting thing about the Head-position parameter is that it is consistent. Once the
value of the parameter has been set, or in other words, after one of the two options has
been chosen, then the position of head and complement are consistent throughout the
language. There are no known languages that vary the position of head and complement.
However, there are languages such as German in which the Head-position parameter does
not seem to operate in an entirely straightforward manner, and there is also considerable
debate about the transferability of this parameter to second language (L2) learners
(White, 1989: Ch 4).
1. Are the open parameters of the UG still available for the second language learner,
or do these learners have to ‘reset’ or ‘readjust’ the parameters which have been
given values? The answer to this question will have implications for contrastive
analysis and language transfer (i.e. the study of differences between languages
and of how these differences affect second language learning).
Formal grammar teaching went into a partial decline with the advent of task-based
learning, and this decline was fostered by some SLA theorists (for example, Stephen
Krashen) who popularised the view that formal grammar teaching was in fact an obstacle
to effective language acquisition. This led in some quarters to the active avoidance of
formal grammar instruction in the second language classroom. However, the so-called
learning-acquisition dichotomy is not as popular as it was in the early 1980’s.
76
Grammar teachers today begin with the assumption that language learning is a natural
process which goes through certain stages: the foreign language learner (with his/her
knowledge of UG) is continually in the process of developing hypotheses about the target
language. The developing interim grammars which result from these hypotheses are
called interlanguages. The second language teacher's problem is how to promote the
quick and efficient development of these interlanguages so that they result in something
approximating target language competence. Most teachers nowadays further assume
that:
a) both conscious and subconscious learning are necessary if learners are to achieve
both accuracy and fluency in the L2.
Grammar teaching which stresses the process of learning, and the need to match
instruction to the learner's readiness to acquire certain items, has been labelled
consciousness-raising. The associated exercises are consequently known as ‘C-R
activities’. (See 6.0 below.)
It is step two which causes second language teachers grief: after having apparently
‘mastered’ a structure, students later regularly get it wrong. What is often happening is
that the learner is freeing up processing space to analyse the chunk of language for the
first time, and when this happens, errors tend to occur. However, in time students should
proceed to step three, although the phenomenon of ‘fossilisation’ (ie getting stuck at step
two) is not uncommon. ‘Fossilisation’ is one of the phenomena which distinguishes first
language from second language acquisition.
77
b) A ‘Natural Order’
It is argued -- but it is still controversial -- that languages are acquired in a roughly
predictable order. That is, you can say in general which items are likely to be early or
late acquired, although it would be rash to make these claims for any one particular
learner. The following areas might well cause problems for learners whose competence
is still quite low:
1. Beginners like the grammar and the meaning to be related in a fairly linear
fashion. There is therefore a reluctance to delete items which reinforce this
linearity: eg the final pronoun is often kept in sentences like:
*That's the boy who my mother hit him.
Most of these problems are experienced by learners from a wide range of language
backgrounds, in the early stages of their acquisition of English. The problems are not
necessarily specific to learners whose L1 is a particular language, although the ‘errors’
can correspond to correct usage in the learner's L1 (i.e. there are languages, such as
Arabic, in which there are constructions equivalent to *‘That’s the boy who my mother
hit him’). In such cases, errors can be said to be the result of negative transfer.
UG theorists are rather more sophisticated in their approach -- and consequently the
model is much more complicated. If all languages share certain features, eg an attention
to word order, then some transfer from L1 to L2 will be positive. However, it may be
that some parameters have to be ‘reset’ with new values in the L2.
One example of parameter resetting occurs, as we have seen, with word order. English
has a rigid SVO or SVC pattern; other languages might have greater flexibility,
employed, for example, to vary the focus of information. For example, a Spanish student
might use a VCS pattern (*’Was very interesting that movie’). An English speaker could
78
also have ‘that movie’ as the climax of the sentence, but only if he or she maintains the
SVC pattern by inserting a dummy Subject (‘It was very interesting, that movie.’)
Spanish students, then, come to a study of English with the inbuilt knowledge that word
order carries meaning (positive transfer of a principle). However, some of the parameters
of their own language have to be assigned new values, as is the case with word-order
flexibility. Where parameters have to be reset, we can expect negative transfer from the
L1.
1. Skeleton sentences
How many ways can these words be combined to make a sentence in English? In what
kind of contexts would these sentences be appropriate?
This activity raises to consciousness the relationship between discourse and grammar.
The various constructions (elicited and/or presented) will all be used to shift the focus of
attention around the clause, whilst retaining a basic SPOAC pattern. This and the
matching activity below encourage learners to pay attention to the constraints of
discourse upon syntax.
2. Matching sentences
a) Match the following sentences so that the answer follows the question fairly
naturally. (The sentences are a little artificial. More than one answer might
sometimes be possible, although some answers are likelier than others.)
79
g) What did Mary do with the 8. It was with a knife that Mary sliced the
knife? carrots.
h) Who sliced what? 9. It was Mary who sliced the carrots.
10. What Mary sliced the carrots with was
a knife.
3. Grammatical judgement
Semantic realisations can also cause problems for learners: not all languages allow
various case roles to occupy Subject position in the clause; and marked roles (eg
Instrument) will be later acquired than unmarked roles (eg Agent). The two exercises
below are designed to raise to consciousness the relationship between syntax and
semantics in English. Such exercises can easily be adapted for group-work or self-
correction, using data gathered from class error analysis.
avoid avoided
deny denied
deprive deprived
to forbid it of its French-speaking forbidden
keep identity, no-one can say that they are kept
prevent prevented
prohibit prohibited
avoid
deny
to speak English. That is, forbid
in making French the official keep
language of Quebec, the laws still do not prevent anyone
prohibit
80
avoid
deny
from speaking whatever deprive
language they choose. forbid
Some people speak French and keep speaking English.
prevent
prohibit
avoid
deny
deprive
In Canada, they don't forbid you your rights.
keep
prevent
prohibit
The latter of these two activities can be related to the lexical constraints on the use of
vocabulary items, briefly discussed in Chapter 6. The vocabulary options for each ‘slot’
in the sentence are similar in meaning, but they behave differently in different
grammatical contexts. For example, some can and others cannot be followed by
infinitive forms, while others can or cannot be followed by the preposition from + Ving.
This kind of C-R activity effectively raises to consciousness the grammatical constraints
on the use of vocabulary items which are otherwise similar in meaning.
7.0 Summary
In this chapter we have again shifted our focus, in order to look at the impact of
grammatical theories, and in passing the impact of Universal Grammar, on the way we
understand how languages (and particularly English) are acquired. In brief, UG
developed out of the interest Chomskyan linguists had in modelling the psychological
processes which lead to the production of a grammar. UG, unlike the grammars
discussed earlier in this book, is not so much a grammar of English, as a description of
the initial state of instinctive knowledge from which English, and every other language,
develops. Every child is assumed to be born with an innate knowledge of basic linguistic
principles, and a set of parameters -- a finite set of options -- that are assigned values
through exposure to the mother tongue, and interaction with its speakers. UG seeks to
define what these principles and parameters might be.
Researchers into language acquisition are interested in the processes by which children
and adults learn their first and second languages. If the UG model is correct, then it
seems likely that most children will follow similar routes towards first language
acquisition, and that second language acquisition too will follow a ‘natural order’
determined by the application of universal principles and the setting and resetting of
parameters. Where the values assigned to parameters are similar in the L1 and L2, there
is likely to be positive transfer; where they are different, there is likely to be negative
transfer until the parameters are reset. While the acquisition of the grammar of one’s first
language is an instinctive, largely subconscious process, it is clear that the acquisition of
a second language by adults is aided by some degree of formal instruction, for example
through ‘consciousness-raising’ activities.
81
The Chomskyan tradition has focused much scholarly attention on the formal and
cognitive aspects of language behaviour. It has its critics – for example, in a book
entitled Educating Eve (1997), Geoffrey Sampson offers a spirited criticism of the very
foundations of post-Chomskyan linguistics. Another criticism of this tradition is that it
neglects to consider the social function of language. When language is seen primarily as
a set of formal operations which model the mental processes of the individual, there is
little scope for explanations which attempt to take into account the relationships of
individuals in social and cultural groups, or the relationships of individuals with their
world. Formal and functional grammarians’ linguistic descriptions might well be
incompatible in some respects simply because they are interested in fundamentally
different things and look to different criteria for evidence of their explanations. The next
chapter of this workbook returns us to issues relating to performance rather than
competence.
82
Chapter 10
Data-driven Grammars
How does Fillmore distinguish between the corpus and armchair grammarian? The
armchair grammarian, he says, sits by the fireside in a cosy armchair. He or she -- let us
say it is a he -- sits for long hours, a glassy expression in his eyes. Suddenly he sits up,
strikes his forehead, cries, ‘Gee, that's a interesting fact!’ and writes it down. It might be
a classification of determiners, a new rule for affix-hopping, or a systemic network for
ergativity. Let us say that it is the principle for ‘wanna-contraction’ discussed in Chapter
9, Section 2.1. The corpus grammarian – let us assume this one is a woman -- comes
along and looks at the fact, and comments, ‘Yes that is interesting -- but how do you
know it's true?’ The ‘wanna-contraction’ principle is a good example of the armchair
grammarian's reliance on intuition – who is to say that the ‘unacceptable’ forms are
indeed unacceptable?
The corpus grammarian relinquishes the armchair for the computing laboratory. She
scans into her computer memory thousands, perhaps millions, perhaps even a billion
words of running text, gathered from carefully constructed, representative samples of
speech and writing. From this vast array of data, using concordances, statistics and
sophisticated tagging and parsing programs, she comes up with a fact about language
about which she can demonstrate the truth. Let us say that the fact concerns the statistical
frequency of the use of ‘wanna’ in a given spoken genre. She prints this out. The
armchair grammarian happens by, perhaps on his way to buy a new pair of slippers, and
comments, ‘Yes, that is true -- but is it interesting?’
The point of this story is to illustrate two extreme examples of the way grammarians,
past, present and probably future, operate. Until recently, people who tried to formulate
grammars of a language were for practical reasons limited to quite small collections of
data. Even the Survey of English Usage was later confined to a one million words – a
drop in the ocean when you think of the number of words an individual uses in a day, a
week or a month. Let us say, then, that you wanted to show how the demonstratives this,
that, these, those worked in English -- you generally looked at some examples, and you
83
used your intuitive knowledge of the language to fill in the blanks. You did not look at
how the demonstratives were used in a billion words of contemporary running text,
because it would take too long to read and tabulate. Furthermore, if you were a
Chomskyan linguist, looking at a billion -- or two or three hundred billion -- words would
not only have been impractical but a terrible waste of time. Your billion words would be
an example of performance and what you really want to produce is an account of
competence -- the deep structure rules that allow such sentences to be generated.
The armchair linguists have a point. Data in itself is not a theory -- and even if you have
an almost limitless corpus of text to search, then you still have to have some idea of what
you're looking for. And -- depending on the complexity of what you are looking for --
you must program the computer to find it. What kinds of things might you wish the
computer to find? Some of the possibilities include:
frequencies of words
frequencies of phrases
frequencies of word-types (n., v., av., aj., etc)
frequencies of phrase-types (NP, VP, AjP, etc)
frequencies of sentence types
This kind of information is obviously useful to grammarians. Let us say we want to find
out how the word ‘out’ is used, both as a preposition, and in phrasal verbs such as ‘find
out’, ‘come out’, ‘break out’ etc. We can look at the data in our corpus, break down the
instances into related groups, and compare the frequencies of occurrence. The
information obtained might be used in different applications, such as syllabus design in
teaching English as a foreign language.
That kind of information (the occurrences of a particular word) is now reasonably easy to
find -- ten minutes in the STELLA lab and you are well on your way. Difficulties begin
to arise once you wish to move into more abstract areas – e.g. frequencies of word,
phrase or sentence types. The computer can recognise the letter sequence ‘o-u-t’ but it
84
will not immediately recognise that it is a preposition, or a part of a verb, or even,
perhaps, in some contexts, an adverb.
Therefore, if you want to get information about the frequencies of word, phrase and
sentence types, then you will have to tag each instance of each word, phrase and sentence
-- probably manually, though some of the work can be done by crude automatic parsers.
(It is presently best to check their tagging, though!) Notice that if you are tagging items,
then you must already have a framework for grammatical description, and if you have a
framework, then you must already have a theory. The paradox (which is a normal
paradox in scientific experimentation) is that the theory predates the data analysis --
though the data analysis may then modify the theory. Some corpora have been tagged
using a generative framework (LOB has been tagged in this way) while others have been
tagged using a systemic-functional framework (e.g. the Polytechnic of Wales -- POW --
Corpus). Arguments can then be made about the frequency of certain surface
realisations. While corpus grammars claim to be data-driven this therefore does not mean
that the data precedes the theory. However, it is fair to say that the relationship between
theory and data has changed, since it is now much easier to check theoretical intuitions
with reference to a vast amount of hitherto inaccessible evidence. The status of data – of
performance – is much more significant in grammars that have been formulated with
reference to computer corpora.
Obviously, over the past two or three decades a lot of hard work has been done for us --
large corpora have been assembled on statistically-sound principles, some have been
tagged carefully, and they are in the process of being analysed. The armchair
grammarian's ideas are being checked and new questions are being asked. The rest of this
chapter will be spent looking at specific instances: (i) where armchair grammarians' ideas
have been tested, and (ii) where new types of question about grammar have been asked.
Halliday (1985) also notes that ‘of’ is a little strange – it is not strictly a preposition, he
says, because it only is found in phrases which act as post-modifiers in NPs -- except in
the single instance where ‘of’ means ‘about’ and marks a circumstance of matter, e.g. He
spoke of strange and terrible omens. This constraint on the use of ‘of’ leads Halliday to
class it as mainly a structure marker in a nominal group.
85
This observation seems borne out by Quirk et al.'s Comprehensive Grammar of the
English Language. ‘Of’ appears intermittently but is treated in greatest detail in a
subsection of ‘The Noun Phrase’ entitled ‘Postmodification by prepositional phrases’. In
this section the following meanings are the main ones suggested for Of-phrases:
Sinclair ran a search for ‘of’ in a corpus of machine-readable text, and found that, as
expected, by far the majority of instances (c.80%) occurred in noun phrases. The 20%
that didn't occurred (a) in set phrases like ‘of course’, (b) after certain verbs (eg reminded
of, and (c) after certain adjectives (eg capable of). The number of circumstances of
matter (of mice and men,..etc) are few and far between.
By far the majority of ‘of-phrases’ occur in noun phrases. Now in basic grammar
courses, we analysed a noun phrase with an embedded PP phrase like this:
M H M x H
(this kind (of problem))
NP d N PP pr N
This analysis seems plausible enough: ‘kind’ is the headword of the noun phrase as a
whole, and ‘of problem’ simply postmodifies that headword. Or does it? Sinclair
grouped some of his occurrences thus:
The problem here is that the headword of the PP seems more important than the first
headword, i.e. the headword of the NP in these phrases. We do not normally expect
embedded headwords to be more salient -- more important -- than the headword which
the embedded phrase modifies. We shall return to this problem shortly (2.5 below).
On the basis of the evidence gathered from his search of the corpus, Sinclair divided the
occurrences of ‘of’ in NPs as follows:
86
a lot of the houses
some of these characteristics
a number of logistic support ships
Group F (Support nouns: (1) nouns which are rarely used alone)
the notion of machine intelligence
the position of France
an object of embarrassment
various kinds of economic sanctions
87
many examples of local authorities
the context of a kitchen
the familiar type of the peppery conservative
Group H (Metaphors)
the juices of their imaginations
the grasp of the undertow
a twilight of reason and language
the treadmill of housework
Group I (Titles)
the Duchess of Bedford
the United States of Europe
the new President of Zaire
the garden of Allah
Group J (Nominalisations)
the British view of the late senator
widespread avoidance of the call-up
a wonderful sketch of her
the aim of the lateral thinker
reflection of light
the description of the lady
the growth of a single-celled creature
the teaching of infants
the expectation of a million dollars
the design of nuclear weapons
It will now be clear that the use of ‘of-phrases’ in nominal groups (or NPs) can be
clarified by collecting instances from a large-scale corpus, and classifying the instances
into groups. Sinclair has four broad categories: measures, focus nouns, support nouns
and double-headed nominal groups. Let us consider them in more detail.
88
2.2 Groups C, D and E: Focus nouns
Focus nouns do not express measurement as such, but they focus in on part of the second
noun (Group C), or indeed some specialised part (Group D), or some component, aspect
or attribute of the second noun (Group E). Again, the area of differentiation among these
groups is sometimes blurred -- is, for example, ‘the whole hull’ a component or a part of
the boat? Sinclair says ‘component’ but his reason is not clear -- perhaps it is motivated
by the presence of the determiner ‘whole’. Perhaps it is unwise to get bogged down in
detail: the overall function of the first noun in all three groups (C, D, E) is to focus on
some part or aspect of the second noun. This can be done in a variety of ways, some of
which might overlap.
A less formal, more colloquial kind of ‘support’ is in vagueness indicators such as those
in Group G. Again some kind of discussion or analysis is going on, but this time
probably more in speech or in very informal writing.
The last main type of support (Sinclair also discusses more ‘marginal’ ones) is metaphor:
Group H. In these figurative phrases, some semantic feature of the second noun is made
vivid by the metaphor expressed by the first noun.
M M H x H
( (this kind) of problem))
NP NP d N m N
Note what has happened here. The primacy of the second headword (problem) has been
acknowledged, as has the modifying function of the initial NP. The preposition has been
reclassified as a ‘marker’ -- so far the only one of its kind in our grammar -- and its
function -- like an ordinary preposition -- is not to modify but to indicate a particular kind
of grammatical relationship. Its function is therefore marked with an ‘x’.
89
2.5 Double-headed nominal groups
You will have noticed that Groups I and J have not yet been mentioned. This is because
Sinclair argues that these NPs work in a different way. He argues for the primacy of the
second noun in Groups A-H, and this primacy has been shown in our analysis above. But
in Groups I and J, according to Sinclair, both nouns are equally necessary. The phrases
are simply double-headed. The cases where this is so are the easily-identified set of titles
, as in Group I (the Duchess of Bedford) , and nominalisations, as in Group J (where
actions and states are expressed as nouns: eg avoidance of call-up ).
The representation of this kind of structure would demand a new kind of phrase structure.
We could keep ‘of’ as a structural marker, here a kind of phrasal conjunction perhaps, but
we would need a new type of phrase that allowed the presence of two headwords.
Perhaps the best way to do this would be to show the structure of each separate NP linked
by ‘of’
M H x H
( (the President) of (Zaire))
NP NP d N m NP N
M H x M H
( (the description) of (the lady))
NP NP d N m NP d N
But, unfortunately, even this is not entirely clear-cut. Sinclair argues that if focus and
support nouns are modified, their lexical ‘weight’ might increase to the point where they
are better regarded as double-headed noun phrases, so we have a distinction between:
MM H x H
( (a gasp) of shock)
NP NP d N m N
M M M H x H
( (a little shrill gasp) of (shock))
NP NP1 d Aj Aj N m NP2 N
In the former analysis, a gasp modifies shock, while in the latter, a shrill little gasp is
classified as a NP on an equal footing with shock. Without a theory, of course, you pays
your money and you takes your choice. Some people may not be convinced by the
double-headed NP and prefer the old-fashioned postmodifying PP (of shock) as a solution
to titles and nominalisations, possibly because they could argue that the first NP is more
important than the second in these phrases. But the idea of focus and support nouns as
premodifiers nevertheless has some attraction.
It is worth noting in passing that double-headed nominal groups might solve other
problems in grammatical analysis. In other grammar courses, you might have
encountered the concept of ‘nouns in apposition’, that is, phrases like ‘Robert, that
rugged individual’ or ‘that rugged individual, Robert’. For the sake of convenience, we
90
treated such expressions as embedded noun phrases, in which the second phrase always
post-modified the first:
H M M M H M M H M H
(Robert (that rugged individual)) (that rugged individual (Robert))
NP N NP d Aj N NP d Aj N NP N
A double-headed nominal phrase would allow us to treat the two constituents as equal in
value:
H M M H M M H H
((Robert) (that rugged individual)) ((that rugged individual) (Robert))
NP NP1 N NP2 d Aj N NP NP1 d Aj N NP2 N
Here we have no structural marker, like ‘of’ to link the two phrases explicitly, but our
analysis still suggests that the two phrases within each NP are equivalent, rather than that
they exist in a Head-Modifier relationship. As before, much depends on what you
understand the relationship between the two phrases to be.
The general point is nevertheless that, with concordances, we can break free of a
dependence only on intuition, and supplement our intuitions with evidence from a large
amount of text, quite quickly and quite easily. We can check and test our grammatical
observations. But we can also ask new kinds of question, as new grammars, particularly
of spoken English, are beginning to illustrate. Fuller descriptions of the grammar of
spoken language are now available in grammar books such as Biber et al (1999) and
Carter and McCarthy (2006).
BYU-BNC: http://corpus.byu.edu/bnc/
SCOTS: http://www.scottishcorpus.ac.uk
91
The British National Corpus data can easily be restricted to particular registers, which
include ‘spoken’, or, defining more narrowly, ‘courtroom speech’, ‘interview’, ‘sermon’,
‘conversation’, and so on, by selecting from a pull-down menu on the BYU-BNC page,
and the SCOTS data can be restricted to the spoken documents only by selecting ‘spoken’
in the Standard or Advanced Search options. By limiting our searches to spoken
documents, we can begin to explore aspects of the grammar of speech. Here we focus on
one common feature of speech to which the availability of corpora has drawn our
attention, namely delexicalised verbs.
If you simply take cuttings from an apple tree they will grow vigorously…
Here the subject of the first clause is you, the object is cuttings, and the verbal process is
take, which has its basic dictionary meaning of moving something or someone from one
place to another. However, if we search for the sequence take a in the spoken section of
the SCOTS corpus, we find other possible uses of take:
I take a drink
Do they take a big jump at the top
Before you had to take a breath
maybe you can just also take a look at this one
92
ye just take a nap or a kip yeah
just take a wee sippie at a time
In these examples, take has lost its meaning of moving something from one place to
another – in other words it has become delexicalised. What seems to be happening here is
that the delexicalised verb substitutes for a verb that has been turned into a noun and put
in the object position in the clause (drink, jump, breathe, look, nap/kip, sip). The reason
for this is possibly that the action that would have been expressed as a verb can more
easily be modified when it has been turned into a noun (big jump, wee sippie).
Delexicalised verbs are also a relatively common feature of written English but they seem
particularly useful in spoken language, where there is perhaps greater emphasis on
evaluating events and actions. Other common delexicalised verbs are have and give, as in
the following examples, also from the SCOTS data:
4.0 Colligation
The availability of language corpora has allowed linguists to turn their attention more
fully to colligation, that is, the grammatical relationships that words and phrases form.
Hoey (2005, p.43) defines colligation as:
1. the grammatical company a word or word sequence keeps (or avoids keeping)
either within its own group or at a higher rank
2. the grammatical functions preferred or avoided by the group in which the word
or word sequence participates
3. the place in a sequence that a word or word sequence prefers (or avoids)
In other words, to explore the colligation of a word or phrase, we would consider the
following questions:
93
• does the word or phrase typically appear as part of the subject, predicator, object,
complement or adverbial in a clause?
• does the word or phrase typically function as the head or modifier in a phrase?
In this fashion, we build up a profile of the grammatical behaviour of the word or phrase
in question. To explore colligation, let us consider a fairly rare lexical item, eco-friendly.
In the 100 million words of the BNC, eco-friendly occurs only 15 times, in the following
contexts:
1. It’s more eco-friendly, as (a) the plants are a replaceable resource, and (b)
burning ethanol distilled from them doesn’t add to atmospheric CO 2.
2. …I would like to be allowed to put my faith in wine merchants such as the
Kendricks or Simon Loftus of Adnams when they tell me which of their wines are
eco-friendly.
3. The play is a musical about eco-friendly aliens whose mission is to save our
planet.
4. Muji’s own make of eco-friendly transport follows sturdy, basic designs…
5. Eco-friendly collectives such as Catweasle Press, Conscious Earthwear and No
Lo Go (a label and an Oxfam shop in London’s Marylebone High Street) are
embracing unbleached cotton, old bedspreads and jumble sale clothes.
6. And it may be a comforting thought to some that an Australian company is
experimenting with eco-friendly coffins made of newspapers, which are cheap and
biodegradable.
7. …that salted peanuts are a killer for birds; that eco-friendly insecticides are a
contradiction in terms;
8. In Japan and traditionally eco-friendly European countries such as Switzerland
and Denmark, it has never been popular.
9. …a wing of guest rooms in every hotel converted to an eco-friendly
environment, to be monitored over two years to see how energy consumption
compares with standard rooms.
10. Will my hon. friend look at the work being done in Austria and France to make
an eco-friendly diesel fuel from oilseed rape and other oil crops?
11. Enter Goldfinger, the eco-friendly banana.
12. The initiative, based on ideas introduced by the Inter-Continental group,
focuses on areas such as energy-saving heating, recycling waste and buying eco-
friendly products.
13. Do you want to know how easy it is to affect the environment of the world by
planting trees or buying eco-friendly products?
14. Eco-friendly power plant planned for capital’s centre…
15. They were impressed by the eco-friendly solvent spinning operation, which
starts with harvested woodpulp and uses chemicals which can be totally recycled.
On the basis of the 15 examples from the data provided by the BNC, then, we can make
the following tentative suggestions about the colligation of eco-friendly.
94
Eco-friendly modifies nouns. More specifically, it modifies nouns expressing human or
human-like beings and institutions (aliens, collectives, European countries), products
(ethanol, wines, (make of) transport, coffins, insecticides, diesel fuel, banana, products
(x2)), industrial plant or processes (power plant, solvent spinning operation), and
ambience (environment). The most common type of headword is product. Eco-friendly
can in turn be modified by the intensifying adverb more, indicating that it is a quality.
Does the word or phrase typically appear as part of the subject, predicator, object,
complement or adverbial in a clause?
If for the sake of argument we look mainly at the function of the phrase in which eco-
friendly appears in the clause or subordinate clause in which the phrase appears, then we
find the following results:
There is a fairly even distribution of phrases amongst the clause functions. In the subject
position, people and things that are described as eco-friendly engage in actions (embrace,
enter, follow), are described (are a contradiction in terms) and are subject to action in
passive constructions (are planned). They also participate as objects in other clauses, in
which they are made and bought (x2). Alternatively, things such as ethanol and wine are
described as eco-friendly, and those things and people that are eco-friendly are present in
different kinds of Adverbial (about eco-friendly aliens, with eco-friendly coffins, in…eco-
friendly countries, to an eco-friendly environment, by the eco-friendly solvent spinning
operation).
Does the word or phrase typically function as the head or modifier in a phrase?
In the overwhelming majority of instances (13 of the 15), eco-friendly is a modifier,
preceding a noun. In two instances it is the head of its own phrase, and once it is modified
by more.
Task: Colligation
Eco-friendly is of course a fairly straightforward word, which yields sufficient examples
to provide a quick and fairly rough analysis. Using the BNC data, you might wish to
attempt a colligational profile of another word – a more frequent and variable one, like
baby.
1. Go to the BYU-BNC at http://corpus.byu.edu/bnc/.
95
2. Search for the sequences ‘baby [n*]’ and ‘baby [aj*]’. Your results will give you
insights into the use of the word in phrases like baby boom, and baby fresh.
3. Then search for ‘[n*] baby’ and ‘[aj*] baby’. Your results will show you instances of
baby as a headword.
4. Since there are over 8000 instances of baby in the BNC, take a sample of perhaps 100
instances and track the use of phrases with baby as subject, predicator, object,
complement and adverbial. One question that such an analysis would answer is how
much agency babies tend to be given in Anglophone culture – do they tend to be the
subject or the object of active clauses?
Grammarians talk about different verb systems when they attempt to relate the different
forms of verb phrases to their meanings. Verb systems include tense and aspect (whereby
the verb form usually changes to express meanings related to time and duration). Other
verb systems include mood (the distinction between statements, questions and
commands), modality (the use of modal auxiliaries to express concepts like possibility
and obligation, e.g. might work, should work), voice (the distinction between active and
passive uses, e.g. he has remodelled the house, the house has been remodelled), and
finiteness (the capacity of the verb phrase to signal tense, as in is/was working, or not to
signal tense, as in working). Here we will touch briefly on the use of corpora to explore
two features of verb systems, namely aspect and voice.
96
One of the features often taught to learners of English as a foreign language is that certain
types of verb, namely verbs of perception and affect, like see and love tend to be
expressed using the simple aspect, even though the actions the verbs refer to have
duration and may be happening at the moment of utterance. That is, learners are taught
that I see is preferable to I’m seeing, and that I love you is preferable to I’m loving you.
A corpus can help us to investigate exactly how these verbs behave with respect to tense
and aspect. For example, we can run a search for ‘see*’ in the BNC, restricting the search
to spoken data. First of all we can note that the instances of see far outnumber the
instances of seeing. Many of the instances of see can be accounted for by the common
discourse marker, I see. Even so, it is revealing to compare the uses of see/seeing in oral
presentations, e.g.:
We can observe that there is an option in English to choose either the simple or
continuous aspect in this kind of context – but there is a subtle change in meaning. The
first group of utterances treat see as an uncontested fact – something has presented itself
to our sight or our understanding. In the second set of utterances, the emphasis is on
seeing as a process of perception or understanding – the process is what is at stake in
these utterances, and it might be more easily contested than in the first group of
utterances.
97
5.2 Formal and informal passive constructions
O’Keeffe, McCarthy and Carter (2007: 106-114) analyse and discuss the meanings of be-
passives and the less formal get-passive, as in
He was arrested.
He got arrested.
They conclude that the get-passive is used more in informal contexts when ‘speakers are
marking attitude, most probably that attitude denoting concern, problematicity in some
way, or, at the very least, noteworthiness of the event as judged by the speaker, beyond
its simple fact of occurring’ (ibid, pp.113-14; emphasis in original).
Their observations can be tested by running a search on the spoken data in BNC for was
*ed and got *ed and comparing the ‘neutrality’ or otherwise of the speaker’s stance in
the results. Some of the results might support the suggestions made by O’Keeffe,
McCarthy and Carter; in other cases the stance of the speaker using the informal get-
passive is more difficult to gauge. Compare the following examples:
Er well there was none of them got married during the time that I was there.
Er my biggest downfall was that the guy that employed me who was the eldest
brother of the two that owned the company got killed in a bloody erm riding
accident…
And eleven of them got involved in a fist fight in the middle of one of those New
York streets.
Arguably, in the view of O’Keeffe, McCarthy and Carter, the use of the get-passive by
the second group of speakers problematises the actions of being married, killed or
involved more explicitly than does the use of the be-passive in the first, although in some
instances the use of the get-passive simply signals that the event is ‘noteworthy or of
some significance to the speaker’ (ibid, p.111). An alternative theory is that, more
explicitly than the be-passive, the get-passive assigns responsibility for the action to those
affected by it. Thus if the speaker says I was involved in the 1926 strike, the speaker’s
agency is not explicitly expressed; he or she might have been involved by accident. But if
the speaker says I got involved in the 1926 strike, then his or her agency, or carelessness,
is more explicitly expressed. If those affected by an action bear some of the responsibility
for it, and the speaker expresses this, then the situations are probably more likely also to
be those that problematise the action in question.
98
linguists had to rely for their observations on more limited language data, manually
collected and analysed, or alternatively, they had to rely on intuition, their reflections on
their own knowledge of language and their feelings about what is acceptable and
unacceptable, and what particular constructions mean. The view that grammarians should
rely on intuition was strengthened, from the 1950s on, by the prominent linguist Noam
Chomsky’s distinction between competence, an individual’s knowledge about language,
and performance, the spoken and written language that an individual actually produces
(see Chomsky 1965). Chomsky made the description of competence, or knowledge of
grammar, the goal of linguistic scholarship, and played down the value of performance.
For armchair grammarians following in Chomsky’s footsteps, intuition is the key to
eliciting generalisations about language structure and to formulating rules that show the
relationship between one structure and another. In generalising about the structures of
language and the relationships between these structures, they attempt to model
knowledge about grammar. Performance, as represented by corpora, plays little or no part
in this project. Corpus grammarians, therefore, have had to engage in restating the value
of analysing performance. They claim that the study of language data on a large scale
brings to light structures and behaviour that are not available to intuition alone. At their
most extreme, corpus linguists argue that their models of grammar are ‘data-driven’, that
they emerge from a study of the language behaviour of thousands of people. Corpus
linguists must show that data-driven analysis leads to genuinely innovative insights into
and models of grammatical behaviour, as in Hunston and Francis (1999).
Despite the relative novelty of corpora, the proven insights that corpus data have given us
into the behaviour of words and phrases now make it difficult for any grammarian to
dispense with the immensely powerful tools that corpora represent for the study of
language. Performance is back on the linguistic agenda. However, it is an indisputable
fact that data does not automatically give rise to theories that explain it; we still use our
intuition to search corpora for features that we think might be interesting: we construct
hypotheses based on our intuition or a partial analysis of the data, and we test those
hypotheses against further data. There is therefore a continuous interaction between our
intuitions and our data-based analyses. For example, people brought up in Scotland might
feel intuitively that the distribution and meanings of modal auxiliary verbs in the Scottish
speech community vary from those that are current south of the border. They might feel
on the basis of their intuitions about their own and their fellow Scots’ practice that certain
modals were avoided, others used, and yet others had meanings particular to the Scottish
community. They could then form a hypothesis based on their intuitions, test them
against corpus data and refine them in the light of their findings.
Data driven grammars based on corpus data, then, are powerful tools for the description
of the behaviour of a speech community – whether that community is conceived of as
being determined by geography, class, gender, profession or other criteria. But some
grammarians remain interested in accounting for grammar by appealing to the mind of
the user, not the collected output of a given community of users. And some grammarians
wish to do this without necessarily appealing to the formal mechanisms of generative
syntax. This desire has given rise to the relatively new field of cognitive grammar.
99
Chapter 11
Cognitive Grammar
Not all grammarians, as we have also seen, share this primary interest in
psycholinguistics, and not all of the grammarians who are interested in the nature of the
mind share the formal linguists’ fascination with the formulation of generative rules. In
the 1980s, also largely in America, an alternative school of cognitive linguistics began to
form, influenced by the work of Ron Langacker, who was, in turn, interested in research
into linguistic topics such as cognitive metaphor, by scholars such as George Lakoff and
Mark Johnson (e.g. Lakoff, 1996). Lakoff and Johnson, to simplify their work
considerbably, popularised a shift in the study of metaphor from an analysis of linguistic
texts to focus instead on the mental processes that arguably produce metaphor, which
they see as a mapping of one conceptual domain (e.g. A JOURNEY) onto another (e.g.
LIFE) which would account for our ability to comprehend sentences such as ‘I have
reached a milestone in my life.’ Lakoff and Johnson believe that such metaphors arise
from embodied experience and perceptions of the world; while cultural differences in
language and metaphor certainly exist, there seems to be a universal tendency to associate
UP with happiness (e.g. ‘I’m on a high’) and DOWN with sadness (e.g. ‘I’m feeling
low’), a tendency that probably has less to do with an innate language organ, and more to
do with bodily sensations.
Langacker’s contribution has been to take the kinds of insight Lakoff and Johnson
brought to Semantics and apply them to grammar. Unlike grammarians such as the
generativists, who are interested in formalising cognitive operations, this school of
cognitive linguists do privilege meanings in their grammatical descriptions and accounts.
To this extent, some of their interpretations might remind you of SF accounts, though the
connection is seldom made (in his introduction to cognitive linguistics, for example,
David Lee makes no direct reference to work by Halliday, though a few other
systemicists, such as Günther Kress, are cited, and their work appear in his bibliography).
This chapter of the workbook focuses on several key concepts in cognitive grammar, and
100
illustrates them with a few features of grammar that might cause you problems:
prepositions, phrasal verbs and raising constructions. The accounts in this chapter are
largely drawn from Lee (2001). The concepts are construal, perspective, foregrounding,
metaphor, frame and extension.
2.0 Construal
A fundamental assumption of cognitive linguistics (which it shares with SFG) is that
states and events in the natural world (and, by extension, the world of the imagination)
are ‘encoded’ into the system of language. There is no single way of doing this, and so a
state or event can be conceptualised or ‘construed’ in one of a number of ways. Different
construals may give rise to alternative structures for the same state or event or disallow
apparently similar structures. Consider the following sentences:
Here, traditionally, we have two ways of representing the ‘indirect object’ (me/to me).
We have two ways of encoding what is the same situation, and a generative grammarian
would simply account for the difference with movement rules. However, there are other
situations in which this kind of equivalence seems less ‘natural’, e.g.
While the first sentence sounds natural enough, the second one doesn’t. This suggests
that while the structures look similar, they represent different conceptualisations, or – to
put it in the terms of cognitive grammar – the relationships between the actors and the
processes are construed differently in each of the four sentences. Clearly the meaning of
‘show’ and its relationship to the direct object is different in ‘show the results’ and ‘show
a good time’, and our conceptualisation of these processes and relationships make the
realisation of the indirect object as a prepositional phrase (to me/to you) more or less
natural.
One task of the cognitive grammarian, then, is to explain why we can say one thing but
not another in terms of how we construe the events that we are encoding in language.
3.0 Perspective
The perspective of the person producing the utterance is a factor in the different ways of
construing an event or state. Points of reference are either implicit or explicit in an
utterance. They are explicit in some pairs of sentences about the same event, e.g.
Obviously, the speaker’s perspective on the event changes, even if the act of travelling
home is the same – in the first sentence above, the speaker construes the act from the
101
perspective of someone who is also ‘at home’ (even if that someone is not actually the
speaker) and towards whom, therefore, ‘Alicia’ is ‘coming’. In the second sentence, the
sentence is construed from the perspective of someone who is not ‘at home’ and towards
whom ‘Alicia’ is not, therefore, ‘going’ when she travels ‘home’.
As David Lee (2001: 3) points out in his discussion of perspective, the construal of the
same event from different points of view can have an impact on the meanings of identical
phrases. What does ‘a good price’ mean in the following two sentences – is it a high price
or a low price? How does your understanding of ‘a good price’ in each sentence indicate
the orientation you are taking to the utterance – in other words, your perspective on the
event?
Perspective often entails points of reference and movement, construed from different
points of view. In the first pair of sentences above, ‘Alicia’ is moving towards ‘home’,
the point of reference. In the second pair of sentences, the car is moving from Maria to
Lucas, each of whom can be a point of reference. In cognitive grammar, we refer to the
points of reference as the ‘landmark’ and whatever is moving is called the ‘trajector’. The
speaker of the first two sentences construes the movement of the trajectory (‘Alicia’)
either from the perspective of the landmark (‘home’) or not. The speaker of the second
pair of sentences construes the movement of the trajectory (‘the car’) either from the
perspective of Maria or Lucas (the potential landmarks) and interprets ‘for a good price’
accordingly (‘good for Maria’, or ‘good for Lucas’).
The sun’s going down. Or the earth’s coming up, as the fashionable theory has it. (Small
pause.) Not that it makes any difference.
4.0 Foregrounding
When an individual construes an event, he or she often has the choice of selecting one
particular component as being relatively more prominent. We know from SF grammar
that we can manipulate prominence, or salience, by selecting a particular participant as
Subject and by moving its position in the sentence (eg from Rheme to Theme). And so
we have alternative ways of encoding a particular event in language:
102
I broke your car window with my golf ball.
A golf ball has broken your car window.
Your car window has been broken.
While SF grammar focuses on the linguistic resources for manipulating salience (eg the
ideational and textual functions of language), cognitive grammar is more concerned with
the mental workings of the individual who produces the sentences: perspective and
salience seem to be rooted in visual perception. Foregrounding, then, goes beyond
indicating responsibility for an event through shifting the selection of Subject and
ordering the sequence of sentence constituents. You can foreground a perception simply
by changing a preposition:
The first of the above two sentences foregrounds the perception of ‘the road’ as a
container, while the second foregrounds the perception as a surface. What is
foregrounded in a sentence can depend on the relationship of the verb to the participants,
e.g.
Here the relationship between verb and participants indicates that in the first sentence the
speaker foregrounds ‘Scotland’ as a geographical space, while in the second, the
construal of ‘Scotland’ that is foregrounded is that of the collective of the voting
population.
Foregrounding has a long history in literary and critical linguistics, where textual patterns
are assessed in discussions about salience in the interpretation of a given text. In Prague
School linguistics, for example, foregrounding is known as aktualizace. The interest of
the cognitive grammarian, however, is less in textual patterns and more in what part of
the individual’s knowledge base is activated by a word, phrase or larger structure. If a
man says to his wife The dog’s been bitten by a snake then the phrase the dog presumably
activates both her knowledge about dogs as a category, and their own dog in particular –
her knowledge about its visual appearance and behaviour, for example. Her knowledge
about the dog will be greater than her knowledge or concern with the snake (compare the
husband’s possible sentence A snake’s bitten our dog! which, by selecting the snake as
Subject and shifting it to thematic position foregrounds the snake).
103
5.0 Frame
The concern with the knowledge that speakers and listeners bring to interactions extends
to what in cognitive linguistics are known as ‘frames’. ‘Frames’ refer to an individual’s
knowledge of a situation and how the elements that make up that situation function
within it. For example, which frames are triggered by the word goal in the following
sentences?
We make sense of these three sentences in relation to our knowledge of the situations that
they are likely to refer to: football (or ‘soccer’), rugby, and business, respectively.
Even the sentence The dog’s been bitten by a snake is likely to conjure up a frame in
which the dog is running free in the countryside, rather than being taken for a walk in the
city.
Conceptual frames are culturally relative and change over time. How natural, for you, are
the following sentences?
Bananas and apples fall into our prototypical category of fruit, though Brazilians and
British people might, for example, be more inclined to think of one or the other as a
preferred example. Tomatoes and cucumbers are – technically – also fruit, insofar as they
also develop from the flower and contain seeds. But they do not usually fall into the same
frame as apples and bananas when we think of eating ‘fruit’. Avocados are more
contentious. Brazilians may think of them as being like apples and bananas; British
people (like me) might be more inclined to think of them as being like vegetables.
Personally, I think of avocados as a type of savoury foodstuff, which I add olive oil to
and eat in salads. The idea of avocado mousse, with sugar added, or an avocado
smoothie, I initially found repulsive. For me, it was like suggesting a cucumber mousse
or tomato smoothie. But, living in Brazil for some time now, I have readjusted my frame
of what you eat in a salad and what you eat as dessert…or in a smoothie.
104
realisation of a material process is the present continuous. Thus the following sentences
are both acceptable and make sense:
The argument goes that we frame the actions as either mental (hear) or physical (listen)
processes. Typically, physical processes have a beginning, a middle and an end point:
when I listen to you, I might turn my head towards you, attend to you when you are
speaking, and stop listening when you stop speaking. The present progressive
acknowledges the duration of this physical activity. Mental processes do not have this
same ‘bounded’ characteristic: one doesn’t start or stop ‘hearing’, particularly in the
sense that it has here, which is something like ‘register and understand’. So, normally,
mental processes do not acknowledge duration, and the ‘default’ realisation is the simple
aspect.
But, as we have seen, frames are culturally relative, and the way we perceive things
might change. What is the difference between these two sentences – in other words, what
kind of ‘frame’ about the nature of the process do they trigger?
The first of these two sentences frames ‘loving’ as an unbounded process, something with
no beginning or end, which in this instance has attached itself to the series on Netflix. In
the second sentence we understand ‘loving’ as an experience of intense enjoyment that
began as the series started (or when I started watching it), and so is bounded by the
duration of the series so far. The second sentence is also more likely to be produced by a
slightly younger speaker – the use of ‘love’ and its ability to enter into these kinds of
structures has changed in the last few decades. Cognitive grammarians account for such
grammatical changes as cultural and conceptual shifts in the relationship of the word,
phrase or larger structure with the frames that govern the way we think about entities
(like fruit and vegetables) and processes (like events that are bounded or unbounded).
6.0 Metaphor
We have seen that cognitive grammarians account for aspects of sentence formation and
structure by appealing to the ways in which we construe an event or statue from a
particular perspective, foregrounding the salient aspects and relating them to our frames
of typical situations. Often, however, we make sense of one conceptual domain with
reference to another – this is the basis of metaphor. For example, we might understand
the phenomenon of death in terms of absence, or being elsewhere:
105
Metaphor is far from a marginal issue in cognitive grammar – it is an everyday and
pervasive means of accounting for grammatical phenomena that might otherwise appear
random and unrelated. It is normally part of the extension of meaning of a word, phrase
or structure that allows it to be used in a wider range of contexts with a wider range of
meanings. Metaphorical extension can be illustrated in the many nuanced interpretations
of the preposition through (cf Lee, 2001: 39-48 for more detail).
First of all, let us assume that the prototypical meaning of through relates to a locational
frame, that is, it is part of our knowledge about how things move. They can go over,
under, around or through, for example. If we think of how speakers visualise movement
through we can come up with what is called an ‘image schema’ which might look like
this:
LANDMARK
In other words, the ‘basic’ meaning of through involves a trajector moving from a source
towards a goal, and, on the way, it enters and leaves a landmark. In a prototypical
sentence, all or most of these elements might be present, e.g.
The diesel fuel is transferred from an underground tank through pipes to above-ground
dispensers.
In this sentence, the trajector is the diesel fuel, the tank is the source, the pipes are the
landmark and the above-ground dispensers are the goal. However, if you look at a given
corpus of English, remarkably few instances of through actually appear in sentences of
physical motion and location. The basic meaning based on the image schema – of a
trajector moving into and out of a landmark towards a goal – frequently goes through a
number of increasingly abstract and metaphorical extensions. Consider the following
examples.
As David Lee observes (2001: 41), examples like this treat the trajectory and/or the
landmark in a more abstract fashion. In the first sentence, there is no actual trajectory,
just the woman’s line of sight which is directed beyond the window to the rain outside. In
the second example, the man presumably does not literally walk through the door (or he
would be hurt!); rather he passes through the doorway. In each of these cases, the basic
image schema is being extended in some way. The meaning of through becomes
increasingly more metaphorical in the following examples, where the preposition might
106
be used adverbially, or, sometimes, is combined with a particular verb to form a phrasal
verb.
First, there is the possibility that the trajector’s passage affects the landmark in some way,
possibly damaging it or consuming it completely:
The minister’s confessions are causing panic through the entire government.
I’m afraid we’ve gone through all our budget.
You and me, we’re through.
Then there is the possibility that the landmark is difficult for the trajector to pass through:
In most of the examples we have considered so far, the motion is through space (although
sometimes that space is metaphorical, e.g. a relationship can be perceived as a space that
the lovers pass through, until the relationship is finished). If we draw upon the common
conceptual metaphor that SPACE is TIME, then we can understand the following
sentences:
The USA experienced a series of social revolutions through the 1960s and 1970s.
I woke up half way through the night, desperate for something to drink.
If we combine the metaphorical notion that SPACE is TIME, and that the landmark is an
obstacle, then events can be construed as ordeals to be endured:
In another type of metaphorical shift, there is also the possibility that the landmark is
perceived as an instrument by which the trajector can move from the source to the goal:
In the above two sentences there is a metaphorical shift from the conceptual domain of
MOTION to the domain of ACQUISITION. To accomplish this shift, we need to think of
the trajector not as something that is moving, but as something that is acquiring. The goal
is the acquisition, and the landmark is the means of acquisition.
In examples like through there is what is called ‘radial extension’ from the core image
schema. In other words, the basic meaning, rooted in our experience of a world where
objects might enter and leave landmarks on their way from source to goal, ‘radiates out’
107
to encompass other meanings, through the processes of abstraction (objects become
concepts) and metaphor (e.g. the passage through space is reconceptualised as a passage
through time, or as the process of acquiring something). The result is that words, phrases
and larger structures might be used to express meanings that, at first glance, seem
unrelated, but which can be explained by attending to the abstractions and metaphorical
relations that the linguistic phenomena enter into. At first glance, there might be little to
relate the use of through in We drove through the tunnel in the mountain range between
São Paulo and Minas and He’s going through a really hard time just now. But if we think
of the steps by which the meaning radiates from the core image schema, the relationship
may become clearer.
7.0 Conclusion
Cognitive grammar, then, shares with Chomskyan approaches to syntax a concern not so
much with text as with the mental processes that explain why we can say certain things in
certain ways. The difference between the approaches lie in the Chomskyan assumption
that we can account for syntax by creating a formal model that will generate all (and
only) the acceptable sentences of English, and that the most efficient version of this
model will effectively represent a human-specific ‘language organ’. Cognitive
grammarians do not share this assumption. Rather they assume that language originates
in our perceptual experiences of the world and the ‘image schemata’ that result from our
embodied perceptions. Meanings radiate out from these core image schemata via
abstraction and metaphor. The resulting structures continue to exhibit the trace evidence
of these processes, and they continue to express the knowledge frames, attributions of
salience and the perspective of the speaker.
8.0 Activities
1. Construal
From the perspective of cognitive grammar, how would you account for the factors that
trigger the following encodings in English.
a) Matteo kissed Gabriella.
b) Gabriella kissed Matteo.
c) Gabriella and Matteo kissed.
d) He’s eaten every biscuit on the plate.
e) He’s eaten each biscuit on the plate.
f) Would you like a chocolate?
g) Would you like some chocolate?
h) The woman at the corner table wants coffee.
i) The woman at the corner table wants a coffee.
j) The local team is playing really well at the moment.
k) The local team are playing really well at the moment.
108
2. Foregrounding
What is being foregrounded and backgrounded in the following sentences? Does any
seem less ‘natural’ than the others? If so, why?
a) To test her reflexes, I tapped a small hammer against her knee.
b) To test her reflexes, I tapped her knee with a small hammer.
c) To test her reflexes, I tapped a small hammer against Wendy.
d) To test her reflexes, I tapped Wendy with a small hammer.
3. Framing
What knowledge frames do you draw upon to make sense of the following sentences?
a) After five days, we saw land.
b) After five hours, we saw the ground.
Certain texts have been deliberately devised to deprive readers of contextual frames. In
the absence of a frame, how do you make sense of the following passages? How does
your understanding change when the frame becomes clear?
d) The procedure is actually quite simple. First, you arrange things into different
groups. Of course, one pile may be sufficient depending on how much there is
to do. If you have to go somewhere else due to lack of facilities that is the next
step; otherwise, you are pretty well set. It is important not to overdo things.
That is, it is better to do too few things at once than too many. In the short run,
this may not seem important but complications can easily arise. A mistake can
be expensive as well. At first, the whole procedure will seem complicated.
Soon, however, it will become just another fact of life. It is difficult to foresee
any end to the necessity for this task in the immediate future, but then one can
never tell. After the procedure is completed one arranges the materials into
different groups again. Then they can be put into their appropriate places.
Eventually, they will be used once more, and the whole cycle will then have to
be repeated. However, that is part of life.
Passages are from Bransford, J. D., and M. K. Johnson. “Contextual Prerequisites for
Understanding Some Investigations of Comprehension and Recall.” Journal of Verbal
Learning and Verbal Behavior 11, no. 6 (1972): 717-726.
109
4. Extension
Match the sentences with the developing image schemata on the following page. These
examples are based on the discussion of out in Lee (2001: 35ff).
Once you have matched the sentences, consider the radial network and discuss how
useful (or not) you find the visualisations of the image schemata.
9.0 Envoi
This course has introduced you to a fairly wide range of grammatical theories. The
purpose of looking at these different ‘grammars of English’ is to try to get ‘underneath’
the kinds of grammatical description you were probably exposed to in earlier parts of
your language study, and to offer you reasons why grammatical phenomena are described
in the ways that they are.
As you look back on this course, you might reconsider the evidence used by the different
grammars of English: the ‘discovery procedures’ of the structural grammarians, the
structure tests of the formal and generative grammarians, and the slightly more subjective
ways that surface features are linked to meanings by the SF and cognitive grammarians.
Corpus grammarians sometimes stress the data they can collect by computerised searches
more than the rigorous application of any one theory – but even their descriptions must
proceed on some theoretical basis and using some – inevitably debatable – assumptions
about what grammar is and what kinds of things grammars can and should tell us.
110
Radial network for out (Lee, 2001: 35)
TR
LM
LM
TR
TR LM
(1) EXTERIORITY
(CENTRAL MEANING)
OBSERVER
OBSERVER
LM
TR TR
LM
(4) ENTRY INTO/PRESENCE IN (5) EXIT/ABSENCE FROM
CONCEPTUAL FIELD CONCEPTUAL FIELD
CONCEPTUALISER
CONCEPTUALISER
LM
LM
(6) ENTRY INTO/PRESENCE IN (7) EXIT FROM COGNITIVE FIELD
COGNITIVE FIELD
TR TR
LM LM
111
description of given structures, the generation of possible sentences, or the linking of
sentences to their social context or mental perceptions. However, it should by now be
obvious that insights from one school do often shape the procedures used in another
school. For instance, TG, SF and cognitive grammarians attempt to account for
‘processes, participants and circumstances’ in recent versions of their grammars, although
they tend to approach this topic in quite different ways. So do not expect much
consistency across textbooks and theoretical discussions in the field!
As emphasised throughout, this course can only deliver a sketch (sometimes approaching
a caricature) of the various grammars mentioned. The recommended reading gives some
guidance in the selection of introductory textbooks and more advanced work on each of
the grammars covered. None of the more advanced texts is particularly easy reading –
but it is in the nature of theory to be difficult. This workbook should at least help you get
oriented as you start your investigation of a sometimes tough but always rewarding
subject.
112
Further Reading
The following list includes books used heavily in the preparation of this course. Other good
books are coming on the market all the time. If a book does not appear on this list, it is not
necessarily an indication of its lack of worth!
General:
The following books give useful general background to grammar, or are standard reference
guides, including the mighty Quirk et. al. (1985) Comprehensive Grammar of English and more
recent, corpus-informed pretenders to its authoritative throne, Biber et al (1999) and Carter and
McCarthy (2006). It is worth looking at them with a view to discovering which grammatical
theories underpin the descriptions given. Simpson (1979) gives a brief account of pre-20th
century grammatical theories as well as more detailed descriptions of 20 th century schools of
thought. Sampson (1980) is also excellent if a little dated.
Biber, D., E. Finegan, S. Johansson, S. Conrad and G. Leech (1999) Longman Grammar of
Spoken and Written English
Carter R. and M. McCarthy (2006) Cambridge Grammar of English: A Comprehensive Guide
Crystal, D (1987) The Cambridge Encyclopedia of Language
Huddleston, R (1984) Introduction to the Grammar of English
Huddleston, R (1988) English Grammar: An Outline
Leech, G & Svartvik, J (1975) A Communicative Grammar of English
Palmer, F (1983) Grammar
Sampson, G (1980) Schools of Linguistics: Competition and Evolution
Simpson, JMY (1979) A First Course in Linguistics
Quirk, R & Greenbaum, S (1973) A University Grammar of English
Quirk, R, Leech, G & Svartvik, J (1985) A Comprehensive Grammar of the English Language
Ferdinand de Saussure:
An introduction to the ‘father of modern linguistics’ by Culler, and the reconstituted ‘cours’ in
translation:
113
Davidse, K (1987) ‘MAK Halliday's Functional Grammar and the Prague School’ in Dirven, R
and Fried, V eds Functionalism in Linguistics
Eggins, S (1994) An Introduction to Systemic Functional Linguistics
Halliday, MAK (1985) An Introduction to Functional Grammar
Halliday, MAK and Matthiessen, C (2004) An Introduction to Functional Grammar 3rd edn
Hudson, R (1986) ‘Systemic Grammar’ Linguistics 24 pp 791-815 (A review of Butler 1985 and
Halliday 1985)
Thompson, G (2004) Introducing Functional Grammar 2nd edn
Structural Grammar
Simpson (1979) has an accessible account of the strengths and weaknesses of structural grammar.
Lyons (1968) is more complicated but worth a look. The other books are primary reading –
Bloomfield’s legendary Language sketches out some grammatical principles amidst a wealth of
other information about phonetics and morphology. Fries fleshes out the skeleton; Hymes and
Fought provide a useful reflection on the continuing impact of structuralism, post-Chomsky.
114
Universal Grammar and Second Language Grammar
Most of these books presuppose some familiarity with UG and/or a Chomskyan grammatical
model. They are useful examples of how formal grammatical theory has been applied to second
language education. For a critical perspective, see Atkinson (1982).
Corpus-informed Grammar
See Biber et al (1999) and Carter and McCarthy (2006) in the general section above for recent
reference grammars that use corpus-informed insights. Sinclair’s anthology of articles is still a
good introduction to corpus grammar; Hunston (2002) goes from theory to applications; Hunston
and Francis (1999) suggest a new data-driven model of grammar. Anderson and Corbett (2017)
give a basic introduction to corpus-driven language analysis and guidance on how to use online
corpora.
Anderson, W. and J. Corbett (2017) Exploring English With Online Corpora 2nd edn.
Biber, D, Conrad, S and Reppen, R (1998) Corpus Linguistics: Investigating language structure
and use
Hunston, S. and G. Francis (1999) Pattern Grammar: a corpus-driven approach to the lexical
grammar of English
Hunston, S. ed (2002) Corpora in Applied Linguistics
McEnery, T, and Wilson, R (1996) Corpus Linguistics
Meyer, CF (2002) English Corpus Linguistics: An Introduction
O'Keeffe Anne, Mccarthy Michael, Carter Ronald (2007) From Corpus To Classroom:
Language Use And Language Teaching
Sinclair, J (1991) Corpus, Concordance, Collocation
Cognitive Grammar
The main theorist behind Cognitive Grammar is Langacker, and his texts are the foundational
ones in the field. An accessible introduction is by David Lee, whose account is used as the basis
for the chapter in this workbook. A broader view of cognitive semantics and grammar is Lakoff
(1987).
Lakoff, G. (1987) Women, Fire and Dangerous Things: What Categories Reveal about the Mind
Langacker, R. (1981) Foundations of Cognitive Grammar: Theoretical Prerequisites
Langacker, R. (1990) Concept, Image and Symbol: The Cognitive Basis of Grammar
Lee, D. (2001) Cognitive Linguistics: an Introduction
Rudzja-Ostyn, B. (ed) (1988) Topics in Cognitive Linguistics (includes Langacker’s chapter ‘An
overview of cognitive grammar’).
115
Useful Web Resources
University College London has a web-based grammar course for undergraduates. It can be found
at http://www.ucl.ac.uk/internet-grammar
There has been an explosion of online corpora in the last two decades and they are becoming
more sophisticated and varied. They include:
The BYU portal at http://corpus.byu.edu/ gives access to the most powerful collection of corpora available
online. You can spend hours browsing the collection, the most useful of which is probably the Corpus of
Contemporary American English. If you are interested in language change and diachronic linguistics, the
Corpus of Historical American English and TIME corpus are fascinating. If you are interested in pop
culture there is the SOAP corpus and there are also massive corpora of World English and news sites. You
have to register for this site and you will be limited to a certain number of queries per day unless you pay a
modest registration fee.
GlossaNet
GlossaNet, at http://glossa.fltr.ucl.ac.be/, run by the University of Louvain in Belgium, facilitates
concordance analysis of daily-updated corpora of newspaper texts in many languages. Users can specify
language and search term requirements, and receive concordances by email. The GlossaNet Instant facility
provides concordances online.
116
Hong Kong Polytechnic University Language Bank
The Hong Kong PolyU Language Bank resource, at http://langbank.engl.polyu.edu.hk/indexl.html, offers
access to a bank of corpora (of English and other languages), all of which can be searched, and
concordances created. The available corpora include the BNC Sampler, and corpora in the domains of
business, academia, travel and tourism, medicine and fiction.
Scottish Corpus of Texts & Speech (SCOTS) & Corpus of Modern Scottish Writing (1700-1950)
SCOTS, available at www.scottishcorpus.ac.uk, contains 4 million words of texts in Scottish English and
varieties of Scots, covering a wide range of genres from conversations and interviews to prose fiction,
poetry, correspondence and official documents from the Scottish Parliament. Twenty per cent of the corpus
is made up of spoken texts, which are presented as orthographic transcripts synchronised with streamed
audio/video recordings. Features include a concordancer and map visualisation. Complete texts can be
viewed and downloaded, and audio/video recordings can also be downloaded. Extensive demographic and
textual metadata is available for each text, and can be used to refine a search. A historical counterpart of 4
million words of written English in Scotland (including a subcorpus of transcribed letters) is available via
the same website.
WebCorp
WebCorp allows the user to harness the World Wide Web for use as a language corpus of English and other
languages: http://www.webcorp.org.uk/. WebCorp features collocation analysis, the possibility of filtering
results according to date and collocates, and a word list generator, which creates word lists for individual
web pages. While it is very difficult to use the Web to make quantitative statements about language,
because the overall quantity of data and proportions of different registers is almost impossible to establish
(not least because it is constantly changing), the unparalleled quantity of authentic language data which the
Web offers makes it a valuable resource for exploring features of language such as uncommon words and
neologisms.
117
Essay Titles
The following essay titles are suggested for this part of the Topics in Grammar course. In the essay, you
are expected to show that you have engaged with and understood some of the recommended reading for the
course, and additional credit will be given if you have sought out and incorporated some recent research
relevant to the topic (e.g. in journals or newly-published books). You should look at two or three different
sources at least for any essay chosen; some questions require more independent work than others – credit
will be given for more ambitious essays.
Your essay should directly address the topic given, and make reference where appropriate to the
background reading (using proper citations and references). Credit will be given if you (a) adapt the
examples from your background reading to show that you have understood the theoretical principles being
discussed; and (b) engage in a critical discussion of the background reading, rather than simply reproducing
the ideas of others.
1. Explain the roles of the three ‘metafunctions’ in functional grammar. Discuss, too, the advantages
and disadvantages of devising ‘semantic’ definitions of grammatical constituents.
2. Discuss the acquisition of grammar EITHER by children OR second language learners from
EITHER the perspective of Universal Grammar OR systemic-functional grammar.
3. How are the constituents of Immediate Constituent Analysis identified and defined? In your
answer, illustrate by using ICA to give sample analyses of phrases and sentences.
4. Take two or three sentences of different types and write phrase structure (PS) rules for them.
What do PS rules attempt to do that, say, the tree diagrams of the structuralist grammarians do not
do?
6. What grammatical problems does the notion of the X-Bar attempt to solve? Furthermore, what
advantages in terms of power and economy does the concept of the X-Bar have for
Transformational-Generative grammar?
7. Illustrate different kinds of movement (or ‘transformation’). What are the advantages for a
grammatical theory of having a set of movement rules?
8. From your own reading into the subject, give an explanation of the key principles underlying the
Minimalist Program in recent Chomskyan linguistics.
9. Discuss the acquisition of grammar EITHER by children OR second language learners from
EITHER the perspective of Universal Grammar OR systemic-functional grammar.
10. Are corpus-based grammars really driven by data? Illustrate the kind of insights that a grammar
based on a computer corpus can give us, and discuss the roles of theory and evidence in delivering
these insights.
11. Using examples of your own, illustrate how the concepts of construal, perspective, foregrounding,
metaphor and frame are used in a cognitive approach to grammar. You might look at one of the
following topics in your discussion: prepositions & adverbs, phrasal verbs, verbal aspect, mass &
count nouns.
118