0% found this document useful (0 votes)
166 views75 pages

CH 1 Schmitt - An Overview of AL

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1/ 75

APPLIED LINGUISTICS

Edited by Schmitt
1ST Ch.:
An Overview of Applied Linguistics
By: Sarah H. Kwair
Supervised by:
Asst. Prof. Dr. Shuooq Aboodi Ali
19th safar 1443 A.H. mon. 27th September 2021
A.D.
What is Applied Linguistics?

‘Applied linguistics’ is using what we know about


(a) language,
(b) how it is learned
And
(c) how it is used, in order to achieve some purpose or
solve some problem in
the real world.
In a broad sense, Wilkins (1999: 7) defines applied
linguistics as concerned with increasing understanding of
the role of language in human affairs and thereby with
providing the knowledge necessary for those who are
responsible for taking language-related decisions whether
the need for these arises in the classroom, the workplace,
the law court, or the laboratory.
The range of these purposes is partly illustrated by the
call for papers for the American Association of Applied
Linguistics (AAAL) 2010 conference
• analysis of discourse and interaction
• assessment and evaluation
• bilingual, immersion, heritage and language minority education
• language and ideology
• language and learner characteristics
• language and technology
• language cognition and brain research
• language, culture, socialization and pragmatics
• language maintenance and revitalization
• language planning and policy
• reading, writing and literacy
• second and foreign language pedagogy
• second language acquisition, language acquisition and attrition
• sociolinguistics
• text analysis (written discourse)
• translation and interpretation.
Out of these numerous areas, the dominant application has
always been the teaching and learning of second or foreign
languages (L2). Around the world, a large percentage of
people, and a majority in some areas, speak more than one
language.
English is the main second language being studied in the
world today, and even a decade before this book was
published, an estimated 235 million L2 learners were
learning it
(Crystal, 1995: 108).
Traditionally, the primary concern of applied linguistics
has been
• second language acquisition theory,
• second language pedagogy
and
• the interface between the two
Carter and Nunan (2001: 2) list the following sub-
disciplines in which applied linguists also take an
interest:

• literacy,
• speech pathology,
• deaf education,
• interpreting and translating,
• communication practices,
• lexicography and
• first language acquisition.
Besides mother tongue education, language planning and
bilingualism/ multilingualism, two other areas that Carter
and Nunan (2001) did not list are:
• authorship identification and
• forensic linguistics.
These areas exemplify how applied linguistics knowledge
may be utilized in practical ways in non-educational areas
Authorship identification uses a statistical analysis of
various linguistic features in anonymous or disputed texts and
compares the results with a similar analysis from texts whose
authors are known.
Applied linguistics is interested in cases where language goes
wrong.
(Fromkin, l973, 1980) mentions that we can better understand
how the brain functions when we analyse what happens when
the speaker’s language system breaks down or does not
function properly. Even slips of the tongue and ear committed
by normal individuals can give us insights into how the human
brain processes language. On that belief Researchers are
working on language-related disorders study the speech of
aphasic, schizophrenic(1) and autistic speakers(2), as well as
hemispherectomy patients(3).
Notes:
1. Aphasia is a condition that robs you of the ability to communicate.
It can affect your ability to speak, write and understand language, both
verbal and written. Aphasia typically occurs suddenly after a stroke or a
head injury.
2. Autism is a neurodevelopmental disorder characterized by
difficulties with social interaction and communication, and by
restricted and repetitive behavior. Parents often notice signs during the
first three years of their child's life.
3. A hemispherectomy is a rare surgery where half of the brain is
either removed or disconnected from the other half. It's performed
on children and adults who have seizures that don't respond to medicine
The Development of Applied Linguistics
Early History
118th century

Samuel Johnson 1755 Robert Lowth 1762


Plato and Aristotle contributed to the design of a curriculum
beginning with good writing (grammar), then moving on to
effective discourse (rhetoric) and culminating in the
development of dialectic to promote a philosophical approach
to life’
In 1755, Samuel Johnson published his Dictionary of the
English Language, which quickly became the
unquestioned authority on the meanings of English words.
It also had the effect of standardizing English spelling,
which until that time had been relatively variable
Robert Lowth published an influential grammar, Short
Introduction to English Grammar (1762), but whereas
Johnson sought to describe English vocabulary by collecting
thousands of examples of how English words were actually
used, Lowth prescribed what ‘correct’ grammar should be.
He had no specialized linguistic background to do this, and
unfortunately based his English grammar on a classical Latin
model, even though the two languages are organized in quite
different ways.
The result was that English, which is a Germanic language,
was described by a linguistic system (parts of speech)
which was borrowed from Latin, which had previously
borrowed the system from Greek. The process of
prescribing, rather than describing, has left us with English
grammar rules which are much too rigid to describe actual
language usage:
• no multiple negatives (I don’t need no help from
nobody!)
• no split infinitives (So we need to really think about all
this from scratch.)
• no ending a sentence with a preposition (I don’t know
what it is made of.)
These rules made little sense even when Lowth wrote them, but
through the ages both teachers and students have generally
disliked ambiguity, and so Lowth’s notions of grammar were
quickly adopted once in print as the rules of ‘correct English’
Applied Linguistics during the Twentieth Century
An Overview of the Century
The real change in linguistic description and pedagogy occurred
during the twentieth century. At the beginning of the century,
second languages were usually taught by the ‘Grammar-
translation method’, which had been in use since the late
eighteenth century, but was fully codified in the nineteenth
century by Karl Plötz (1819–1881)
the method grew into a very controlled system, with a heavy
emphasis on accuracy and explicit grammar rules, many of
which were quite obscure. The content focused on reading
and writing literary materials, which highlighted the archaic
vocabulary found in the classics.
One of the main problems with Grammar-translation was
that it focused on the ability to ‘analyse’ language, and not
the ability to ‘use’ it. In addition, the emphasis on reading
and writing did little to promote an ability to communicate
orally in the target language
A new pedagogical direction was needed, By the beginning of
the twentieth century, new use-based ideas had coalesced into
what became known as the ‘Direct method’

This emphasized exposure to oral language, with listening


and speaking as the primary skills. Meaning was related
directly to the target language, without the step of translation,
while explicit grammar teaching was also downplayed.
It imitated how a mother tongue is learnt naturally, with
• listening first,
• then speaking, and only later
• reading and
• writing.
The Direct method had its own problems, however.
• It required teachers to be highly proficient in the target
language, which was not always possible.
• Also, it mimicked L1 learning, but did not take into
account the differences between L1 and L2 acquisition.

One key difference is that L1 learners have abundant exposure


to the target language, which the Direct method could not
hope to match.
Michael West was interested in increasing learners’ exposure
to language through reading. His ‘Reading method’
attempted to make this possible by promoting reading skills
through vocabulary management. To improve the readability of
his textbooks, he ‘substituted low-frequency “literary” words
such as isle, nought, and ere with more frequent items such as
island, nothing, and before’
He also controlled the number of new words which could
appear in any text.
These steps had the effect of significantly reducing the lexical
load for readers. This focus on vocabulary management was
part of a greater approach called the ‘Vocabulary Control
Movement’, which eventually resulted in a book called the
General Service List of English Words (West, 1953), which
listed the most useful 2000 words in English.
American structural linguists stepped into the gap and developed
a programme which borrowed from the Direct method,
especially its emphasis on listening and speaking. It drew its
rationale from the dominant psychological theory of the time,
Behaviourism, that essentially said that language learning was
a result of habit formation.
the method included activities which were believed to
reinforce ‘good’ language habits, such as
• close attention to pronunciation,
• intensive oral drilling,
• a focus on sentence patterns and
• memorization
students were expected to learn through drills rather than
through an analysis of the target language. The students who
went through this ‘Army method’ were mostly mature and
highly motivated, and their success was dramatic. This
success meant that the method naturally continued on after
the war, and it came to be known as ‘Audiolingualism’
Chomsky’s (1959) attack on the behaviourist underpinnings of
structural linguistics in the late 1950s proved decisive, and its
associated pedagogical approach – audiolingualism – began to
fall out of favour.

Supplanting the behaviourist idea of habit-formation,


language was now seen as governed by cognitive factors, in
particular a set of abstract rules which were assumed to be
innate
Chomsky (1959) posited that children are born with an
understanding of the way languages work, which was referred
to as ‘Universal Grammar’.

children would need only enough exposure to a language to


determine whether their L1 allowed the deletion of pronouns
(+pro drop, for example, Japanese) or not (–pro drop, for
example, English). This parameter-setting would require much
less exposure than a habit-formation route, and so appeared a
more convincing argument for how children learned language so
quickly
In the early 1970s, Hymes (1972) added the concept of
‘communicative competence’, which emphasized that
language competence consists of more than just being able to
‘form grammatically correct sentences but also to know when
and where to use these sentences and to whom’
This helped to swing the focus from language ‘correctness’
(accuracy) to how suitable any use of language was for a
particular context (appropriacy). At the same time, Halliday’s
(1973) systemic-functional grammar was offering an alternative
to Chomsky’s approach, in which language was seen not as
something exclusively internal to a learner, but rather as a means
of functioning in society.
Halliday (1973) identified three types of function:
• ideational (telling people facts or experiences)
• interpersonal (maintaining personal relationships with people)
• textual (expressing the connections and organization within a
text, for example, clarifying, summarizing, signalling the
beginning and end of an argument)
Halliday’s (1973) systemic-functional grammar approach to
language highlighted its communicative and dynamic nature.
These and other factors pushed the field towards a more
‘communicative’ type of pedagogy
The revised 1998 version lists six broad categories of language
function:
• imparting and seeking factual information
• expressing and finding out attitudes
• getting things done (suasion)
• socializing
• structuring discourse
• communication repair.
In addition, eight general categories of notions were listed,
which are shown here
with representative examples of their sub-classes:
• existential (existence, presence, availability)
• spatial (location, distance, motion, size)
• temporal (indications of time, duration, sequence)
• quantitative (number, quantity, degree)
• qualitative (shape, colour, age, physical condition)
• mental (reflection, expression of ideas)
• relational (ownership, logical relations, effect)
• deixis (anaphoric and non-anaphoric proforms, articles).
In the early 1980s, a theory of acquisition promoted by
Krashen (1982) focused attention on the role of input.
Krashen’s ‘Monitor theory’ posited that a second language was
mainly unconsciously acquired through exposure to
‘comprehensible input’ rather than being learnt through explicit
exercises, that it required a focus on meaning rather than form,
and that a learner’s emotional state can affect this acquisition
(‘affective filter’).
The methodology which developed from these factors
emphasized the use of language for meaningful
communication – communicative language teaching (CLT)
(Littlewood, 1981). The focus was on learners’ message and
fluency rather than their grammatical accuracy.
The assumption was that the learners would acquire the L2
simply by using it to learn the subject matter content,
without the L2 being the focus of explicit instruction. Taking
the communicative approach to its logical extreme, students
could be enrolled in ‘immersion’ programmes where they
attended primary or secondary schools which taught subject
matter only in the L2.
learners could indeed become quite fluent in an L2 through
exposure without explicit instruction, and that they developed
excellent receptive skills.

the learners continued to make certain persistent grammatical


errors, even after many years of instruction. In other words, a
communicative approach helped learners to become fluent,
but was insufficient to ensure comparable levels of
accuracy.
Until the 1980s, tests were evaluated according to three
principal criteria:
• ‘Validity’ (did the test really measure what it was supposed
to measure?)
• ‘Reliability’ (did the test perform consistently from one
administration to the next?)
• ‘Practicality’ (was the test practical to give and mark in a
particular setting?).
Messick (1989) changed this with a seminal paper which argued
that tests could not be considered ‘valid’ or ‘not valid’ in a black
and white manner by focusing only on test-internal factors;
rather, one needed to argue for the validity of a test by
considering a variety of factors:
Other criteria that should be considered:
• for what kind of examinee was the test suitable;
• what reasonable inferences could be derived from the scores?;
• how did the test method affect the scores?;
• what kind of positive or negative effect (‘washback’) might the
test have on stakeholders?
• and many others.
• tests are seen in the context of a complete assessment
environment, which includes
• stakeholders (for example, examinees, raters, administrators,
government officials),
• test conditions (for example, can everyone hear the tape
recorder clearly),
• the intended use of the scores (for example, will they be used
for relatively ‘high-stakes’ purposes (university admission)
versus relatively ‘low stakes’ purposes (a classroom quiz)) and
• characteristics of the test itself (Are the instructions clear?
What kind of tasks does the test employ?).
Within this framework, tests are generally seen as being
suitable for particular purposes and particular sets of
learners, rather than ‘one size fits all’
personal computers probably has had the greatest impact
on applied linguistics.

Pedagogically, this opened the door to ‘computer-assisted


language learning’ (CALL), where learners could work on
individual computers truly at their own pace.
Computing technology also made it possible to analyse large
databases of language, called ‘corpora’.
corpora have provided numerous insights into the workings of
language
Corpora are now a key tool in lexicography, and have been
consulted in the development of most current learner
dictionaries.
Evidence from corpora of spoken discourse has also
highlighted the differences between spoken and written
discourse
Incorporating Social and Cultural Elements into AL

The mid-twentieth century domination of behaviourism


as the overriding psychological paradigm (at least in
English-speaking countries) meant that only stimuli (that
is, teaching input) and reactions (student responses) which
could be observed were considered worthy of discussion in
the area of psychology.
In linguistics, a similar dichotomy occurred when Saussure
(1857–1913) split language (‘langue’) from the actual use of
language (‘parole’).

Chomsky’s (1965) ideas had a similar effect as they


distinguished what was happening inside the learner (‘language
competence’) from what was observable outside the person
(‘language performance’).
One view of cognition, called ‘sociocultural theory’,
emphasizes individual– social integration by focusing on the
necessary and dialectic relationship between the sociocultural
endowment (the ‘inter’-personal interface between a person and
his or her environment) and the biological endowment (the
‘intra’-personal mechanisms and processes belonging to that
person), out of which emerges the individual.
Sociocultural theory suggests that in order to understand the
human mind, one must look at these two endowments in an
integrated manner, as considering either one individually will
inevitably result in an incomplete, and thus inaccurate,
representation.

For it is only through social interaction with others that


humans develop their language and cognition
Psycholinguistic Perspectives in Applied Linguistics

One of the most noticeable recent trends has been the


establishment of a more psychological perspective of
language acquisition, processing and use.
Psycholinguistic perspectives have now become a major
influence in applied linguistics, in areas ranging from theory
building to research methodology
Perhaps the most noticeable outcome is that the current leading
theories of how second languages are acquired are all
informed by psycholinguistic thinking and research.
The shared point between those theories is that the mind
extracts the recurring patterns from the language input a
learner receives.
For example:
(spl – splatter, split, spleen) = exist
(zlf) = doesn’t exist
the learner’s linguistic knowledge is ‘constructed’ through
general learning mechanisms, rather than being innately in
place, as Chomsky posited more than half a century earlier.

The process is implicit, but eventually the patterns may become


salient enough that a learner is able to describe them explicitly.
A related trend is use of psycholinguistic research methodologies to
explore language processing in much more detail than before
possible. Previously, most language measurement required
explicit knowledge of linguistic features because learners were
required to write down or say their answers.

Newer psycholinguistic techniques can look into the inner


workings of the brain while learners are using language in various
ways.
This has now made research into the very initial pre-conscious
stages of language learning possible. For example, Schmitt
describes how this is beginning to revolutionize research into
vocabulary acquisition. He relates how
• Reaction-timing studies can inform about the development
of automaticity of exical access.
• Priming studies can show the acquisition of collocation
pairings.
• Eye-movement studies can show how formulaic sequences
are read by native and non-native speakers.
• Event-Related Potentials (ERP) can indicate the very
earliest traces of lexical learning
• Functional Magnetic Resonance Imaging (fMRI) can show
the locations where various types of word (that is, words
relating to parts of the body) are activated in the brain
there is a growing awareness that the various factors also affect
each other in dynamic and fluid ways. For example, language
learners’ willingness to communicate (WTC) is partially
dependent on their levels of proficiency and on their linguistic
self-confidence.

successful communication can improve the learner’s language


proficiency and enhance their confidence
Greater proficiency should lead to greater confidence.
Conversely, greater confidence may lead to the learners
putting themselves in situations where they use and practise
their language more, which in turn may lead to improved
proficiency
Themes to Watch For in this Book
The Interrelationship of the Areas of Applied Linguistics

language is a big, complex subject and we are nowhere near to


being able to comprehend it in its entirety. The best any person
can do at the moment is to study a limited number of elements of
language, language use and language learning, and try to
understand those elements in detail.
applied linguistics is compartmentalized to some extent.
this compartmentalization is an expedient which enables us
to get around our cognitive limitations as human beings; it is
not the way language works in the real world. Language,
language learning and language use are a seamless whole and
all of the various elements interact with each other in
complex ways.
The Move from Discrete to more Holistic and
Integrative Perspectives

• Up until the middle of the last century, language was viewed


in very discrete terms:
• grammar,
• phonology and
• vocabulary, each of which could be separately identified and
described.
Now language use is not just a product of a number of
individual language ‘knowledge bits’ which reside completely
within ‘interlocutors’ (language users); it’s affected by a
number of other factors, such as:

• The social context (who you are communicating with and


for what purpose),
• The degree of involvement and interaction,
• The mode of communication (written versus spoken) and
• Time constraints
Lexico-grammar and Formulaic Language

One of the most interesting developments in applied linguistics


today is the realization that vocabulary and grammar are
not necessarily separate things, but may be viewed as two
elements of a single language system referred to as ‘lexico-
grammar’ (Halliday, 1978). This term acknowledges that much
of the systematicity in language comes from lexical choices
and the grammatical behaviour of those choices. (SEE P.
12)
Bringing the Language Learner into the Discussion

in the early 1970s, it was realized that learners are active


participants in the learning process and should be allowed to
take substantial responsibility for their own learning.
It first led to the development of the area of ‘learner
strategies’. then it followed that what these learners did
would make a difference in the quality and speed of their
learning.
More recently, there has been a great deal of emphasis on
how the individual characteristics of each learner affects
their learning (that is, individual differences).
a range of differences either constrain or facilitate the rate at
which second languages are learned, including
• age (Birdsong, 2006),
• aptitude (Dörnyei, 2005),
• learning style preferences (Cohen and Weaver, 2006),
• strategy use (Griffiths, 2008) and
• motivation
New Perspectives on Teaching the Four Skills

The teaching of the four language skills has long been an


important concern in second language pedagogy.
Although it is useful to give attention to the unique sub-
skills and strategies associated with each skill, it is also
important to consider the overlaps in mode (oral versus
written) and process (receptive versus productive):
Oral Written
Receptive LISTENING READING decoding
Productive SPEAKING WRITING encoding
Top-down processing utilizes shared knowledge, pragmatic
knowledge and contextual information to achieve an
appropriate interpretation or realization of textual meanings
and messages.
Bottom-up processing depends on language resources –
lexico-grammar and phonology (pronunciation) or
orthography – as aids to the accurate decoding or
interpretation, or encoding or realization, of meaningful text.
The Lack of ‘Black and White’ Answers

Because language is created and processed both between


interlocutors and within the human mind, much of what is of
interest in applied linguistics is hidden from direct view and
study; i.e. most research has to rely on indirect evidence
observable through language processing and use.
Conclusion

language learning and language use are not static, but are
constantly evolving. At the point in time when you read this
book, they will still be changing.
Thank
You!

You might also like