Cambridge Esol and Cefr

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4
At a glance
Powered by AI
The passage discusses the history and development of the Common European Framework of Reference (CEFR) and its relationship to Cambridge English exams. It explores how the CEFR levels emerged from traditional English language teaching levels and how Cambridge exams have helped define several of the CEFR levels.

The CEFR is a framework put forth by the Council of Europe to serve as a common basis for describing language learning, teaching and assessment. It aims to provide a transparent, coherent and comprehensive basis for language learning, teaching and assessment. The CEFR plays a growing role for language testers, and Cambridge is often asked about relating its ESOL exams to the CEFR levels.

The CEFR levels emerged gradually from traditional ELT levels, with labels like 'intermediate' and 'advanced'. Several key CEFR levels like B1, B2, C1 were initially defined based on established Cambridge exams like PET, FCE and CAE. The six CEFR levels (A1-C2) now provide a defined pathway for language learning.

ResearchNotes

EXTRACT FROM RESEARCH NOTES ISSUE 24 / MAY 2006 / PAGES 2–5 24/01

Cambridge ESOL exams and the Common European Framework


of Reference (CEFR)
| LYNDA TAYLOR AND NEIL JONES, RESEARCH AND VALIDATION GROUP

Introduction The conceptual perspective


A previous Research Notes article explored issues of test In part, emergence of a framework formalised conceptual levels
comparability and the role of comparative frameworks as with which ELT learners, teachers and publishers had operated for
communicative tools (Taylor 2004). One framework which has a some years – with familiar labels such as ‘intermediate’ or
growing role for language testers is the Common European ‘advanced’. Dr Brian North, one of the CEFR’s authors, confirms
Framework of Reference (CEFR; Council of Europe 2001). At its origins in traditional English Language Teaching levels:
Cambridge we are often asked about the relationship between our The CEFR levels did not suddenly appear from nowhere. They
ESOL exams and the CEFR; the nature of this relationship can be have emerged in a gradual, collective recognition of what the
considered from four complementary, sometimes overlapping, late Peter Hargreaves (Cambridge ESOL) described during the
perspectives.1 1991 Rüschlikon Symposium as “natural levels” in the sense of
useful curriculum and examination levels.
The process of defining these levels started in 1913 with the
The historical perspective
Cambridge Proficiency exam (CPE) that defines a practical
The origins of the CEFR date back to the early 1970s when the mastery of the language as a non-native speaker. This level has
Council of Europe sponsored work within its Modern Languages become C2. Just before the last war, Cambridge introduced the
Project to develop the Waystage and Threshold levels as sets of First Certificate (FCE) – still widely seen as the first level of
specified learning objectives for language teaching purposes. These proficiency of interest for office work, now associated with B2.
two levels were designed to reflect achievable and meaningful In the 1970s the Council of Europe defined a lower level called
levels of language competence, at a relatively low proficiency “The Threshold Level” (now B1), originally to specify what kind
level, and to form part of a European unit/credit system for adult of language an immigrant or visitor needed to operate effectively
language learning. They defined levels of functional competence in society. Threshold was quickly followed by “Waystage” (now
A2), a staging point half way to Threshold. The first time all these
among language users forming the basis for curriculum, syllabus,
concepts were described as a possible set of “Council of Europe
and later assessment design.
levels” was in a presentation by David Wilkins (author of “The
In the late 1980s Cambridge was one of several stakeholder
Functional Approach”) at the 1977 Ludwighaven
organisations (with the British Council and BBC English) to provide
Symposium…(North 2006:8).
funding and professional support for revising Threshold and
Cambridge’s upper-intermediate level CAE exam, introduced in
Waystage (Van Ek and Trim 1998a, 1998b); the revised level
1991, helped bridge the gap between FCE and CPE and was
descriptions underpinned test specifications for a revised PET
proposed as C1. Lastly, a lower Breakthrough level was proposed
exam in the mid 1980s and a new KET exam in the early 1990s.
as A1. These six levels (A1-C2) thus constituted a ‘language
Linguistic and functional description of a third, higher
ladder’, providing a pathway for upward progression in language
proficiency level began in the 1990s, with support and
teaching and learning with explicit opportunities to evaluate and
participation on this occasion from the Association of Language
accredit learning outcomes along the way. The Cambridge Main
Testers in Europe (ALTE); work on this third level took account of
Suite exams (KET, PET, FCE, CAE and CPE) were already providing
FCE and led to the publication of Vantage in 1999 (Van Ek and
well-established and recognised accreditation ‘stepping stones’
Trim 2001). As work extended on level descriptions for English,
along this pathway.
so the concept of a framework of reference levels began to emerge
Emergence of these common reference levels, with their
and to take on a more concrete form.
contributory elements such as language courses, public
examinations, and published coursebooks, was formally confirmed
through the Common European Framework project; managed

1. This article is based on a presentation given at IATEFL Harrogate in April 2006 and we
between 1993 and 1996 by the Council of Europe with significant
are grateful to Dr Brian North for his helpful comments on an early draft. input from the Eurocentres organisation, the overarching aim was

RESEARCH NOTES : I S S U E 2 4 / M AY 2 0 0 6 / PAG E S 2 – 5 | 1


to construct a common framework in the European context which without extensive underpinning from measurement theory and
would be transparent and coherent, to assist a variety of users in statistics; but measurement theory has become increasingly
defining language learning, teaching and assessment objectives. important as attempts have been made to validate aspects of the
A major strength was that it would build upon the shared CEFR empirically (North and Schneider 1998, North 2000a) and to
understanding which already existed among teachers and other link assessments to it (North 2006b).
ELT stakeholders in the European context, but would also resolve Syllabus designers, coursebook publishers and language test
some difficulties of relating language courses and assessments to providers worldwide, including Cambridge ESOL, seek to align
one another; it would provide a common meta-language to talk their exams to the CEFR for reasons of transparency and
about learning objectives and language levels and encourage coherence; claims of alignment can also assist in marketing
practitioners to reflect on and share their practice. It’s worth communications to try and gain a competitive edge. However,
remembering that this took place in a larger context where notions any claim of alignment needs to be examined carefully; simply
of a socio-political and economic community in Europe were to assert that a test is aligned with a particular CEFR level does
rapidly taking shape; an early motivation for revising Waystage not necessarily make it so, even if that assertion is based on an
and Threshold in the late 1980s had been their relevance to intuitive or reasoned subjective judgement. To some extent,
educational programmes of language learning for European alignment can be achieved historically and conceptually as we
citizenship. have seen, but empirical alignment requires more rigorous
Notions of framework development linked to language learning analytical approaches; appropriate evidence needs to be
progression were nothing new. Wilkins’ 1977 set of levels has accumulated and evaluated.
already been referred to. In the UK context, the English Speaking The ALTE Can Do Project (Jones 2001, 2002) was one of the
Union (ESU) set up its ‘framework project’ in 1985 to devise a empirical approaches used by Cambridge ESOL for aligning its
comprehensive frame of description for comparing the various original five levels with the six-level CEFR. Other empirical support
examinations of the main English language boards (Taylor 2004). for alignment comes from Cambridge’s item-banking methodology
In the wider context of Europe, ALTE members were also by the underpinning our approach to all test development and validation
early 1990s working systematically to co-locate their qualifications (Weir and Milanovic 2003). The Cambridge-TOEFL Comparability
across different European languages and proficiency levels within Study, conducted in 1987-90 (Bachman et al 1995) highlighted
a shared framework of reference. The aim was to develop a how far the UK-based assessment tradition had relatively
framework to establish common levels of proficiency in order to underplayed the psychometric dimension; for Cambridge ESOL this
promote the transnational recognition of certification in Europe. established an empirical imperative and we invested heavily in
The process of placing ALTE members’ exams on the framework approaches and systems to address measurement issues such as
was based on content analysis of the tests, the creation of test reliability and version comparability. Latent trait methods have
guidelines for the quality production of exams, and the been used since the early 1990s to link the various Cambridge
development of empirically validated performance indicators or levels onto a common measurement scale using a range of
Can Do statements in different European languages (see ALTE quantitative approaches, e.g. IRT Rasch-based methodology,
website www.alte.org). The resulting five-level ALTE Framework alongside qualitative research methods.
developed simultaneously during the mid-1990s alongside the More recently, Cambridge ESOL has supported the authoring
six-level CEFR published in 1997. Since the two frameworks and piloting of the Council of Europe’s Manual Relating Language
shared a common conceptual origin, similar aims – transparency Examinations to the CEFR (Figueras et al 2005) which presents a
and coherence – and comparable scales of empirically developed linking process based on three sets of procedures:
descriptors, Cambridge ESOL and its ALTE partners decided to
conduct several studies to verify their alignment. This was Specification of the content and purpose of an examination
achieved mainly through the ALTE Can Do Project in 1998-2000 Similar procedures were conducted when the PET and KET test
(see below). Following publication of the CEFR in 2001 the ALTE specifications were originally based upon Threshold and Waystage
members adopted the six CEFR levels (A1-C2). levels, and the ALTE partners’ exams were aligned within the ALTE
One of the strengths of this conceptual approach to framework Framework; an extensive range of documentation for all our exams
development has undoubtedly been its ‘organic’ development. (test specifications, item writer guidelines, examiner training
Even in 1991, qualifications existed for other languages that could materials, test handbooks and examination reports) assists in
also be confidently associated with what were to become the CEFR specifying the content and purpose of existing and new exams
and ALTE levels, including: the new advanced level DALF with direct reference to the CEFR.
(Diplôme Approfondi de Language Française) at C1; the Zertifikat
Deutsch (ZD) at Threshold (B1); and the Kleines Deutsches Standardisation of interpretation of CEFR levels
Sprachdiplom (KDS) commonly considered an equivalent to Suitable standardised materials are needed for assessment
Cambridge’s CPE (C2). personnel and others to benchmark their tests against CEFR levels.
Cambridge has helped develop such materials by supplying
calibrated test items and tasks from our Main Suite Reading and
The empirical perspective Listening test item banks together with exemplar Speaking and
Shared understanding among teachers, publishers and language Writing test performances from our writing examiner coordination
testers enabled the framework concept to function quite well packs and Oral Examiner standardisation materials at each CEFR

RESEARCH NOTES : I S S U E 2 4 / M AY 2 0 0 6 / PAG E S 2 – 5 | 2


level; a set of benchmarking materials, incorporating both Conclusion
classroom-based and test-based materials, is now available from
Today the CEFR plays a key role in language and education policy
the Council of Europe on CD or DVD.
within Europe and the wider world – perhaps in ways not
originally envisaged by its authors. Within Europe it is believed to
Empirical validation studies
serve policy goals of fostering linguistic diversity, transparency of
Empirical validation studies are a greater challenge sometimes
qualifications, mobility of labour, and lifelong language learning.
requiring specialist expertise and resources; Cambridge ESOL is
Beyond Europe it is being adopted to help define language
among a relatively small number of examination providers
proficiency levels with resulting implications for local pedagogy
undertaking this sort of research, partly through our routine item-
and assessment. For Cambridge ESOL it offers a valuable frame of
banking and test calibration methodology and also through
reference for our work and for our stakeholder community; as it
instrumental research and case studies such as the Common
evolves, we look forward to continuing to make an appropriate
Scale for Writing Project (Hawkey and Barker 2004).
professional contribution to its development.
Could it be argued that Cambridge ESOL exams ‘embody’ the
Common European Framework? That will be for others to judge
The evolutionary perspective based on evidence presented here and elsewhere. It partly depends
The CEFR remains ‘work in progress’; it will continue to evolve as on how the word ‘embody’ is defined; but there does exist a
experience grows among those who use it in various ways and growing body of evidence to support a claim that Cambridge
contexts, and as they reflect on that use. For many it already exams contain and express the CEFR as an important feature, that
provides a useful frame of reference, offering practical guidance for they include the CEFR as part of their structure, and that they
their thinking and doing. Others have expressed concerns about its express or represent the CEFR in a variety of ways. Such
application: within the language testing community some fear use embodiment is a natural outcome of several factors, such as
of the CEFR as an instrument for ‘harmonisation’ of policy/practice historical legacy, conceptual synergy, and empirical underpinning.
(Fulcher 2004); others question how far the CEFR provides a Extending the biological metaphor, we could envisage how the
suitable instrument for operational test development (Weir 2005). relationship between the CEFR and Cambridge ESOL exams will
In response, the CEFR authors emphasise the original intention of continue to evolve, partly due to the genetic makeup of the
the Framework as a means of valuing and encouraging diversity, relationship itself and also as a result of external environmental
and remind us that the CEFR is not a ‘cookbook’ or ‘how to’ factors in a changing world.
document. Perhaps the real value of the CEFR lies in it being used To celebrate his 80th birthday in 2004, Professor John Trim,
as a heuristic rather than prescriptively; it needs to be interpreted one of the authors of the CEFR, was interviewed for Language
thoughtfully and intelligently if it is to be meaningful and have Assessment Quarterly. In the interview, he describes the aspirations
local validity. behind the Framework: ‘What we were aiming at was something
Another useful role for the Framework in assessment could be in which will be a common reference point that people working in
matters of quality assurance, not just to improve systems and different fields and people using it for entirely different things and
procedures but to support the growing professionalisation of in very different ways could refer to in order to feel that they were
personnel and institutions involved in language learning, teaching part of a common universe’ (Saville 2005:281). This focus on
and assessment. North (2006) notes that the scheme outlined in the individual practitioners as the agents of activity is a welcome
Manual ‘reflects the three step process of any Quality Management reminder that it is people, rather than frameworks, systems, or
System (Design, Implementation, Evaluation)’. This view echoes procedures, who are – or who should be – at the heart of what
Cambridge ESOL’s long-standing commitment to addressing quality happens in language learning, teaching and assessment, i.e.
assurance issues. In the early 1990s ALTE produced its professional learners, teachers, teacher trainers, course and syllabus designers,
Code of Practice and has since then elaborated the concept of textbook writers, language test providers – anyone who is a
quality assurance in language testing by developing quality stakeholder in the ELT or ESOL constituency, or who is a member
management instruments. Like the CEFR, the ALTE Code of Practice of another language learning community.
offers the practitioner community a common frame of reference Ultimately, it may be unhelpful to talk about ‘embodiment’ in
and a shared meta-language for reflecting on and evaluating policy relation to a course syllabus or an assessment tool; of greater
and practice – ensuring the door is always open for improvement. interest and importance, both to the developers of the CEFR and
Since 2001, the CEFR has also been a source of inspiration or a to Cambridge ESOL, are surely the populations of human beings
catalyst for other initiatives; one is the innovative European directly involved in language learning, teaching and test-taking,
Language Portfolio (ELP) developed to support the language whether at the group or the individual level. The quality of the
learning and teaching community with input from the EAQUALS relationship between the CEFR and Cambridge ESOL exams is
organisation and the ALTE partners; another is the recently perhaps best judged by the extent to which together they enable
launched English Profile Project to develop a comprehensive set of language learning to flourish, encourage achievements to be
Reference Level Descriptions for English using the CEFR levels as a recognised and so enrich the lives of individuals and
springboard. communities.

RESEARCH NOTES : I S S U E 2 4 / M AY 2 0 0 6 / PAG E S 2 – 5 | 3


References and further reading —(2000a) The development of a common framework scale of language
proficiency, New York: Peter Lang.
Bachman, L F, Davidson, F, Ryan, K and Choi, I C (1995) An investigation
into the comparability of two tests of English as a foreign language: The —(2000b) Linking Language Assessments: an example in a low-stakes
Cambridge-TOEFL comparability study, Cambridge: UCLES/Cambridge context, System 28, 555–577.
University Press.
—(2006) The Common European Framework of Reference: Development,
Council of Europe (2001) Common European Framework of Reference for Theoretical and Practical Issues, paper presented at the symposium ‘A
Languages: Learning, Teaching, Assessment, Cambridge: Cambridge New Direction in Foreign Language Education: The Potential of the
University Press. Common European Framework of Reference for Languages’, Osaka
University of Foreign Studies, Japan, March 2006.
Figueras, N, North, B, Takala, S, Verhelst, N and Van Avermaet, P (2005)
Relating Examinations to the Common European Framework: a North, B and Schneider, G (1998) Scaling Descriptors for Language
Manual, Language Testing 22 (3), 1–19. Proficiency Scales, Language Testing 15 (2), 217–262.

Fulcher, G (2004) Deluded by artifices? The Common European Saville, N (2005) An interview with John Trim at 80, Language Assessment
Framework and harmonization, Language Assessment Quarterly, 1 (4), Quarterly 2 (4), 263–288.
253–266.
Taylor, L (2004) Issues of test comparability, Research Notes 15, 2–5.
Hawkey, R & Barker, F (2004) Developing a common scale for the
Van Ek, J A and Trim, J L M (1998a) Threshold 1990, Cambridge:
assessment of writing, Assessing Writing 9/2, 122–159.
Cambridge University Press.
Jones, N (2001) The ALTE Can Do Project and the role of measurement in
—(1998b) Waystage 1990, Cambridge: Cambridge University Press.
constructing a proficiency framework, Research Notes 5, 5–8.
—(2001) Vantage, Cambridge: Cambridge University Press.
—(2002) Relating the ALTE Framework to the Common European
Framework of Reference, in Alderson, J C (Ed.), Common European Weir, C J (2005) Limitations of the Common European Framework for
Framework of Reference for Languages: Learning, Teaching, developing comparable examinations and tests, Language Testing 22
Assessment. Case Studies, Strasbourg: Council of Europe Publishing, (3), 281–300.
167–183. Weir, C J and Milanovic, M (2003) Continuity and Innovation: Revising
North, B (1995) The development of a common framework scale of the Cambridge Proficiency in English examination 1913–2002, Studies
descriptors of language proficiency based on a theory of measurement, in Language Testing Vol. 15, Cambridge: UCLES/Cambridge University
System 23, 445–465. Press.

This is an extract from Research Notes, published quarterly by University of Cambridge ESOL Examinations.
Previous issues are available on-line from www.CambridgeESOL.org.
© UCLES 2006 – this article may not be reproduced without the written permission of the copyright holder.

RESEARCH NOTES : I S S U E 2 4 / M AY 2 0 0 6 / PAG E S 2 – 5 | 4

You might also like