Assessing Foreign/second Language Writing Ability
Assessing Foreign/second Language Writing Ability
Assessing Foreign/second Language Writing Ability
www.emeraldinsight.com/1753-7983.htm
EBS
3,3 Assessing foreign/second
language writing ability
Christine Coombe
178 Dubai Men’s College, Higher Colleges of Technology,
Dubai, United Arab Emirates
Abstract
Purpose – Having a certain degree of assessment literacy is crucial for today’s language teachers.
The main aim of this paper is to provide that knowledge as it pertains to the writing skill. More
specifically, the purpose of this paper is to provide an overview of the main practical issues that teachers
often face when evaluating the written work of their students. It will consider issues and solutions in five
major areas: test design; test administration; ways to assess writing; feedback to students; and the effects
on pedagogy.
Design/methodology/approach – The author took a very practical and principled approach to the
complete process of assessing the written work of our students in a foreign or second language.
Findings – The cyclical relationship between teaching and assessment can be made entirely positive
provided that the assessment is based on sound principles and procedures. Both teaching and
assessment should relate to the learners’ goals and very frequently to institutional goals.
Practical implications – Good teachers spend a lot of time ensuring that their writing assessment
practices are valid and reliable. The author deals with the fundamental issues that underlie good test design
in a very practical and understandable way and later suggests practical steps to ensure smooth and reliable
test administration before dealing with ways to assess a range of different writing tasks. Then, the crucial
issue of how best to provide useful developmental feedback to students is considered. She concludes by
discussing how best testing practice should seek to accommodate the requirements of test takers.
Originality/value – This topic is significant as assessing foreign/second language writing skills is
one of the most problematic areas in language testing. It is made even more important because good
writing ability is very much sought after by higher education institutions and employers.
Keywords Languages, Literacy, Assessment, Language teaching
Paper type General review
Introduction
Assessing writing skills is one of the most problematic areas in language testing. It is
made even more important because good writing ability is very much sought after by
higher education institutions and employers. To this end, good teachers spend a lot of time
ensuring that their writing assessment practices are valid and reliable. This paper
explores the main practical issues that teachers often face when evaluating the writing
work of their students. It will consider issues and solutions in five major areas: test design;
test administration; ways to assess writing; feedback to students; and effects on pedagogy.
Test design
Education, Business and Society:
Approaches to writing assessment
Contemporary Middle Eastern Issues The first step in test design is for teachers to identify which broad approach to writing
Vol. 3 No. 3, 2010
pp. 178-187 best identifies their chosen type of assessment: direct or indirect. Indirect writing
q Emerald Group Publishing Limited
1753-7983
DOI 10.1108/17537981011070091 Published by kind permission of HCT Press.
assessment measures correct usage in sentence-level constructions and focuses on Foreign/second
spelling and punctuation via objective formats like multiple choice questions and cloze language
tests. These measures are supposed to determine a student’s knowledge of writing
sub-skills such as grammar and sentence construction which are assumed to constitute writing
components of writing ability. Indirect writing assessment measures are largely
concerned with accuracy rather than communication.
Direct writing assessment measures a student’s ability to communicate through the 179
written mode based on the production of written texts. This type of writing assessment
requires the student to come up with the content, find a way to organize the ideas, and
use appropriate vocabulary, grammatical conventions and syntax. Direct writing
assessment integrates all elements of writing. The choice of one approach over another
should inform all subsequent choices in assessment design.
Writing prompt. Hyland (2003, p. 221) defines the prompt as “the stimulus the student
must respond to”. Kroll and Reid (1994, p. 233) identify three main prompt formats: base,
Expected response. This is a description of what the teacher intends students to do with
the writing task. Before communicating information on the expected response to
students, it is necessary for the teacher to have a clear picture of what type of response
they want the assessment task to generate.
Post-task evaluation. Finally, whatever way is chosen to assess writing, it is
recommended that the effectiveness of the writing tasks/tests is evaluated. According to
Hyland (2003), good writing tasks are likely to produce positive responses to the following
questions:
.
Did the prompt discriminate well among my students?
.
Were the essays easy to read and evaluate?
.
Were students able to write to their potential and show what they knew?
Topic restriction. This is in addition to the four aspects of test design described above.
Topic restriction is a controversial and often heated issue in writing assessment. Topic
restriction is the belief that all students should be asked to write on the same topic with
no alternatives allowed. Many teachers may believe that students perform better when
they have the opportunity to select the prompt from a variety of alternative topics. Foreign/second
When given a choice, students often select the topic that interests them and one language
for which they have background knowledge. The obvious benefit of providing students
with a list of alternatives is that if they do not understand a particular prompt, they will writing
be able to select another. The major advantage to giving students a choice of writing
prompt is the reduction of student anxiety.
On the other hand, the major disadvantage of providing more than one prompt is that it 181
is often difficult to write prompts which are at the same level of difficulty. Many testers feel
that it is generally advisable for all students to write on the same topic because allowing
students to choose topics introduces too much variance into the scores. Moreover, marker
consistency may be reduced if all papers read at a single writing calibration session are not
on the same topic. It is the general consensus within the language testing community that
all students should write on the topic and preferably on more than one topic. Research
results, however, are mixed on whether students write better with single or with multiple
prompts (Hamp-Lyons, 1990). It is thought that the performance of students who are given
multiple prompts may be less than expected because students often waste time selecting a
topic instead of spending that time writing. If it is decided to allow students to select a topic
from a variety of alternatives, alternative topics should be of the same genre and rhetorical
pattern. This practice will make it easier to achieve inter-rater reliability.
Self-assessment. There are two self-assessment techniques than can be used in writing
assessment: dialog journals and learning logs. Dialog journals require students to
regularly make entries addressed to the teacher on topics of their choice. The teacher
then writes back, modeling appropriate language use but not correcting the student’s
language. Dialog journals can be in a paper/pencil or electronic format. Students
typically write in class for a five- to ten-minute period either at the beginning or end of the
class. If you want to use dialog journals in your classes, make sure you do not assess
students on language accuracy. Instead, Peyton and Reed (1990) recommend that you
assess students on areas like topic initiation, elaboration, variety, and use of different
genres, expressions of interests and attitudes and awareness about the writing process.
Peer assessment. Peer assessment is yet another technique that can be used when
assessing writing. Peer assessment involves the students in the evaluation of writing.
One of the advantages of peer assessment is that it eases the marking burden on the
teacher. Teachers do not need to mark every single piece of student writing, but it is
important that students get regular feedback on what they produce. Students can use
checklists, scoring rubrics or simple questions for peer assessment. The major
rationale for peer assessment is that when students learn to evaluate the work of their
peers, they are extending their own learning opportunities.
Portfolio assessment. As far as portfolios are defined in writing assessment,
a portfolio is a purposive collection of student writing over time, which shows the stages
in the writing process a text has gone through and thus the stages of the writers’ growth.
Several well-known testers have put forth lists of characteristics that exemplify
good portfolios. For instance, Paulson et al. (1991) believe that portfolios must include
student participation in four important areas:
(1) the selection of portfolio contents;
(2) the guidelines for selection;
(3) the criteria for judging merit; and
(4) evidence of student reflection.
Once benchmark papers have been selected, the team of experienced raters needs to rate
the scripts using the scoring criteria and agree on a score. It will be helpful to note down a
few of the reasons why the script was rated in such a way. Next, the lead arbitrator needs
to conduct a calibration session (oftentimes referred to as a standardization or norming
session) where the entire pool of raters rate the sample scripts and try to agree on the
scores that each script should receive. In these calibration sessions, teachers should
evaluate and discuss benchmark scripts until they arrive at a consensus score. These
calibration sessions are time consuming and not very popular with groups of teachers
who often want to get started on the writing marking right away. They can also get very
heated especially when raters of different educational and cultural backgrounds are
involved. Despite these disadvantages, they are an essential component to standardizing
writing scores.
sp Spelling
vt Verb tense
ww Wrong word
wv Wrong verb
Nice idea/content!
Switch placement Table II.
{ New paragraph Sample marking codes
? I do not understand for writing
EBS Research indicates that teacher-written feedback is highly valued by second language
3,3 writers (Hyland, 1998 as cited in Hyland, 2003) and many students particularly value
feedback on their grammar (Leki, 1990). Although positive remarks are motivating and
highly valued by students, Hyland (2003, p. 187) points out that too much praise or
positive commentary early on in a writer’s development can make students complacent
and discourage revision.
186
Effects on pedagogy
The cyclical relationship between teaching and assessment can be made entirely
positive provided that the assessment is based on sound principles and procedures.
Both teaching and assessment should relate to the learners’ goals and very frequently
to institutional goals.
Process versus product. The goals of all the stakeholders can be met when a
judicious balance is established, in the local context, between process and product. In
recent years, there has been a shift towards focusing on the process of writing rather
than on the written product. Some writing tests have focused on assessing the whole
writing process from brainstorming activities all the way to the final draft (or finished
product). In using this process approach, students usually have to submit their work in
a portfolio that includes all draft material. A more traditional way to assess writing is
through a product approach. This is most frequently accomplished through a timed
essay, which usually occurs at the mid and end point of the semester. In general, it is
recommended that teachers use a combination of the two approaches in their teaching
and assessment, but the approach ultimately depends on the course objectives.
Some aspects of good teacher-tester practice. Teachers and testers know that any
type of assessment should first and foremost reflect the goals of the course, so they start
the test development process by reviewing their test specifications. They will avoid a
“snap shot” approach to writing ability by giving students plenty of opportunities to
practice a variety of different writing skills. They will practice multiple measures writing
assessment by using tasks which focus on product (e.g. essays at midterm and final) and
process (e.g. writing portfolio). They will give more frequent writing assessments because
they know that assessment is more reliable when there are more samples to assess. They
will provide opportunities for a variety of feedback from teacher and peers, as well as
opportunities for the learners to reflect on their own progress. Overall, they will ensure
that the learner’s focus is maintained primarily on the learning process and enable them to
see that the value of the testing process is primarily to enhance learning by measuring real
progress and identifying areas where further learning is required.
References
Cohen, A. (1994), Assessing Language Ability in the Classroom, Heinle & Heinle, Boston, MA.
Davidson, P. and Lloyd, D. (2005), “Guidelines for developing a reading test”, in Lloyd, D.,
Davidson, P. and Coombe, C. (Eds), The Fundamentals of Language Assessment:
A Practical Guide for Teachers in the Gulf, TESOL Arabia, Dubai.
Hamp-Lyons, L. (1990), “Second language writing: assessment issues”, in Kroll, B. (Ed.), Second
Language Writing: Research Insights for the Classroom, Cambridge University Press,
New York, NY, pp. 69-87.
Hamp-Lyons, L. (1991), “Scoring procedures for ESL contexts”, in Hamp-Lyons, L. (Ed.),
Assessing Second Language Writing in Academic Contexts, Ablex, Norwood, NJ, pp. 241-76.
Heaton, J.B. (1990), Classroom Testing, Longman, Harlow. Foreign/second
Hyland, F. (1998), “The impact of teacher written feedback on individual writers”, Journal of language
Second Language Writing, Vol. 7 No. 3, pp. 255-86.
Hyland, K. (2003), Second Language Writing, Cambridge University Press, Cambridge.
writing
Jacobs, H.L., Zinkgraf, S.A., Wormuth, D.R., Hartfiel, V.F. and Hughey, J.B. (1981), Testing ESL
Composition: A Practical Approach, Newbury House, Rowley, MA.
Kroll, B. and Reid, J. (1994), “Guidelines for designing writing prompts: clarifications, caveats 187
and cautions”, Journal of Second Language Writing, Vol. 3 No. 3, pp. 231-55.
Leki, I. (1990), “Coaching from the margins: issues in written response”, in Kroll, B. (Ed.),
Second Language Writing: Insights from the Language Classroom, Cambridge University
Press, Cambridge, pp. 57-68.
Markin (2002), “‘Electronic editing program’ creative technology”, available at: www.cict.co.uk/
software/markin4/index.htm
Paulson, F., Paulson, P. and Meyer, C. (1991), “What makes a portfolio a portfolio?”, Educational
Leadership, Vol. 48 No. 5, pp. 60-3.
Peyton, J.K. and Reed, L. (1990), Dialogue Journal Writing with Non-native English Speakers:
A Handbook for Teachers, Teachers of English to Speakers of Other Languages,
Alexandria, VA.
Santos, M. (1997), “Portfolio assessment and the role of learner reflection”, English Teaching
Forum, Vol. 35 No. 2, pp. 10-14.