Douglas 2012

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

This article was downloaded by: [University of Maastricht]

On: 09 June 2014, At: 12:17


Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Innovations in Education and Teaching


International
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/riie20

Multiple-choice question tests: a


convenient, flexible and effective
learning tool? A case study
a a a
Mercedes Douglas , Juliette Wilson & Sean Ennis
a
Department of Marketing , University of Strathclyde , Glasgow ,
UK
Published online: 18 May 2012.

To cite this article: Mercedes Douglas , Juliette Wilson & Sean Ennis (2012) Multiple-choice
question tests: a convenient, flexible and effective learning tool? A case study, Innovations in
Education and Teaching International, 49:2, 111-121, DOI: 10.1080/14703297.2012.677596

To link to this article: http://dx.doi.org/10.1080/14703297.2012.677596

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Innovations in Education and Teaching International
Vol. 49, No. 2, May 2012, 111–121

Multiple-choice question tests: a convenient, flexible and effective


learning tool? A case study
Mercedes Douglas*, Juliette Wilson and Sean Ennis

Department of Marketing, University of Strathclyde, Glasgow, UK

The research presented in this paper is part of a project investigating assessment


practices, funded by the Scottish Funding Council. Using established principles
of good assessment and feedback, the use of online formative and summative
Downloaded by [University of Maastricht] at 12:17 09 June 2014

multiple choice tests (MCT’s) was piloted to support independent and


self-directed learning and improve performance in an efficient manner for both
students and staff. The paper reviews previous studies that have examined the
relevance of MCT’s and presents an evaluation of the students grades and the
results of a questionnaire designed to capture their perceptions about the effec-
tiveness of MCT’s. Our findings identify improvements on students’ marks and
positive responses from students who found MCT’s to be useful at supporting
their learning of basic concepts and building confidence and self-esteem. We
also argue that MCT’s work more effectively when used in conjunction with
other assessment methods.
Keywords: multiple choice tests; assessment; feedback; learning; higher educa-
tion; first year students

Introduction
Feedback and assessment are areas of concern within higher education (HE). They
are considered to be critical drivers of student learning which deeply affect the qual-
ity of student–teacher interaction, but they are demanding in terms of staff time and
resources (Gibbs & Simpson, 2004). The findings presented here are part of a wider
project funded by the Scottish Funding Council (REAP, 2007) to improve assess-
ment practices in HE using new technologies. Our aim was to use methods to
develop in students the ability to monitor, manage and self-direct their learning and
in addition to providing more immediate and effective feedback without adding to
staff workload. Assessment and feedback in particular were two key areas high-
lighted in a recent UK-wide survey of undergraduate students in further education
(NUS, 2008), where students reported concerns with their experiences. Online mul-
tiple choice tests (MCT’s) were identified as a potentially useful tool to help deal
with these concerns.
MCT’s have been widely used to replace or supplement constructed responses
(problems, essays or oral questions). Their usage has increased notably due to the
pressures of growing students’ numbers, budget constraints and the subjectivity of
constructed forms of assessment (Becker & Johnston, 1999). MCT’s allow for a
reduction of assessment bias as the results can be measured more objectively than

*Corresponding author. Email: [email protected]

ISSN 1470-3297 print/ISSN 1470-3300 online


Ó 2012 Taylor & Francis
http://dx.doi.org/10.1080/14703297.2012.677596
http://www.tandfonline.com
112 M. Douglas et al.

other forms of assessment. This has been made possible because the science of
question design has become more efficient and respected (Bacon, 2003) as ques-
tions can be created specifically to test in-depth learning (Buckles & Siegfried,
2006). The wider use and accessibility of computer networks and computerised test
banks and quizzes also offer the benefit of instantaneous feedback, quicker compila-
tion of grades and collation of results and tracking of student’s progress throughout
their course (Harter & Harter, 2004). In addition, online MCT’s allow for more flex-
ibility in delivery as students take their tests at a time and place convenient to them
(Krieg & Uyar, 2001; Nicol, 2007).
Nevertheless, there are concerns about the pedagogical limitations, validity, use-
fulness and reliability of this instrument (Dibattista, Mitterer, & Goose, 2004), as
MCT’s may only measure and foster students’ ability to memorise or recall factual
information rather than test higher levels of cognitive processes (Becker & John-
ston, 1999). They can also encourage guessing and feedback is limited and prede-
Downloaded by [University of Maastricht] at 12:17 09 June 2014

termined during test construction: driven by the needs of teacher efficiency and
rapid feedback, not by pedagogic principles (Krieg & Uyar, 2001; Nicol, 2007).
The students’ mindset, command of the language, experience and risk aversion may
also influence the selection and interpretation of alternatives and these characteris-
tics may also be gender-specific.
However, if designed properly, used in conjunction with other methods and
where the educator defines the dimensions of understanding that need to be mea-
sured, the tests are as effective and reliable as essays in measuring comprehension
levels (Bacon, 2003; Buckles & Siegfried, 2006).
MCT’s should be part of an overall educational process which helps students
develop self-regulation and autonomy and should start with activities which come
before the actual teaching takes place, continues after and feeds into the redesign of
a course. Orrell (2006), Brown, Gibbs, and Glover (2003), Gibbs and Simpson
(2004) and Nicol and MacFarlane-Dick (2006) have defined a number of conditions
for an assessment and feedback process which allows students to monitor their own
learning in an intelligent manner, supporting the development of their self-esteem
and motivation and enabling them to face future learning challenges. Assessments
should be distributed across topics and weeks to avoid concentrating tasks workload
all at once (Kember & Leung, 2006). It is also critical that students engage in learn-
ing in a productive manner by being given tasks that facilitate reflection, so that
they are aware of what they are learning and also of their weaknesses. Quality feed-
back has to focus on learning, on its formative role and not simply concentrate on
marks. In this context, MCT’s and feedback should support the process of reflective
learning allowing them to close the gap between current and desired performance,
linking tasks to other assessments and encouraging further application of what was
learned (Goodman & Wood, 2004).
It is important that students cover fundamentals in a scaffolding strategy which
can support other assessments and help them gain confidence (Román, Cuestas, &
Fenollar, 2008) particularly with first year students who need to be aided in their
transition from secondary school to tertiary education (Yorke, 2007). This strategy
allows students to acquire background knowledge and understanding of the funda-
mentals of a subject before moving to more complex scenarios and deep leaning
approaches (O’Dwyer, 2007). In this respect, the questions in MCT’s need not test
very high levels of understanding as defined by Bloom’s taxonomy (Buckles &
Innovations in Education and Teaching International 113
Downloaded by [University of Maastricht] at 12:17 09 June 2014

Figure 1. Bloom’s cognitive domain.

Siegfried, 2006; Anderson, Krathwohl, & Airasian, 2001), but they can test knowl-
edge, comprehension, application and some degree of analysis (Figure 1).
Our belief is that MCT’s support remembering which is a prerequisite for
understanding and that understanding is a prerequisite for application.

The context
The module Principles of Marketing aims to provide first year students with basic
knowledge of marketing as a business/societal philosophy and a managerial func-
tion. The class is large with over 500 students who are drawn from a variety of dis-
ciplines. Students receive two lecture hours per week and also meet in tutorial
groups (of up to 12 students) four times per semester. Tutorials are run by graduate
tutors who are also responsible for grading continuous assessment. Students who
achieve an average of 60% or over in course work are exempt from the end of year
exam. Online MCT’s were introduced to enhance the students’ learning experience,
improve performance and supplement other assessments. Given the size of the class,
the use of electronic MCT’s allows for more efficient staff time management as they
facilitate automatic test delivery, submission and feedback reducing the physical
involvement of tutors or lecturers. It was decided to keep the online tests and free
response assessments separate in order to build confidence amongst students and
support the essay and report (Yorke, 2007). The MCT’s were delivered via WebCT,
the institutional virtual learning environment (VLE) using approximately 1500 ques-
tions provided by publishers McGraw Hill to support the class textbook.
Six MCT’s were delivered in two forms (practice tests 1, 2 and 3; formal tests
1, 2 and 3) containing formative and summative elements (Figure 2).
All students received a voluntary opportunity to self-test during two-week ‘win-
dows’ to help students revise key topics. Online practice multiple choice tests
(PMCT) could be undertaken on campus or at any other location with Internet
access i.e. at home. Students were able to take the test as often as they liked. Feed-
back in the form of the score and the right answer was provided immediately on
submission of the PMCT to give students an indication of their achievement. This
allowed students to identify the right answers to ‘close the loop’ in their learning
and self-correct their responses when they tried subsequent tests. When students
finished the tests, the marks and feedback were generated automatically within a
few seconds of submission and displayed on the screen.
114 M. Douglas et al.

• Online
Practice Feedback
Test 1 • Tutorial
Feedback

• Online
Formal
Semester 1 Test 1
Feedback
• Report

• Online
Practice Feedback
Test 2 • Tutorial
Feedback

• Tutorial
Formal Feedback
Downloaded by [University of Maastricht] at 12:17 09 June 2014

Test 2
• Essay

• Online
Practice Feedback
Semester 2 Test 3 • Tutorial
Feedback

• Online
Formal
Feedback
Test 3
• Exemption

Figure 2. Assessment modes for principles of marketing students.

Another formative element was the use of open books and notes during the for-
mal online multiple choice tests (Formal Multiple Choice Test [FMCT]) to help
students with answers and explanations and provide a further opportunity for self-
evaluation and learning. If students had any problems with a question, they could
look for the right answers in their books during the tests. Time is limited, students
had to be well organised and know where to quickly find the right information to use
in the answers. Students submitted the FMCT on campus computers only. Marks for
the FMCT and feedback were not provided until all students had completed the tests.
Additional feedback opportunities were offered at tutorial meetings, where a short
10-question MCT was attempted.
The tests were progressive, increasing in difficulty in terms of the number of
questions asked but also covering increasingly more material. The core textbook
questions provided by the publisher were reviewed to make sure that their level of
difficulty was suitable for an introductory marketing class and covered basic con-
cepts, applied concepts to problems, linked steps logically and judged the outcome
or effect of particular changes to situations in line with the first four levels of
Bloom’s cognition taxonomy (Buckles & Siegfried, 2006). This was also done to
improve the reliability of the test, as test banks items may present problems in this
area (Bacon, 2003).

Research method
The impact of the MCT’s was evaluated by means of data gathered through a ques-
tionnaire designed to generate students’ views on issues related to the use of
Downloaded by [University of Maastricht] at 12:17 09 June 2014

Table 1. Questionnaire results.

Questionnaire items Mean % Strongly agree/Agree (out of 264) SD


Student effort: time demands and distribution
1 Mark for the FMCT reflected amount of work 2.18 72.7 1.128
2 Preparation and work for all assignment was spread evenly 2.63 55.3 0.986

Assignments and learning: productive learning, facilitating reflection, covering fundamentals and communicating expected standards
3 FMCT materials were comprehensive 2.07 79.5 .759
4 FMCT question range reflected expected level of difficulty 2.17 75.4 .841
5 Open books during FMCT allowed students to correct mistakes 1.82 85.6 .782
6 Repeating PMCT helped students gain confidence in own knowledge 1.86 84.1 .783
7 PMCT improved chances of success with FMCT 1.67 88.3 .757
8 Tests allowed consolidation of what was learnt in lectures 2.3 70.1 .858

Quality of feedback: learning and formative role and supporting confidence and self-esteem
9 PMCT feedback allowed students to learn from their mistakes 2.14 70.1 .852
10 Subsequent feedback aided understanding of mistakes 2.47 57.2 1.017

Feedback application and links to other assessments


11 Students found the right answers for questions they got wrong in the FMCQ test 2.58 54.2 1.003
12 Students applied material covered in MCT to other assessments 2.6 50.8 .946

Technology
13 Felt confident using WebCT to sit online tests 1.78 89.4 .815
14 Instructions for using WebCT were clear 1.88 84.8 .802
Innovations in Education and Teaching International
115
116 M. Douglas et al.

MCT’s. The questions were based on the principles of assessment and feedback
adapted from Brown et al. (2003), Gibbs and Simpson (2004) and Nicol and
MacFarlane-Dick (2006) and discussed above. In addition, the impact of technology
as a key enabler for assessment practices was also assessed. Likert-type scales ques-
tions were used, ranging from a value of 1 = strongly agree to 5 = strongly disagree.
These are outlined in Table 1.
The sample for the study was drawn from the 518 students registered for the
first year marketing class. Two hundred and sixty-four students responded to the
questionnaire, representing just over half of the target sample (51%). The split in
respondents in terms of gender was very even: 135 females and 129 males. Their
age ranged from 17 to 42, with an overall mean of 18.7 years of age.
Quantitative data analysis was undertaken using SPSS 16.0 software. The results
are discussed below.
Downloaded by [University of Maastricht] at 12:17 09 June 2014

Results and discussion


Learning issues
The study found that students generally held positive views about MCT’s with
responses for all attitudinal variables as agree or strongly agree (Table 1).

MCT’s capture time demands and distribution of student effort


According to established principles (Brown et al., 2003; Gibbs & Simpson, 2004;
Nicol & MacFarlane-Dick, 2006), assessments should capture sufficient student time
and effort and tasks should distribute student effort evenly across topics and weeks.
Attitudes towards the MCT’s in this respect were positive. Almost 3/4’s of respon-
dents claimed that the mark they achieved in the FMCT’s was a fair reflection of
the amount of work they put into preparing for the exercise and only 11% of
respondents disagreed with this statement. The students who have confidence in the
test as a good indicator of their effort and performance are capable of recognising
their own capabilities, with significant correlations found between a positive attitude
towards the work put in and test results (Table 2). Students were also asked about
whether they felt their preparation and work for all five assignments was spread
evenly across the year (Kember & Leung, 2006). The majority again agreed with
this (55%), however, a small number (1/5th) disagreed with this statement. There
was no significant relationship between this variable and test results.

MCT’s should engage students in productive learning, which facilitates reflection,


covers fundamentals of the course and communicates expected standards
The tests were broken down into smaller learning units rather than one large assess-
ment to help students acquire a better understanding of the concepts and build up
their knowledge base. Responses show strong agreement that the practice tests
helped students gain confidence in their knowledge and particularly that they
improved their chances of success with the FMCT. Correlations with the actual tests
grades confirm that there were significant relationships with these variables, mostly
for the final formal test (Table 2), possibly highlighting the value of this test at rein-
forcing and consolidating learning material and also building up confidence over
time. Furthermore, three-quarters of respondents were in agreement that the range of
Innovations in Education and Teaching International 117

Table 2. Survey responses and correlation with FMCT grades.

Correlations
FMCT 1 FMCT 2 FMCT 3
PMCT improved chances of success with FMCT .159⁄ .136⁄ .239⁄⁄
Open books used for FMCT allowed students to correct .169⁄⁄ – .150⁄
mistakes
Repeating PMCT helped students gain confidence – – .148⁄
Material covered in FMCT was comprehensive .183⁄⁄ .175⁄⁄ .135⁄
FMCT mark reflected amount of work put by students .171⁄⁄ .222⁄⁄ .234⁄⁄

Correlation significant at the 0.05 level (2-tailed)
⁄⁄
Correlation significant at the 0.01 level (2-tailed).
Downloaded by [University of Maastricht] at 12:17 09 June 2014

questions reflected the expected level of difficulty. This indicates that the tests had
been suitably designed around the principle that the expectations of students should
be matched by the materials included in the assessment. The use of open books was
acknowledged as helping them to correct their own mistakes by a large proportion of
students (85%). This should have aided understanding by students instead of merely
memorising information. The practice tests were introduced to encourage students to
read chapters from the core textbook in order to reinforce lecture material and apply
concepts to other assessment. Students were not involved in constructing questions.
However, there were indications that their expectations about the content (compre-
hensive 79% agree, supporting lectures 70% agree) and difficulty of the test (75% in
agreement) were met and the marks received reflected this.

Quality of feedback: focusing on learning and formative role and supporting


confidence and self-esteem
The value of the tests for improving learning was reflected in the positive views
regarding the confidence students gained by repeating the tests and by the fairness
of the tests. The results also showed that the feedback obtained when they com-
pleted the practice test allowed them to learn from their mistakes (70%). However,
only just over half (57%) thought the feedback after the PMCT helped them under-
stand why they got answers wrong. No significant relationships were found between
these variables and the formal test marks. Because of the number of times students
did the tests and checked their answers, it can be argued that by understanding
where they went wrong and also being shown where they did well, students could
close the gap between their current and desired performance for future assessments.
This supports the findings of earlier studies in this area (Gibbs & Simpson, 2004;
Nicol & MacFarlane-Dick, 2006). We would expect that the opportunity to learn
from the feedback and the perception that the tests were a fair reflection of their
effort and knowledge (discussed in the previous section) would contribute to
building their confidence and self-esteem.

Feedback links tasks to other assessments and further application of what is learnt
Just over half of the respondents claim to have applied the material from the tests
for their essays and report (51%). This may reflect the fact that these latter
assignments were narrower in scope and only used a limited number of chapters of
118 M. Douglas et al.

the core textbook. It may also indicate that students fail to see the inter-connectivity
between MCT’s and the other assessments.
In addition, only 54% of students agreed that they had found the right answers
to the questions that they got wrong in the formal tests. Students did not receive
grades for the formal MCT’s immediately, as they were delivered over a period of
two weeks and the correct answers were not released until the tests were over in
order to avoid possible collusion.

Technology
There were no issues with the technology and the use of WebCT. The survey indi-
cates that students feel confident using WebCT for their tests (90%) and the same
level of satisfaction was expressed around the instructions (85%) and using it for
other assignments (84%).
Downloaded by [University of Maastricht] at 12:17 09 June 2014

Conclusion
Before considering the findings of the survey, the authors urge readers to exercise a
degree of caution when interpreting the results due the characteristics of the sample,
as it is possible that the respondents to the survey were more motivated and more
effective independent learners than the non-respondents.
The use of online MCT’s in this case has served to provide some evidence of
their advantages and disadvantages and their value to support learning and educa-
tional efforts. The general value of the tests has been confirmed, as we have been
able to cover a wide range of materials, provide instant feedback and more objec-
tive marking with less involvement of staff. Overall students’ performance over the
three tests improved and there were positive correlations with other assessments.
The way in which the tests were used helped enhance their usefulness in support-
ing other types of assessments and learning by breaking them down into separate
stages and using practice tests and open books to engage students with course materi-
als. Students performed better than with answering conventional examination ques-
tions and this, therefore, improved retention rates and motivation (O’Dwyer, 2007).
Previous research has indicated some ambivalence about the ability of MCT’s to
cultivate a climate of ‘deep’, reflective learning on the part of the participants. New
technology and question design have changed this and MTC’s can be used to exam-
ine higher learning levels, but we chose not to use a more complex tool at such
early stage of the students’ degree. Our choice of questions did not allow for syn-
theses and evaluation, but such tests play an important role in developing an under-
standing of the basic principles, concepts and frameworks underlying a particular
discipline. It was expected that these tests, used as part of a scaffolding strategy
together with other free-response forms, would support the development of stu-
dents’ ability to monitor, manage and self-direct their own learning (REAP, 2007).
The results of our study indicate that these aims have been fulfilled.
The literature indicates that MCT’s have not been used in such a flexible forma-
tive role before. This has been made possible by the use of the VLE. Our method
of getting students to manage their number of attempts and control their learning
process online without adding to their workload has served as a starting tool to
build up in-depth and self-regulating learning (Rodríguez & Cano, 2006).
Another key dimension of our MCT’s relates to the quality and level of student
effort and engagement. Responses indicate that a very significant percentage (75%)
Innovations in Education and Teaching International 119

thought that the mark received in their formal MCT’s was an accurate and fair
reflection of the amount of effort put into preparation for this task. This correlates
with the work of Becker and Johnston (1999) and Bacon (2003), who noted that
MCT’s are perceived to be a reliable form of assessment. As marks and feedback
for practice tests are released immediately, participants find it easier to analyse their
performance and more importantly identify where they went wrong, learn from and
correct their mistakes and gain confidence in their knowledge. These are areas
which students found most valuable and highlight their learning expectations.
Our findings also indicate that the area of feedback drew less positive views
from the respondents. This is of particular importance as it reflects concerns raised
in the National Union of Students survey (NUS, 2008). The issue here is that stu-
dents were not inclined to seek the right answers to the questions they got wrong
for the formal tests. This raises a number of implications for educators.
It is important to recognise that there are different levels of feedback which can
Downloaded by [University of Maastricht] at 12:17 09 June 2014

be provided to students. In the case of the formal tests, the correct answers and
grades are given to students two weeks after completing the tests. Our findings sug-
gest that students might have underestimated the importance of revisiting the rele-
vant material in the text, in order to identify the reasons why they made an
incorrect choice for the various questions. It is also possible that many students
demonstrate a level of inertia in terms of following up their areas of poor perfor-
mance, and may have lost interest as the feedback was not as immediate as for the
practise tests. In many cases, we suspect that once they achieved a mark of over
60% (the mark that indicates a potential exemption from the examination) they were
satisfied with this position and did not bother or have the time to develop their
learning experience any further. This observation suggests that more research needs
to be undertaken with students to explore the ‘pragmatic’ position that they may
adopt during their time in tertiary education. They manage their workload according
to their needs (in this case, a mark of 60%) and learning expectations. If they pass
this threshold, then there is little perceived incentive to invest more time and effort
on that particular task.
There is a concurrent danger that academics take the view that ‘a one size fits
all’ approach can be applied with regard to feedback. In any cohort, we will find
some students that expect and need higher levels of feedback and greater detail,
while at the other extreme, some students require minimum feedback. In the case of
online MCT’s, the feedback is instantaneous but restricted to the correct answer for
the particular question. By definition, the feedback is not detailed. With other forms
of assessment such as essays, students tend to demand more lengthy feedback, par-
ticularly on how to improve performance in the future. Much of the research on the
area of feedback does not really investigate the observation that students hold dif-
ferent levels of expectations regarding feedback. More work in this area might pro-
vide greater guidance for academics in terms of the level and content of feedback
provided on assignments.
It is critical that an acceptable balance be achieved between providing suffi-
cient feedback to students on the one hand, and maximising the productivity and
efficiency of the staff involved in delivering the course, on the other. Whilst
acknowledging the importance of this balance, our results indicate that in hind-
sight, greater direction should possibly have been given to students to become
more proactive in revisiting the relevant material and engaging in reflective learn-
ing. This may not encourage those students that take such a pragmatic position as
120 M. Douglas et al.

discussed in the preceding paragraph, but might encourage some to ‘re-think’ their
approach.
We recommend that online MCT’s can be used to maximum effect when com-
bined with other forms of assessment. In this respect, we argue strongly that in first
year, introductory programmes such as the one featured in this study, it is important
to take a holistic position when designing formative and summative assessments.
Our MCT’s addressed the lower levels in Bloom’s cognitive domain. When used in
conjunction with other procedures such as essay writing, report writing and case
studies, a more comprehensive covering of other cognitive skills (analysis, creativity
and evaluation) can be achieved. In an era of ever-increasing class sizes and the
pressure on academics to perform in areas such as research and administration, it is
tempting to use such an assessment tool, where other approaches such as essays
and reports are far more labour-intensive. Such an over-reliance on MCT’s can
damage the learning experience and restrict the opportunities for students to engage
Downloaded by [University of Maastricht] at 12:17 09 June 2014

with key cognitive skills. In essence, we argue for a balance to be maintained


between efficiencies and productivity and maintaining a positive learning environ-
ment for the students.
In summary, our study indicates that students held very positive attitudes about
their experiences with online MCT’s. However, we counsel against an over-depen-
dence on MCT’s which do not contain more analytical and evaluative elements.
They work best in foundation classes, where the emphasis is on building up a stu-
dent’s understanding of the basic principles and concepts that underpin a particular
subject area. The added benefits of this process were the affective and motivational
aspects, which produce engagement with learning and building students’ confidence
and self-belief (Heikkilä & Lonka, 2006).
Online MCT’s, when used in an imaginative and integrated manner with other
assessment tools, such as essay and report-writing, can considerably enhance the
learning environment for students.

Notes on contributors
Mercedes Douglas, senior tutor (Marketing) at the University of Strathclyde was in charge
of organising and supporting tutorial teaching. During the past three years, she has worked
to develop better assessment and feedback practices using the VLE.

Juliette Wilson is a lecturer (Marketing) at the University of Strathclyde. Her present


research interests include supply chains and networks, small businesses and effective
teaching and learning practices.

Sean Ennis is senior lecturer (Marketing) at the University of Strathclyde and has had
responsibility for developing a range of blended learning classes for MSc programmes that
are delivered through a bespoke VLE. His research interests are in the areas of innovative
teaching, retail marketing and sports marketing.

References
Anderson, L.W., Krathwohl, D.R., & Airasian, P.W. (Eds.). (2001). A taxonomy for learning,
teaching, and assessing – A revision of Bloom’s taxonomy of educational objectives.
New York, NY: Addison Wesley Longman.
Bacon, D.R. (2003). Assessing learning outcomes: A comparison of multiple-choice and
short-answer questions in a marketing context. Journal of Marketing Education, 25
(April), 31–36.
Innovations in Education and Teaching International 121

Becker, W.E., & Johnston, C. (1999). The relationship between multiple choice and essay
response questions in assessing economics understanding. Economic Record, 75(231),
348–357.
Brown, E., Gibbs, G., & Glover, C. (2003). Evaluation tools for investigating the impact of
assessment regimes on student learning. Bioscience Education e-journal, Vol. 2.
Retrieved September 5, 2008, from http://bio.ltsn.ac.uk/journal/vol2/beej-2-5.aspx
Buckles, S., & Siegfried, J.J. (2006). Using multiple-choice questions to evaluate in-depth
learning of economics. Journal of Economic Education, 37(Winter), 48–57.
Dibattista, D., Mitterer, J.O., & Gosse, L. (2004). Acceptance by undergraduates of the
immediate feedback assessment technique for multiple choice testing. Teaching in Higher
Education, 9(1), 17–28.
Gibbs, G., & Simpson, C. (2004). Conditions under which assessment supports students’
learning. Learning and Teaching in Higher Education, 1, 3–31.
Goodman, J.S., & Wood, R.E. (2004). Feedback specificity, learning opportunities, and
learning. Journal of Applied Psychology, 89(5), 809–821.
Harter, C., & Harter, J.F.R. (2004). Teaching with technology: Does access to computer
Downloaded by [University of Maastricht] at 12:17 09 June 2014

technology increase student achievement? Eastern Economic Journal, 30(4), 505–514.


Heikkilä, A., & Lonka, K. (2006). Studying in higher education: Students’ approaches to
learning, self-regulation and cognitive strategies. Studies in Higher Education, 31(1),
99–117.
Kember, D., & Leung, D.Y.P. (2006). Characterising a teaching and learning environment
conducive to making demands on students while not making their workload excessive.
Studies in Higher Education, 31(2), 185–198.
Krieg, R.G., & Uyar, B. (2001). Student performance in business and economics statistics:
Does exam structure matter? Journal of Economics and Finance, 25(Summer), 229–241.
Nicol, D. (2007). E-assessment by design: Using multiple-choice tests to good effect. Jour-
nal of Further and Higher Education, 31(February), 53–64.
Nicol, D., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning:
A model and seven principles of good feedback practice. Studies in Higher Education,
31, 119–218.
NUS. (2008). The great NUS feedback amnesty briefing paper. Retrieved September 29,
2009, from http://resource.nusonline.co.uk
O’Dwyer, A. (2007, May). Assessment using multiple-choice questions on the first year of a
three-year pass degree programme. From the REAP international online conference on
assessment design for learner responsibility. Retrieved June 8, 2007, from http://ewds.
strath.ac.uk/REAP07
Orrell, J. (2006). Feedback on learning achievement: Rhetoric and reality. Teaching in
Higher Education, 11(October), 441–456.
REAP. (2007). Reengineering assessment practices in Scottish higher education. Retrieve
June 8, 2007, from http://www.reap.ac.uk
Rodríguez, L., & Cano, F. (2006). The epistemological beliefs, learning approaches and
study orchestrations of university students. Studies in Higher Education, 31(5), 617–636.
Román, S., Cuestas, P.J., & Fenollar, P. (2008). An examination of the interrelationship
between self-esteem, others’ expectations, family support, learning approaches and
academic achievement. Studies in Higher Education, 33(2), 127–138.
Yorke, M. (2007, May). Assessment, especially in the first year of higher education: Old
principles in new wrapping? From the REAP international online conference on assess-
ment design for learner responsibility. Retrieved June 8, 2007, from http://ewds.strath.ac.
uk/REAP07

You might also like