0% found this document useful (0 votes)
42 views10 pages

Assessment - Writing Good Multiple Choice Test Questions

The document provides comprehensive guidelines for constructing effective multiple choice test items, emphasizing clarity, relevance, and the avoidance of unnecessary complexity. Key points include ensuring stems are meaningful and independent, using plausible distractors, and focusing on higher-order thinking. The guidelines aim to enhance the reliability and validity of assessments while minimizing potential confusion for examinees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views10 pages

Assessment - Writing Good Multiple Choice Test Questions

The document provides comprehensive guidelines for constructing effective multiple choice test items, emphasizing clarity, relevance, and the avoidance of unnecessary complexity. Key points include ensuring stems are meaningful and independent, using plausible distractors, and focusing on higher-order thinking. The guidelines aim to enhance the reliability and validity of assessments while minimizing potential confusion for examinees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

GUIDELINES FOR CONSTRUCTING MULTIPLE CHOICE ITEMS

Reference: Lorimar Review Book

1.Avoid unnecessary unfamiliar terminology. The difficulty of an item should evolve from the subject matter
rather than from the wording.

2. In composing the multiple choice item stems, use terms whose definition are. Likely to be precise in minds of
examinees.

Example: After firing, an avalanche control cannon recoiled at 20 feet per second. What would be the recoil
speed of another cannon, which is twice as heavy and fires the same projectile?
Comment: a student completely unfamiliar with the term “recoil” may false unintended extra difficulty.

3.Avoid complex or akward word arrangements. Also , avoid use of negatives in the stem as this may add
unnecessary comprehension difficulties.

Example: (Poor)
Which of the following groups of citizens of the Philippines are not allowed to cast their votes in elections?

4.Avoid double negatives. Do not use with a negative stem since it becomes a double-negative.
Example:
(Poor) Which of the following is least likely to be a speculative purchase?
(Better) Which of the following is likely to be the most speculative purchase?

5. Do not build upon other questions. Keep questions independent of one another.

6. Do not supply answers to other questions, avoid providing cues from question to another.

7. Avoid negative stems. If negatives are necessary, they are emphasized with underlined, bolded
CAPITALIZED, italicized and/or colored indicators. Word the stem positively; avoid negative phrasing such as
“not” or “except.” If this cannot be avoided, the negative words should always be highlighted by underlining or
capitalization: Which of the following is NOT an example……
8. Do not introduce unfamiliar vocabulary and concepts in the test unless there is a relevant stated purpose in
the test directions.
9. Use the number of alternatives appropriate to a test item throughout the test, generally three to five (no
necessity to use a consistent number throughout the test).

10.Sequence alternatives in logical or numerical order. Should there be no order, randomly assign correct
answers in the sequence.

11.Pay attention to grammatical consistency of all alternatives. Avoid giving clues through the use of faulty
grammatical construction.

12. Vary position in sequence of alternatives. The answer must not only be A or B or C or D.

13. Include common misconceptions as distracters.

14. Include plausible content or viable cues in each distracter. Re-use key words from the correct alternative to
make distracters more viable.

15. Avoid “all of the above”.

16. Do not use “none of the above” in a “best answer option”.


17. Avoid giving unintended cues - such as making the correct answer longer in length than the distractor.

18. Test for important or significant information.

19. Focus on a single problem or idea for each exam question.

20. Keep the vocabulary consistent with the students level of understanding.

21. Avoid questions based on opinion.

22. Use multiple-choice to measure higher-level thinking.

23. Be sensitive to cultural and gender issues.

24. State stem in either question form. (when did World War II begin?) or the completion form (World War II
began in _____).

25. Avoid window dressing (excessive verbiage) in the stem.

26. Include the central idea and most of the phrasing in the stem.

27. Avoid giving clues such as linking the stem to the answer. (… Is an example of an: test-wise students will
know the correct answer should start with a vowel).

28. Place options in logical or numerical order.

29. Use letters in front of options rather than numbers; numerical answers in numbered questions may be
confusing to the students.

30. Keep options independent questions should not be overlapping.

31. Keep all options homogenous in content.

32. Keep the length of options fairly consistent (preferably short).

33. Avoid specific determinants such as never and always.

34. Make sure that there is only one correct option.

35. Use plausible distracters.

36. Incorporate common errors of students in distracters.

37. Avoid technically phrased distracters.

38. Use familiar yes incorrect phrases as distracters.

29. Avoid the use of humor when developing options.


Writing Good Multiple Choice Test Questions
by Cynthia J. Brame, CFT Assistant Director
Cite this guide: Brame, C., (2013) Writing good multiple choice test questions. Retrieved [todaysdate]
from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/.

Constructing an Effective Stem


Constructing Effective Alternatives
Additional Guidelines for Multiple Choice Questions
Considerations for Writing Multiple Choice Items that Test Higher-order Thinking
Additional Resources

Multiple choice test questions, also known as items, can be an effective and efficient way to assess learning
outcomes. Multiple choice test items have several potential advantages:
Versatility: Multiple choice test items can be written to assess various levels of

learning outcomes, from basic recall to application, analysis, and evaluation. Because students are choosing
from a set of potential answers, however, there are obvious limits on what can be tested with multiple choice
items. For example, they are not an effective way to test students’ ability to organize thoughts or articulate
explanations or creative ideas.
Reliability: Reliability is defined as the degree to which a test consistently measures a learning outcome. Multiple
choice test items are less susceptible to guessing than true/false questions, making them a more reliable means
of assessment. The reliability is enhanced when the number of MC items focused on a single learning objective
is increased. In addition, the objective scoring associated with multiple choice test items frees them from
problems with scorer inconsistency that can plague scoring of essay questions.
Validity: Validity is the degree to which a test measures the learning outcomes it purports to measure. Because
students can typically answer a multiple choice item much more quickly than an essay question, tests based on
multiple choice items can typically focus on a relatively broad representation of course material, thus increasing
the validity of the assessment.
The key to taking advantage of these strengths, however, is construction of good multiple choice items.
A multiple choice item consists of a problem, known as the stem, and a list of suggested solutions, known as
alternatives. The alternatives consist of one correct or best alternative, which is the answer, and incorrect or
inferior alternatives, known as distractors.
Constructing an Effective Stem
1. The stem should be meaningful by itself and should present a definite problem. A stem that presents
a definite problem allows a focus on the learning outcome. A stem that does not present a clear
problem, however, may test students’ ability to draw inferences from vague descriptions rather serving
as a more direct test of students’ achievement of the learning outcome.

2. The stem should not contain irrelevant material, which can decrease the reliability and the validity of the
test scores (Haldyna and Downing 1989).
2. The stem should be negatively stated only when significant learning outcomes require it.
Students often have difficulty understanding items with negative phrasing (Rodriguez 1997). If a
significant learning outcome requires negative phrasing, such as identification of dangerous laboratory
or clinical practices, the negative element should be emphasized with italics or capitalization.

3. The stem should be a question or a partial sentence. A question stem is preferable because it
allows the student to focus on answering the question rather than holding the partial sentence in
working memory and sequentially completing it with each alternative (Statman 1988). The cognitive
load is increased when the stem is constructed with an initial or interior blank, so this construction
should be avoided.

Constructing Effective Alternatives


1. All alternatives should be plausible. The function of the incorrect alternatives is to serve as
distractors,which should be selected by students who did not achieve the learning outcome but ignored by
students who did achieve the learning outcome. Alternatives that are implausible don’t serve as functional
distractors and thus should not be used. Common student errors provide the best source of distractors.
2. Alternatives should be stated clearly and concisely. Items that are excessively wordy assess students’
reading ability rather than their attainment of the learning objective
3. Alternatives should be mutually exclusive. Alternatives with overlapping content may be considered
“trick” items by test-takers, excessive use of which can erode trust and respect for the testing process.

4. Alternatives should be homogenous in content. Alternatives that are heterogeneous in content can
provide cues to student about the correct answer.

5. Alternatives should be free from clues about which response is correct. Sophisticated test-takers are
alert to inadvertent clues to the correct answer, such differences in grammar, length, formatting, and language
choice in the alternatives. It’s therefore important that alternatives
have grammar consistent with the stem.
are parallel in form.
are similar in length.
use similar language (e.g., all unlike textbook language or all like textbook language).
6. The alternatives “all of the above” and “none of the above” should not be used. When “all of the
above” is used as an answer, test-takers who can identify more than one alternative as correct can select the
correct answer even if unsure about other alternative(s). When “none of the above” is used as an alternative,
test-takers who can eliminate a single option can thereby eliminate a second option. In either case, students
can use partial knowledge to arrive at a correct answer.

7. The alternatives should be presented in a logical order (e.g., alphabetical or numerical) to avoid a bias
toward certain positions.

8. The number of alternatives can vary among items as long as all alternatives are plausible. Plausible
alternatives serve as functional distractors, which are those chosen by students that have not achieved the
objective but ignored by students that have achieved the objective. There is little difference in difficulty,
discrimination, and test score reliability among items containing two, three, and four distractors.
Additional Guidelines
1. Avoid complex multiple choice items, in which some or all of the alternatives consist of different
combinations of options. As with “all of the above” answers, a sophisticated test-taker can use partial
knowledge to achieve a correct answer.

2. Keep the specific content of items independent of one another. Savvy test-takers can use information in
one question to answer another question, reducing the validity of the test.

Considerations for Writing Multiple Choice Items that Test Higher-order Thinking
When writing multiple choice items to test higher-order thinking, design questions that focus on higher levels of
cognition as defined by Bloom’s taxonomy. A stem that presents a problem that requires application of course
principles, analysis of a problem, or evaluation of alternatives is focused on higher-order thinking and thus
tests students’ ability to do such thinking. In constructing multiple choice items to test higher order thinking, it
can also be helpful to design problems that require multilogical thinking, where multilogical thinking is defined
as “thinking that requires knowledge of more than one fact to logically and systematically apply concepts to a
…problem” (Morrison and Free, 2001, page 20). Finally, designing alternatives that require a high level of
discrimination can also contribute to multiple choice items that test higher-order thinking.

You might also like