Creating Questionnaires
Creating Questionnaires
6
Ensure you can protect privacy. Make your plan to protect
respondents’ privacy before you begin writing your survey. This is a
very important part of many research projects.
Consider an anonymous questionnaire. You may not
want to ask for names on your questionnaire. This is one step you can
take to prevent privacy, however it is often possible to figure out a
respondent’s identity using other demographic information (such as
age, physical features, or zipcode).
Consider de-identifying the identity of your
respondents. Give each questionnaire (and thus, each respondent) a
unique number or word, and only refer to them using that new
identifier. Shred any personal information that can be used to
determine identity.
Remember that you do not need to collect much
demographic information to be able to identify someone. People may
be wary to provide this information, so you may get more respondents
by asking less demographic questions (if it is possible for your
questionnaire).
Make sure you destroy all identifying information after
your study is complete.
Part 2
Writing your questionnaire
1.
1
Introduce yourself. Your introduction should explain who you are, and
what your credentials are. You should clarify if you are working alone
or as a part of a team. Include the name of the academic institution or
company for whom you are collecting data. Here are some examples:
o My name is Jack Smith and I am one of the creators of this
questionnaire. I am part of the Department of Psychology at the
University of Michigan, where I am focusing in developing cognition in
infants.
o I’m Kelly Smith, a 3rd year undergraduate student at the
University of New Mexico. This questionnaire is part of my final exam
in statistics.
o My name is Steve Johnson, and I’m a marketing analyst for
The Best Company. I’ve been working on questionnaire development to
determine attitudes surrounding drug use in Canada for several years.
2
Explain the purpose of the questionnaire.[13] Many people will not
answer a questionnaire without understanding what the goal of the
questionnaire is. No long explanation is needed; instead, a few
concise sentences will do the trick. Here are some examples:
o I am collecting data regarding the attitudes surrounding
gun control. This information is being collected for my Anthropology
101 class at the University of Maryland.
o This questionnaire will ask you 15 questions about your
eating and exercise habits. We are attempting to make a correlation
between healthy eating, frequency of exercise, and incidence of
cancer in mature adults.
o This questionnaire will ask you about your recent
experiences with international air travel. There will be three sections
of questions that will ask you to recount your recent trips and your
feelings surrounding these trips, as well as your travel plans for the
future. We are looking to understand how a person’s feelings
surrounding air travel impact their future plans.
3
Reveal what will happen with the data you collect. Are you collecting
these data for a class project, or for a publication? Are these data to
be used for market research? Depending on what you intend to do with
the data you collect from your questionnaire, there may be different
requirements that you need to pay attention to before distributing your
survey.
o Beware that if you are collecting information for a
university or for publication, you may need to check in with your
institution’s Institutional Review Board (IRB) for permission before
beginning. Most research universities have a dedicated IRB staff, and
their information can usually be found on the school’s website.
o Remember that transparency is best. It is important to be
honest about what will happen with the data you collect.
o Include an informed consent for if necessary. Note that you
cannot guarantee confidentiality, but you will make all reasonable
attempts to ensure that you protect their information.[14]
4
Estimate how long the questionnaire will take. Before someone sits
down to take your questionnaire, it may be helpful for them to know
whether the questionnaire will take them 10 minutes or 2 hours.
Providing this information at the onset of your questionnaire is more
likely to get you more complete questionnaires in the end.
o Time yourself taking the survey. Then consider that it will
take some people longer than you, and some people less time than
you.
o Provide a time range instead of a specific time. For
example, it’s better to say that a survey will take between 15 and 30
minutes than to say it will take 15 minutes and have some
respondents quit halfway through.
o Use this as a reason to keep your survey concise! You will
feel much better asking people to take a 20 minute survey than you
will asking them to take a 3 hour one.
5
Describe any incentives that may be involved. An incentive is
anything that you can offer as a reward at the end of the
questionnaire. Incentives can be many types of things: they can be
monetary, desired prizes, gift certificates, candy, etc. There are both
pros and cons to offering incentives.
o Incentives can attract the wrong kind of respondent. You
don’t want to incorporate responses from people who rush through
your questionnaire just to get the reward at the end. This is a danger
of offering an incentive.[15]
o Incentives can encourage people to respond to your survey
who might not have responded without a reward. This is a situation in
which incentives can help you reach your target number of
respondents.[16]
o Consider the strategy used by SurveyMonkey. Instead of
directly paying respondents to take their surveys, they offer 50 cents
to the charity of their choice when a respondent fills out a survey.
They feel that this lessens the chances that a respondent will fill out a
questionnaire out of pure self-interest.[17]
o Consider entering each respondent in to a drawing for a
prize if they complete the questionnaire. You can offer a 25$ gift card
to a restaurant, or a new iPod, or a ticket to a movie. This makes it
less tempting just to respond to your questionnaire for the incentive
alone, but still offers the chance of a pleasant reward.
6
Make sure your questionnaire looks professional. Because you want
people to have confidence in you as a data collector, your
questionnaire must have a professional look.
o Always proof read. Check for spelling, grammar, and
punctuation errors.
o Include a title. This is a good way for your respondents to
understand the focus of the survey as quickly as possible.
o Thank your respondents. Thank them for taking the time
and effort to complete your survey.
Part 3
Distributing Your Questionnaire
1.
1
Do a pilot study. Ask some people you know to take your
questionnaire (they will not be included in any results stemming from
this questionnaire), and be prepared to revise it if necessary. [18] Plan
to include 5-10 people in the pilot testing of your questionnaire. [19] Get
their feedback on your questionnaire by asking the following
questions:
o Was the questionnaire easy to understand? Were there any
questions that confused you?
o Was the questionnaire easy to access? (Especially
important if your questionnaire is online).
o Do you feel the questionnaire was worth your time?
o Were you comfortable answering the questions asked?
o Are there any improvements you would make to the
questionnaire?
2
Disseminate your questionnaire. You need to determine what is the
best way to disseminate your questionnaire.[20] There are several
common ways to distribute questionnaires:
o Use an online site, such as SurveyMonkey.com. This site
allows you to write your own questionnaire with their survey builder,
and provides additional options such as the option to buy a target
audience and use their analytics to analyze your data.[21]
o Consider using the mail. If you mail your survey, always
make sure you include a self-addressed stamped envelope so that the
respondent can easily mail their responses back. Make sure that your
questionnaire will fit inside a standard business envelope.
o Conduct face-to-face interviews. This can be a good way to
ensure that you are reaching your target demographic and can reduce
missing information in your questionnaires, as it is more difficult for a
respondent to avoid answering a question when you ask it directly.
o Try using the telephone. While this can be a more time-
effective way to collect your data, it can be difficult to get people to
respond to telephone questionnaires.
3
Include a deadline. Ask your respondents to have the questionnaire
completed and returned to you by a certain date to ensure that you
have enough time to analyze the results.
o Make your deadline reasonable. Giving respondents up to 2
weeks to answer should be more than sufficient. Anything longer and
you risk your respondents forgetting about your questionnaire.
o Consider providing a reminder. A week before the deadline
is a good time to provide a gentle reminder about returning the
questionnaire. Include a replacement of the questionnaire in case it
has been misplaced by your respondent.[22]
Steps
1.
1
Write a list of objectives. This should outline the kind of data you want to collect, as
it will serve as the basis for choosing your questions. For example, you may want to
know how customers feel about your new product, in which case your objectives
may be to gauge customer responses to the product's price range, ease of use,
durability and concept.[1]
2.
2
Determine the type of questions your survey will use. There are 2 basic question
formats:[2]
o Fixed-response/structured. Fixed-response questions provide research
questionnaire respondents with a specified set of possible answers, and require
respondents to choose from those options. Use fixed-response questions when you
have a clear-cut way to define and categorize data that the answers provide and
when you are not looking for unique or original information from those you are
surveying. Examples of structured question formats include multiple choice, ranking,
yes/no and rating scale.
o Open-ended/non-structured. Non-structured questions are good for
collecting fresh, individual ideas from respondents. However, it can be harder to
systematically analyze, organize and categorize data collected from open-ended
questions. Any question that does not limit the scope of the answer is open-ended.
3
Decide on a method of administration. You may opt to use face-to-face interviews,
telephone interviews, web-based surveys or written questionnaires. [3]
4
Word questions effectively. Keep the following in mind:[4]
o Comprehension. Use simple language and keep sentences as brief and
concise as possible. Be sure to cover the what, where, who, when and why so that
respondents are not lacking for pertinent information regarding their input.
o Answerability. Respondents should be able to provide accurate answers
without having to do research. For example, people may not be able to tell you
exactly how much money they spend eating out in an entire year, but they should be
able to estimate what they spend eating out over the course of a week.
o Double-barreled questions. Make sure your survey questions ask only 1
thing at a time. For example, rather than ask respondents if they prefer to exercise
in the evening or after eating, break the question into 2 separate questions, 1
specific to exercising in the evening and the other specific to exercising after
eating.
o Make sure response options don't overlap. For example, multiple choice
questions should never provide for the same number in 2 different range options (i.e.
"between 10 and 30" or "between 20 and 40").
5
Organize the survey questionnaire in a logical way.[5]
o If your research questionnaire covers a range of topics, then group the
questions into subjects.
o Downplay sensitive questions by grouping them with neutral questions,
and by placing them toward the end of the survey, after a rapport has been
established.
6
Design the questionnaire so that it is easy to read and navigate through. [6]
o Use a large, clear font.
o Space questions so that it is easy to see where 1 stops and the next
begins. Be sure to provide ample answer space, if you are using open-ended
questions.
o Answer space should be right below the question. Avoid asking a
question on 1 page, then requiring the respondent to flip the page in order to provide
an answer.
o Use page numbers.
7
Provide respondents with the information they will need in order to effectively
comply with the survey. In order to achieve the best results, questionnaire design
should amply prepare respondents in the following ways: [7]
o Explain the purpose of the survey questionnaire. When people
understand the reasons behind a line of questioning, they are more likely to
volunteer accurate personal information.
o Give clear instructions for completing the questions. Explain the
question format (i.e. multiple choice, rating scale, etc.) and provide an example for
how to appropriately answer the question. Additional instructions may be to read the
questions all the way through before answering and/or to guess on an unknown
question, rather than leave it blank.
o Tell respondents how many questions the research questionnaire
includes, and give an estimated time frame for completion.
8
Revise the questionnaire as necessary. Each time you administer the survey,
analyze the results so that you can make changes that will improve the survey's
efficacy.[8]
o If you find that certain questions are consistently being skipped, then
those questions may need to be reworded for clarity.
o If respondents are unable to provide full answers due to space
restraints, you can change the layout.
o If simple yes/no answers aren't providing you with the range of data you
desire, then you may want to change to a multiple choice format.
Since my Fellowship with Edelman Berland began, research for me has become a “must
do” rather than a “could do” when creating programs for clients. Berland, our research
arm, is running an internal campaign with all of Edelman that aims to educate and
exchange ideas with our firm’s employees on the ways research can benefit their
communications initiatives. Here’s how we’re sharing the basics of writing a simple
questionnaire that can become a valuable quantitative research tool.
1. Work as a Partner. Align your research theme with your client’s overall business
objectives so outcomes will complement the communications strategy. Together
you can begin to craft the concept, timing and desired findings.
2. Keep it Simple. Write short, simple, specific questions using as few words as
possible. To capture the respondent’s actual beliefs, it’s best to write a clear
statement that can be responded to without too much deliberation. The more
instinctual reaction you receive, the better.
3. Choose the Best Delivery Method. Today's surveys can be delivered over the
computer, in person, on the phone or by mail. Postal surveys can be cheap but
responses can be slow. Face-to-face can be expensive but will generate the fullest
responses. Web surveys can be cost-effective but inconsistent with response
rates. Telephone can be expensive, but will often generate high response rates
and will allow for follow-up questions to enhance findings. So, a choice must be
made.
4. Ask the Same Question Twice but in Different Ways. To ensure you are
understanding a person’s true opinion on a given topic, it’s smart to ask the same
question a couple of times. It will help you avoid the respondent bias that
inevitably presents itself with each survey, and gives you a better chance at
finding the person's true opinion on a given topic.
5. Be Selective From the Start. Although you may feel a person is the right one to
take the survey, it’s best to ask a series of screening questions to make sure.
Position those at the beginning so you're not wasting anyone's time. Examples
would be demographic benchmarks, such as salary, education and geography,
etc.
6. Pilot the Questionnaire. By testing the survey with a small population, you’ll
determine if it’s set to do what you need it to do. This soft launch enables you to
determine whether some questions may need to paraphrased, reordered or
removed.
8. Chapter Objectives
9. This chapter is intended to help the reader to:
10. Understand the attributes of a well-designed questionnaire, and
Adopt a framework for developing questionnaires.
16. Formal standardised questionnaires: If the researcher is looking to test and quantify
hypotheses and the data is to be analysed statistically, a formal standardised
questionnaire is designed. Such questionnaires are generally characterised by:
17. prescribed wording and order of questions, to ensure that each respondent
receives the same stimuli
18. prescribed definitions or explanations for each question, to ensure interviewers handle
questions consistently and can answer respondents' requests for clarification if they
occur
19. prescribed response format, to enable rapid completion of the questionnaire during the
interviewing process.
20. Given the same task and the same hypotheses, six different people will probably come
up with six different questionnaires that differ widely in their choice of questions, line of
questioning, use of open-ended questions and length. There are no hard-and-fast rules
about how to design a questionnaire, but there are a number of points that can be borne
in mind:
21. 1. A well-designed questionnaire should meet the research objectives. This may
seem obvious, but many research surveys omit important aspects due to
inadequate preparatory work, and do not adequately probe particular issues due
to poor understanding. To a certain degree some of this is inevitable. Every
survey is bound to leave some questions unanswered and provide a need for
further research but the objective of good questionnaire design is to 'minimise'
these problems.
22. 2. It should obtain the most complete and accurate information possible. The
questionnaire designer needs to ensure that respondents fully understand the questions
and are not likely to refuse to answer, lie to the interviewer or try to conceal their
attitudes. A good questionnaire is organised and worded to encourage respondents to
provide accurate, unbiased and complete information.
23. 3. A well-designed questionnaire should make it easy for respondents to give the
necessary information and for the interviewer to record the answer, and it should be
arranged so that sound analysis and interpretation are possible.
24. 4. It would keep the interview brief and to the point and be so arranged that the
respondent(s) remain interested throughout the interview.
25. Each of these points will be further discussed throughout the following sections. Figure
4.1 shows how questionnaire design fits into the overall process of research design that
was described in chapter 1 of this textbook. It emphasises that writing of the
questionnaire proper should not begin before an exploratory research phase has been
completed.
27. Even after the exploratory phase, two key steps remain to be completed before the task
of designing the questionnaire should commence. The first of these is to articulate the
questions that research is intended to address. The second step is to determine the
hypotheses around which the questionnaire is to be designed.
28. It is possible for the piloting exercise to be used to make necessary adjustments to
administrative aspects of the study. This would include, for example, an assessment of
the length of time an interview actually takes, in comparison to the planned length of the
interview; or, in the same way, the time needed to complete questionnaires. Moreover,
checks can be made on the appropriateness of the timing of the study in relation to
contemporary events such as avoiding farm visits during busy harvesting periods.
33. It should be noted that one does not start by writing questions. The first step is to decide
'what are the things one needs to know from the respondent in order to meet the
survey's objectives?' These, as has been indicated in the opening chapter of this
textbook, should appear in the research brief and the research proposal.
34. One may already have an idea about the kind of information to be collected, but
additional help can be obtained from secondary data, previous rapid rural appraisals and
exploratory research. In respect of secondary data, the researcher should be aware of
what work has been done on the same or similar problems in the past, what factors have
not yet been examined, and how the present survey questionnaire can build on what has
already been discovered. Further, a small number of preliminary informal interviews with
target respondents will give a glimpse of reality that may help clarify ideas about what
information is required.
36. At the outset, the researcher must define the population about which he/she wishes to
generalise from the sample data to be collected. For example, in marketing research,
researchers often have to decide whether they should cover only existing users of the
generic product type or whether to also include non-users. Secondly, researchers have
to draw up a sampling frame. Thirdly, in designing the questionnaire we must take into
account factors such as the age, education, etc. of the target respondents.
40. Within this region the first two mentioned are used much more extensively than the
second pair. However, each has its advantages and disadvantages. A general rule is
that the more sensitive or personal the information, the more personal the form of data
collection should be.
43. There are only two occasions when seemingly "redundant" questions might be included:
44. Opening questions that are easy to answer and which are not perceived as
being "threatening", and/or are perceived as being interesting, can greatly assist
in gaining the respondent's involvement in the survey and help to establish a
rapport.
45. This, however, should not be an approach that should be overly used. It is almost always
the case that questions which are of use in testing hypotheses can also serve the same
functions.
46. "Dummy" questions can disguise the purpose of the survey and/or the
sponsorship of a study. For example, if a manufacturer wanted to find out
whether its distributors were giving the consumers or end-users of its products a
reasonable level of service, the researcher would want to disguise the fact that
the distributors' service level was being investigated. If he/she did not, then
rumours would abound that there was something wrong with the distributor.
52. It permits the respondent to specify the answer categories most suitable for their
purposes.
55. They 'suggest' answers that respondents may not have considered before.
56. With open-ended questions the respondent is asked to give a reply to a question in
his/her own words. No answers are suggested.
60. They often reveal the issues which are most important to the respondent, and this may
reveal findings which were not originally anticipated when the survey was initiated.
61. Respondents can 'qualify' their answers or emphasise the strength of their opinions.
62. However, open-ended questions also have inherent problems which means they must
be treated with considerable caution. For example:
63. Respondents may find it difficult to 'articulate' their responses i.e. to properly
and fully explain their attitudes or motivations.
64. Respondents may not give a full answer simply because they may forget to mention
important points. Some respondents need prompting or reminding of the types of answer
they could give.
65. Data collected is in the form of verbatim comments - it has to be coded and reduced to
manageable categories. This can be time consuming for analysis and there are
numerous opportunities for error in recording and interpreting the answers given on the
part of interviewers.
66. Respondents will tend to answer open questions in different 'dimensions'. For example,
the question: "When did you purchase your tractor?", could elicit one of several
responses, viz:
67. "A short while ago".
"Last year".
"When I sold my last tractor".
"When I bought the farm".
68. Such responses need to be probed further unless the researcher is to be confronted with
responses that cannot be aggregated or compared.
69. It has been suggested that the open response-option questions largely eliminate the
disadvantages of both the afore-mentioned types of question. An open response-option
is a form of question which is both open-ended and includes specific response-options
as well. For example,
75. The one disadvantage of this form of question is that it requires the researcher to have a
good prior knowledge of the subject in order to generate realistic/likely response options
before printing the questionnaire. However, if this understanding is achieved the data
collection and analysis process can be significantly eased.
76. Clearly there are going to be situations in which a questionnaire will need to incorporate
all three forms of question, because some forms are more appropriate for seeking
particular forms of response. In instances where it is felt the respondent needs
assistance to articulate answers or provide answers on a preferred dimension
determined by the researcher, then closed questions should be used. Open-ended
questions should be used where there are likely to be a very large number of possible
different responses (e.g. farm size), where one is seeking a response described in the
respondent's own words, and when one is unsure about the possible answer options.
The mixed type of question would be advantageous in most instances where most
potential response-options are known; where unprompted and prompted responses are
valuable, and where the survey needs to allow for unanticipated responses.
77. There are a series of questions that should be posed as the researchers develop the
survey questions themselves:
78. "Is this question sufficient to generate the required information?"
79. For example, asking the question "Which product do you prefer?" in a taste panel
exercise will reveal nothing about the attribute(s) the product was judged upon. Nor will
this question reveal the degree of preference. In such cases a series of questions would
be more appropriate.
80. "Can the respondent answer the question correctly?"
82. Having never been exposed to the answer, e.g. "How much does your husband earn?"
83. Forgetting, e.g. What price did you pay when you last bought maize meal?"
84. An inability to articulate the answer: e.g. "What improvements would you want to see in
food preparation equipment?"
85. "Are there any external events that might bias response to the question?"
86. For example, judging the popularity of beef products shortly after a foot and mouth
epidemic is likely to have an effect on the responses.
87. "Do the words have the same meaning to all respondents?"
88. For example, "How many members are there in your family?"
89. There is room for ambiguity in such a question since it is open to interpretation as to
whether one is speaking of the immediate or extended family.
90. "Are any of the words or phrases loaded or leading in any way?"
91. For example," What did you dislike about the product you have just tried?"
92. The respondent is not given the opportunity to indicate that there was nothing he/she
disliked about the product. A less biased approach would have been to ask a preliminary
question along the lines of, "Did you dislike any aspect of the product you have just
tried?", and allow him/her to answer yes or no.
93. "Are there any implied alternatives within the question?"
94. The presence or absence of an explicitly stated alternative can have dramatic effects on
responses. For example, consider the following two forms of a question asked of a
'Pasta-in-a-Jar' concept test:
95. 1. " Would you buy pasta-in-a-jar if it were locally available?"
2. "If pasta-in-a-jar and the cellophane pack you currently use were both
available locally, would you:
96. Buy only the cellophane packed pasta?
Buy only the pasta-in-a-jar product?
Buy both products?"
97. The explicit alternatives provide a context for interpreting the true reactions to the new
product idea. If the first version of the question is used, the researcher is almost certain
to obtain a larger number of positive responses than if the second form is applied.
98. "Will the question be understood by the type of individual to be interviewed?"
101. The careless design of questions can result in the inclusion of two items in one
question. For example: "Do you like the speed and reliability of your tractor?"
102. The respondent is given the opportunity to answer only 'yes' or 'no', whereas he
might like the speed, but not the reliability, or vice versa. Thus it is difficult for the
respondent to answer and equally difficult for the researcher to interpret the response.
103. The use of ambiguous words should also be avoided. For example: "Do you
regularly service your tractor?"
104. The respondents' understanding and interpretation of the term 'regularly' will
differ. Some may consider that regularly means once a week, others may think once a
year is regular. The inclusion of such words again present interpretation difficulties for
the researcher.
105. "Are any words or phrases vague?"
106. Questions such as 'What is your income?' are vague and one is likely to get
many different responses with different dimensions. Respondents may interpret the
question in different terms, for example:
107. hourly pay?
weekly pay?
yearly pay?
income before tax?
income after tax?
income in kind as well as cash?
income for self or family?
all income or just farm income?
108. The researcher needs to specify the 'term' within which the respondent is to
answer.
109. "Are any questions too personal or of a potentially embarrassing nature?"
110. The researcher must be clearly aware of the various customs, morals and
traditions in the community being studied. In many communities there can be a great
reluctance to discuss certain questions with interviewers/strangers. Although the degree
to which certain topics are taboo varies from area to area, such subjects as level of
education, income and religious issues may be embarrassing and respondents may
refuse to answer.
111. "Do questions rely on feats of memory?"
112. The respondent should be asked only for such data as he is likely to be able to
clearly remember. One has to bear in mind that not everyone has a good memory, so
questions such as 'Four years ago was there a shortage of labour?' should be avoided.
119. In developing the questionnaire the researcher should pay particular attention to
the presentation and layout of the interview form itself. The interviewer's task needs to
be made as straight-forward as possible.
120. Questions should be clearly worded and response options clearly
identified.
121. Prescribed definitions and explanations should be provided. This ensures that
the questions are handled consistently by all interviewers and that during the interview
process the interviewer can answer/clarify respondents' queries.
122. Ample writing space should be allowed to record open-ended answers, and to
cater for differences in handwriting between interviewers.
Use of booklets The use of booklets, in the place of loose or stapled sheets of paper, make it easier for
interviewer or respondent to progress through the document. Moreover, fewer pages
tend to get lost.
Simple, clear The clarity of questionnaire presentation can also help to improve the ease with which
formats interviewers or respondents are able to complete a questionnaire.
Creative use of In their anxiety to reduce the number of pages of a questionnaire these is a tendency to
space and put too much information on a page. This is counter-productive since it gives the
typeface questionnaire the appearance of being complicated. Questionnaires that make use of
blank space appear easier to use, enjoy higher response rates and contain fewer errors
when completed.
Use of colour Colour coding can help in the administration of questionnaires. It is often the case that
coding several types of respondents are included within a single survey (e.g. wholesalers and
retailers). Printing the questionnaires on two different colours of paper can make the
handling easier.
Interviewer Interviewer instructions should be placed alongside the questions to which they pertain.
instructions Instructions on where the interviewers should probe for more information or how replies
should be recorded are placed after the question.
130. whether the questions have been placed in the best order
134. Usually a small number of respondents are selected for the pre-test. The
respondents selected for the pilot survey should be broadly representative of the type of
respondent to be interviewed in the main survey.
135. If the questionnaire has been subjected to a thorough pilot test, the final form of
the questions and questionnaire will have evolved into its final form. All that remains to
be done is the mechanical process of laying out and setting up the questionnaire in its
final form. This will involve grouping and sequencing questions into an appropriate order,
numbering questions, and inserting interviewer instructions.
138. A good questionnaire is one which help directly achieve the research objectives,
provides complete and accurate information; is easy for both interviewers and
respondents to complete, is so designed as to make sound analysis and interpretation
possible and is brief.
139. There are at least nine distinct steps: decide on the information required; define
the target respondents, select the method(s) of reaching the respondents; determine
question content; word the questions; sequence the questions; check questionnaire
length; pre-test the questionnaire and develop the final questionnaire.
140. Key Terms
141. Group focus interviews
Mailed questionnaire
Open-ended and open response-option questions
Personal interviews
Piloting questionnaires
Target respondents
Telephone interviews
145. 3. The textbook says that one does not start by writing questions. How should the
researcher begin?
146. 4. What are the two occasions when apparently "redundant" questions should be
found in a questionnaire?
148. 6. What are the three reasons why a respondent is unable to answer a question?
149. 7. What is the recommended duration of interviews carried out in rural situations?
Susan Farrell
on September 25, 2016
Topics:
Research Methods
0 Comments
It’s possible to mix the two kinds of surveys, and it’s especially useful to do
small, primarily qualitative surveys first to help you generate good answers to
count later in a bigger survey. This one-two-punch strategy is much preferable
to going straight to a closed-ended question with response categories you and
your colleagues thought up in your conference room. (Yes, you could add an
“other” option, but don’t count on valid statistics for options left to a catch-all
bucket.)
2. Don’t make your own tool for surveys if you can avoid it. Many solid
survey platforms exist, and they can save you lots of time and money.
3. Decide up front what the survey learning goals are. What do you
want to report about? What kind of graphs and tables will you want to
deliver?
4. Write neutral questions that don’t imply particular answers or give
away your expectations.
5. Open vs. closed answers: Asking open-ended questions is the best
approach, but it’s easy to get into the weeds in data analysis when
every answer is a paragraph or two of prose. Plus, users quickly tire of
answering many open-ended questions, which usually require a lot of
typing and explanation. That being said, it’s best to ask open-ended
questions during survey testing. The variability of the answers to
these questions during the testing phase can help you decide whether
the question should be open-ended in the final survey or could be
replaced with a closed-ended question that would be easier to answer
and analyze.
6. Carefully consider how you will analyze and act on the data. The
type of questions you ask will have everything to do with the kind of
analysis you can make: multiple answers, single answers, open or
closed sets, optional and required questions, ratings, rankings, and free-
form answer fields are some of the choices open to you when deciding
what kinds of answers to accept. (If you won’t act on the data, don’t ask
that question. See guideline #12.)
7. Multiple vs. single answers: Often multiple-answer questions are
better than single-answer ones because people usually want to be
accurate, and often several answers apply to them. Survey testing on
paper can help you find multiple-answer questions, because people will
mark several answers even when you ask them to mark only one (and
they will complain about it). If you are counting answers, consider not
only how many responses each answer got, but also how many choices
people made.
8. Front-load the most important questions, because people will quit
partway through. Ensure that partial responses will be recorded
anyway.
9. Provide responses such as, “Not applicable” and “Don’t use” to
prevent people skipping questions or giving fake answers. People get
angry when asked questions they can’t answer honestly, and it skews
your data if they try to do it anyway.
10. People have trouble understanding required and optional
signals on survey question/forms. It’s common practice to use a red
asterisk “*” to mark required fields, but that didn’t work well enough,
even in a survey of UX professionals — many of whom likely design
such forms. People complained that required fields were not marked.
Pages that stated at the top that all were required or optional also didn’t
help, because many people ignore instruction text. Use “(Optional)”
and/or “(Required)” after each question, to be sure people understand.
11. When marking is not clear enough, many people feel
obligated to answer optional questions. Practically speaking that
means you don’t have to require every question, but you should be
careful not to include so many questions that people quit the survey in
the middle.
12. Keep it short. Every extra question reduces your response rate,
decreases validity, and makes all your results suspect. Better to
administer 2 short surveys to 2 different subsamples of your audience
than to lump everything you want to know into a long survey that won’t
be completed by the average customer. 20 questions are too many
unless you have a highly motivated set of participants. People are much
more likely to participate in 1-question surveys. Be sensitive to what
your pilot testers tell you, and realistically estimate the time to complete
the survey. The more open-ended questions and complex ranking you
ask people to do, the more you’ll lose respondents.
13. People often overlook examples and instructions that are on
the right, after questions. Move instructions and examples to the left
margin instead (or the opposite side, for languages that read right to
left), to put them in the scannability zone and place them closer to the
person’s focus of attention, which is on the answer area.
14. Use one-line directions if you can. Less is more. Just as in our
original writing for the web studies, people read more text when there is
a lot less of it. People complain about not getting enough information,
but when it’s there they don’t read it because it’s too long.
15. People tend not to read paragraphs or introductions. If you must
use a paragraph, bold important ideas to help ensure that most
people, who scan instead of reading, glean that information.
16. Think carefully about using subjective terms, such as
“essential,” “useful,” or “frequent.” Terms that cause people to make a
judgment call may get at how they feel, but such questions can be
confusing to evaluate logically. Ratings scales are more flexible. If you
do need to know how participants perceive a certain aspect, indicate
that’s what you want them to base their answer on (for example, instead
of asking “Is X essential for Y?” say “Do you feel that X is essential for
Y?”).
17. Define terms as needed in order to start from a shared meaning.
People might quibble about the definition, but it’s better than getting
misleading answers because of a misinterpretation.
18. Don’t ask about things that your analytics can tell you. Ask why
and how questions.
19. Include a survey professional in your test group. Your survey
method may be criticized after the fact, so get expert advice before you
conduct your survey.
20. Answer ordering and first words matter, especially in long lists.
Logical groupings, randomized lists, and short lists work better than
long, alphabetical lists. Ordering issues can skew your data, so test
alternative list orderings when you test your survey. When selecting
from a list, many people choose the first thing that sounds like it might
be right and go to the next question.
o Items at the top and bottom of lists may attract more attention
than items in the middle of long lists.
o Because people scan instead of read, the first words of items in
lists can cause them to overlook the right choice, especially in
alphabetical lists.
Conclusion
Qualitative surveys are tools for gathering rich feedback. They can also help
you discover which questions you need to ask and the best way to ask them,
for a later quantitative survey. Improve surveys through iterative testing with
open-ended feedback. Test surveys on paper first to save time-consuming
rework in the testing platform. Then test online to see the effects of page order
and question randomization and to gauge how useful the automated results
data may be.
It’s well-established that all forms of research come with their own theories
and implementation methods, qualitative research is much the same.
Qualitative research is conducted to understand the thought process of both,
the respondents as well as researchers. It usually is conducted in a natural
setup where respondents will be their true selves and would respond
transparently. Results achieved from this research will not be generalized to
be the representation of the entire population but asked questions and their
vocabulary gives away the motive of research which makes it easier for
respondents to participate in qualitative market research.