Program Assessment Tool Kit:: A Guide To Conducting Interviews and Surveys A Guide To Conducting Interviews and Surveys
Program Assessment Tool Kit:: A Guide To Conducting Interviews and Surveys A Guide To Conducting Interviews and Surveys
Tool Kit:
A Guide to Conducting Interviews and Surveys
This work was supported by a grant from the University of Wisconsin-Madison Assessment Council to the
College of Engineering. Additional resources were provided by the Office of the Dean of the College of
Engineering and a Hilldale Foundation Grant from the UW-Madison Chancellor to the LEAD Center.
As UW-Madison moves forward with strategic planning and outcomes assessment activities
across campus, we are fortunate to have this Tool Kit from the LEAD Center. The Tool Kit
provides guidance in the use of interviews and surveys, two of the many important assessment
methods that are in wide use across campus. We encourage you to make use of this Tool Kit as
your program explores the use of interviews and surveys as a part of its assessment activities.
Please note that the authors of the Tool Kit invite you to provide feedback so they can improve it
in future editions.
For general information on other student outcomes assessment techniques, you are invited to the
UW-Madison Outcomes Assessment web site located at:
http://www.wisc.edu/provost/assess.html
Eden Inoway-Ronnie
Chair, University Assessment Council
Section 1
INTRODUCTION
Introduction
This tool kit was developed for University of Wisconsin-Madison faculty and staff who are
involved in program assessment activities within their departments or majors. It is a complement
to a document prepared by the University Assessment Council1 which gives a broad overview of
assessment directions and accreditation requirements as well as an introduction to a variety of
assessment techniques. The Tool Kit has the goal of being a useful and user-friendly resource
framed around pertinent questions that might be asked by faculty or staff involved in program
assessment. It is an evolving document developed through the LEAD Center’s experience with
majors, departments, schools, and colleges at UW-Madison. In particular, the document
emphasizes the use of interviews and surveys because these are the primary tools selected by the
academic units with whom we have worked and these are the tools for which the LEAD Center
offers some expertise. We offer special thanks to the departments with whom we've worked for
providing us with the opportunity to learn more about program assessment.
Because the issues and assessment needs that each program faces are unique, we provide
checklists and open-ended worksheets primarily as ways to get started. We do not believe that
“one tool fits all.” We expect that you will select and modify items to fit your own needs. The
value of this tool kit will be measured by its usefulness to you and your colleagues involved in
program assessment.
1
“Using Assessment for Program Improvement: A Resource Manual for Academic Departments,”
January 1997, by the University Assessment Council and the Office of the Provost. University of
Wisconsin-Madison. The reader is referred to their website at
http://publications/provost/assess/manual.
1-1
How to use this “Tool Kit”
This tool kit was developed to guide you through a program evaluation by presenting key aspects
of the entire process. It is important to note that many sections are interrelated and need to be
understood in the context of the whole. As such, we do not advise taking individual sections out
of context but recommend that you review the Tool Kit from start to finish and then select those
sections or items that fulfill your program needs. Throughout this document we use the terms
assessment and evaluation interchangeably, although we recognize they are sometimes used as
separate and distinct terms.
The remainder of this kit is organized into nine sections. With the exclusion of Section 10 on
resources, each section provides a brief description of the topic, an overview to the section, and
worksheets or checklists that engage you in the process of implementing a program evaluation.
The worksheets provided in the kit are designed so that they can be copied as needed. They
appear on colored paper. In some of the sections we also include a “sample." We hope that over
time this kit will serve as a central location in which a program can store all of its evaluation-
related materials, including items such as program goals, evaluation records, and results, thus
keeping assessment information readily available for accreditation-related site visits and for future
evaluators.
The Tool Kit is organized by steps in the assessment process. These are:
• Section 2 suggests some "Questions for Assessment Planners" to ask as they begin the process
of program assessment.
• Section 3, “Setting Objectives,” includes a brief discussion of the value of setting educational
goals for a department or program and offers tips on defining objectives so as to make them
measurable as well as acceptable to a diverse group of faculty members.
• Section 4, “Selecting Assessment Tools,” discusses the pros and cons of various types of
assessment tools, from surveys and interviews to standardized exams and portfolios.
• Sections 5 and 6 discuss “Surveys” and “Interviews” in more detail, and cover issues such as
wording questions, gathering data, and analyzing responses. These sections also include
worksheets for tracking surveys and interviews as they are being conducted.
• Section 7 presents some of the “Ethical Dimensions” of conducting program assessment.
• Sections 8 and 9 describe various methods for “Reporting Results” and “Acting on Results;”
both emphasize making program assessment a practical, valued, and above all, useful
undertaking.
• Clearly, no tool kit can provide all of the information needed for conducting a program
assessment. Therefore, the final section of this tool kit lists a number of valuable resources
where you can find additional information on various stages of the assessment process. It
includes web sites, articles, books, and resources at UW-Madison.
We understand the process of evaluation to be cyclical, and as such it can become part of a plan
for continuous improvement. The diagram on the following page illustrates components of this
process and identifies the stages in the cycle where this tool kit provides resources.
1-2
Developing and Implementing a Departmental Assessment Plan
For Programmatic Improvement
STEP 1
Define educational / programmatic goals &
objectives
STEP 6 STEP 2
Implement assessment plans & Determine instruments & methods
revise as needed needed for assessing student
achievement
STEP 5 STEP 3
Submit assessment objectives, Determine how results will be
methods, & timetable to school / disseminated & used for program
college academic Planning Councils improvement
STEP 4
Develop a timetable for assessment plan
implementation
1-3
We value your feedback on this tool kit
Please let us know how this kit works for you! You can send us your feedback via e-mail or
campus mail to the address listed below. Specifically we are interested in what aspects of the kit
you find most useful, what seems unimportant, and what you would like to see included in future
editions. Since we see this as an evolving document that can be improved and expanded over
time, we appreciate any comments you can share that will help us improve this took kit for you
and others involved in program assessment.
1-4
Section 2
QUESTIONS FOR ASSESSMENT PLANNERS
There are two steps that you should take before you begin an assessment project:
• determine the parameters of the project (who’s doing it?, why?, for whom?, what resources
are available?)
• take stock of what information the department, college, and university already have available
The following pages present questions that the department members responsible for assessment
should consider. Keeping a written record of the answers to these questions will ease reporting to
administrators and accreditors, and will also facilitate handing off the assessment project to your
successors.
2-1
WORKSHEET FOR PLANNING YOUR ASSESSMENT
1. Why are you conducting this assessment? Who or what initiated the assessment?
m Dean m Department Committee m Whole Department
m Department Chair m Accreditation Criteria m Other ____________
2. Who will conduct the assessment? Internal or external evaluators? Whole department or a
committee? A combination of the above?
m External Evaluators m Whole Department m Department Committee
m Department Chair m Other ____________________
5. How much money and what resources (staff time, photocopying costs, etc.) are available?
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
2-2
6. Given that a major goal of the assessment is determining if the department's goals relative to
earning degrees in the major are being met, what do you hope or expect to learn from the
assessment?
m To assess students’/alumni’s content knowledge.
m To assess students’/alumni’s general preparedness.
m To assess students’/alumni’s perceptions of program quality.
m To assess employers’ perceptions of program quality.
m Other
___________________________________________________________________
10. At this present time, what kinds of information would be most beneficial? ____________
________________________________________________________________________
________________________________________________________________________
11. What types of information are you already gathering (through tests, exams, student records,
certifications, portfolios, existing questionnaires, standardized tests, etc.)? ____________
________________________________________________________________________
________________________________________________________________________
12. Will any of the information you currently gather be helpful to answer your present questions
and concerns?
_________________________________________________________________________
_________________________________________________________________________
13. Is there information that the university gathers that might be helpful in answering your
present questions and concerns? (E.g., through ISIS, student record data.)
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________
2-3
14. Do you want information on the following categories (many of which may be out of the
control of your department)?
m Courses offered by other departments? m Specific courses and/or professors?
m Services offered at the college or university level such as advising or career services?
m Other
___________________________________________________________________
16. For students and graduates in your field as a whole, do you have:
m approximate number of graduates in the U.S. per year?
m typical post-graduation plans of U.S. graduates?
m success rates of U.S. graduates?
m average starting salaries of U.S. graduates?
m typical career paths of U.S. graduates?
Of the data that is NOT readily available, which will be needed for the assessment? Where
can it be found? Who will find it? (See bibliography for a list of information resources.)
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
18. What can and will you do with the results of the assessment?
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
__________________________________________________________________________
Given your answers above, become familiar with the types of evaluation tools (discussed in
Section 4) in order to choose a tool that best suits your present needs.
2-4
Section 3
SETTING OBJECTIVES
Setting objectives for the program or major being assessed is a crucial, but often neglected,
starting point. If a department has well-written and well-thought out objectives, or goals,
measuring them becomes relatively easy. In essence, program objectives tell you, your fellow
faculty members and your students what your program values and what it expects students will
achieve by the time they graduate. For those departments who don't have agreed upon written
objectives already in place, this section provides some questions designed to stimulate ideas about
what your objectives might be, a checklist of what to do with those objectives, and some sample
program objectives from UW units. For departments who do have written objectives in place, this
section provides a stimulus to revisit and refine them. A bibliography in Section 10 refers to
sources of additional information on the usefulness of objectives and on how to go about
preparing them.
• Start brainstorming the department’s objectives before focusing on the accrediting agency’s
objectives – it’s important that the objectives be goals that your faculty can support.
• It may be useful to include the full faculty in brainstorming ideas for the objectives list, but
especially in large departments, it’s inefficient to have the full faculty involved in
“wordsmithing” the list – refer this duty to a small committee and return to the full faculty for
final approval.
• If a committee has already prioritized the objectives and refined the language, explain that to
the department, and don’t ask the full faculty for “comments” – ask them whether they can
live with the list. If they can’t, ask them to suggest alternatives.
3-1
WORKSHEET FOR SETTING PROGRAM OBJECTIVES
m With fellow faculty, brainstorm a list of goals for graduates of your program.
• What makes your program distinct or different from its peers/competition?
• What makes your graduates marketable, valuable?
• How would you define/describe your graduates’ abilities?
• What will your graduates know or be able to do once they graduate that they couldn’t do
when they began your program?
• What would you tell a prospective student that your program can/will give him/her?
m Compare the list with the objectives suggested or required by your accrediting
agency(ies) and with your college or school.
• Are any of the college’s or accreditors' objectives missing from your list? If so, can you
either add them or justify why you cannot?
• Are any of your objectives in conflict with those of the accrediting agency or your school?
If so, can you find a way to bring them in line with each other?
m Distribute the final list of objectives to both faculty and students on a regular basis.
• Are faculty encouraged, expected, or required to link their course objectives with these
department objectives?
• Are students encouraged, expected, or required to know what the program objectives are?
3-2
SAMPLE PROGRAM OBJECTIVES
Department of Industrial Engineering
3-3
SAMPLE PROGRAM OBJECTIVES
Research Capability
Graduates should have the ability to formulate relevant biological questions, generate
hypothesis, devise experiments and interpret results.
Literacy
Graduates should be capable of communicating in clear scientific prose and of reading and
critically evaluating scientific literature.
Numeracy
Graduates should be capable of using quantitative methods of analysis and modeling.
Computer Literacy
Graduates should be capable of using computers in such areas as data processing, database
searches, and word processing.
Documentation
Graduates should be capable of establishing and maintaining a laboratory and/or field
notebook.
3-4
Section 4
SELECTING ASSESSMENT TOOLS
There are a variety of ways to collect information to answer specific questions about your
program or major. We refer to these different data-gathering techniques as “assessment tools.”
The tools you select should be consistent with the type of information you seek. In some cases,
several tools may be needed to address a breadth of issues. Using a variety of tools is often
beneficial because the information gathered by some tools can be compared with information from
other tools (this allows one to confirm findings from one source with another source), or can
deepen your understanding about issues raised in one method. Some researchers purposely use a
variety of different tools to see if they reveal similar findings and to supplement and deepen
understanding about the information – this is one form of “triangulation.” One example of a
mixed-methods design is described below:
1. Use a focus group with subset of the sample to identify issues; then develop questionnaire
based on issues identified.
2. Distribute questionnaire to the entire sample.
3. After analysis of responses to questionnaire, conduct personal interviews with a subgroup
of the sample to explore issues in greater depth.
A primary UW-Madison Assessment Council report2 presents types of assessment divided into
“Direct” and “Indirect” Indicators of Learning. We highly recommend you review this report
which provides short summaries for each of these types of indicators.
• Direct Indicators of Learning are assessments used directly with students for the primary
purpose of assessing student learning, student achievement, and performance such as:
1. Capstone Course Evaluation
2. Course-Embedded Assessment
3. Tests and Examinations
4. Portfolio Evaluation
5. Pre-test/Post-test Evaluation
6. Thesis Evaluation
7. Videotape and Audiotape Evaluation of Performance
• Indirect Indicators of Learning are tools that do not assess student achievement per se, but
which provide important information about the program. Indirect indicators commonly
involve self-reports through surveys or interviews of students, alumni, or employers on their
experiences and perspectives about a program. These indicators include:
8. External Reviewers
9. Student Surveying and Exit Interviewing
10. Alumni Surveying
11. Employer Surveying
12. Curriculum and Syllabus Analysis
2
“Using Assessment for Program Improvement A Resource Manual for Academic Departments,”
(January 1997), The Assessment Council and the Office of the Provost, UW-Madison.
4-1
This tool kit focuses on the use of surveys and interviews as these have been the primary indirect
indicators used by departments at UW-Madison with whom we have worked. Some of the
advantages and disadvantages of a few selected types of tools are presented in the following table.
Your decision about the tools you select depends in part on your answers to questions in Section
2. the purpose of and available resources for conducting the assessment.
4-2
WORKSHEET 1 FOR INDICATORS CURRENTLY IN USE
Notes:
Notes:
4-3
WORKSHEET 2 FOR INDICATORS CURRENTLY IN USE
Notes:
Notes:
4-4
Section 5
SURVEYS
Surveys are an excellent way to gather information that describes, compares, or explains
knowledge, attitudes and behaviors. This section includes a worksheet on survey creation and a
worksheet on survey analysis.
We have found that THE SURVEY KIT by series editor Arlene Fink is an excellent resource for
conducting surveys. We highly recommend that you acquire and study this resource and therefore
we do not attempt to supply information in much detail regarding surveys. THE SURVEY KIT
contains 10 volumes, each focusing on a specific aspect of conducting a survey. This kit includes
volumes on asking survey questions, conducting self-administered and mail surveys, interviews by
phone and in person, designing surveys, how to select a sample for surveys, how to measure
survey reliability and validity, how to analyze survey data, and how to report on surveys. We
think it provides some of the more sophisticated information about using surveys that can guide
your survey development. THE SURVEY KIT is available through SAGE publications at the
address below:
Tips on surveys
• Allow enough time – plan for time needed to conduct first mailing, receive the first set of
responses, conduct follow-up mailing, receive follow-up responses, input data, conduct
analysis, and create report.
• Before using a survey, pilot it on a small sample of a similar audience. Then modify unclear or
poorly worded questions or delete those questions that do not seem to elicit worthwhile
information.
• Plan to do two mailings. The follow-up mailing can add up to 1/3 more responses.
• Keep the survey as short as possible.
• Sending surveys on colored paper can increase your return rate.
• Make sure the return date and sender address is bold, big and clear on the survey.
• Use a short cover letter on the survey from the department chair. Personalize the letter so
that the students, alumni or employers recognize they have valuable information to contribute.
• Use incentives to increase return rate - Response rates can be improved if you give a pizza
coupon when the survey is returned or enclose a card that becomes part of a drawing to win a
prize. When planning these techniques, plan them in a way that keeps the anonymity of the
respondent. Anticipate an average return rate of about 20% unless you do something
exceptional as an incentive.
• Identify one person to keep records and do follow-up mailings.
5-1
SAMPLE QUESTIONS TO CONSIDER FOR USE IN SURVEYS OR INTERVIEWS
Do not attempt to use all of the questions below on a single assessment. These questions are
intended to provide help in brainstorming questions. This list is by no means complete, nor is
the wording of each question set in stone. To use this form, select which categories of questions
your assessment will include, then decide which questions or types of questions will best elicit the
type of information you need. Modify and tailor questions to fit your program needs.
QUESTIONS ON “DEMOGRAPHICS:”
m Name
m What is your specific area of interest within the overall field?
m Are you completing a minor, double major, or specialization option?
m What types of companies are you interviewing with and why did you choose those types?
(e.g., service, manufacturing, consulting, design, other)
m Have you been working while in school? How many hours per week? Summer/school year?
In what field?
m How many hours per week do you typically spend on coursework outside of class?
m How many semesters has it taken you to graduate? How many credits have you taken?
m Has working during school affected the quality of your classwork or the time it’s taken you to
graduate? For better or for worse? What recommendations do you have for other students
who are considering working while in school?
5-2
m What are the strengths of the program?
m What aspects of the program could be improved?
5-3
QUESTIONS ON NON-TECHNICAL SKILLS:
m Do you think you’ve gained the skills necessary to be an effective team member on the job?
What are those skills? Which parts of the program helped you gain those skills?
m Do you think you’ve gained the skills necessary to communicate effectively (orally and in
writing) on the job? Which parts of the program helped you gain those skills?
m What are the most important qualities or skills that a person working in this field should have?
Why? (For example: working independently, creative thinking, problem solving, time
management, communication, working in a team, intellectual curiosity, confidence in field,
ethical responsibility).
m Which skills would you like to see the program encourage or improve on?
m What skills would you like to develop that you haven’t yet?
m How, if at all, would you change the way the department helps students develop their non-
technical skills?
5-4
m Do you plan to remain in the state (or country)? Why or why not?
m Since graduation, have you taken continuing education or industrial short courses? What
subjects have you studied? Why?
m Please write a short description of the type of work you do in your present position.
m Do you supervise the work of other engineers?
m (Especially for supervisors:) In your view, what deficiencies in preparation do people
beginning work in this field have?
m If you have attended or completed graduate school or are currently in graduate school, please
rate how well your undergraduate education prepared you for graduate study.
m How well did your training prepare you to compete within your field or current area of
employment?
m How does your educational preparation compare with that of peers in your field from other
schools?
m Rate how well your education prepared you in the following areas and also rate how useful
these areas have been in your career. [A chart listing the program courses would follow here.]
m If you rated any of the above topics as “very prepared” or “ poorly prepared,” please
comment below on why.
m If you rated any of the above topics as “very useful” or “not useful” please comment below
on why.
m How could the department have better prepared you for your job or for graduate school?
WRAP-UP QUESTIONS:
m Any other comments on the program or “messages” you’d like the faculty to hear?
m What suggestions do you have for addressing the limitations you’ve described?
m From your answers above, are there particular issues you’d like to emphasize or be sure we
include in our report?
m Is there anything we haven’t covered in this interview/survey that you’d like to mention?
5-5
CHECKLIST FOR CONDUCTING SURVEYS
m Is your survey easy to read? (on lightly colored paper, large print, legible font and photocopy
quality)
m Does the survey include information at beginning and end about when and where survey
should be returned?
m Do you have a system for recording the names and both old and new addresses of surveys
which were returned-to-sender?
m If you are doing a second mailing, has it been at least two weeks since the first was sent?
(Ideally, you should wait two weeks before sending a second mailing)
m Have you changed the return date on the second mailing of the survey?
m Have you used another color paper on the second mailing of the survey?
5-6
WORKSHEET FOR CREATING A SURVEY
1. What information do you want to tell the participant about returning the survey? What can
you provide the participant to aid this process?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
3. What information do you want to collect about participants' academic career or work
experience? (Examples: Years attended, locations, degrees earned, numbers of jobs, job
descriptions)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
4. What are the broad topics you are interested in learning about? (Examples: Integration of
curriculum, learning preferences, preparation for work)
A.___________________________ B._________________________
C.___________________________ D._________________________
E.___________________________ F._________________________
G.___________________________ H._________________________
5. What are the particular dimensions of the topics you are interested in? (Examples: Parts of
the curriculum that seem extraneous? Particular projects or classes that help students learn
best? Ways alumni feel they were not adequately prepared for the world of work?)
Topic A Topic B
___________________________ _________________________
___________________________ _________________________
___________________________ _________________________
___________________________ _________________________
5-7
Topic C Topic D
___________________________ _________________________
___________________________ _________________________
___________________________ _________________________
Topic E Topic F
__________________________ _________________________
___________________________ _________________________
___________________________ _________________________
___________________________ _________________________
Topic G Topic H
__________________________ _________________________
___________________________ _________________________
___________________________ _________________________
___________________________ _________________________
7. Are there relationships between topics you would like to learn about? (Example: Is there a
relationship between advising and time-to-degree?)
___________________________ + _________________________
___________________________ + _________________________
___________________________ + _________________________
8. Are there topics you would like to know the value of in participants' views? (Example:
Strengths and weaknesses of internships? Value of co-op?) What specific dimensions of the
topic are you interested in?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
5-8
WORKSHEET FOR SURVEY ANALYSIS
1. What is the demographic profile of your sample? For each category, list the sub-categories
below the heading. Add or change categories to fit your needs.
Ethnicity
Gender
GPA
Year-in-School
Other
5-9
2. If your survey has open-ended questions, can the responses be grouped into like categories? If
so, what are they?
A.___________________________________________________________________
B.___________________________________________________________________
C.___________________________________________________________________
D.___________________________________________________________________
E.___________________________________________________________________
3. For each of the categories listed above, what are the varying comments/perspectives/issues?
A.____________________________________________________________________________
B.____________________________________________________________________________
C.____________________________________________________________________________
D.____________________________________________________________________________
E.____________________________________________________________________________
4. Are there responses that relate to each other? (Do not look at grouping of questions on survey,
but at responses)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
5. Are there dimensions to the program that respondents feel are strengths? Why?
______________________________________________________________________________
______________________________________________________________________________
Weaknesses? Why?
______________________________________________________________________________
______________________________________________________________________________
5-10
SAMPLE COVER LETTER FOR SURVEYS
[Date]
All responses will be held confidential. The surveys will be read and analyzed by [name of office
or individual], and any reports produced as a result will not contain any identifying characteristics
of individual respondents. The original surveys will be available only to [name of office or
individual]. [Name of office or individual] will prepare a report for the department based on these
surveys, and a [copy or summary] of that report will be available to you [via a department
mailing, newsletter, etc.]. [Insert description of department’s plans for responding to the survey
findings.]
Please complete and return the survey to [name & address in bold] by [date in bold]. A stamped,
addressed return envelope is provided for your convenience. If you have any questions about the
survey or the department’s self-assessment, please feel free to contact [name of office or
individual] at [address, phone number, or e-mail].
We recognize how busy you are and appreciate your time and feedback. Thank you for assisting
us in our efforts to improve the [undergraduate or graduate] program.
Sincerely,
5-11
SAMPLE THANK-YOU POSTCARD FOR SURVEYS
[Department Name]
5-12
WORKSHEET 1 FOR SURVEY TRACKING
Department: _________________________________
Date: _______________________________________
In Sample or Not?
# of Semesters to
Month & Year of
# of Credits to
GPA in Major
Male/Female
GPA Overall
Graduation
Graduation
Graduation
Ethnicity
ID #
123 Sample, Chris May-93 11 132 3.245 3.125 F Native Amer. yes
Lowest Value
in Range
Highest
Value in
Range
Averages
(total # of
(total # of
students)
years)
Totals &
Percentages
5-13
WORKSHEET 2 FOR SURVEY TRACKING
Department: _________________________________
Date: _______________________________________
Survey "Returned to
1st Mailing Address
Completed Survey
Returned (Date)
Sender"
(Date)
(Date)
ID #
received)
returned)
names)
mailed)
mailed)
(total #
(total #
(total #
(total #
TOTALS
5-14
Section 6
INTERVIEWS
This section summarizes many of the important points you should consider as you use interviews
to evaluate your major, program, or department. It presents some broad guidelines which can be
applied to a variety of interviewees whether they are current students, graduating students,
alumni, or employers. The section also gives general background information about conducting
interviews in various formats such as individually or in groups, over the phone or face-to-face,
and tape recorded or not. The section discusses these formats, poses questions to consider when
planning interviews, provides checklists and worksheets for conducting interviews, and discusses
methods for analyzing interviews.
These questions are inter-related and should be considered as a whole instead of in a linear way.
(see worksheet at the end of this section that is guided by these questions.)
6-1
one hour. Group interviews usually run from one hour to one and one-half hours.
Since most programs do not have resources to interview every student, alumni, or employer, you
will likely need to ask a sample to participate in the interviews. Some suggested ways to select a
sample to interview are presented below.
• Invite a percentage: If you want to select 25 students out of 50, for example, you can choose
every other name on the student roster.
• Invite everyone in the targeted group: You can ask everyone (through posters, e-mails, or
phone calls) in the program to participate in an interview. Then wait to see how many
respond and confirm interviews with the first 25 who respond to your request. This is an
efficient way but may not allow for a very representative sample.
• Invite a representative sample: For a sample to be representative of the entire population,
your sample should reflect proportions of the entire population based on criteria you identify
as important. These criteria will vary for different groups, however, some of the most
commonly used criteria to select a sample are: gender, ethnicity, GPA, length of time in
program. For example: An entering class in the School of Pharmacy had 100 students who
were 60% female, had an average GPA of 3.2, and half had completed their first two years at
UW-Madison. In their sample they purposely selected a sample of twenty students who were
60% female, had a range of GPA’s above and below 3.2, and half of whom had attended UW-
Madison for 2 years.
6-2
ahead of time may take away any anxiety about interviewing that some interviewees may feel. It
also allows interviewees time to prepare thoughtful answers which can improve the quality of the
responses you receive. Questions can be sent to the interviewees over e-mail at the same time you
confirm the interview time or made available at the main departmental office.
6-3
11. Should we tape record or not? Should we transcribe or not?
There are three approaches to recording interview data, each with advantages and disadvantages
as described below.
6-4
13. What problems might we anticipate?
Some problems you might anticipate are summarized below.
Problem Description/Solution
Interviewee does not show up Some interviewees will not show up, or if you are doing phone interviews,
will not be available at the time you have arranged to call them. It is
appropriate to try to re-schedule the interview, but discontinue efforts after
one more contact.
Interviewee is very uncomfortable Depending on the types of questions, it is possible that interviewees may be
talking about issues or becomes uncomfortable discussing certain topics or some may become emotional and
emotional cry or withdraw. If this occurs, tell the interviewee it is OK. If you are
taping, offer to turn off the tape recorder, and offer to take a few minutes
break. Ask if the interviewee would like water. Take a break from
questioning or ask if you should proceed.
You run out of materials Have extra materials: copies of questions, paper and pencil, consent forms,
batteries, blank tapes.
Interviewee confides important It is possible that the interviewee may reveal information related to issues
information such as harassment, unethical, or illegal activity. In the event this occurs,
refer the student to the appropriate office or counselor. A list of these
references is provided in the resource section at the end of this kit.
Equipment Fails Equipment failure happens. Have extra equipment on hand. Encourage
interviewers to take some written notes and check the tape recorder during
the session. After the interview is completed, the interviewer should also
check the tape. If the machine did not work, the interviewer can create a
written summary based on notes while the discussion is still fresh in the
interviewer’s mind.
Section 5 includes a worksheet on developing questions which may also be valuable in developing
interview questions. However, interview questions differ from survey questions in that they
should be open-ended rather than limited “yes-no” type questions. Their purpose is to elicit
perspectives and opinions from the interviewee. Ideally, interview questions will be developed
with input from a cross section of stakeholders such as: faculty, deans, staff, and students. For
example, The Department of Botany used a committee that represented a cross-section of the
department: two faculty, two academic staff, and one graduate student. This group brainstormed
questions related to issues that were of particular concern within the department. Some of these
questions related to the senior thesis requirement of the department and others related to the
newly-created departmental student learning goals. Drafts of interview questions and the short
survey were distributed to faculty, discussed at a faculty meeting, and then modified based on
faculty input.
6-5
Tips on Interviewing
An interview, rather than a paper and pencil survey, is selected when interpersonal contact is
important and when opportunities for follow-up of comments are desired. In-depth interviews are
characterized by extensive probing and open-ended questions. In-depth interviews are particularly
appropriate when seeking an understanding of complex or highly sensitive subject matter.
Typically the interviewer works from a list of questions or issues that are to be explored and
probes areas of particular interest. The guide helps the interviewer pace the interview and makes
the process more systematic and comprehensive.
6-6
7. Keep materials accessible. A box of materials containing the following items can be kept in
the interview room if the room can be locked when not in use.
• master lists of the interview questions
• blank surveys for students to complete prior to interviews, if applicable
• blank human subject consent forms
• tape recorders plus batteries or outlet
• paper and pens
• blank audio tapes
• a list of student names and assigned ID numbers
• a sign saying “Interview in Progress” and adhesive tape to post the sign on the door
• general instructions that summarize the interview process
Although interviews can provide a wealth of information, the analysis of interviews is challenging
because responses are usually quite varied, interconnected, complex, and flow throughout the
entire interview. Your task throughout the analysis is to look for common and recurring ideas,
issues and themes across the set of interviews, as well as try to recognize the range of the
perspectives that interviewees raise. When doing analysis, one must continually attempt to put
aside your own perspective and be guided by what the interviewees said.
Whether or not you work alone or with a small committee to conduct the analysis, and whether
you work from verbatim transcriptions or written notes, the process of analysis is generally similar
to that described below:
1. If interviewers used notes, have each one collect their notes together. If transcripts were
6-7
completed, distribute these to the people who will complete the analysis. If you used several
interviewers and had the interviews transcribed, a way to distribute these to people identified
as “first” and “second” readers is described below:
Divide transcripts equally among the analysis committee to ensure that each transcript will
be read by two readers who did not conduct the particular interview. Identify a primary
reader and a secondary reader as shown in the example grid below. Both first and second
readers read the transcripts in the same way, but the first reader is the main one
responsible for understanding the perspective of the student who was interviewed and
synthesizing that information, although the second reader should expand on the thoughts
of the first.
1 0032 A B C
2 4127 B A D
3 6219 C D A
4 0002 D C B
3. Identify common issues and themes, cluster, and re-group. The committee should hold
lengthy meetings to review and summarize the perspectives presented in each transcript. The two readers for
in a blackboard or flip-charts full of comments. Patterns and common issues and perspectives
will begin to be noticeable. The
committee should cluster or group these issues or try to capture the range of opinions if there
is no common opinion expressed. A continual process of re-grouping and synthesis should
occur. Quotations from interviewees that represent the prevalent perspectives can be
identified.
4. Create a summary report. See Section 8 for a discussion of types of information that should
be included in the report.
6-8
WORKSHEET FOR SCHEDULING INTERVIEWS WITH STUDENTS
in Department of ______________
Week of ______________________
Scheduled Interviews
Date Interviewer Times available to
Time, Location, Student Name
conduct interview
A 10:00-1:00 or 12:00-12:45, Room 512, Lisa Jones
9/3-Monday
3:00-5:00
B
A
9/4- Tuesday
B
A
9/5-Wednesday
B
6-9
Which to use: Focus groups or In-depth interviews?
Factors to consider Use focus groups when. . . Use in-depth interview when. . .
Group Interaction Interaction of respondents may Group interaction is likely to be
stimulate a richer response or new and limited or nonproductive.
valuable thought.
Group/peer pressure Group/peer pressure will be valuable Group/peer pressure would inhibit
in challenging the thinking of responses and cloud the meaning of
respondents and illuminating results.
conflicting opinions.
Sensitivity of subject matter Subject matter is not so sensitive that Subject matter is so sensitive that
respondents will temper responses or respondents would be unwilling to
withhold information. talk openly in a group.
Depth of individual responses The topic is such that most The topic is such that a greater depth
respondents can say all that is relevant of response per individuals is
or all that they know in less than 10 desirable, as with complex subject
minutes. matter and very knowledgeable
respondents.
Data collector fatigue It is desirable to have one individual It is possible to use numerous
conduct the data collection; a few individuals on the project; one
groups will not create fatigue or interviewer would become fatigued or
boredom for one person. bored conducting all interviews.
Extent of issues to be covered The volume of issues to cover is not A greater volume of issues must be
extensive. covered.
Continuity of information A single subject area is being It is necessary to understand how
examined in depth and strings of attitudes and behaviors link together
behaviors are less relevant. on an individual basis.
Experimentation with interview Enough is known to establish a It may be necessary to develop the
guide. meaningful topic guide. interview guide by altering it after
each of the initial interviews.
Observation by stakeholders It is desirable for stakeholders to hear Stakeholders do not need to hear
what participants have to say. firsthand the opinions of participants.
Logistics Geographically An acceptable number of target Respondents are dispersed or not
respondents can be assembled in one easily assembled for other reasons.
location
Cost and Training Quick turnarounds is critical, and Quick turnaround is not critical, and
funds are limited. budget will permit higher cost.
Availability of qualified staff Focus group facilitators need to be able Interviewers need to be supportive
to control and manage groups. and skilled listeners.
Source: User-friendly Handbook for Mixed Method Evaluations, Directorate for Education and Human Resources,
Division of Research, Evaluation and Communication, National Science Foundation. August 1997, p. 3-11.
6-10
WORKSHEET FOR PLANNING INTERVIEWS
3. Who will conduct individual, group or phone interviews and how many will be conducted?
m Individual interviews: How many?______
m Group interviews: How many groups?______ How many in a group? ______
m Phone interviews: How many?______
Describe how interviews will be scheduled (Who will send e-mail or call?):_____________
_________________________________________________________________________
_________________________________________________________________________
6-11
7. Where will interviews be conducted?___________________________________________
(Consider: Privacy. Can site be locked? Can materials be stored at site? Who has key?)
11. Will interviewees be provided with the questions ahead of time? m yes m no
If yes, how will these be distributed to interviewees? ___________________________
______________________________________________________________________
13. What problems may arise while administering the survey? Are there plans to address these
problems? If so, what are they?_____________________________________________
_______________________________________________________________________
_______________________________________________________________________
_______________________________________________________________________
6-12
CHECKLIST FOR PREPARING AND CONDUCTING INTERVIEWS
Conducting interviews
m Put “Interview in Progress” sign on door
m Review consent form with interviewee, ask permission to tape record, get student signature
m If a written survey is completed, review completed written survey for take-off into questions
OR Ask student to complete written survey quickly before asking interview questions.
m If permission has been granted, begin audio tape
m Conduct interview
m Near beginning of interview, check to see that tape is recording
m At end of interview, thank interviewee
m Turn off audio tape
m Remove “Interview in Progress” sign.
m Tape recorder, blank consent forms, extra surveys and lists of questions remain in box.
m If using ID numbers write number on audio-tape and record on master
list that the person showed up.
m Put tape and written survey in identified location for delivery to transcriptionist or data entry.
6-13
SAMPLE GUIDE FOR INDIVIDUAL OR GROUP INTERVIEWS
Permission to tape
FOR INDIVIDUAL: “If it is OK with you, I would like to tape record our conversation. The
purpose of this is so that I can get all the details but at the same time be attentive to the
conversation with you. All of your comments will remain anonymous. No one but the
transcriptionist will listen to the tape, and others involved in summarizing the report will not
know your name. I will be interviewing approximately [number] people, and compiling a
report, but there will be no reference to individuals. If you agree to this interview, please sign
this consent form.”
FOR GROUP: “My colleague [name of colleague] will be taking notes and tape recording the
discussion so that I do not miss anything you have to say. I have already explained these
procedures to you when we set up this meeting. As you know everything is anonysmous. No
one will know who said what. I want this to be a group discussion, so feel free to respond to
me and to other members in the group without waiting to be called on. I will try to give
everyone a chance to give their opinions about each topic. However, I would appreciate it if
only one person talks at a time. The discussion will last approximately one hour. There is a
lot to discuss, so at times I may move us along a bit. Now let’s start by everyone sharing their
name and a sentence of background about yourself.”
Closure
“Thank you for coming and giving your precious time to help us improve our programs. If you
think of further information you would like to share, I can be reached at ____. We hope to have
this report completed by ______. I wish you success in your program/summer.”
6-14
SAMPLE QUESTIONS FOR PRE-INTERVIEW SURVEY
This form is intended to serve two functions. First, it provides background and a taking-off point
for the respondent’s comments during the interview. For example, it may be important to know
that a student who clearly understands the purpose of the various program requirements has
been working in the field throughout school. Second, the form can be used to identify areas to
pursue in an interview. For instance, an interviewer may want to question why a full-time
student took seven years to complete an undergraduate degree.
1. When did you enter the ______ major? Semester 1 ____ Semester 2____ Summer____
Year Year Year
6. Check the statement below which best represents your path in college:
____ I have studied or plan to study all of my coursework at UW-Madison.
____ I transferred into UW-Madison from another institution. If so, how many semesters
have you been at UW-Madison? ________________________
____ Other: Please comment: _____________________________________________
7. Students in the ______ major can take several foundation courses or sequences before they
take upper level courses (e.g., in biology students can take Biocore or 151-152 or a different
choice). What foundations sequence did you take? (circle one below)
Biocore 151-152 Other:__________________________________
6-15
WORK EXPERIENCE
9. Did you complete a co-op or internship?
m Yes. With what company/school? ______________________________________
m No.
10. Have you been working (other than co-op) while attending school?
m Yes, approximately _______ hours per week during the semester.
m Yes, approximately _______ hours per week during the summer.
m No.
11. If you have been working, has your work been related to your field of study?
m Yes, I’ve been working in (which field?) ________________________________
m No.
m Some yes, some no (which field?) ______________________________________
12. How many hours a week do you typically spend on coursework outside of class? _________
POST-GRADUATION PLANS
13. What are your post-graduation plans? (If you are planning on both employment and graduate
school, please indicate which you plan to do first.)
m Employment in (what field?) _______________________________________
m Graduate school in (what field?) ____________________________________
m Undecided
If Graduate School: How well prepared do you feel you are for graduate school?
_____Not Prepared ______Prepared _____Well Prepared
If Employment:
A. Do you have a job lined up after graduation? ____Yes ____No
If yes, what type of job is it?___________________________________________
C. What type of employers are you interviewing with? Check all that apply.
m Consultingm Design m Manufacturing
m Service m Other _______________________________________
6-16
ADDITIONAL SAMPLES OF PRE-INTERVIEW SURVEY QUESTIONS
SAMPLE #1
As a result of your major in [list program name] at UW-Madison, do you think the program
adequately provides opportunity for the development of the program goals shown below?
RATING
Superior Good Fair Poor
PRIMARY PROGRAM GOALS Provides Provides Provides Provides
numerous adequate limited insufficient
opportunitie opportunitie opportunities opportunitie
s s s
Goal 1
Goal 2
Goal 3
Goal 4
Goal 5
Goal 6
Goal 7
SAMPLE #2
6-17
SAMPLE #3
6-18
SAMPLE INTERVIEW QUESTIONS
Department of Botany
University of Wisconsin-Madison
INTRODUCTION
CAREER PLANS
8. Do you think your undergraduate program has equipped you with the skills and knowledge
you are likely to need to pursue your career plans?
The following four questions all start with the following phrase:
Of the courses you have taken, are there any courses required for the botany major which you
consider to be: 9. Most worthwhile? Most valuable? Why?
10. Worthless or not relevant? Why?
11. Most difficult? Why?
12. Most interesting? Why?
13. The botany major requirements specify that students take courses in five out of six areas
within botany. Have you found that to be an advantage or disadvantage and why? Do you
feel there should be more flexibility in choices or do you agree that the design is adequate?
14. We strongly encourage/require our majors to get some sort of research or field experience
either as a 699 project or a thesis. Did you (or do you plan to) do some sort of research or
field project?
If answer is NO, then: Why did you decide not to?
If YES: How valuable do you think it was?
6-19
ADVISING
15. Do you feel that you received adequate advising prior to your decision to declare botany as
your major? Why? Can you suggest ways to improve the advising system?
16. Do you feel that you received adequate advising after declaring botany as your major? Can
you suggest ways to improve our departmental advising system?
17. Do you feel a sense of belonging to the botany department? Why or why not? Do you have
any suggestions related to this?
GENERAL
18. Taken as a whole, how would you assess our undergraduate program?
19. What do you consider to be some of the strengths of the botany major?
21. Can you suggest specific ways in which our undergraduate program might be strengthened or
improved?
22. Are there any concerns suggestions, or comments you would like to make about the program
that haven’t already been discussed?
6-20
SAMPLE INVITATION TO STUDENTS TO INTERVIEW
********
The Department of ____________ is involved in activities to evaluate and improve its curriculum
and majors. As part of this, I would like to hold individual or group interviews with as many
graduating seniors as possible sometime between [dates]. The interview can be arranged at a time
convenient for you (Mondays through Fridays between 8:30 and 5:00). Because you are a senior
who is about to graduate, your opinions and suggestions are very important to help improve your
program. The interview will last about 45 minutes and participation is voluntary. There are four
people who will conduct the interviews which will take place at [identify location]. With your
permission we would like to tape-record the interview. Only I and the transcriptionist will listen to
the tape. If you do not want your interview taped, that is also OK. All student names and
identities will remain confidential. The discussion questions will be provided in advance so that
you can think about them ahead of time. I hope you will assist us by giving your feedback and
sharing your opinions about your program.
IF YOU CAN PARTICIPATE in an interview, please e-mail me with your first, second and third
choices of dates and one-hour time frames that will work for you between the dates [enter dates].
I will quickly respond to you with the specific time and a room location.
Thank you in advance for a quick response and for your participation in this important
improvement process.
6-21
SAMPLE OF INDIVIDUAL INTERVIEW RECORD
Department: ___________________________
Year: ______________
NOTES:
6-22
SAMPLE OF THANK-YOU POSTCARD FOR INTERVIEWS
[Department Name]
6-23
TOTALS
ID #
123 Chris
Sample,
(# of names) Last Name, First Name
608/555-5555
[email protected]
Address/Phone
3/4/98 left
phone msg.
1st Attempted Contact
e-mail
3/7/98
reached via
Disconnected or
(# incorrect info) Incorrect Contact Info
Declined Interview
(# declined) (Reason?)
yes
Date/Time Scheduled
####### yes
(# no shows) No Show
yes
Transcript Completed
715 555-5555
6-24
Section 7
ETHICAL CONSIDERATIONS
No matter which type of assessment tool you use, there are several ethical dimensions to
conducting a program assessment, and a little forethought can prevent some awkward situations
later.
Although it is not required that a department contact the campus human subjects research
committees for approval of program assessment research, departments may still wish to provide a
consent form for participants, especially in interviews, where the identity of the interviewee may
be harder to conceal. A sample form is included at the end of this section. For additional
guidance on obtaining consent, departments can contact their local human subjects research
committee.
7-1
What to tell the student?
When a student has revealed sensitive information, the first concern should be to determine
whether the student wants or needs help, and in what form.
• Do they want to report the incident officially? (Do they think they are doing so by telling
you?)
• Do they want or need to talk about the incident further, perhaps with a counselor?
Whom to tell?
Section 10 of this Tool Kit lists some campus resources where students can be referred.
7-2
PROGRAM ASSESSMENT CONSENT FORM
Department: ______________________________
Year: ___________
With your permission interviews will be audiotaped. Your participation is voluntary. The
audiotapes made during telephone interviews and transcripts of both telephone and e-mail
interviews will be available only to __________________, who will use them to obtain accurate
accounts of the interviews. All of your responses will be anonymous. Any reports produced as
a result of this study will not contain identifying material.
There are no reasonably foreseeable risks, discomforts, or benefits associated with participation in
this interview. Your participation is completely voluntary; you may stop participating at any time
prior to the completion of the project.
If you have any questions about your interview or the program assessment process, you may
contact __________________________ at the following address or phone number:
_________________________________. Copies of this form are available from this address as
well.
I have read the above and give my consent to participate in the study.
_____________________________________________ _________________
Signature Date
________________________________________________
Print name
7-3
Section 8
REPORTING RESULTS
In order for evaluation efforts to inform the process of change, results from evaluation activities
will likely need to be made available to numerous players in the educational system. Findings can
be presented in a variety of ways although written reports are the most commonly used format for
disseminating evaluation findings. It is important that the evaluator know the potential audiences
of the report and can tailor the findings in a way that is accessible to each interested party. It
may be the case that the evaluator does not know the answer to many of the questions listed
below. If this is the case, ASK! Most often people welcome the opportunity to explain their
needs and interests. This section includes worksheets to help guide you in creating a report.
8-1
In deciding how to report the results of an evaluation, it is important to consider how the findings
will be used. If the results will be distributed to a funding source, a more detailed description of
the findings may be required. In contrast, if a department would like to present findings to its
faculty at a retreat or planning session, a summary of the findings on overhead slides may be more
appropriate.
Introduction
In many cases if there is not an abstract or executive summary, people interested in the results of a
program evaluation will read only the introduction and the conclusion of a report. The primary
purpose of an introduction is to set the stage for the remainder of the report by highlighting the
purpose of the evaluation and the intended audiences for the findings. The introduction is also a
place to clarify contextual issues surrounding the study and discuss its limitations. It also should
provide a brief "reader's guide" to the report by listing what will discussed in each section.
Methods
This section is where methods of data collection are described. Included in discussion of methods
are the types of instruments or protocol used (such as survey), the procedure by which data was
collected, the number of respondents and/or issues with responding, and representativeness of
sample. This section also provides a brief discussion of the analysis process.
Findings/Themes
8-2
This section is where the findings are described and analyzed and usually comprises the bulk of
the report. The interpretation of the findings is as important as the presentation, and is perhaps
the most sophisticated aspect of doing an assessment. In writing a report on the findings, it is
important that the writer provide a guide for the reader in terms of relating the findings to the
questions. One way to organize the report is around the survey or interview questions. Another
way to organize a report is based on the major issues addressed through responses. In either
format, both quantitative and qualitative data can be used and interwoven to create a thorough
picture of the findings. If the primary methodology is interview or open-ended survey, it is
important that the report represents as authentically as possible the voice of the participants.
Tables and charts can be used to present quantitative data in an organized and readable fashion.
Conclusion
This is the section where the evaluator can synthesize different findings from the evaluation. In
summarizing the overarching findings, it is important to not introduce new material, but rather
focus on what has been presented before. Depending on your situation, it may be appropriate to
make recommendations based on the summary. Most often it is not appropriate to make
recommendations unless otherwise advised to do so by the coordinator of the evaluation.
Appendices
This is the place to attach copies of instruments, protocol, and other reference materials that will
help the reader understand the questions asked and other relevant content.
8-3
Before you begin consider the following questions
1. What are your resource constraints for writing a report?
________________________________________________________________________
________________________________________________________________________
3. How will the results be used? (Examples: Internal feedback, documentation for
accreditation)
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________
Introduction
1. What was the purpose of the evaluation? What issues/ topics were evaluation planners
interested in?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
2. Are there contextual issues surrounding the evaluation that the reader should know about?
(Examples: Accreditation review coming up in one year; major changes in the program)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
Methods/sample
1. What instruments were used? (Attach copy in Appendix)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
8-4
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
4. Were there difficulties in garnering participation? If so, what was done to encourage
participation?
______________________________________________________________________________
______________________________________________________________________________
5. Compared with the whole, how representative was the responding sample?
Ethnicity
Gender
GPA
8-5
Findings
1. What were the four biggest issues/topics that arose from the evaluation? (Example: Career
advising is limited)
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
2. Were there any "sensitive" issues that arose during the assessment?
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________
3. If you used multiple instruments, how does the information compare between instruments?
______________________________________________________________________________
______________________________________________________________________________
Conclusion
1. What are the key points you want the reader to leave thinking about?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
8-6
SAMPLE ASSESSMENT REPORT TABLE OF CONTENTS
8-7
Section 9
ACTING ON RESULTS
Evaluation results are most likely to influence decision making when evaluators and top
administrators agree on the goals of the assessment and perceive information about outcomes as
an important source of feedback about organizational effectiveness (Weiss and Bucuvalas 1977).
The degree to which evaluators involve others in the process may affect the degree to which
interested parties act on results. Generally, the higher the level of participation, the greater the
involvement in acting on results. It is important to involve all the potential stakeholders
(administration, faculty, staff, students) in deciding how to act on results. The evaluator can play
a crucial role in facilitating action from findings.
In creating a plan to act on evaluation findings there are a number of potential challenges to be
aware of because they can affect participation.
2. Faculty resistance. Faculty may resist if asked to devote effort to evaluation when it is not
rewarded with release from other responsibilities or if they do not understand the purpose of the
evaluation. Some faculty are hesitant to participate in evaluation activities because they assume
the evaluation will be a personal critique of their teaching rather than an evaluation of the entire
program. The evaluator should clarify that the purposes of program evaluation are to look at the
program as a whole.
3. Costs. Administrators may be wary of evaluation costs. One way to gain support is to use as
much existing data as possible. Find out what is available through campus records, etc.
4. Timing and follow through. Evaluation activities scheduled during busy times are likely to
be met with resistance by participants who are already busy. In addition, if there is no follow-
through with the findings, participants may feel that their time was wasted and be less inclined to
participate in the future.
5. Consider the work culture. On a final note, it is important to consider the culture in which
you are working. People will be more inclined to participate and act on evaluation activities that
are intended to improve educational programs if the social environment in which they are working
is one that supports and rewards high quality teaching and effective learning.
9-1
WORKSHEET 1 FOR ACTING ON RESULTS
This worksheet is intended to begin the process of acting on results. Ideally, it should be
completed by those who will be involved in the change process. Once the changes are underway,
or at a designated time (such as the time identified on the timeline), faculty, staff and
administrators should revisit this plan and make necessary revisions. Worksheet 2 is intended as a
follow-up guide.
9-2
WORKSHEET 2 FOR ACTING ON RESULTS
9-3
Section 10
RESOURCES
This section lists a variety of resources. It contains the following: campus-based assessment
resources; campus-based resources for student/alumni data; campus-based resources to which
students can be referred; names of faculty and staff experienced with assessment processes;
Internet resources on assessment; national sources of statistics and indicators; and literature
focused on assessment.
LEAD Center
Contact: Susan Millar
1402 University Avenue
4th Floor
ph# 265-5943; email: [email protected]
The LEAD Center maintains a library of assessment and evaluation literature and resources
that can be searched by interested faculty and staff of the UW-Madison. LEAD Center
researchers can assist with identifying relevant resources and can be hired to help programs
conduct assessments.
10-1
University of Wisconsin Survey Center (UWSC)
Contact: James A. Sweet
Rm 2412 Social Sciences Building
ph# 262-2182; email: [email protected]
Graduate School
http://www.wisc.edu/grad/
Counseling
262-1744
905 University Avenue
Madison, WI 53706
www.uhs.wisc.edu
10-2
FACULTY RESOURCES
Wayne Becker, Professor
Department of Botany
Rm B115 Birge Hall
262-5833
[email protected]
If you would be interested in having your name included here in future editions please
contact Dianne Bowcock (see Section 1, pg. 4)
Digest of Education Statistics and The Condition of Education, each produced annually by the
National Center for Education Statistics (U.S. Department of Education)
Science & Engineering Indicators, produced annually by the National Science Board (National
Science Foundation)
10-3
INTERNET RESOURCES ON ASSESSMENT
UW-Madison, Assessment Council: http://publications/provost/assess/manual
Banta, T.W., Lund, J.P., Black, K.E., Oberlander, F.W. (1996). Assessment in practice.
San Francisco: Jossey-Bass.
A very thorough all-purpose guide to assessment. Uses case studies to illustrate concepts.
Caffarella, R. S. (1995). Program planning for adults: a comprehensive guide for adult
educators, trainers, and staff developers. San Francisco: Jossey-Bass.
Focuses on planning educational programs for adults. It is designed to lead the reader
through the process of planning a program and includes a chapter on evaluation relating to
program planning.
LeCompte, J.P., & Preissle, J. (1993). Ethnography and qualitative design in educational
research. (2nd. ed.) San Diego, Academic Press.
An excellent introductory book. It is a scholarly book well-grounded in philosophical,
theoretical and empirical literature (and has a 30-page bibliography). It is a
methodological book and chapters are structured around primary procedural topics
(researcher role, sampling, data collection, analysis).
10-4
Marshall, C., & Rossman, G.B. (1989). Designing qualitative research. Beverly Hills:
Sage.
Easy to read. It uses examples of actual research studies as examples of pragmatic
challenges that arise in designing and conducting qualitative inquiry.
Merriam, S.B., & Simpson, E. L. (1995). A guide to research for educators and trainers of
adults. Malabar, FL: Krieger.
Guides the reader through all the various steps of conducting an evaluation or research
study and provides information about the various philosophical frameworks which underpin
choices about methods and analysis. Also addresses ethical issues
Miles, M.B., & Huberman, A.M. (1984). Qualitative data analysis: A source book of new
methods. Newbury Park, CA: Sage.
Provides a format for analyzing and reporting qualitative data.
Patton, M.Q. (1997). Utilization-focused evaluation: the new century text. Thousand
Oaks, CA: Sage.
Both practical and theoretical. It focuses on how to conduct program evaluations with each
chapter containing a review of relevant literature and actual case examples.
Patton, M.Q. (1990). Qualitative evaluation and research methods. Beverly Hills: Sage.
Provides an overview of all major aspects of qualitative design and analysis, with an
evaluation focus.
Struass, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory
principles and techniques. Newbury Park, CA: Sage.
Describes analysis strategies in grounded theory studies. Easy to read.
Worthen, B. R., Sanders, J. R., and Fitzpatrick, J.L. (1997). Program evaluation:
alternative approaches and practical guidelines. NY: Longman.
Presents a variety of approaches to doing program evaluation with practical guidelines for
planning and implementing program evaluations. Also discusses analyzing and reporting on
findings.
10-5