Ej 1256307

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

University of Nebraska - Lincoln

DigitalCommons@University of Nebraska - Lincoln

Honors in Practice -- Online Archive National Collegiate Honors Council

2020

A Meaningful and Useful Twofer: Enhancing Honors Students’


Research Experiences While Gathering Assessment Data
Mary Scheuer Senter

Follow this and additional works at: https://digitalcommons.unl.edu/nchchip

Part of the Curriculum and Instruction Commons, Educational Administration and Supervision
Commons, Gifted Education Commons, Higher Education Commons, and the Liberal Studies Commons

This Article is brought to you for free and open access by the National Collegiate Honors Council at
DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Honors in Practice --
Online Archive by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln.
A Meaningful and Useful Twofer:
Enhancing Honors Students’ Research
Experiences While Gathering Assessment Data
Mary Scheuer Senter
Central Michigan University
Abstract: Engaging students in assessment practice benefits honors students, fac-
ulty, and administrators. Students gain meaningful research experience while honors
programs receive data to help assess student learning and prepare for program
review. A one-semester course, Program Evaluation Experiences, tasks students (n
= 10) with collecting and analyzing data from peers and faculty and then articulat-
ing its value for their personal academic development. Qualitative and quantitative
instruments and measures include an online survey (Qualtrics), personal interviews
(Rev), and focus groups (rev, n = 30). Students complete various analyses of data
using SPSS and NVivo. Results indicate that students’ active participation in applied
research methods for program assessment benefits both student and program and,
because anchored in student experience, helps to reveal data that might otherwise
remain unexpressed. The author asserts that this type of hands-on learning provides
honors students with a wide range of practical experience not offered in non-honors
curricula. A short history of program assessment in honors is provided.
Keywords: student engagement; high-impact practices; program evaluation; effec-
tive teaching; Collaborative Institutional Training Initiative

H onors programs and their faculty must devote time and attention to the
assessment of student learning despite strong reservations about the
value of these efforts given the time and methodology involved (Carnicom
and Snyder 2010; Digby 2006; Freyman 2006; Mariz 2006; Otero and Spur-
rier 2005). Honors students can, however, be actively involved in collecting
and analyzing data that the honors program can use to document student

141
Senter

learning and to bolster arguments for administrative support during program


reviews. What I describe is, hence, a two-fer: a course that provides meaning-
ful enhancement of students’ research skills and that creates data for justifying
and improving the honors program.

assessment in higher education


Researchers at the American Sociological Association argue that by
2011–12 the “assessment of student learning was a universal activity for soci-
ology departments” (Spalter-Roth, Kisielewski, and Van Vooren 2013: 11).
One assumes that other academic units have had similar experiences because
all the regional accrediting bodies for higher education and many of the spe-
cialized accrediting bodies mandate that programs document the extent to
which students are meeting the learning objectives that faculty establish for
them (Ewell, Paulson, and Kinzie 2011). Certainly, honors programs are not
immune to this call; in fact, a special issue of the Journal of the National Colle-
giate Honors Council fully thirteen years ago included nine essays in its “Forum
on Outcomes Assessment, Accountability, and Honors” (cited in Driscoll
2011), and the National Collegiate Honors Council published a monograph
on the topic of assessment and evaluation in 2005 (Otero and Spurrier 2005).
Meanwhile, it is an understatement that not all faculty have embraced
assessment with enthusiasm. Faculty criticism of assessment focuses on the
top-down, bureaucratic nature of many assessment initiatives; on threats to
academic freedom in reducing faculty prerogatives to evaluate students learn-
ing on their own terms (often by grading); on the extra (uncompensated)
work required; on the suspect methodology underlying some data gathering
for assessment; on the disconnect between assessment findings and admin-
istrative efforts to improve students’ experiences; and on the divide between
institutions that easily document the success of their already well-prepared
students and those that struggle serving students who enroll with limited col-
lege preparation (Eubanks 2018; Gilbert 2016; Lederman 2019; Snyder and
Carnicom 2011; Worthen 2018). Honors faculty, in particular, are concerned
that the kinds of educational growth promoted by honors programs are not
easily documented, requiring sophisticated qualitative analyses rather than
the more common quantitative analyses and standardized testing found in
many assessment studies (Frost 2006). Honors faculty have also argued that
the transformational learning resulting from involvement in honors programs
is best recognized later in life when students, as graduates, assume positions
of civic responsibility (Digby 2006; Freyman 2006; Mariz 2006).

142
A Twofer

Counterarguments exist, of course, with some authors arguing that the


honors community should not just embrace but take the lead on evaluation
and assessment, in part as a defense against the imposition of standardized
testing (Wilson 2006); Achterberg (2006: 39) argues that “honors cannot
survive the future on anecdotal evidence.” Several scholars provide concrete
suggestions for implementing an assessment program for honors (Wilson
and Perrine 2005; Lanier 2008) or for embarking on an effective honors pro-
gram review (Smith 2015). Jones and Wehlburg argue that we need to know
what students are learning “to know what needs to be modified or changed”
(2014: 19).
While not ignoring criticism of mandated evaluation efforts, I have
argued elsewhere that assessment can be made manageable and meaning-
ful and that the best assessment activities promote student learning by being
integrated into the curriculum rather than a burdensome add-on for faculty
(Senter 2001). In making this argument, I assumed that students would be
the subjects assessed and that assessment activities would be embedded into
their existing coursework. For example, in a capstone course, students might
complete research projects that faculty would evaluate for assessment pur-
poses. The case I make now, however, is that students can also be directly
involved in the creation of assessment instruments and gathering of useful
data and that these student-focused activities can form the core of an honors
course for undergraduates. Further, students can be guided to gather both
qualitative and quantitative assessment data as a lesson in good research prac-
tices that use multiple sources. In this way, students are modeling and learning
a multi-method program evaluation approach that draws on the strengths of
each data-gathering technique. If student involvement in assessment activi-
ties can lead to enhanced student learning, then even the most strident critics
of assessment might see some positive element in the enterprise.
My semester-long class for honors students, which both introduced them
to program evaluation and collected valuable data for program assessment
and review, illustrates a positive assessment practice. The two-fer is that while
the students engaged in assessment were in a learning-rich setting, the honors
program faculty and administrators were relieved of some of the burden of
collecting and summarizing assessment data.

the context and the course


The Central Michigan University (CMU) Honors Program, founded
in 1961, enrolls approximately 800 students or about four percent of the

143
Senter

undergraduate student body. Most students (85%) begin the honors program
as first-year students although some students enter the program as transfer
students or after completing their first year at CMU through the honors Track
II admission process. The honors program, like all academic programs at the
university, is required to submit assessment reports each fall, summarizing
the assessment data collected in the previous year and outlining any improve-
ments in the program suggested by the data. Every seven years, all programs
go through an academic program review process that requires the creation of
a detailed self-study, including assessment findings and a SWOT (strengths,
weaknesses, opportunities, and threats) analysis. The self-study, along with a
report from an external reviewer, is submitted for commentary to the relevant
dean or vice provost and, in the end, to the provost.
The honors program director is a senior faculty member with reassigned
time to administer the program. He has extensive experience working with
honors students and conducting research on the experiences of young adults.
The program reports jointly to the Honors Council, a faculty/student/staff
committee of the Academic Senate, and to the Senior Vice Provost for Aca-
demic Affairs.
To graduate with honors, students must complete fifteen hours of honors
coursework in addition to an introductory course, first-year seminar, senior
project, writing course, and other cultural diversity and service requirements.
The fifteen hours of honors coursework can consist of special sections for
honors students offered by departments, such as an honors section of Foun-
dations of Cell Biology offered by the biology department or Women and
Politics offered by the political science and public administration department.
Alternatively, students can complete special topics courses offered by the hon-
ors program. Faculty throughout the university, such as myself, can propose
these special topics courses and are encouraged to develop courses that would
not typically be offered through one academic department. Courses that use
high-impact learning practices and include experiential learning activities are
most likely to be selected for inclusion in the honors course schedule.
As a sociology faculty member, I usually teach courses in social inequality
and research methods required for sociology majors. In spring 2019, I had the
opportunity to teach Program Evaluation Experiences, the course discussed
here, which was one of four such special topics available in honors. Students
were recruited to the class, which counts as three of the required fifteen hours,
with a description stressing that program evaluation is “a specialized form of
research that is designed to answer questions” and that it allows practitioners

144
A Twofer

to evaluate whether “the program you run now, or want to run someday, is
really doing what it is supposed to.” The description stressed that students
would be actively engaged in all components of program evaluation “from
interviews with key stakeholders to a final presentation of results” and that
students would be “given the opportunity to help the honors program address
a wide array of questions posed by the Honors Council, honors office, and of
course—students themselves.” Students were assured that “the results from
evaluation activities [would] also be utilized in a more formal program review
targeted for completion next year with the goal of improving our program.”
The objectives of the course dovetail well with the CMU Honors Mis-
sion Statement (Honors 2019), which commits the program to “providing
high academic ability students with unique educational opportunities and
experiences” and to challenging “students to aim higher and to achieve more
academically, personally, and professionally for the greater good of our dis-
ciplines, our society, and our world.” No honors course focused on program
evaluation had been offered in the past, making this course unique. In addi-
tion, no class had afforded students the opportunity to assist the honors
program by being actively involved in gathering and analyzing data for pro-
gram review or assessment, allowing them to work for the betterment of the
program itself.
Ten students enrolled in the course. They ranged from sophomores to
seniors, with eight of the ten students majoring in sociology, psychology, or
political science, one student majoring in personal financial planning, and
one in philosophy.
The course met in a seminar room twice a week for the sixteen-week
semester, with each class period lasting seventy-five minutes. A computer lab
was available for some class periods, making it possible for students to learn
appropriate software (SPSS for quantitative analysis and NVivo for qualita-
tive analysis) and to work on their final papers. The only constraint on data
gathering established by the honors director and me prior to the beginning of
the class was that students would conduct a quantitative survey, qualitative
interviews, and one or more focus groups.

course outline and activities


Pedagogy and Foundational Readings
The pedagogy for the course included a variety of high-impact prac-
tices. Students engaged in “collaborative assignments and projects” designed

145
Senter

to help them learn “to work and solve problems in the company of others.”
Further, they completed real-world “undergraduate research,” with the goal
of involving them “with actively contested questions, empirical observa-
tion, cutting-edge technologies, and the sense of excitement that comes from
working to answer important questions.” Finally, their activities can be con-
ceptualized as a kind of community-based learning if one defines the honors
program as one of these students’ relevant communities: students had the
opportunity “to both apply what they [were] learning in real-world settings
and reflect in a classroom setting on their service experiences” (Kuh 2008).
Given the diverse backgrounds of enrolled students, all students needed
a basic background in social science research and, in particular, in the ways
that program evaluation—with its applied, real-work focus—differs from
traditional academic research. Students were assigned a short textbook that
emphasized “small-scale evaluation” (Robson 2017), a primer on conducting
online surveys (Sue and Ritter 2012), and a selection of articles on qualitative
interviewing (Esterberg 2002), focus groups (Berg 2009), and the honors
program itself.
The course began by laying the groundwork for data collection while
students worked concurrently to develop the outline of topics to guide their
program evaluation. They then worked collaboratively in teams to develop
the specifics of their research designs. The last sections of the course focused
on data collection, followed by data analysis and report writing.

Laying the Groundwork for Data Collection


Much, but not all, of the class time during the first eight weeks of the
course was consumed with lectures and discussion based on the readings.
Course topics included:
• what is program evaluation and why do we do it;
• engaging stakeholders;
• ethics and politics;
• types of program evaluation;
• methods of data collection;
• issues of sampling;
• quantitative and qualitative data preparation;

146
A Twofer

• quantitative and qualitative data analysis (including instruction in


SPSS and NVivo); and
• report writing.
The latter topics of data analysis and report writing occurred in the eleventh
and twelfth weeks of the course as students were in the process of gathering
their quantitative and qualitative data.
Meanwhile, given the constraints of a sixteen-week semester, students
needed to begin to design their honors program evaluation while the substan-
tive background was being laid in class. Hence, a tension existed throughout
the course between academic preparation or context and the actual activities
of conducting an evaluation project (Mallin 2017; Monahan 2015). Students’
first assignment, due at the beginning of the third week of class, required them
to complete the nationally recognized, online training offered by the Collab-
orative Institutional Training Initiative, which focuses on protecting human
research participants.

Creating Outlines of Topics to Guide Program Evaluation


Given the open-ended nature of the evaluation, students needed to
develop an outline of topics that would govern their efforts. In addition, they
needed to remain aware that they were conducting a real-world evaluation for
a real client. While the client for this evaluation was the honors program, stu-
dents needed to think through the issue of who, besides the honors director,
the clients were. Through brainstorming in class, they developed a list of cli-
ents that included faculty and staff who were members of the Honors Council
and the associate directors and staff of the honors program. Students were
not viewed as clients at this point in the process because their opinions and
experiences would be captured through the surveys, interviews, and focus
groups. Senior administrators such as the provost were not seen as clients
because they already had substantial input into the organization of program
reviews and the necessary components of the required self-study. Non-hon-
ors students, faculty, and staff were not included because of time constraints
although their absence led to a useful discussion about the limitations of the
evaluation.
Then, working in teams of two, students completed two or three inter-
views of clients, who were asked what they would like to know about the
honors program as well as topics, if any, that should not be included because
the information was already available or because of political issues within the

147
Senter

institution. I compiled the students’ work into a single document and distrib-
uted it to them.
By the beginning of the fifth week of the semester, students completed
a summary of the “questions/topics that interest many of our clients,” the
“questions/topics that interest a client but . . . that we really cannot address
through this class,” and additional topics/questions that they themselves
would like to answer. For each general topic, the students were asked whether
a student, faculty member, staff member, or administrator was “in a position
to answer the question that the client would like answered.” They were also
asked whether it would “be best to gather this information through a sur-
vey that yields quantitative data (‘which category fits you best’) or through
more open-ended qualitative methodologies such as focus groups or qualita-
tive interviews (that yield more extensive text).” Again, the responses from
all students were compiled, and class time focused on finalizing the draft
topic outline along with the methodology to address each topic. The Honors
Council then reviewed the draft outline, and the honors director approved it.

Collaborative Methods Design


Students were then assigned to one of three groups, defined by the quan-
titative, data-gathering methodology of an online survey of honors students,
qualitative interviews with honors students, or focus groups with honors
students and honors faculty. Students met with their group to assign the fol-
lowing tasks with due dates:
• to develop a budget;
• to flesh out a specific topic outline for their data gathering;
• to secure the sample necessary for gathering relevant data;
• to write drafts of invitations to respondents to participate in the proj-
ect; and
• to write a first draft of the questionnaire, focus group guide, or qualita-
tive interview guide.
Their first group project demonstrating that these tasks had been completed
was due by the end of the eighth week of the class, just before our week-long
spring break. Students chose to create GoogleDocs, making it easy for them
to share their work with one another and for me to comment on it. I worked
closely with each group, helping to ensure approval of the relevant budget from
the honors director and helping to secure the relevant samples from honors

148
A Twofer

program staff. I commented extensively on their work so that they were in a


good position to make changes when they returned from spring break.
The first tasks after spring break were to execute the changes that I had
proposed. In particular, they needed to finalize a working draft of their ques-
tionnaire, interview guide, or focus group guide; finalize communication
(including informed consent documents) with their respondents/partici-
pants; and secure relevant materials (e.g., recorders and water bottles). Class
time was used to provide updates on the progress of each group and to work
through solutions to dilemmas that arose as students finalized their data-col-
lection plans.
Students then pre-tested and reviewed the work of the two groups to
which they did not belong. By the beginning of the tenth week of the semes-
ter, they completed an assignment that discussed “the strong points of what
is being proposed,” “what should be changed” or “is problematic,” and “what
is missing, given our earlier interviews with our clients and the preferences of
students” enrolled in the class. I shared the responses with the student groups
in short order so that they could complete their second group project by the
end of the tenth week of the semester. This second report was largely con-
firmation that they had made the revisions requested by me and their peers
and that they had completed the work necessary actually to implement their
surveys, interviews, or focus groups, including informed consent documents
and invitations to research participants.

Quantitative and Qualitative Data Collection


Students then had a two-week period to collect their data. The honors
director facilitated this process by writing an email to all honors students tell-
ing them to expect communications from their peers about how they could
help the honors program by completing one or more evaluation activities. The
survey group then sent invitations and subsequent reminders to all honors
students asking them to complete the online survey developed through the
software package Qualtrics. In the end, 380 questionnaires were completed out
of a total of 727 for a fine response rate of 52.5 percent. In addition to demo-
graphics, the questionnaire consisted of questions on topics such as these:
• knowledge of program requirements;
• confidence in completing the requirements;
• perceptions of the meaningfulness of each of the requirements “to
your personal development”;

149
Senter

• ease or difficulty in securing faculty support;


• the difficulties and the meaningfulness of the senior (capstone) proj-
ect and of other honors classes;
• levels of satisfaction with honors resources and advising;
• the extent of belonging to the honors community; and
• issues related to differences, if any, between the experiences of stu-
dents beginning the honors program in their first year of college and
those joining through the Track II admission process.
The qualitative interview group completed fourteen interviews with
honors students, half of whom began the program as first-year students and
half joining the program later in their college careers. The interviews were
recorded and transcribed professionally by the online service Rev. The inter-
view guide asked for a discussion of the ways the honors program had been
“meaningful to you”; the ways, if any, that students felt connected to the hon-
ors community; and the ways that honors experiences were different “from
what you were expecting.” Questions also focused on the introductory course,
the diversity requirement, and the senior (capstone) project.
The group charged with conducting focus groups completed three group
discussions: one with faculty members; one with students admitted to the
honors program as first-year students who had either completed their cap-
stone project or had an approved capstone proposal; and one with students
who were admitted to the program through the Track II process. I facilitated
a fourth focus group during class time of the students enrolled in the course,
the purpose of which was both to collect data and to model good focus group
practice. In the end, nineteen students (including members of the class) and
eleven faculty members participated in the focus group discussions. The
focus group guide for students included many of the questions posed in the
qualitative interviews; however, the guide for students admitted to the hon-
ors program after their first year of college included questions on why they
chose to join the honors program, and the guide for advanced students begin-
ning the program in their first year placed more emphasis on experiences with
the senior project. The faculty focus group guide focused on the positive and
challenging aspects of working with honors students and with the honors
program itself. Faculty were also queried about differences, if any, between
the students admitted to the program for their first year and those admitted
later in their collegiate career.

150
A Twofer

Data Analysis and Report Writing


The final three weeks of the semester were devoted to data analysis and
report writing by each of the three groups. Students worked with their groups
during the regularly scheduled class time, and I was available to provide feed-
back and support. Students used the software package SPSS to analyze the
survey data and the software package NVivo to help with analysis of the quali-
tative interviews and student focus groups. I wrote the report on the faculty
focus group discussion since it was too much to expect those students who
had fielded focus groups to complete two separate reports.

evalution of the course


There are two ways to evaluate the success of this kind of honors course:
assessing the work that students produced and analyzing student feedback
on the experience. Both the honors director, a client for our work, and I
were impressed with the quantity and quality of the students’ work. At the
final meeting of the class during the week designated for exams, the director
thanked the students for their efforts and noted the utility of their work both
for assessment and program review and for ongoing efforts to improve the
program. I was also pleased with the quality and outcomes of their work. I had
not been convinced at the outset of the course that students would be able
to complete all components of a small-scale evaluation; I was sure that they
would succeed in collecting data, but I was not confident that they would be
able to execute final reports summarizing their findings in the time allowed.
The students succeeded well beyond my expectations.
Students provided feedback on the course in three ways: the university’s
standard end-of-course evaluation instrument, the honors program’s end-of-
course evaluation instrument, and an open-ended discussion with the director
and me during the final meeting of the course. While the students were not
asked directly to comment on their learning in the university instrument,
they were asked to choose one of five Likert scale agree/disagree categories,
including the neutral “agree nor disagree” in response to the statement “The
instructor’s teaching helped me learn.” Seven of the ten students reported
“strongly agree” and three selected “agree” for this question, providing a mean
score of 3.7 (with “strongly agree” coded as 4 and “strongly disagree” as 0).
The honors program’s instrument links directly to its mission and asks
students “To what degree do you feel this honors course offered unique edu-
cational opportunities and experiences compared to a non-honors course?”

151
Senter

Responses were recorded on a 5-point semantic differential scale with 1 equal


to “not at all” and 5 equal to “very much.” Seven students chose the highest
option to record their response while three students chose option 4, resulting
in a mean score of 4.7. Students’ comments following this question provided
useful insight into what students found appealing about the experience. Com-
ments included:
• I like the opportunity to be actively involved in real program evaluation.
• I think the program evaluation opportunity itself is unique, and I really
enjoyed that I was able to both learn and practice different research
methods.
• Having the ability to evaluate the honors program was a very unique
opportunity, and one that I don’t feel other programs or institutions
would offer.
• How lucky I am to be able to lead a focus group session with honors
faculty! An experience most will not get.
The emphasis on active and applied learning experiences in the course
was also reflected in the students’ final class day discussion. I began the dis-
cussion by noting the tension between learning about program review and
doing it. I then asked students what they found to be the most valuable
component of the course. The comments below are paraphrases, rather than
verbatim transcriptions, from their discussion:
• I adore honors. This was my opportunity to help out. Diving in helped
more than the textbook.
• The bigger component was the act of doing; it was very beneficial to me.
• Walking through an entire project—actually executing the project was
valuable.
• The course was very valuable for me; it was practical for me.
• I’m interested to see where this goes—there was beneficial hands-on
learning. I could see my skills improving.
• This was an interesting class to take—the background and doing and
analyzing.
Some students also directly noted the benefits of learning more about social
science research methodology:

152
A Twofer

• I gained insight into the methods and paradigms in social science. It


was cool to learn new things.
• I learned more about honors. This changed my ideas about research.

discussion
Two points are clear: within a single semester, honors students can have
valuable learning experiences while engaging in meaningful data collection
and analysis; and such data can prove useful to honors programs as they seek
to assess their programs and make improvements. Involving students directly
in some kinds of assessment-related data collection can also have method-
ological advantages. Honors students whose experiences are being assessed
might be more willing fully to share their views (the negative as well as the
positive) with fellow honors students than with honors faculty or staff. Simi-
larly, honors students might be especially aware of the ways that experiences
outside of the classroom, for example in the residence halls, impact the hon-
ors learning experience and, therefore, might be able to craft even quantitative
survey questions to address such issues.
Meanwhile, some cautionary notes are appropriate as well. First, class
size and the composition of the class matter. It would be difficult to execute
a multi-modal data collection plan with fewer than ten students and logisti-
cally challenging with more than eighteen. Teamwork and feedback to the
teams were essential. Too few students would make multiple successful teams
impossible, and too many students would hinder the instructor from provid-
ing timely and useful feedback. It also would be beneficial if all students in the
course had completed some kind of statistics or research methodology course
prior to enrolling although the diversity of student backgrounds and fields of
study was advantageous when assigning students to take the lead on specific
tasks, e.g., statistical analysis as opposed to report writing.
Second, this kind of course requires a substantial time commitment from
the instructor to accomplish essential tasks: ensuring the necessary on-time
feedback to students; trouble-shooting and assisting students with navigating
the university bureaucracy, e.g., securing the sample; processing the gift cards
used as incentives/thanks to the interview participants; and organizing the
class so that both content instruction and application can occur within the
confines of a single semester. Students recognized the importance of these
tasks, with all strongly agreeing that “the instructor was accessible to stu-
dents” (mean score = 4.0 of a possible 4.0) and nine of ten strongly agreeing
that “the instructor seemed well prepared” (mean score = 3.9).

153
Senter

Third, given the press of completing data collection, analysis, and report
writing, I had to abandon my initial plan to administer a content exam based
on the readings and first weeks’ class discussion. Consequently, I cannot be
certain that all students mastered some basic methodological content and
skills; such skills might include calculating the margin of error from a proba-
bility sample of a specific size or articulating the conditions when “matching”
the characteristics of an interviewer and research participant is or is not desir-
able when collecting qualitative data. Another issue is the tension between
“covering” content and applying it although requiring a statistics prerequisite,
for instance, might alleviate this tension. Although a full content exam com-
pleted by students during a class period or at home would be ideal, instructors
with time limitations might consider administering a short pretest on the first
day of class followed by a short post-test later in the semester to gauge content
learning.
More generally, the data collection activities in which students were
engaged provided more indirect than direct measures of student learning.
Honors student survey respondents and the participants in qualitative inter-
views and focus groups self-reported on ways the honors program provided
meaningful learning experiences. They reflected on the extent to which the
honors program was meeting its goals and on the ways the program could be
improved. Other data collection efforts are necessary and underway to eval-
uate the quality of, for example, senior projects. Nevertheless, the research
reports students provided to their client could be independently evaluated by
faculty for direct assessment purposes.
The constraints outlined above are not insurmountable, and other
honors programs and their students might benefit from designing a similar
honors course. We hope to offer the course again although we will work with
a client other than the honors program. Using this model, other programs
in which honors students participate, e.g., study abroad, can gain assistance
with their evaluation and assessment efforts while enhancing the learning of
honors students.

conclusions
Honors programs are under pressure from numerous stakeholders to
collect data on student learning. Honors faculty and staff are committed to
improving the honors experience. Both of these goals can be accomplished
by undergraduate honors students, who can successfully collect and ana-
lyze quantitative and qualitative data from their peers within the context of

154
A Twofer

a semester-long course. This type of hands-on learning and the execution of


a real, applied program evaluation project provided honors students at CMU
with a range of experiences that they could not receive in non-honors courses.
While not eliminating the criticism of assessment that exists in the literature
and that is voiced on many campuses, an assessment project that enhances
students’ experiences and saves valuable faculty and staff time is worthwhile
on its own terms. Many features of the course outlined here could be rep-
licated on other campuses, benefiting both the honors program and, most
importantly, its students.

references
Achterberg, Cheryl. 2006. “Honors Assessment and Evaluation.” Journal of
the National Collegiate Honors Council 7:37–39.
Berg, Bruce L. 2009. “Focus Group Interviewing.” Pp. 158–89 in Qualitative
Research Methods for the Social Sciences, 7th edition. Boston: Allyn and
Bacon.
Carnicom, Scott, and Christopher A. Snyder. 2010. “Learning Outcomes
Assessment in Honors: An Appropriate Practice.” Journal of the National
Collegiate Honors Council 11(1):69–82.
Digby, Joan. 2006. “They Graduated.” Journal of the National Collegiate Honors
Council 7(1):57–59.
Driscoll, Marsha B. 2011. “National Survey of College and University Honors
Programs and Assessment Protocols.” Journal of the National Collegiate
Honors Council 12(1):89–103.
Esterberg, Kristin G. 2002. “Interviews.” Pp. 83–114 in Qualitative Methods in
Social Research. Boston: McGraw-Hill.
Eubanks, David. 2018. “Addressing the Assessment Paradox.” Washington,
DC: Association of American Colleges and Universities. Retrieved 15 July
2019 <https://www.aacu.org/peerreview/2018/Fall/RealityCheck>.
Ewell, Peter, Karen Paulson, and Jillian Kinzie. 2011. “Down and In: Assessment
Practices at the Program Level.” National Institute for Learning Out-
comes Assessment. June. Retrieved 15 July 2019 <http://www.learning
outcomesassessment.org/documents/NILOAsurveyreport2011.pdf>.
Freyman, Jay. 2006. “When It’s Bad Cess to Assess!” Journal of the National
Collegiate Honors Council 7(1):41–42.

155
Senter

Frost, Linda. 2006. “Saving Honors in the Age of Standardization.” Journal of


the National Collegiate Honors Council 7(1):21–25.
Gilbert, Erik. 2016. “Why Assessment Is a Waste of Time.” Inside Higher
Education. November 21. Retrieved 15 July 2019 <https://www.inside
highered.com/views/2016/11/21/how-assessment-falls-significant
ly-short-valid-research-essay>.
Honors Program. 2019. “About Us.” Mount Pleasant, MI: Central Michigan
University. Retrieved 9 July 2019 <https://www.cmich.edu/office_pro-
vost/AcademicAffairs/Honors/Pages/About-Us.aspx>.
Jones, Beata M., and Catherine M. Wehlburg. 2014. “Learning Outcomes
Assessment Misunderstood: Glass Half-Empty or Half-Full.” Journal of
the National Collegiate Honors Council 15(2):15–21.
Kuh, George D. 2008. “High-Impact Educational Practices.” Washington,
DC: American Association of Colleges and Universities. Retrieved 9 July
2019 <https://www.aacu.org/leap/hips>.
Lanier, Gregory. 2008. “Towards Reliable Honors Assessment.” Journal of the
National Collegiate Honors Council 9(1):81–149.
Lederman, Doug. 2019. “Advocates for Student Learning Assessment Say
It’s Time for a Different Approach.” Inside Higher Education. April 17.
Retrieved 13 June 2019 <https://www.insidehighered.com/news/
2019/04/17/advocates-student-learning-assessment-say-its-time-differ
ent-approach>.
Mallin, Irwin. 2017. “Forum: The Lecture and Student Learning. Lecture
and Active Learning as a Dialectical Tension.” Communication Education
66(2):242–43.
Mariz, George. 2006. “Accountable to Whom? Assessment for What?” Jour-
nal of the National Collegiate Honors Council 7(1):43–45.
Monahan, Nicki. 2015. “More Content Doesn’t Equal More Learning.” Faculty
Focus. October 12. Madison, WI: Magna Publications. Retrieved 16 July
2019 <https://www.facultyfocus.com/articles/curriculum-development/
more-content-doesnt-equal-more-learning>.
Otero, Rosalie, and Robert Spurrier. 2005. Assessing and Evaluating Honors
Programs and Honors Colleges: A Practical Handbook. National Collegiate
Honors Council Monograph Series, Lincoln, NE: National Collegiate
Honors Council.

156
A Twofer

Robson, Colin. 2017. Small-Scale Evaluation: Principles and Practice, 2nd edi-
tion. Thousand Oaks, CA: Sage.
Senter, Mary Scheuer. 2001. “Academic Outcomes Assessment: Fads, Fallacies,
and Footsteps.” Pp. 14–23 in Charles F. Hohm and William S. Johnson,
Assessing Student Learning in Sociology, 2nd edition. Washington, DC:
American Sociological Association.
Smith, Patricia Joanne. 2015. “A Quality Instrument for Effective Honors Pro-
gram Review.” Honors in Practice 11:53–91.
Snyder, Christopher A., and Scott Carnicom. 2011. “Assessment, Account-
ability, and Honors Education.” Journal of the National Collegiate Honors
Council 12(1):111–27.
Spalter-Roth, Roberta, Michael Kisielewski, and Nicole Van Vooren. 2013. The
Victory of Assessment? What’s Happening in Your Department? The AY
2011–12 Department Survey. June. Washington, DC: The American Socio-
logical Association. <https://www.asanet.org/sites/default/files/files/pdf/
2012_asa_dept_survey_two.pdf>.
Sue, Valerie M., and Lois A. Ritter. 2012. Conducting Online Surveys, 2nd edition.
Thousand Oaks, CA: Sage.
Wilson, Steffen. 2006. “Using Learning Outcomes Assessment in Honors as a
Defense Against Proposed Standardized Testing.” Journal of the National
Collegiate Honors Council 7(1):27–31.
Wilson, Steffen Pope, and Rose M. Perrine. 2005. “We Know They are Smart,
but Have They Learned Anything?: Strategies for Assessing Learning in
Honors.” Honors in Practice 1:27–37.
Worthen, Molly. 2018. “The Misguided Drive to Measure ‘Learning Outcomes.’ ”
The New York Times. February 23. Retrieved 13 June 2019 <https://www.
nytimes.com/2018/02/23/opinion/sunday/colleges-measure-learning-
outcomes.html>.
__________________________________________________________

The author may be contacted at


[email protected].

157

You might also like