Ej 1256307
Ej 1256307
Ej 1256307
2020
Part of the Curriculum and Instruction Commons, Educational Administration and Supervision
Commons, Gifted Education Commons, Higher Education Commons, and the Liberal Studies Commons
This Article is brought to you for free and open access by the National Collegiate Honors Council at
DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Honors in Practice --
Online Archive by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln.
A Meaningful and Useful Twofer:
Enhancing Honors Students’ Research
Experiences While Gathering Assessment Data
Mary Scheuer Senter
Central Michigan University
Abstract: Engaging students in assessment practice benefits honors students, fac-
ulty, and administrators. Students gain meaningful research experience while honors
programs receive data to help assess student learning and prepare for program
review. A one-semester course, Program Evaluation Experiences, tasks students (n
= 10) with collecting and analyzing data from peers and faculty and then articulat-
ing its value for their personal academic development. Qualitative and quantitative
instruments and measures include an online survey (Qualtrics), personal interviews
(Rev), and focus groups (rev, n = 30). Students complete various analyses of data
using SPSS and NVivo. Results indicate that students’ active participation in applied
research methods for program assessment benefits both student and program and,
because anchored in student experience, helps to reveal data that might otherwise
remain unexpressed. The author asserts that this type of hands-on learning provides
honors students with a wide range of practical experience not offered in non-honors
curricula. A short history of program assessment in honors is provided.
Keywords: student engagement; high-impact practices; program evaluation; effec-
tive teaching; Collaborative Institutional Training Initiative
H onors programs and their faculty must devote time and attention to the
assessment of student learning despite strong reservations about the
value of these efforts given the time and methodology involved (Carnicom
and Snyder 2010; Digby 2006; Freyman 2006; Mariz 2006; Otero and Spur-
rier 2005). Honors students can, however, be actively involved in collecting
and analyzing data that the honors program can use to document student
141
Senter
142
A Twofer
143
Senter
undergraduate student body. Most students (85%) begin the honors program
as first-year students although some students enter the program as transfer
students or after completing their first year at CMU through the honors Track
II admission process. The honors program, like all academic programs at the
university, is required to submit assessment reports each fall, summarizing
the assessment data collected in the previous year and outlining any improve-
ments in the program suggested by the data. Every seven years, all programs
go through an academic program review process that requires the creation of
a detailed self-study, including assessment findings and a SWOT (strengths,
weaknesses, opportunities, and threats) analysis. The self-study, along with a
report from an external reviewer, is submitted for commentary to the relevant
dean or vice provost and, in the end, to the provost.
The honors program director is a senior faculty member with reassigned
time to administer the program. He has extensive experience working with
honors students and conducting research on the experiences of young adults.
The program reports jointly to the Honors Council, a faculty/student/staff
committee of the Academic Senate, and to the Senior Vice Provost for Aca-
demic Affairs.
To graduate with honors, students must complete fifteen hours of honors
coursework in addition to an introductory course, first-year seminar, senior
project, writing course, and other cultural diversity and service requirements.
The fifteen hours of honors coursework can consist of special sections for
honors students offered by departments, such as an honors section of Foun-
dations of Cell Biology offered by the biology department or Women and
Politics offered by the political science and public administration department.
Alternatively, students can complete special topics courses offered by the hon-
ors program. Faculty throughout the university, such as myself, can propose
these special topics courses and are encouraged to develop courses that would
not typically be offered through one academic department. Courses that use
high-impact learning practices and include experiential learning activities are
most likely to be selected for inclusion in the honors course schedule.
As a sociology faculty member, I usually teach courses in social inequality
and research methods required for sociology majors. In spring 2019, I had the
opportunity to teach Program Evaluation Experiences, the course discussed
here, which was one of four such special topics available in honors. Students
were recruited to the class, which counts as three of the required fifteen hours,
with a description stressing that program evaluation is “a specialized form of
research that is designed to answer questions” and that it allows practitioners
144
A Twofer
to evaluate whether “the program you run now, or want to run someday, is
really doing what it is supposed to.” The description stressed that students
would be actively engaged in all components of program evaluation “from
interviews with key stakeholders to a final presentation of results” and that
students would be “given the opportunity to help the honors program address
a wide array of questions posed by the Honors Council, honors office, and of
course—students themselves.” Students were assured that “the results from
evaluation activities [would] also be utilized in a more formal program review
targeted for completion next year with the goal of improving our program.”
The objectives of the course dovetail well with the CMU Honors Mis-
sion Statement (Honors 2019), which commits the program to “providing
high academic ability students with unique educational opportunities and
experiences” and to challenging “students to aim higher and to achieve more
academically, personally, and professionally for the greater good of our dis-
ciplines, our society, and our world.” No honors course focused on program
evaluation had been offered in the past, making this course unique. In addi-
tion, no class had afforded students the opportunity to assist the honors
program by being actively involved in gathering and analyzing data for pro-
gram review or assessment, allowing them to work for the betterment of the
program itself.
Ten students enrolled in the course. They ranged from sophomores to
seniors, with eight of the ten students majoring in sociology, psychology, or
political science, one student majoring in personal financial planning, and
one in philosophy.
The course met in a seminar room twice a week for the sixteen-week
semester, with each class period lasting seventy-five minutes. A computer lab
was available for some class periods, making it possible for students to learn
appropriate software (SPSS for quantitative analysis and NVivo for qualita-
tive analysis) and to work on their final papers. The only constraint on data
gathering established by the honors director and me prior to the beginning of
the class was that students would conduct a quantitative survey, qualitative
interviews, and one or more focus groups.
145
Senter
to help them learn “to work and solve problems in the company of others.”
Further, they completed real-world “undergraduate research,” with the goal
of involving them “with actively contested questions, empirical observa-
tion, cutting-edge technologies, and the sense of excitement that comes from
working to answer important questions.” Finally, their activities can be con-
ceptualized as a kind of community-based learning if one defines the honors
program as one of these students’ relevant communities: students had the
opportunity “to both apply what they [were] learning in real-world settings
and reflect in a classroom setting on their service experiences” (Kuh 2008).
Given the diverse backgrounds of enrolled students, all students needed
a basic background in social science research and, in particular, in the ways
that program evaluation—with its applied, real-work focus—differs from
traditional academic research. Students were assigned a short textbook that
emphasized “small-scale evaluation” (Robson 2017), a primer on conducting
online surveys (Sue and Ritter 2012), and a selection of articles on qualitative
interviewing (Esterberg 2002), focus groups (Berg 2009), and the honors
program itself.
The course began by laying the groundwork for data collection while
students worked concurrently to develop the outline of topics to guide their
program evaluation. They then worked collaboratively in teams to develop
the specifics of their research designs. The last sections of the course focused
on data collection, followed by data analysis and report writing.
146
A Twofer
147
Senter
institution. I compiled the students’ work into a single document and distrib-
uted it to them.
By the beginning of the fifth week of the semester, students completed
a summary of the “questions/topics that interest many of our clients,” the
“questions/topics that interest a client but . . . that we really cannot address
through this class,” and additional topics/questions that they themselves
would like to answer. For each general topic, the students were asked whether
a student, faculty member, staff member, or administrator was “in a position
to answer the question that the client would like answered.” They were also
asked whether it would “be best to gather this information through a sur-
vey that yields quantitative data (‘which category fits you best’) or through
more open-ended qualitative methodologies such as focus groups or qualita-
tive interviews (that yield more extensive text).” Again, the responses from
all students were compiled, and class time focused on finalizing the draft
topic outline along with the methodology to address each topic. The Honors
Council then reviewed the draft outline, and the honors director approved it.
148
A Twofer
149
Senter
150
A Twofer
151
Senter
152
A Twofer
discussion
Two points are clear: within a single semester, honors students can have
valuable learning experiences while engaging in meaningful data collection
and analysis; and such data can prove useful to honors programs as they seek
to assess their programs and make improvements. Involving students directly
in some kinds of assessment-related data collection can also have method-
ological advantages. Honors students whose experiences are being assessed
might be more willing fully to share their views (the negative as well as the
positive) with fellow honors students than with honors faculty or staff. Simi-
larly, honors students might be especially aware of the ways that experiences
outside of the classroom, for example in the residence halls, impact the hon-
ors learning experience and, therefore, might be able to craft even quantitative
survey questions to address such issues.
Meanwhile, some cautionary notes are appropriate as well. First, class
size and the composition of the class matter. It would be difficult to execute
a multi-modal data collection plan with fewer than ten students and logisti-
cally challenging with more than eighteen. Teamwork and feedback to the
teams were essential. Too few students would make multiple successful teams
impossible, and too many students would hinder the instructor from provid-
ing timely and useful feedback. It also would be beneficial if all students in the
course had completed some kind of statistics or research methodology course
prior to enrolling although the diversity of student backgrounds and fields of
study was advantageous when assigning students to take the lead on specific
tasks, e.g., statistical analysis as opposed to report writing.
Second, this kind of course requires a substantial time commitment from
the instructor to accomplish essential tasks: ensuring the necessary on-time
feedback to students; trouble-shooting and assisting students with navigating
the university bureaucracy, e.g., securing the sample; processing the gift cards
used as incentives/thanks to the interview participants; and organizing the
class so that both content instruction and application can occur within the
confines of a single semester. Students recognized the importance of these
tasks, with all strongly agreeing that “the instructor was accessible to stu-
dents” (mean score = 4.0 of a possible 4.0) and nine of ten strongly agreeing
that “the instructor seemed well prepared” (mean score = 3.9).
153
Senter
Third, given the press of completing data collection, analysis, and report
writing, I had to abandon my initial plan to administer a content exam based
on the readings and first weeks’ class discussion. Consequently, I cannot be
certain that all students mastered some basic methodological content and
skills; such skills might include calculating the margin of error from a proba-
bility sample of a specific size or articulating the conditions when “matching”
the characteristics of an interviewer and research participant is or is not desir-
able when collecting qualitative data. Another issue is the tension between
“covering” content and applying it although requiring a statistics prerequisite,
for instance, might alleviate this tension. Although a full content exam com-
pleted by students during a class period or at home would be ideal, instructors
with time limitations might consider administering a short pretest on the first
day of class followed by a short post-test later in the semester to gauge content
learning.
More generally, the data collection activities in which students were
engaged provided more indirect than direct measures of student learning.
Honors student survey respondents and the participants in qualitative inter-
views and focus groups self-reported on ways the honors program provided
meaningful learning experiences. They reflected on the extent to which the
honors program was meeting its goals and on the ways the program could be
improved. Other data collection efforts are necessary and underway to eval-
uate the quality of, for example, senior projects. Nevertheless, the research
reports students provided to their client could be independently evaluated by
faculty for direct assessment purposes.
The constraints outlined above are not insurmountable, and other
honors programs and their students might benefit from designing a similar
honors course. We hope to offer the course again although we will work with
a client other than the honors program. Using this model, other programs
in which honors students participate, e.g., study abroad, can gain assistance
with their evaluation and assessment efforts while enhancing the learning of
honors students.
conclusions
Honors programs are under pressure from numerous stakeholders to
collect data on student learning. Honors faculty and staff are committed to
improving the honors experience. Both of these goals can be accomplished
by undergraduate honors students, who can successfully collect and ana-
lyze quantitative and qualitative data from their peers within the context of
154
A Twofer
references
Achterberg, Cheryl. 2006. “Honors Assessment and Evaluation.” Journal of
the National Collegiate Honors Council 7:37–39.
Berg, Bruce L. 2009. “Focus Group Interviewing.” Pp. 158–89 in Qualitative
Research Methods for the Social Sciences, 7th edition. Boston: Allyn and
Bacon.
Carnicom, Scott, and Christopher A. Snyder. 2010. “Learning Outcomes
Assessment in Honors: An Appropriate Practice.” Journal of the National
Collegiate Honors Council 11(1):69–82.
Digby, Joan. 2006. “They Graduated.” Journal of the National Collegiate Honors
Council 7(1):57–59.
Driscoll, Marsha B. 2011. “National Survey of College and University Honors
Programs and Assessment Protocols.” Journal of the National Collegiate
Honors Council 12(1):89–103.
Esterberg, Kristin G. 2002. “Interviews.” Pp. 83–114 in Qualitative Methods in
Social Research. Boston: McGraw-Hill.
Eubanks, David. 2018. “Addressing the Assessment Paradox.” Washington,
DC: Association of American Colleges and Universities. Retrieved 15 July
2019 <https://www.aacu.org/peerreview/2018/Fall/RealityCheck>.
Ewell, Peter, Karen Paulson, and Jillian Kinzie. 2011. “Down and In: Assessment
Practices at the Program Level.” National Institute for Learning Out-
comes Assessment. June. Retrieved 15 July 2019 <http://www.learning
outcomesassessment.org/documents/NILOAsurveyreport2011.pdf>.
Freyman, Jay. 2006. “When It’s Bad Cess to Assess!” Journal of the National
Collegiate Honors Council 7(1):41–42.
155
Senter
156
A Twofer
Robson, Colin. 2017. Small-Scale Evaluation: Principles and Practice, 2nd edi-
tion. Thousand Oaks, CA: Sage.
Senter, Mary Scheuer. 2001. “Academic Outcomes Assessment: Fads, Fallacies,
and Footsteps.” Pp. 14–23 in Charles F. Hohm and William S. Johnson,
Assessing Student Learning in Sociology, 2nd edition. Washington, DC:
American Sociological Association.
Smith, Patricia Joanne. 2015. “A Quality Instrument for Effective Honors Pro-
gram Review.” Honors in Practice 11:53–91.
Snyder, Christopher A., and Scott Carnicom. 2011. “Assessment, Account-
ability, and Honors Education.” Journal of the National Collegiate Honors
Council 12(1):111–27.
Spalter-Roth, Roberta, Michael Kisielewski, and Nicole Van Vooren. 2013. The
Victory of Assessment? What’s Happening in Your Department? The AY
2011–12 Department Survey. June. Washington, DC: The American Socio-
logical Association. <https://www.asanet.org/sites/default/files/files/pdf/
2012_asa_dept_survey_two.pdf>.
Sue, Valerie M., and Lois A. Ritter. 2012. Conducting Online Surveys, 2nd edition.
Thousand Oaks, CA: Sage.
Wilson, Steffen. 2006. “Using Learning Outcomes Assessment in Honors as a
Defense Against Proposed Standardized Testing.” Journal of the National
Collegiate Honors Council 7(1):27–31.
Wilson, Steffen Pope, and Rose M. Perrine. 2005. “We Know They are Smart,
but Have They Learned Anything?: Strategies for Assessing Learning in
Honors.” Honors in Practice 1:27–37.
Worthen, Molly. 2018. “The Misguided Drive to Measure ‘Learning Outcomes.’ ”
The New York Times. February 23. Retrieved 13 June 2019 <https://www.
nytimes.com/2018/02/23/opinion/sunday/colleges-measure-learning-
outcomes.html>.
__________________________________________________________
157