Program Based
Program Based
Portfolios
Chemistry Nutrition
Survey Interviews
Biology
Capstone
Communications
PROGRAM-Based
Review and Assessment
Tools and Techniques for Program Improvement
office of
Contributing Authors: Martha L. A. Stassen, Director of Assessment; Kathryn Doherty and Mya Poe, Research Associates Publication supported in part through a grant from the Presidents Reserve, University of Massachusetts
R
This handbook is one of two campus publications designed by the Office of Academic Planning and Assessment (OAPA) to guide the practitioner through the steps of student learning assessment. COURSE-Based Review and Assessment: Methods for Understanding Student Learning offers strategies for assessing student learning at the course level and is particularly useful to instructors developing assessment strategies for their courses. The companion publication PROGRAM-Based Review and Assessment: Tools and Techniques for Program Improvement focuses on the assessment at the department or program level and is particularly useful to department or program chairs, as well as others interested in program assessment, to guide program review and improvement. Both publications are available through OAPA.
The contributing authors are grateful for the many UMass colleagues who provided their suggestions on earlier versions of this handbook. Wed also like to acknowledge the contributions of colleagues at other institutions of higher eduction whose work is referenced throughout this handbook.
Contents
Principles of Good Practice for Assessing Student Learning . . . . . . . . . . . . . . . .2 How to Use this Handbook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Chapter 1 Getting Started: What is Program-Based Assessment? . . . . . . . . . . . . . . .5 Chapter 2 Defining Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9 Appendix 2-A Goal Definition Worksheet . . . . . . . . . . . . . . . . . . . . . .16 Appendix 2-B Objectives Worksheet . . . . . . . . . . . . . . . . . . . . . . . . . .17 Chapter 3 Designing the Assessment Program . . . . . . . . . . . . . . . . . . . . . . . . . . . .19 Appendix 3-A Sample Assessment Plans . . . . . . . . . . . . . . . . . . . . . . .25 Chapter 4 Assessment Strategies and Methods . . . . . . . . . . . . . . . . . . . . . . . . . . .29 Appendix 4-A Glossary of Assessment Methods . . . . . . . . . . . . . . . . . .37 Chapter 5 Analyzing, Reporting, and Using Results . . . . . . . . . . . . . . . . . . . . . . . .49 Appendix 5-A Questions to Guide Assessment Reporting . . . . . . . . . . .57 Sources and Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .59 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .60
Publication supported in part through a grant from the Presidents Reserve Design: MBDesign [email protected]
Fall 2001
Chapter 1
Getting Started: What is Program Assessment? The purpose of this chapter
This chapter will help you think about assessment in terms of how it can benefit you and other members of your department or program. Assessment is about improvement, and program assessment can help you focus on improving student learning in your classes and in the major. What is assessment? Why assess? What is program assessment? What are the steps to effective program assessment? Are other research universities using program assessment?
Chapter 1
At A Glance
What is Assessment?
The word assessment has taken on a variety of meanings within higher education. The term can refer to the process faculty use to grade student course assignments, to standardized testing imposed on institutions as part of increased pressure for external accountability, or to any activity designed to collect information on the success of a program, course, or University curriculum. These varied uses have, unfortunately, moved us away from a focus on the central role that assessment should play in educational institutions the gathering of information to improve institutional practices. Therefore, for the purposes of this handbook
Assessment is the systematic collection and analysis of information to improve student learning.
Defined in this manner, assessment asks you to think about the following questions: What should students be learning and in what ways should they be growing? What are students actually learning and in what ways are they actually growing? What should you be doing to facilitate student learning and growth?
Faculty Members Can rely less on the comments that appear on student evaluations as indicators of success in teaching.
Faculty Members Can engage in more productive conversations about the status of student achievement and make better decisions about how it might be improved.
Faculty Members Can make reliable decisions about innovations or experimental projects in instruction and share successes more easily
Faculty Members Can become the primary decision-makers in regard to setting learning goals, identifying processes for assessing them, determining whether they have been reached, and recommending future directions.
adapted from the University of Nebraska-Lincoln Teaching and Learning Center, Teaching at UNL, Vol. 21, No. 2. (Oct. 1999). 6 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
Program assessment focuses on assessing student learning and experience to determine whether students have acquired the skills, knowledge, and competencies associated with their program of study.
Effective program assessment helps you answer three questions: 1. What are you trying to do? 2. How well are you doing it? 3. How (using the answers to 1. and 2.) can you improve?
Assessment activities can also serve external needs by: Providing data to create a body of evidence for external accreditation and the on-campus AQAD (Academic Quality Assessment and Development) program review requirements and to support your assertions about your departments successes and strengths.
Chapter 2
Defining Goals and Objectives
Chapter 2
At A Glance
Goals describe broad learning outcomes and concepts (what you want students to learn) expressed in
general terms (e.g., clear communication, problem-solving skills, etc.).
Objectives are the specific skills, values, and attitudes students should exhibit that reflect the
broader goals (e.g., for students in a freshman writing course, this might be students are able to develop a cogent argument to support a position).
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 9
Program Goals
Developing program goals where do you start?
Developing agreed upon program-specific student learning goals is not always a quick and easy task. Departments vary in the extent to which the faculty share a common disciplinary framework or epistemology. When departments hold many subfields, specialties, or perspectives, identifying agreed upon goals may be more difficult than in departments where there is a unified approach to the discipline. Before actually writing or revising departmental goals and objectives, try some of the following activities:
Have open discussions with department faculty on one of the following topics or similar topics Describe the ideal student in your program at various phases
throughout your program. Be concrete and focus on those strengths, skills, and values that you feel are the result of, or at least supported and nurtured by, the program experience. Then ask: - What does this student know? - What can this student do? - What does this student care about? - List and briefly describe the program experiences that contribute most to the development of the ideal student. List the achievements you implicitly expect of graduates in each major field. Describe your alumni in terms of such achievements as career accomplishments, lifestyles, citizenship activities, and aesthetic and intellectual involvement.
Review and react to goals and objectives from another unit that is similar but external Try grouping the statements into broad categories of student
outcomes (e.g., knowledge, attitudes, behavior).
Use the 25 percent problem to refine or reduce a set of goal statements Imagine that you want to reduce program or course material
by 25 percent. What goals would you keep and which would you discard?
Administer a goals inventory or conduct an interview study Involve a variety of groups (or stakeholders)
when possible.
Use a Delphi technique or a modification Choose an impartial facilitator to mediate a panel discussion
about possible program goals. In a brainstorming session, ask each panel member to build a list of criteria that he or she thinks is important for program goals. For each criterion, have each member anonymously rank it as: 1-very important; 2-somewhat important; or 3-not important. Place the criteria in rank order and show the (anonymous) results to the panel. Discuss possible reasons for items with high standard deviations. Repeat the ranking process among the panelists until the panel can reach consensus. The objective is to reach consensus before writing goals and objectives. (Additional information about the Delphi technique is available in Chapter 4.)
adapted from the Ball State University, Assessment Workbook (1999).
Collect and review instructional materials Try sorting materials by the type of learning each one is
designed to promote: recognition/recall, comprehension/simple application, critical thinking/problem-solving. Use any of the following: - Syllabi and course outlines - Course assignments and tests - Textbooks (especially the tables of contents, introductions, and summaries)
Collect and review documents that describe your department and its programs Brochures and catalogue descriptions Accreditation reports Curriculum committee reports Mission statements
noteworthy
The worksheet in Appendix 2-A at the end of this chapter may aid you in developing program goals.
10 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
It is generally a good idea to identify between three and five instructional goals for your program. However, if you and other members of your department can agree on only one goal, dont let this stall your progress. Focus on that one goal more will come later.
Program Objectives
Developing program objectives where do you start?
Program objectives transform the general program goals you developed above into specific student performance and behaviors that demonstrate student learning and skill development along these goals. Before drafting objectives, it might be helpful to consider these three questions, which focus on objectives in slightly different ways: For each of your stated goals, what are the specific student behaviors, skills, or abilities that
would tell you this goal is being achieved?
Ideally and briefly, what would a skeptic need (evidence, behavior, etc.), in order to see that
your students are achieving the major goals you have set out for them?
In your experience, what evidence tells you when students have met these goals how do you
know when theyre getting it?
Types of Objectives
There are three types of learning objectives, which reflect different aspects of student learning: Cognitive Objectives Affective Objectives What do you want your graduates to know? What do you want your graduates to think or care about? What do you want your graduates to be able to do?
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 11
Behavioral Objectives
Levels of Objectives
Objectives can also reflect different levels of learning Mastery Objectives reflect minimum competencies within the discipline those skills that must be mastered by the student before moving on to the next level of instruction. Mastery objectives tend to be very specific and limited in scope and, therefore, can often be articulated with great specificity (Palomba, et. al., 1999).
For example, all accounting students should be able to: Balance a financial statement; Prepare an Excel spreadsheet; Track accounts receivable
Developmental Objectives reflect more complex (or higher order) learning outcomes those learning tasks on which students can be expected to demonstrate varying degrees of progress. Note that these developmental objectives are often written in a two-stage process in which the general objective is stated along with a sample of specific learning outcomes.
For example, accounting students might also be expected to: Understand Generally Accepted Accounting Practices (GAAP); Explain GAAP in laymans terms; Name one or two of the practices; Discuss the difference between accepted and non-standard practices; and Give an example of when to use and reference GAAP
In this case, the objective for the student is to understand GAAP. While some students may demonstrate all four of the learning outcomes associated with this objective, some may only demonstrate three, and some only one.
Program objectives should be accepted and supported by members of the department. Developing appropriate and useful objectives is an iterative process; its not unusual to go back a number of times to refine objectives. In most cases, it is only when you try to develop assessment techniques for program objectives that the need for refining those objectives becomes apparent.
noteworthy
Blooms Taxonomy (1964) is a well-known description of levels of educational objectives. It may be useful to consider this taxonomy when defining your objectives. You can also use the Objectives Worksheet in Appendix 2-B at the end of this chapter to help you match your goals to specific objectives.
Cognitive Behaviors to know specific facts, terms, concepts, principles, or theories to understand, interpret, compare and contrast, explain to apply knowledge to new situations, to solve problems to identify the organizational structure of something; to identify parts, relationships, and organizing principles. to create something, to integrate ideas into a solution, to propose an action plan, to formulate a new classification scheme to judge the quality of something based on its adequacy, value, logic, or use
WORD POWER
Concrete verbs such as define, argue, or create are more helpful for assessment than vague verbs such as know, understand, or passive verbs such as be exposed to. Some examples of action words frequently used in objectives are included in the table below.
Knowledge define identify indicate know label list memorize name recall record relate repeat select underline Comprehension classify describe discuss explain express identify locate paraphrase recognize report restate review suggest summarize tell translate Application apply compute construct demonstrate dramatize employ give examples illustrate interpret investigate operate organize practice predict schedule shop sketch translate use Analysis analyze appraise calculate categorize compare contrast criticize debate determine diagram differentiate distinguish examine experiment inspect inventory question relate solve Synthesis arrange assemble collect compose construct create design formulate manage organize perform plan prepare produce propose set-up Evaluation appraise assess choose compare contrast decide estimate evaluate grade judge measure rate revise score select value
adapted from California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Social Sciences Program Goal Students who major in one of the social sciences will learn that they have responsibilities to themselves, their families, peer groups, communities, and society. Objectives
- Students can identify the role that cultural diversity plays in defining what it means to be a social being. - Students can identify the origins, workings, and ramifications of social and cultural change in their own identity. - Students can compare the distinctive methods and perspectives of two or more social science disciplines.
Natural Sciences Program Goal Students who major in the natural sciences will become critical thinkers who are able to judge scientific arguments created by others and see relationships between science and societal problems. Objectives
- Students can apply scientific methodology. - Students can evaluate the validity and limitations of theories and scientific claims in experimental results. - Students can identify the relevance and application of science in everyday life.
Humanities Program Goal Students who major in the humanities will begin to recognize themselves as knowers, be self-conscious about their participation in a particular culture, and cultivate their ability to discover new knowledge for themselves. Objectives
- Students can identify the contributions of the humanities to the development of the political and cultural institutions of contemporary society. - Students can analyze the meaning of major texts from both Western and non-Western cultures. - Students can apply the humanistic perspective to values, experiences, and meanings in their own lives.
Examples on this page have been adapted from California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Note that the previous objectives do not identify specific assignments for measuring the objectives nor do they set specific levels of proficiency. Generally, those aspects of the objectives need to be spelled out after the department has identified its methods for assessing these basic objectives. Some examples of objectives that are more specific follow:
English Objectives
- Students will write five-page essays reflecting on the work of an author of their choice that presents a clear and well-organized argument and uses examples from the authors work to support the argument. - Students will use the conventions of Standard Written English in all writing assignments.
Education Objectives
- Students will clearly demonstrate an understanding of curriculum theory and standards by preparing a two-page curriculum plan and providing justification from the literature for the chosen curriculum method. - Students will show an understanding of real-world curriculum needs by including in the curriculum plan details on the content and order of the curriculum, the appropriate grade level, and the time frame for implementation.
noteworthy
On our campus, a group of Junior Year Writing faculty gathered to articulate cross-disciplinary learning objectives for Junior Year Writing. A handbook outlining the objectives and describing the process is available through the Office of Academic Planning and Assessment.
Appendix 2-A
Goal Definition Worksheet
WORKSHEET
Each faculty member in the department should complete a copy of this worksheet. Arrange a time for all of you to sit down together to compare notes and discuss results. The final product of this exercise should be a list of three to five broad goals that describe what department faculty believe should be characteristic of graduates in the major. 1. List any department goals that you know. This information can most likely be found in the course catalog, program brochure, or department mission statement.
2. Describe your ideal student in terms of strengths, skills, knowledge and values, and identify which of these characteristics are the result of the program experience.
3. Keeping this ideal student in mind, ask what the student a. knows
b. can do
c. cares about
4. What program experiences can you identify as making the most contribution to producing and supporting the ideal student?
6. What career achievements of your alumni are you most proud of?
adapted from the Ball State University, Assessment Workbook (1999). 16 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
Appendix 2-B
Objectives Worksheet
WORKSHEET
This worksheet may help you and others in your department develop specific instructional objectives from the goals you have identified. Have all faculty members complete the following table. Meet as a group to discuss your response and try to reach consensus on desired objectives and outcomes. Remember that an objective is the specific learning behavior that the student should demonstrate in the context of achieving the goal. You may end up with more than one objective for each goal.
Program Goal Objective(s) a) 1. b)
c) a)
2.
b)
c) 3. a)
b)
c) 4. a)
b)
c) 5. a)
b)
c)
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 17
Chapter 3
Designing the Assessment Program The purpose of this chapter
Clarifying the content and focus of an assessment plan is an essential step in developing your own. By understanding what is included in an assessment plan and looking at what you already have in place, you can begin to focus on how to put together an effective assessment program for your department. In addition, you can decide what to add to existing assessments and become aware of the challenges to assessment design. The sample assessment plans at the end of the chapter illustrate one way you might approach this task. What What What What What does an assessment plan include? is already in place? should you add? can you assess? are the challenges to assessment design?
Chapter 3
At A Glance
Assessment Methods
noteworthy
Sample assessment plans are included in Appendix 3-A at the end of this chapter.
Curriculum Mapping: Linking goals/objectives to the curriculum Curriculum mapping makes it possible to identify where within the current curriculum your departmental learning objectives are addressed. Below is the framework for a matrix that might be helpful to you in identifying these links between intended outcomes and curricular processes. Along the top of the matrix, list all the courses and other requirements/options (internships, service learning, theses, etc.) for the major. Along the side, list your departmental objectives. Then indicate which of the objectives are addressed in each of the requirements/options (you could also go into more detail and identify in which courses these objectives are Introduced, Emphasized, Utilized, and Assessed Comprehensively as shown in the first row).
Objectives Communicate effectively in writing and speech Apply discipline specific theory and principles
adapted from Diamond, R. M. Designing and assessing courses and curricula (1998).
You can also use matrices that link program objectives to specific course assignments, or course objectives to program objectives, or any other configuration that helps you connect what you are currently doing to the program goals and objectives your department has identified as important for graduates in the major.
More on Curriculum Mapping Curriculum mapping provides an inventory of the link between your objectives and the curriculum. It can also serve as a catalyst for discussions about the proper sequencing of courses, the degree to which the curriculum really supports student learning, and the extent to which core objectives are appropriately addressed within the curriculum. Discussing the link between learning objectives and the curriculum may lead to a more general conversation about how processes within the major facilitate or hinder accomplishment of program goals. You may find the following questions helpful in framing that discussion: What are the processes (e.g., course, activities, practica) under your control that contribute to meeting your goals and objectives? Are there processes that dont contribute to your goals? Are there processes in which you should be engaged to attain your goals? Are there resources not under the control of your program that might assist you in improving student learning (e.g., general education, related minor program, courses offered outside the major, library holdings, or other support services for students)?
adapted from the Western Carolina University, Assessment Resource Guide (1999).
Such a departmental conversation can also be very helpful in identifying the key program components particularly in need of assessment. (For example, are there key points in the curriculum where it is particularly important to gauge student progress?) Revisit these questions after collecting assessment information the assessment data should further inform your initial responses.
Inventory of Current Assessment Practices Instructors and departments are already assessing student learning through a variety of methods including grades, competency exams, capstone courses, etc., though you may not call them assessment. Before designing a department assessment program, it is important to identify what assessment information you are already collecting and match these data sources to the learning goals and objectives you outlined in Chapter 2. An assessment matrix is a particularly useful way of linking goals and objectives to assessment tools, program requirements or course curricula. The example on the following page shows a set of departmental objectives down the first column of the matrix and, along the first row, different sets of information currently available at the department level. In this matrix, the link between objectives and data sources is identified in two ways direct measures of the objectives (D) and indirect measures (I).
Direct methods require students to display their knowledge and skills as they respond to Indirect methods such as surveys and interviews ask students to reflect on their learning
rather than to demonstrate it (Palomba and Banta, 1999, pp. 11-12).
the instrument itself. Objective tests, essays, presentations, and classroom assignments all meet this criterion.
Departmental Processes. Are students served efficiently and effectively when they need services such as:
adapted from California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Acknowledge differences between units. Even programs within the same department may have different goals specific to that program. Assessment measures that work well in one unit may not be as successful in another. The key is to design or identify assessment techniques that are specific to the goal that you are assessing. Allow time for mistakes and for ongoing faculty input and discussion. Pilot projects are excellent ways to try out new techniques to see how well they assess the goal or outcome you are trying to measure. Encourage and set time aside for faculty meetings to discuss assessment techniques and methods so that faculty both invest in the process and see how assessment is connected to the learning that goes on in their classrooms. Tie the assessment methodology and instruments to the purpose of the assessment. Differences among units and the need to experiment are only two challenges you may face. You will also want to avoid the common error of designing or identifying an assessment technique, then fitting a purpose or goal to it. Address the issues of participant attrition/retention, the actual amount of time involved, and cost and/or resources. Longitudinal studies are particularly vulnerable to these challenges. Any effective assessment plan will acknowledge these challenges and incorporate ways to address them within the development and implementation of the plan itself.
Appendix 3-A
Sample Assessment Plans
SAMPLE
2. Objectives
identify trends or patterns in anthropological data; formulate a testable explanation or reasonable interpretation; identify data that constitute credible evidence for an explanation or interpretation; identify and define a significant problem or topic in anthropology; and analyze and interpret data in a systematic manner.
3. Outcomes Criteria Completion by a random sample of 15% of the senior majors of identified course assignments in selected upper division anthropology courses. 4. Assessment Methods A cross-section of written work involving several formats and the departments three subdisciplines, including take-home essays, literature critiques, midterm essay, and final exams. 5. Time Frame Senior majors will take the courses proposed and will complete the identified assignments for these courses. Evaluation of the assignments will be scheduled as appropriate throughout the semester. 6. Who Will Do the Assessment? Assignments will be read and evaluated independently by three faculty members other than the course instructor and ranked on a five-point scale with 5 as superior and 1 as inadequate. 7. Type of Feedback. At the end of each evaluation, faculty will submit their evaluations, data will be compiled and areas of strength/weakness will be identified. 8. How data will be used to improve program or revise curricula The department will meet as a whole to discuss findings and will recommend to the Chair methods for improving curricula based on the assessment.
Appendix 3-A
Sample Assessment Plans
SAMPLE
BS in Chemical Engineering 1. Goals The undergraduate degree in chemical engineering requires knowledge and awareness of:
mathematics beyond trigonometry, emphasizing mathematical concepts and principles; general chemistry, organic chemistry, physical chemistry, and general physics; the engineering sciences that have their origins in mathematics and the basic sciences; the iterative decision-making process in which basic sciences, mathematics, and engineering sciences are applied to convert resources to meet project objectives;
3. Outcomes Criteria Successful completion of national standardized Fundamentals of Engineering Exam (FE) by all graduating seniors. 4. Assessment Methods
- analysis of overall FE exam scores in comparison with national and state scores - analysis of FE exam scores by engineering major - analysis of course content in relation to exam subject areas and scores
5. Type of Feedback.
- review of test data by faculty committees within each department of the College to determine percentages of students passing/failing the exam - evaluation of College curricula and course content in relation to areas of the exam on which students receive lower scores
6. How data will be used to improve program or revise curricula Data will be used to update curricula and course content to address identified problem areas. A senior design project is currently being considered to increase hands-on experience and practical application of learning.
Appendix 3-A
Sample Assessment Plans
SAMPLE
BA in English 1. Goal Students are expected to be familiar with major writers, periods and genres of English and American Literature and to be able to place important works and genres in their historical context. 2. Objectives
- Discuss a major work or author in English and/or American Literature, or compare two or more works and authors; for example, analyze the character of Satan in Miltons Paradise Lost. - Analyze a novel, short story, poem, play or a significant piece of prose showing familiarity with the techniques and literary contexts of the particular genre examined. - Show knowledge of the historical context or literary period of the work or author being examined; for example, a discussion of Cranes Maggie as an example of American Naturalism.
3. Outcomes Criteria Completion of a Senior Project consisting of a portfolio of four papers and a reflective essay demonstrating that the student has met a substantial number of the objectives outlined above in Objectives. 4. Assessment Methods Portfolios reviewed and evaluated by departmental committee. 5. Time Frame Students will take the course proposed and will prepare the portfolios before the end of the senior year. Evaluation of the portfolios will be scheduled for each quarter. 6. Who Will Do the Assessment? Department Chair and appointed committee. 7. Type of Feedback. At the end of each evaluation, the committee will write a report describing the strengths and weaknesses that the portfolios demonstrate. 8. How data will be used to improve program or revise curricula The department will meet as a whole to discuss findings and will recommend to the Chair and curriculum committee methods of improving department procedures and curricula.
Appendix 3-A
Sample Assessment Plans
SAMPLE
2. Objectives
use techniques of differentiation and integration of one and several variables; solve problems using differentiation and integration; solve systems of linear equations; give direct proofs, proofs by contradiction, and proofs by induction; write a simple computer program
3. Outcomes Criteria Completion of embedded exam questions designed to evaluate selected knowledge and skills. 4. Assessment Methods Test questions developed by a committee of faculty and embedded in the mid-term and final exams of three upper level classes: Calculus 3, Linear Algebra, and Advanced Calculus. 5. Time Frame Students will take the courses proposed and will complete the mid-term and final exams for these courses. Evaluation of the exam questions will be scheduled at semesters mid-point and end. 6. Who Will Do the Assessment? Members of the departmental Undergraduate Committee, independent of the course instructors, will grade questions for outcomes assessment. The Department Chair and an appointed committee will review the Undergraduate Committees report. 7. Type of Feedback. At the end of each evaluation, the committee will write a report describing the results and making recommendations for curricular revision, if appropriate. 8. How data will be used to improve program or revise curricula The department will meet as a whole to discuss findings and will recommend to the Chair methods for improving curricula based on exam question assessment.
Chapter 4
Assessment Strategies and Methods The purpose of this chapter
This chapter helps you identify the strategies and methods you will use to collect assessment data as part of your departments assessment program. It describes tools for assessing student learning, outlines assessment strategies and offers guidelines for selecting assessment methods. You will also find a sample assessment timeline that illustrates implementation of a hypothetical department assessment program. What should you remember when selecting assessment methods? What are some ways to assess the undergraduate major? What are the guidelines for selecting assessment methods? What assessment methods can you use? How do you link outcomes, methods, and results? What are the specifics of these methods? (Appendix 4-A)
Chapter 4
At A Glance
noteworthy
If youre stuck, remember that the Office of Academic Planning and Assessment can provide additional resources, can connect you to other faculty and departments who have worked through this process, or can assist you directly with specific concerns or assessment needs.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 29
Think about the ways in which you can use one source of information for a variety of individual student and program-level purposes. This method will improve the chances that the assessment activity will become embedded into the structure of your program, requiring less start up work down the road.
Assessment and Grading When the issue of assessment is raised, faculty members Grades are global evaluations that often say, I already do assessment. I grade student assignments. Grades are indeed one measure of represent the overall proficiency of student achievement. There are significant drawbacks, however, to using grades to meet assessments primary students. They dont tell you about goal to improve teaching and learning. student performance on individual Traditional grading which offers one score to represent the sum total of a students performance (or specific) learning goals across a host of outcomes does not provide the detailed and specific information necessary for linking student performance to program objectives and, ultimately, to improvement. Because grades dont tell you about student performance on individual (or specific) learning goals or outcomes, they provide little information on the overall success of the program in helping students attain specific and distinct learning objectives of interest. New Information In addition to accessing data that are already available, your department can collect new data specific to student learning in your program and designed to address departmental goals and objectives. These data sources might include information collected through:
- student internships or performance - capstone courses for graduating seniors (summary course for major) - portfolio analysis (collection of student work) - standardized tests (nationally-constructed or department-based) - surveys, interviews, or focus groups of students at entrance and exit, alumni, faculty, employers or related to course content - performance measures (activities such as writing an essay, making a presentation, completing a complex problem-solving exercise)
30 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
Using these assessment questions to guide method selection can help define your data collection priorities. Use multiple methods to assess each learning outcome. Many outcomes will be difficult to assess using only one measure. The advantages to using more than one method include:
- multiple measures can assess different components of a complex task - no need to try to design a complicated all-purpose method - greater accuracy and authority achieved when several methods of assessment produce similar findings - provides opportunity to pursue further inquiry when methods contradict each other
Include both direct and indirect measures. Direct methods ask students to demonstrate their learning while indirect methods ask them to reflect on their learning. Direct methods include some objective tests, essays, presentations and classroom assignments. Indirect methods include surveys and interviews. Include qualitative as well as quantitative measures. All assessment measures do not have to involve quantitative measurement. A combination of qualitative and quantitative methods can offer the most effective way to assess goals and outcomes. Use an assessment method that matches your departmental culture. For example, in a department where qualitative inquiry is particularly valued, these types of methods should be incorporated into the plan. The data you collect must have meaning and value to those who will be asked to make changes based on the findings.
Qualitative measures rely on descriptions rather than numbers (Palomba and Banta 1999). - ethnographic studies - exit interviews - formal recitals - participant observations - writing samples - open-ended questions on surveys and interviews Quantitative measures assess teaching and learning by collecting and analyzing numeric data using statistical techniques. - GPA - grades - primary trait analysis scores - exam scores - demographics - forced-choice surveys - standardized teaching evaluations
Choose assessment methods that allow you to assess the strengths and weaknesses of the program. Effective methods of assessment provide both positive and negative feedback. Finding out what is working well is only one goal of program assessment.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 31
Be selective about what you choose to observe or measure. Assessment methods should be selected as carefully as you selected your departmental goals and objectives. As you work through this process, remember that:
- comprehensive does not mean assessing everything - choosing assessable indicators of effectiveness is key - complex methods are not necessarily the best choice - select a manageable number of methods that do not drain energy or resources
Include passive as well as active methods of assessment. In addition to assessment methods that require you to interact directly with the student in an instructional or evaluative setting, assessment measures are also available that allow you to analyze assessment information without direct student contact or effort. You can accomplish this goal by analyzing:
- student database information - attendance and course selection patterns - employer survey results - transcript analyses
Use capstone courses or senior assignments to directly assess student learning outcomes. Capstone courses and senior assignments promote faculty student interaction and scholarly inquiry; they allow demonstration of academic breadth; and they allow demonstration of ability to synthesize and integrate knowledge and experiences. If you use this method, however, care should be taken that:
- the course and its assignments are truly representative of requirements for the major - the course curriculum and assignment evaluation (or products) are consistent across sections - students understand the value and importance of the capstone course or senior assignment and take this requirement seriously
Enlist the assistance of assessment and testing specialists when you plan to create, adapt, or revise assessment instruments. Staff in the Office of Academic Planning and Assessment are happy to assist you in finding the appropriate resources. Areas in which you might want to seek assistance include:
- ensuring validity and reliability of test instruments - ensuring validity and reliability of qualitative methods - identifying appropriate assessment measurements for specific goals and tasks - analyzing and interpreting quantitative and qualitative data collected as part of your assessment plan.
Use established accreditation criteria to design your assessment program. Established criteria will help you:
- respond more effectively to accreditation requirements - build on the techniques and measures that you use as part of the accreditation process
Examples on these pages are adapted from University System of Georgia: Task Force on Assessing Major Area Outcomes, Assessing Degree Program Effectiveness (1992); and Western Carolina University, Assessment Resources Guide (1999).
Surveys + + -
Classroom Assignments + + +
adapted from Palomba, C. A., & Banta, T. W., Assessment essentials (1999).
In the next example, the learning objectives under consideration are listed in the first column and methods are outlined along the top. Completing this matrix will help you link learning objectives to specific measures that can be used to assess these objectives. Think about whether each measure is direct or indirect and note that in the appropriate column (in this example, D and I). You can also rate the extent to which each measure appropriately represents the objective, using pluses and minuses or other indicators with meaning for you.
Measures Questionnaire
Speech
The following table identifies various types of assessment data, methods for collecting these data, and the sort of information each method provides. In Appendix 4-A you will find a glossary of assessment methods.
adapted from California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999). 34 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
Assessment Measure
(how will you assess it?)
Population
(whom will you assess?)
Reporting/Use AQAD Report Revise Curriculum and/or Instruction as determined Departmental Discussion/Review of results Revise program Instruction/Goals as determined Departmental Discussion/Review of results Revise program Instruction/Goals as determined
Students will be able to demonstrate mastery of basic knowledge relevant to the field Students understand goals and objectives of program
Student Perceptions
Faculty Perceptions
Faculty agree that goals and objectives of program are being met
Focused dialogue
Department faculty
Summer
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>
Capstone Assignment(s)
Capstone Assignment(s)
Capstone Assignment(s)
Appendix 4-A
Glossary of Assessment Methods
Alumni Surveys Description: Surveying department alumni can provide a wide variety of information about program satisfaction, how well students are prepared for their careers, what types of jobs or graduate degrees majors have gone on to obtain, starting salaries for graduates, and the skills that are needed to succeed in the job market or in graduate study. These surveys provide the opportunity to collect data on which areas of the program should be changed, altered, improved or expanded. Strengths and Weaknesses: Alumni surveying is usually a relatively inexpensive way to collect program data from individuals who have a vested interest in helping you improve your program as well as offering the opportunity for improving and continuing department relationships with program graduates. However, without an easily accessible and up-to-date directory of alumni, they can be difficult to locate. It also takes time to develop an effective survey and ensure an acceptable response rate.
adapted from Palombo et al. Ball State University, Assessment Workbook (2000).
Additional Resources: Converse, J. M. & Pressler, S. (1986). Survey questions: Handcrafting the standardized questionnaire. SAGE Publications. Dillman, D. (1978). Mail and telephone surveys: The total design method. New York: Wiley-Interscience Publication. Dyke, J. V. & Williams, G. W. (1996). Involving graduates and employers in assessment of a technology program. In Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (Eds.). Assessment in practice, pp. 99-101. San Francisco: Jossey-Bass Publishers. Ewell, P. (1983). Student outcomes questionnaires: An implementation handbook. New York, NY: National Center for Higher Education Management Systems and the College Board. Labaw, P. J. (1980). Advanced questionnaire design. Cambridge, MA: Abt Books. McKenna, B. Surveying your alumni: Guideline and 22 sample questionnaires. Washington, DC: Council for advancement and support of education.
Culminating Assignments Description: Culminating assignments offer students the opportunity to put together the knowledge and skills they have acquired in the major, provide a final common experience for majors, and offer faculty a way to assess student achievement across a number of discipline-specific areas. Culminating assignments are generally designed for seniors in a major or field to complete in the last semester before graduation. Their purpose is to integrate knowledge, concepts and skills that students are expected to have acquired in the program during the course of their study. This is obviously a curricular structure as well as an assessment technique and may consist of a single culminating course (a capstone course) or a small group of courses designed to measure competencies of students who are completing the program. A senior assignment is a final culminating project for graduating seniors such as a performance portfolio or a thesis that has the same integrative purpose as the capstone course.
Strengths and Weaknesses: Many colleges and universities are using capstone courses to collect data on student learning in a specific major or in general education or core requirement programs. Putting together an effective and comprehensive capstone course can be a challenge, however, particularly for those programs that mesh hands-on technical skills with less easily measurable learning outcomes. Also, there is a great deal of start-up time to developing appropriate and systematic methods for assessing these or other culminating experiences. See Content Analysis and Primary Trait Analysis below for further information.
adapted from the University of Wisconsin, Madison, Outcomes Assessment Manual (2000).
Additional Resources: Southern Illinois University website: www.siue.edu/~deder/assess Julian, F. D. (1996). The capstone course as an outcomes test for majors. Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (Eds.). In Assessment in practice, pp. 79-81. San Francisco: Jossey-Bass Publishers. Upcraft, M. L., Gardner, J. N., & Associates. (1989). The freshman year experience: Helping students survive and succeed in college. San Francisco: Jossey-Bass Publishers.
Content Analysis Description: Content analysis is a technique that looks at a group of students, such as majors in a program or department, and assesses samples of written work that are produced by this group. This assessment method uses outcomes identified as important prior to the analysis or as the analysis proceeds. For example, you might want to determine how well majors in your department write. To use content analysis to assess their writing skills, you will need a representative sample of the writing. Analysis may look at what students actually write or at the underlying meaning of their writing. Results are generally presented in written form giving averages and examples of specific categories of outcomes (e.g., spelling errors). Primary trait analysis, which identifies important characteristics of specific assignments and assigns levels of competency to each trait, can be particularly effective in identifying student learning. Strengths and Weaknesses: Content analysis allows you to assess learning outcomes over a period of time and can be based on products that were not created for program assessment purposes. Because writing samples can be re-examined, content analysis also makes it easier to repeat portions of the study and provides an unobtrusive way to assess student learning. However, accuracy of the assessment is limited to the skill of the person(s) doing the analysis. Data is also limited by the set of written work and may not be relevant to technical skills valued by a particular field or major that involve hands-on performance. Pre-testing coding schemes, using more than one analyst per document, and concrete materials and coding schemes can improve the reliability of this technique.
adapted from the California State University Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resource: Babbie, E. (1995). The Practice of Social Research (7th ed.). Belmont, CA: Wadsworth. Walvoord, B. E. & Anderson, V. J. (1998). Effective grading: A tool for learning and assessment. San Francisco: Jossey-Bass.
Course-embedded Assessment Description: Course-embedded assessment refers to methods of assessing student learning within the classroom environment, using course goals, objectives and content to gauge the extent of the learning that is taking place. This technique generates information about what and how students are learning within the program and classroom environment, using existing information that instructors routinely collect (test performance, short answer performance, quizzes, essays, etc.) or through assessment instruments introduced into a course specifically for the purpose of measuring student learning. Strengths and Weaknesses: This method of assessment is often effective and easy to use because it builds on the curricular structure of the course and often does not require additional time for data collection since the data comes from existing assignments and course requirements. Course-embedded assessment does, however, take some preparation and analysis time and, while well documented for improving individual courses, there is less documentation on its value for program assessment.
adapted from the University of Wisconsin, Madison, Outcomes Assessment Manual (2000), and the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resources: Angelo, T. A. & Cross, K. P. (1993). Classroom assessment techniques: A Handbook for college teachers (2nd. Ed.). San Francisco: Jossey-Bass. Classroom Assessment Techniques. (1999). Center for Excellence in Learning & Teaching. www.personal.psu.edu/celt/CATs.html Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass. Walvoord, B. E. & Anderson, V. J. (1998). Effective grading: A tool for learning and assessment. San Francisco: Jossey-Bass.
Curriculum Analysis Description: Curriculum analysis involves a systematic review of course syllabi, textbooks, exams, and other materials to help you clarify learning objectives, explore differences and similarities between course sections, and/or assess the effectiveness of instructional materials. It offers a way to document which courses will cover which objectives and helps in sequencing courses within a program. Also see Matrices. Strengths and Weaknesses: Using curriculum analysis as an assessment tool can be a valuable way of tracking what is being taught where. It can provide assurance that specific learning goals and objectives are being covered in the program and can pinpoint areas where additional coverage is needed. This method, however, can be time-consuming, particularly in large departments with many courses and different instructors, and there may be little consistency between how learning objectives are addressed in one course and how they are taught in another.
adapted from the Ball State University, Assessment Workbook, 1999 and The University of Wisconsin, Madison, Outcomes Assessment Manual I (2000).
Additional Resources: Bers, T., Davis, D., & Taylor, W. (1996, Nov.-Dec.). Syllabus analysis: What are you teaching and telling your students? Assessment Update (8), 6, pp. 1-2, 14-15. Diamond, R. M. (1998). Designing and assessing courses and curricula. San Francisco: Jossey- Bass. Ewell, P. T. (1997). Identifying indicators of curricular quality. In Handbook of the undergraduate curriculum, J. G. Gaff & J. L. Ratcliff (Eds.). San Francisco: Jossey Bass, pp. 608-627.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 39
Delphi Technique Description: The Delphi technique is used to achieve consensus among differing points of view. In its original form, a team of experts, who never actually meet, are asked to comment on a particular issue or problem. Each members response is reviewed and a consensus determined. Any member whose response falls outside of the consensus is asked to either defend or rethink the response. The anonymity provided by this technique offers more junior members of the team an equal chance to get their ideas out, as well as permits a challenge to the ideas of senior members that might never take place in an open forum. More recently, the Delphi technique has been modified so that teams of individuals are brought together to discuss an issue or problem face-to-face and reaching a consensus at the meeting. For instance, a team of faculty members might meet to review possible goals and objectives for their department in an effort to develop a set of goals and objectives on which they can agree. Strengths and Weaknesses: The Delphi technique can be useful in bringing together diverse opinions in a discussion forum. This technique fails, however, when the facilitator lacks objectivity or when the participants feel unsafe or insecure in voicing their real opinions. For instance, a faculty member discussing intended goals and objectives might not be comfortable in disagreeing with the department head. For this technique to succeed, care must be taken to appoint an impartial facilitator and to convince participants that differing opinions are welcome. Returning to the original design of this technique, with an anonymous team who never meet, might ensure more honest and open input.
Additional Resources: Armstrong, M. A. (1989). The Delphi technique. Princeton Economic Institute. http://www.pei-intl.com/Research/MARKETS/DELPHI.HTM. Cline, Alan. (2000). Prioritization Process using Delphi Technique. www.carolla.com/wp-delph.htm. Stuter, L. M. (1996). The Delphi technique: What is it? http://www.icehouse.net/lmstuter/page0019.htm. Stuter, L. M. (November 1998). Using the Delphi technique to achieve consensus. Education Reporter (54).
Employer Surveys Description: Employer surveys help the department determine if their graduates have the necessary job skills and if there are other skills that employers particularly value that graduates are not acquiring in the program. This type of assessment method can provide information about the curriculum, programs and student outcomes that other methods cannot: on-the-job, field-specific information about the application and value of the skills that the program offers. Strengths and Weaknesses: Employer surveys provide external data that cannot be replicated on campus and can help faculty and students identify the relevance of educational programs, although, as is true in any survey, ambiguous, poorly-worded questions will generate problematic data. Additionally, though data collected this way may provide valuable information on current opinion, responses may not provide enough detail to make decisions about specific changes in the curriculum or program. Also, it is sometimes difficult to determine who should be surveyed, and obtaining an acceptable response rate can be cost and time intensive.
adapted from the Ball State University, Assessment Workbook (1999), the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the University of Wisconsin, Madison, Outcomes Assessment Manual I (2000). 40 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
Additional Resources: Converse, J. M. & Presser, S. (1986). Survey questions: Handcrafting the standardized questionnaire. Newbury Park: SAGE Publications. Dyke, J. V., & Williams, G. W. (1996). Involving graduates and employers in assessment of a technology program. In Banta. T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (eds.) Assessment in Practice. San Francisco: Jossey-Bass. Lead Center, University of Wisconsin, Madison. (1998). Program assessment tool kit: A guide to conducting interviews and surveys.
Focus Groups Description: Focus groups are structured discussions among homogeneous groups of 6-10 individuals who respond to specific open-ended questions designed to collect data about the beliefs, attitudes and experiences of those in the group. This is a form of group interview where a facilitator raises the topics for discussion and collects data on the results. Emphasis is on insights and ideas. Strengths and Weaknesses: Focus groups can provide a wide variety of data about participants experiences, attitudes, views and suggestions, and results can be easily understood and used. These groups allow a small number of individuals to discuss a specific topic in detail, in a non-threatening environment. Data collected in this way, however, is not useful for quantitative results, and qualitative data can be time-consuming and difficult to analyze because of the large amount of non-standardized information. Ultimately, the success of this method depends on a skilled, unbiased moderator and appropriate groups of participants.
adapted from Palombo et al. Ball State University, Assessment Workbook (2000); and the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resources: Lead Center, University of Wisconsin, Madison. (1998). Program assessment tool kit: A guide to conducting interviews and surveys. Morgan, D. L. (1988). Focus groups as qualitative research. Newbury Park: SAGE Publications. Morgan, D. L., & Krueger, R. A. (1997). The focus group kit (Vols. 1-6). Thousand Oaks, CA: SAGE Publications.
Institutional Data Description: A variety of departmental and student data are routinely collected at the university level. These data can enhance and elaborate on data you collect in the department. Institutional data can tell you whether the program is growing, what the grade point average is for majors in the program, and what the retention rate is for your students. Strengths and Weaknesses: Institutional data are generally easily accessible and readily available. On the UMass Amherst campus, you can access this data through the Office of Institutional Research (OIR), located in 237 Whitmore. Student and departmental data are collected on a systematic and cyclical schedule that can offer you both current and longitudinal information. On the other hand, these data sets are generally large and may be difficult to sort through, particularly for those individuals who are not used to working through large databases. The data may be less useful to specific departments or programs because the information collected is very often general (age, gender, race, etc.) and may not directly relate to program goals and objectives.
adapted from the Ball State University, Assessment Workbook (1999).
Additional Resources: The Office of Institutional Research (see Sources and Resources at the back of this handbook) can provide assistance in accessing institutional data and university-wide data sets. The Information Clearinghouse website is www.umass.edu/oapa/.
Matrices Description: At its most basic, a matrix is a grid of rows and columns used to organize information. For assessment purposes, a matrix can be used to summarize the relationship between program objectives and course syllabus objectives, course assignments, or courses in a program or department. Matrices can be used for curriculum review, to select assessment criteria or for test planning. A matrix can also be used to compare program outcomes to employer expectations. Strengths and Weaknesses: Using a matrix can give you a good overview of how course components and curriculum link to program objectives, can help you tailor assignments to program objectives, and can lead to useful discussions that in turn lead to meaningful changes in courses or curricula. However, because a matrix can offer a clear picture of how program components are interconnected and can reveal where they are not, acknowledging and responding to discrepancies may involve extensive discussion, flexibility and willingness to change.
adapted from the Ball State University, Assessment Workbook, revised April (2000), and the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resource:
Diamond, R.M. (1998). Designing and assessing courses and curricula. San Franciso: Jossey-Bass. Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass.
Observations Description: Observation as a method of assessment is an unobtrusive tool that can yield significant information about how and why students learn. You may choose to observe any relevant interactive event, such as classes, club meetings, or social gatherings. This tool is generally used when you are interested in how students study, are concerned about the effectiveness of study sessions or other supplementary activities, or when you are focusing on the relationship between out-of-class behavior and in-class performance. Data collected through observation can be correlated with test scores and/or course grades to help provide further insight into student learning. Strengths and Weaknesses: Data collected through observation can yield important insight into student behavior that may be difficult to gauge through other assessment methods. This method is typically designed to describe findings within a particular context and often allows for interaction between the researcher and students that can add depth to the information collected. It is especially useful for studying subtleties of attitudes and behavior. Observed data, however, is not precise and cannot be generalized to larger populations. Conclusions may be suggestive rather than definitive, and others may feel that this method provides less reliable data than other collection methods.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resources: Babbie, E. (1995). The practice of social research (7th ed.). Belmont, CA: Wadsworth. Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass.
Performance Assessment Description: Performance assessment uses student activities to assess skills and knowledge. These activities include class assignments, auditions, recitals, projects, presentations and similar tasks. At its most effective, performance assessment is linked to the curriculum and uses real samples of student work. This type of assessment generally requires students to use critical thinking and problem-solving skills within a context relevant to their field or major. The performance is rated by faculty or qualified observers and assessment data collected. The student receives feedback on the performance and evaluation. Strengths and Weaknesses: Performance assessment can yield valuable insight into student learning and provides students with comprehensive information on improving their skills. Communication between faculty and students is often strengthened, and the opportunity for students self-assessment is increased. Performance assessment, like all assessment methods, is based on clear statements about learning objectives. This type of assessment is also labor-intensive, is sometimes separate from the daily routine of faculty and student, and may be seen as an intrusion or an additional burden. Articulating the skills that will be examined and specifying the criteria for evaluation may be both time-consuming and difficult.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Additional Resources:
Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers. San Francisco: Jossey-Bass. Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass.
Portfolio Evaluations Description: Portfolios are collections of student work over time that are used to demonstrate student growth and achievement in identified areas. Portfolios can offer information about student learning, assess learning in general education and the major, and evaluate targeted areas of instruction and learning. A portfolio may contain all or some of the following: research papers, process reports, tests and exams, case studies, audiotapes, videotapes, personal essays, journals, self-evaluations and computational exercises. Portfolios are often useful and sometimes required for certification, licensure, or external accreditation reviews. Strengths and Weaknesses: Portfolios not only demonstrate learning over time, but can be valuable resources when students apply to graduate school or for jobs. Portfolios also encourage students to take greater responsibility for their work and open lines of discussion between faculty and students and among faculty involved in the evaluation process. Portfolios are, however, costly and time-consuming and require extended effort on the part of both students and faculty. Also, because portfolios contain multiple samples of student work, they are difficult to assess and to store and may, in some contexts, require too much time and effort from students and faculty alike.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the University of Wisconsin, Madison, Outcomes Assessment Manual I (2000).
Additional Resources:
Belanoff, P. & Belanoff, D. (1991). Portfolios: Process and product. Portsmouth, NH: Boynton/Cook Publishers. The Washington State University Writing Portfolio (2001). http://wsu.edu/~bcondon/portpage.html. Forrest, A. (1990). Time will tell: Portfolio-assisted assessment of general education. Washington, DC: AAHE Assessment Forum.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 43
Pre-test/Post-test Evaluation Description: This method of assessment uses locally developed and administered tests and exams at the beginning and end of a course or program in order to monitor student progression and learning across pre-defined periods of time. Results can be used to identify areas of skill deficiency and to track improvement within the assigned time frame. Tests used for assessment purposes are designed to collect data that can be used along with other institutional data to describe student achievement. Strengths and Weaknesses: Pre-test/post-test evaluations can be an effective way to collect information on students when they enter and leave a particular program or course, and provide assessment data over a period of time. They can sample student knowledge quickly and allow comparisons between different students groups, or the same group over time. They do, however, require additional time to develop and administer and can pose problems for data collection and storage. Care should be taken to ensure that the tests measure what they are intended to measure over time (and that they fit with program learning objectives) and that there is consistency in test items, administration and application of scoring standards.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the University of Wisconsin, Madison, Outcomes Assessment Manual I (2000).
Additional Resources: Berk, R. (Ed.). (1986). Performance assessment: Methods and applications. Baltimore, MD. The Johns Hopkins University Press. Gronlund, N. (1991). Measurement and evaluation in teaching (4th ed.). New York: MacMillan. Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass.
Reflective Essays Description: Reflective essays may be used as an assessment tool to gauge how well students are understanding class content and issues. They are generally short essays (5 to 10 minutes) on topics related to the course curriculum and may be given as in-class assignments or homework. Reflective essays may be voluntary or required, open-ended questions on surveys required in student portfolios or capstone composition courses. Strengths and Weaknesses: Reflective essays as an assessment tool can offer data on student opinions and perspectives at a particular moment in a class. Essays will provide a wide array of different responses and might lead to increased discussion among faculty and students. On the other hand, poorly worded, ambiguous questions will yield little, and opinions and perceptions may vary in accuracy. Analysis of essay content also takes additional time and expertise. Additional Resource: Banta, T. W., Lund, J. P., Black, K. E. & Oblander, F. W. (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco: JosseyBass.
Scoring Rubrics Description: Scoring rubrics are typically grids that outline identified criteria for successfully completing an assignment or task and establish levels for meeting these criteria. Rubrics can be used to score everything from essays to performances. Holistic rubrics produce a global score for a product or performance. Primary trait analysis uses separate scoring of individual characteristics or criteria of the product or performance. Strengths and Weaknesses: Scoring rubrics allow the instructor to efficiently and consistently look at complex products or performances and to define precise outcomes and expectations. They also are easily shared with students. However, developing an effective rubric can be time-consuming and often requires ongoing edits to fine tune criteria and anticipated outcomes. Training raters to use the scoring rubrics in a consistent manner also involves a significant time commitment.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999).
Standardized and Local Test Instruments Description: Selecting a standardized instrument (developed outside the institution for application to a wide group of students using national/regional norms and standards) or a locally-developed assessment tool (created within the institution, program or department for internal use only) depends on specific needs and available resources. Knowing what you want to measure is key to successful selection of standarized instruments, as is administering the assessment to a representative sample in order to develop local norms and standards. Locally-developed instruments can be tailored to measure specific performance expectations for a course or group of students. Strengths and Weaknesses: Locally-developed instruments are directly linked to local curriculum and can identify student performance on a set of locallyimportant criteria. Putting together a local tool, however, is time-consuming as is development of a scoring key/method. There is also no comparison group and performance cannot be compared to state or national norms. Standardized tests are immediately available for administration and, therefore, are less expensive to develop than creating local tests from scratch. Changes in performance can be tracked and compared to norm groups and subjectivity/misinterpretation is reduced. However, standardized measures may not link to local curricula and purchasing the tests can be expensive. Test scores may also not contain enough locally-relevant information to be useful.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the University of Wisconsin, Madison, Outcomes Assessment Manual I (2000).
Additional Resources: Jacobs, L. C., & Chase, C. you. (1992). Developing and using tests effectively: A guide for faculty. San Francisco: Jossey Bass. Morris, L. L., Fitz-Gibbons, C. T., Lindheim, E. (1987). How to measure performance and use tests. Beverly Hills: Sage. National Post-Secondary Education Cooperative (NPEC) Assessment Tests Review. http://www.nces.gov/npec/evaltests Ory, J., & Ryan, K. E. (1993). Tips for improving testing and grading. Beverly Hills: Sage Publications.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 45
Student Surveys and Exit Interviews Description: Surveys and interviews ask students to respond to a series of questions or statements about their academic experience. Questions can be both open-ended (respondents create answers) and close-ended (respondents answer from a list of simple and unambiguous responses). Surveys and interviews can be written or oral (face-to-face) or phone. Types of surveys include in-class questionnaires, mail questionnaires, telephone questionnaires, and interviews. Interviews include structured, in-person interviews and focus group interviews. Strengths and Weaknesses: Surveys can be relatively inexpensive and easy to administer, can reach participants over a wide area, and are best suited for short and non-sensitive topics. They can give you a sense of what is happening at a given moment in time and can be used to track opinions. Data is reasonably easy to collect and tabulate, yet the sample may not be representative of the population (particularly with a low response rate). Ambiguous, poorly written items and insufficient responses may not generate enough detail for decision making. An interview can follow-up on evasive answers and explore topics in-depth, collecting rich data, new insights, and focused details. It can, however, be difficult to reach the sample and data can be time-consuming to analyze. Information may be distorted by the respondent, who may feel a lack of privacy and anonymity. The success of the interview depends ultimately on the skills of the interviewer.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the University of Wisconsin, Madison, Program Assessment Tool Kit (1998).
Additional Resources: Each spring UMass administers a survey to graduating seniors which focuses on their experiences in their major. Contact OAPA for more information. Dillman, D. (1978). Mail and telephone surveys: The total design method. New York: Wiley-Interscience Publication. Fowler, F. J. (1985). Survey research methods. Beverly Hills: SAGE Publications.
Syllabus Analysis Description: Syllabus analysis (as well as systematic review of textbooks, exams and other curricular material) involves looking at the current course syllabus (written or oral assignments, readings, class discussions/projects and course expectations) to determine if the course is meeting the goals and objectives that the instructor or deparment has set for it. Strengths and Weaknesses: Use syllabus analysis when you want to clarify learning objectives; explore differences and similarities between sections of a course; or assess the effectiveness of instructional materials. Syllabus analysis can provide invaluable information to enhance any assessment plan. However, this review is time consuming and, as there may be more than one reviewer, there may not be adequate consistency in collecting and analyzing the data. Additional Resources: Bers, T., Davis, D., & Taylor, W. (1996, Nov. -Dec.). Syllabus analysis: What are you teaching and telling your students? Assessment Update (8), 6, pp. 1-2, 14-15. Palombo et al. (2000). Assessment workbook. Ball State University. http://web.bsu.edu/IRAA/AA/WB/contents.htm. Walvoord, B. E., & Anderson, V. J. (1998). Effective grading. San Francisco: Jossey-Bass. White, E. M. (1994). Teaching and assessing writing. San Francisco: Jossey-Bass.
Transcript Analysis Description: Transcript analysis involves using data from student databases to explore course-taking or grade patterns of students. This tool can give you a picture of students at a certain point in their academic careers, show you what classes students took and in what order, and identify patterns in student grades. In sum, transcript analysis gives you a more complete picture of students actual curricular experiences. Specific information can be drawn from transcripts to help answer research questions, and course pattern sequences can be examined to see if there is a coherence to the order of courses taken. Strengths and Weaknesses: Transcript analysis is an unobtrusive method for data collection using an existing student database. This information can be linked to other variables such as sex or major, or used to measure outcomes. It is important to keep in mind, however, that course patterns may be influenced by other variables in students lives that dont show up on their transcripts. Also, solutions that arise from results of the analysis may not be practical or easily implemented. It is critical to have specific questions whose answers can lead to realistic change before conducting the analysis.
adapted from the California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999), and the Ball State University, Assessment Workbook (1999).
Additional Resources: Palomba, C. A., & Banta, T. W. (1999). Assessment essentials. San Francisco: Jossey-Bass. Ratcliff, J. L. (1992). What can you learn from coursework patterns about improving undergraduate education? In J. L. Ratcliff (Vol. Ed.), Assessment and curriculum reform: Vol. 80. New directions for higher education (pp. 5-22). San Francisco: Jossey-Bass.
Chapter 5
Analyzing, Reporting, and Using Results The purpose of this chapter
This chapter provides some guidance on the things to consider as you analyze and interpret assessment data. It is also designed to walk you through the process of defining an assessment report in terms of audience and needs, formatting the data for effective presentation, and distributing and sharing the results of your work.
Chapter 5
At A Glance
How do you approach data analysis and interpretation? How do you prepare and present an assessment report? What should you remember?
Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about.
An assessment plans value to the department lies in the evidence it offers about overall department or program strengths and weaknesses, and in the evidence it provides for change (Wright, 1991). The key factors in attaining the real value of all your work is to make the most out of the information you collect through appropriate analysis and interpretation.
Best Ways to Analyze and Interpret Assessment Information In its faculty handbook on program assessment, the University of California at Chico (1998) recommends:
Presenting data in relation to identified goals and objectives Selecting and using appropriate procedures for data analysis Using qualitative and quantitative methods to present a well-balanced picture of the program Keeping in mind the audiences who will access and use the data, and varying your analysis and reporting procedures according to the identified audience Preparing written statements that identify and elaborate on the pros and cons of the academic program Developing recommendations based on analysis of data, and using identified goals as a framework within which to accomplish these changes
Also consider the extent to which your findings can help you answer the following questions.
What do the data say about your students mastery of subject matter, of research skills, or of writing and speaking? What do the data say about your students preparation for taking the next step in their careers? Are there areas where your students are outstanding? Are they consistently weak in some respects? Are graduates of your program getting good jobs, accepted into reputable graduate schools, reporting satisfaction with their undergraduate education? Do you see indications in student performance that point to weakness in any particular skills, such as research, writing, or critical thinking skills? Do you see areas where performance is okay, but not outstanding, and where you would like to see a higher level of performance?
adapted from the Southeast Missouri State University, Busy Chairpersons Guide to Assessment (1997).
These are compelling and central questions for faculty, administrators, students, and external audiences alike. If your assessment information can threatening, when they are used shed light on these issues, the value of your efforts will become all the more apparent. for purposes other than originally Finally, assessment data can offer useful insight into department and program effectiveness when intended and agreed upon. carefully analyzed and interpreted in the context in which it was collected for overall program improvement. Data are misleading, and even threatening, when they are used for purposes other than originally intended and agreed upon. For example, data from assessment of student performance in a capstone course should be used to identify areas of strengths and weaknesses in student learning across the students entire experience in the major. In this way, these data guide curricular modifications and departmental pedagogical strategies. These data should not be used to evaluate the performance of the capstone course instructor.
The audience for your assessment results plays an important role in defining the purpose of the report(s) you generate. For example, if the primary purpose of your report is to help faculty members in the department identify ways to improve the major, you would focus on how the results inform curricular change and improvement. For a report to an external audience, your purpose is more likely to make a case for the quality of the educational experience students receive in your major, and highlight the programs particular strengths in fostering student learning, while also documenting the improvements made as a consequence of results.
Report Content At its most basic, your report should have enough information to answer five basic questions: 1. What did you do? 2. Why did you do it? 3. What did you find? 4. How will you use it? 5. What is your evaluation of the assessment plan itself?
noteworthy
Appendix 5-A lists additional questions to consider as you think about your assessment report.
OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 51
Format of the report A comprehensive, systematic department assessment report is not necessarily a formal written report complete with charts, tables and a structured process, though it can be. It may be as simple as a presentation to the department on major results, leading to further discussion about assessment; or it can be as complex as a formal report to the Provost on assessing learning outcomes in your program.
The audience(s) for your report will also affect your The audience(s) for your report will presentation methods. For some purposes it may be necessary to provide statistical analyses, direct quotes also affect your presentation methods. from interviews, and specific supporting evidence for conclusions made. For other audiences, a general summary of major findings and a discussion of changes made by the department as a result of the findings may be more appropriate.
52 OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst
The reality is that a department rarely has only one purpose for engaging in assessment. Therefore you may want to develop a number of reports tailored specifically to the audiences you need to address.
Formal Reports If you have decided to prepare a formal assessment report, your report should address each of the identified audiences, elaborating on the information you outlined in the table above. Your final report for the department might contain some or all of the following: discussion of why the assessment activity was undertaken description of the major, goals, objectives and intended learning outcomes description of assessment methods and choices, why they were used and how they were implemented explanation of how the analysis was done and what methodology was used presentation of major findings discussion of how results are being used for program improvement evaluation of the assessment plan/process itself: what worked and what did not work and why outline of next steps (programmatic, curricular, and assessment-related) appendix containing a curriculum analysis matrix, relevant assignments and outcomes, data collection methods, and other information or materials as appropriate Summary Reports Assessment reports do not necessarily have to be pages and pages of text and graphs to be effective. You may choose to prepare a report that briefly outlines your assessment program results. By highlighting the main points and significant results, you can convey in a very concise manner what you were trying to accomplish, what you did and did not accomplish, and what changes you will implement as a result. The following forms, from Nichols (1995), provide an example of a format for reporting results and action in short, summary form.
WORKSHEET
Goals:
Intended Educational (Student), Research or Public Service Outcomes, or Departmental Administrative Objectives
1.
2.
3.
4.
5.
Form A
Nichols, J.O. The departmental guide and record book for student outcomes assessment and institutional effectiveness (1995).
WORKSHEET
Intended Educational (Student), Research or Public Service Outcome, or Departmental Administrative Objective
NOTE: There should be one Form B for each intended objective/outcome listed on Form A.
Objective:
FIRST
___a. Means of Assessment & Criteria for Success:
SECOND
___b. Means of Assessment & Criteria for Success:
THIRD
___c. Means of Assessment & Criteria for Success:
5
___c. Use of Results:
Form B
Nichols, J. O. The departmental guide and record book for student outcomes assessment and institutional effectiveness (1995). OAPA Handbook PROGRAM-Based Review and Assessment UMass Amherst 55
Link results to original goals/objectives Report your results in the context of your original goals and objectives to most effectively demonstrate teaching and learning within your department. Assessment results mean little if your audience does not understand what it was you were trying to assess in the first place. Successful completion of goals and objectives should be highlighted. You can also use this opportunity to show how you plan to address program areas that still need work. In this way, even less-desirable results can be used to the departments advantage by telling your audience what steps you will take for improvement. There is a lot of help out there. It is important to keep in mind that you are not alone. Some units on campus are already using student outcomes assessment, and a number of colleges and universities across the country have implemented extensive system-wide assessment programs. There are also staff on campus who specialize in assessment and data collection and analysis. Sources and Resources in this handbook lists campus and on-line resources for getting help with this process as well as additional resources you can find in the printed literature.
Appendix 5-A
Questions to Guide Assessment Reporting
Reporting the Results
1. What were you trying to accomplish by using assessment in your department? 2. What assessment methods did you use? Why did you select these? 3. What was the most valuable thing you learned? 4. What are the three most important things you would like to share with others about your results? a. b. c. 5. How will it affect what you do with your departments courses and/or with program requirements?
On-Campus
Office of Academic Planning and Assessment 362 Whitmore Administration Building Martha L. A. Stassen Director of Assessment (413) 545-5146 [email protected] http://www.umass.edu/oapa Office of Institutional Research 237 Whitmore Administration Building Marilyn H. Blaustein Director of Institutional Research (413) 545-0941 [email protected] http://www.umass.edu/oapa Center for Teaching 301 Goodell Building (413) 545-1225 [email protected] http://www.umass.edu/cft
On-Line
American Association for Higher Education www.aahe.org California State University - San Bernardino http://academic-affairs.csusb.edu www.co.calstate.edu/aa/sloa ERIC Assessment Clearinghouse http://ericae.net/ Internet Resources for Higher Education Outcomes Assessment http://www2.acs.ncsu.edu/upa/archives/assmt/resource.htm Ohio University www.cats.ohiou.edu/~insres/assessments/ncaplan.html Penn State www.psu.edu/dus/uac/assessme.htm Southern Illinois University www.siue.edu/~deder/assess University of Cincinnati - Raymond Walters College www.rwc.uc.edu/phillips/index_assess.html University of Colorado - Boulder www.colorado.edu/pba/outcomes University of Michigan www.umich.edu/~crltmich/crlt.faq.html University of Nebraska www.unl.edu/svcaa/priorities/assessment.html University of Wisconsin - Madison www.wisc.edu/provost/assess.html Virginia Tech http://aappc.aap.vt.edu