0% found this document useful (0 votes)
28 views28 pages

Tech Assessment and Evaluation

eeee

Uploaded by

maha sourani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views28 pages

Tech Assessment and Evaluation

eeee

Uploaded by

maha sourani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

7

ISTE Standards
addressed in this
chapter
NETS-T 2. Design Digital-Age Learn-
ing Experiences and Assessments
Teachers design, develop, and
evaluate authentic learning experi-
ences and assessments incorporat-
ing contemporary tools and re-
sources to maximize content
learning in context and to develop
Lisa F. Young/Fotolia

the knowledge, skills, and attitudes


identified in the NETS-S. Teachers:
d. provide students with multiple
and varied formative and sum-
mative assessments aligned

Assessment and
with content and technology
standards and use resulting
data to inform learning and

Evaluation teaching.
Note: Standard 2.a is addressed
in Chapter 4, 2.b in Chapter 5,
and 2.c in Chapter 6.

W hen you think about assessments from your experiences as


a student, what feelings come to mind? How have different teachers Outcomes
in your school career used assessments? What were their purposes? In this chapter, you will
What were the benefits of those assessments for your teachers? n Develop assessments and
scoring practices that are aligned
What were the benefits to you as a student? with content and technology
Maybe you didn’t think about assessments as being beneficial standards.
when you were a student. But well-designed assessments are a n Incorporate a variety
of appropriate technology-based
critical part of the teaching-learning cycle and do more than just help resources for assessing learning
determine grades. As you will notice in the Stories from Practice box, into your instruction.
assessments can provide valuable information to students, teachers, n Use technology applications
to communicate student
and the larger community—information that can be used to inform performance data and report this
teaching and improve learning. Assessments have a wide range of evidence to parents,
uses including, but not limited to: administrators, and other
stakeholders.
· providing feedback to students on their progress toward n Use data from multiple
measures and resources to assess
achieving learning goals
student use of technology as
required by the NETS-S.

161
162 CHAPTER 7

· motivating students and providing opportunities to build


confidence
· monitoring the progress teachers are making in their curricula
· evaluating the effectiveness of instruction
· determining participation in supplemental or enrichment
programs
· providing information to parents, communities, and others in
the public about school performance
· comparing students and schools to others or to established criteria
· measuring progress of students or schools over time
ISTE NETS-T Standard 2 requires you to “design, develop, and evaluate
authentic learning experiences and assessments incorporating contempor-
ary tools and resources to maximize content learning in context and to
develop the knowledge, skills, and attitudes identified in the NETS-S.”
Standard 2.d further requires you to “provide students with multiple and
varied formative and summative assessments aligned with content and
technology standards and use resulting data to inform learning and
teaching.”

I recently visited a local middle school to inter- parents, many of whom were in two-income allow teachers to quickly determine whether
view teachers and the principal. This middle families and did not have the opportunity to they need to reteach content to the entire
school—a National Blue Ribbon School as visit the classroom as often as parents did class, a group, or an individual. Some of the
designated by the U.S. Department of Educa- when she began her career. To her, it had teachers in the school described how they post
tion—is known for its innovative and excep- “brought the parents into the classroom.” Now helpful web resources and even their class lec-
tional use of technology. The school incorpo- she routinely uses the system not only to post final ture notes or presentations on the website so
rates a web-based gradebook and reporting grades but also to inform parents of upcoming that both parents and students can use it to
software that allows teachers to post student assignments and to prompt greater home invol- support or supplement instruction outside of
assignments and grades as well as communicate vement when students need extra help or class. The veteran teacher reported that the
with parents online. When asked how this tech- encouragement with an assignment. The only parent of one of her students routinely down-
nology had affected her practice, one veteran negative feedback, joked the principal, had loaded her classroom presentations at work
teacher half-jokingly replied that she had been come from the students who groaned that their and that the whole office would look forward
brought into the technology world “kicking and parents now know “what they did in school” to reviewing them. Not only does she think that
screaming” and that she would give up her every day and are able to ensure schoolwork is this practice provides great public relations for
paper-based gradebook only “over my dead completed and supported at home. the school, but “it’s like that parent’s in the
body.” However, she then went on to elaborate The gradebook program allows teachers to classroom getting the same lesson her child
that despite her original reluctance to use the analyze and chart student performance indi- did that day.”
online gradebook, she couldn’t imagine teach- vidually, by group, or by class, and to send Source: John Ross
ing without it now. It had helped to break down confidential messages to parents with real-
communication barriers between her and the time grade data via e-mail. These data reports

Assessing Student Learning


As you learned in Chapter 6, assessment data can be used to set goals for student
learning, monitor learning through formative assessments, and evaluate learning
through summative assessments. Assessments, therefore, are an important part of
Assessing Student Learning 163

the learning process. How can assessment support learning? Classroom assessments
that promote learning (Assessment Reform Group, 1999, p. 7):
· involve sharing learning goals with students
· aim to help students know and recognize the standards for which they are
aiming
· involve students in self-assessment
· provide feedback that leads to students recognizing their next steps and how
to take them
· are based on the belief that every student can improve
· involve both teachers and pupils reflecting on assessment data
Have you experienced assessments that were specifically designed to facilitate
learning? Perhaps you have had projects or assignments reviewed by your teachers
or peers prior to being allowed to revise and strengthen them. Perhaps you engaged
in simulations that allowed you to practice the steps of an experiment before demon-
strating your understanding on an exam. Formative assessments can include video-
taped performances, electronic journals, checklists and rubrics, as well as many other
formats that allow you to practice or check your understanding prior to being evalu-
ated on the “big test.” Formative assessments may or may not have a grade assigned
to them but if they do, these grades are often used for the purposes of monitoring
student progress or providing a benchmark rather than as a final judgment.
As suggested above, feedback is critical to formative assessment. Your students
will benefit most when they receive feedback about the quality of their work and sug-
gestions for improvement—not just whether their responses were right or wrong. To
improve learning, they must also receive feedback about their progress during
instruction. Outcome feedback, knowing whether a response is correct or not, is
the simplest and most common type of feedback, but it provides little guidance to
students (Butler & Winne, 1995). Early drill-and-practice software often provided
this type of feedback, in which student responses were boldly acknowledged as “COR-
RECT” or “INCORRECT,” but rarely with an explanation why. For formative assess-
ment to achieve maximum benefit, feedback must provide an explanation of why an
answer is correct or incorrect, should support students when they are using appropri-
ate strategies and content knowledge, and guide them when they are not. Cognitive
feedback refers to feedback that helps students develop a better understanding of
what is expected for performance, how their current understanding or skill levels
compare to those expectations, and how they might improve their performances
(Butler & Winne, 1995).
It’s important that you don’t isolate the assessment strategies and tools that you
use in the Monitor and Evaluate stages of your instruction and just “tack them on” at
the end. The cycle of instruction is iterative and the stages overlap. As you will learn
in the next section, you plan your assessments as you set goals for student learning.
In addition, some of the assessments you develop may be part of your instructional
actions.

Aligning Assessments with Standards


In education, standards define what students should learn or be able to do after
instruction. Think of content standards as representing the goals we have for our
students. As such, they help shape what we do in the classroom. Typically, teachers
create specific learning activities based on general standards that were approved at
the state level and then translated into suggested units or objectives at the local
level. Standards promote equity by requiring that all students, regardless of gender,
ethnicity, or socioeconomic status, have similar opportunities to achieve the same
content and performance goals. However, the way in which these goals are met will,
by necessity, vary depending on your students’ unique needs. Although standards
provide a framework within which we, as teachers, operate, there is plenty of room
164 CHAPTER 7

Instructional Objectives
Many school systems require teachers to write instructional objectives, When considering the first part of an instructional objective, what
so it is useful to know how to do so. An instructional objective your students are going to do, content standards help determine the
describes what your students should be able to do when they have knowledge your students will need and the skill level(s) required to
successfully completed your lesson. The addition of the qualifying demonstrate that knowledge. The second part of an instructional
term, “instructional,” places it within the context of your classroom objective describes the conditions under which your students will per-
and emphasizes that it should be more specific than a state or national form the skills required by the standards, including the necessary mate-
objective that supports a content standard. The instructional objective rials and resources. The third part of an instructional objective
bridges the gap between the content standard and the assessment describes the criteria for a successful performance such as time limits,
measure. In other words, objectives can be thought of as specifications degrees of accuracy, or percentage correct. Examine the objective
for your assessments. For example, an objective and the associated below, which contains all three components:
assessment measure might look like this:
Objective: Using a reproduction of the architect’s blueprint of your
Objective: Given multiple sample checks, locate the bank routing school (the conditions), locate all restrooms and drinking foun-
number and customer account number on the checks. tains on the first floor (the performance) with 100 percent accu-
Assessment: Review the five blank checks located in the envelope. racy (the criteria).
Circle the bank routing number and underline the customer
Although content standards and objectives provide the focus for a
account number on each check.
good lesson plan, you still must use your experience, knowledge, and
An instructional objective has three parts (Dick, Carey, & Carey, other resources to develop a lesson that will help your students effec-
2005). These parts describe tively build the required content-related skills and knowledge. The
standards, objectives, instructional activities, materials employed,
1. What your students are going to do (the skills and knowledge
and assessment measures are interrelated and should support each
identified in the content standards and objectives)
other. In other words, they must all be “in alignment.”
2. How the students will do it (the conditions under which they will
For more information on converting content standards to objec-
perform the skills or behavior, including necessary materials
tives, see the textbook’s companion website.
and resources)
3. How you will know the students have done it successfully (the
criteria for a successful performance)

for interpreting standards at the local level. In other words, standards do not require
every teacher to teach every child in the exact same way.
Standards, also, do not routinely describe how students will be assessed. To
be more useful, standards must be converted to something more measurable. One
way to do that is to write instructional objectives (see Tools for Use); another way
is to develop your assessment instruments. Objectives and assessments are really
two sides of the same coin. Objectives describe the skills and knowledge your stu-
dents must master; assessments provide the specific means for them to actually dem-
onstrate those skills and knowledge. The assessment strategies and tools that you
select should be closely aligned with the learning goals you have developed for your
students; the content standards you have selected for your lesson and unit plans; and
the teaching strategies, activities, and materials used in your lesson. For example, if
your students will use science simulation software for an assessment, they must have
the opportunity to learn the skills that are needed to perform the simulation, includ-
ing practice with the software, if necessary, prior to completing the assessment. If
your students will compose a letter using word processing software, they must be
given adequate practice and experience not just in using a keyboard, but in using a
keyboard for letter-writing activities. Regardless of whether you actually create your
assessments prior to developing learning activities, assessment and learning activities
are closely linked and should be considered together.

Alignment of Standards and Objectives


Let’s review a few examples of content standards. Table 7.1 presents mathematics
content standards and one supporting objective for three grade levels. The standards
Assessing Student Learning 165

Table 7.1 Examples of Mathematics Content Standards and Associated Objectives


Grade Standard Objective
4 The learner will understand and use graphs, Describe the distribution of data using median, range, and mode.
probability, and data analysis.
8 The learner will understand and use graphs Approximate a line of best fit for a given scatterplot; explain the meaning of the
and data analysis. line as it relates to the problem, and make predictions.
12 The learner will collect and interpret data to Write and interpret an equation of a curve (linear, exponential, quadratic) that
solve problems. models a set of data.
Source: Adapted from Public Schools of North Carolina. (2006). North Carolina standard course of study.

in Table 7.1 were selected to demonstrate similar types of skills in each domain, but
at different grade levels. For example, the sample standards for mathematics are so
similar across grade levels, they use much of the same terminology:
Grade 4: The learner will understand and use graphs, probability, and data
analysis.
Grade 8: The learner will understand and use graphs and data analysis.
Grade 12: The learner will collect and interpret data to solve problems.
In these examples, the math standards and objectives relate to data use and graph-
ing. Although the global nature of the standards suggest which skills and knowledge are
necessary, they fall short in terms of specifics. Greater specificity is provided through
the supporting objectives. Although only one objective is listed for each of these exam-
ples, content standards often have multiple objectives that, in combination, make up
the requisite skills and knowledge your students must develop to demonstrate they
have mastered the standards. Together, the standards and objectives make up the cur-
riculum that your state asks you to teach. The complete standards and supporting
objectives for the eighth grade mathematics example is given in Figure 7.1.
To achieve curriculum alignment, you must develop lessons that are matched
to the standards. Similarly, you must develop or select assessments that allow stu-
dents to demonstrate content mastery at the level demanded by the standards. This
means a couple of things: first, that your assessments are aligned with your lessons
in terms of knowledge and skills, and second, that they are aligned in terms of the
format and materials used. For example, if your objective is for students to be able to
compose a five-paragraph paper and your assessment is completed in a computer lab

Standard 8.1:The learner will understand and use graphs and data analysis.

Objective 8.1.1: Collect, organize, analyze, and display data (including

scatterplots) to solve problems.

Objective 8.1.2: Approximate a line of best fit fot a given scatterplot, explain

the meaning of the line as it relates to the problem and make predictions.

Objective 8.1.3: Identify misuses of statistical and numerical data.

Figure 7.1
Eighth grade mathematics standard and objectives.
Source: Adapted from Public Schools of North Carolina. (2006). North Carolina standard course of study.
166 CHAPTER 7

Objectives

Knowledge
and Skills

Activities Assessment

Figure 7.2
Alignment among objectives, activities, and assessment.

using word-processing software, then your instruction needs to provide opportu-


nities for students to practice composing a five-paragraph paper using the word-
processing software. If you don’t provide access to the appropriate materials, then
you cannot expect your students to demonstrate mastery of the objective. By keeping
all of these components aligned, you increase the odds that your students will be suc-
cessful. As illustrated in Figure 7.2, curriculum alignment ensures that instruc-
tional objectives, activities, and assessment measures are all directed toward student
mastery of the same content knowledge and skills.

Developing Assessments
When you develop assessment instruments you need to consider the same three
components you considered when writing instructional objectives: 1) the behavior,
skill, knowledge, or attitude to be demonstrated, 2) the conditions under which
they will be demonstrated, and 3) the criteria that specify the level of performance
or knowledge required. As you may know, the behavior, skills, knowledge, or atti-
tudes your students are required to master are outlined in your content standards
(see Chapter 3). Performance conditions identify equipment, supplies, or other
resources, including technologies, which are allowed during the assessment of stu-
dent skills. They also include any time limits or other constraints imposed upon stu-
dents as they are demonstrating the skill. Criteria clearly describe what acceptable
performance looks like. Criteria can be conveyed through rubrics, checklists, or a sim-
ple list of what an acceptable answer should include (e.g., “Your report should include
five sources and no grammatical or factual errors.”). When shared with students
before embarking on an activity, students can set goals for their own performances
and can continue to monitor their performances throughout their learning, whether
creating a web page, writing an essay, or preparing a presentation for the class. They
can also determine the essential elements of complex projects, making the project
seem more approachable even to struggling students.
Whether an intended consequence or not, the assessment activities you select will
indicate to your students which skills and knowledge you feel are most important or
worthwhile. If you test factual knowledge, students will believe that facts are most
important; if you test critical and creative thinking skills, students will pay more
attention to developing these skills. So, if your goal is to create a learning environ-
ment that promotes deep understanding of content, what type of assessments
should you choose? How can you communicate the interest and excitement you
have for your content area, or even your interest in lifelong learning, through assess-
ments? Think about the messages you send to your students when you select and use
your assessment methods. Do these messages include a focus on meaningful under-
standing or are they focused on how to pass a test at the end of the year? How do
your classroom assessments communicate your expectations to your students?
There is no magic formula to determine the right assessment task or the appro-
priate technology for every performance goal or assessment task. In fact, a variety of
Assessing Student Learning 167

technologies and formats may fit any number of assessment tasks and can be used
equally well in multiple content areas. In practice, you should use a variety of assess-
ment formats and tools that are matched to your students’ learning goals and that
provide an adequate picture of student understanding.

Assessment Formats and Technologies That Support Them


There is little agreement regarding the classifications of assessment formats. In fact,
some complex assessments can blur the lines between different formats and include
elements of several. We group assessment formats into four broadly defined groups:
1) forced-choice assessments, 2) open-ended response assessments, 3) performance-
based assessments, and 4) project-based assessments. Characteristics of each of these
four broad categories and the technologies that support them are explored next.

Forced-Choice Assessment Formats


Review the continuum, presented in Chapter 1, that describes the common stages of
technology integration through which teachers progress. Remember that one of the
first ways that teachers build their skills with new technologies is to replicate familiar
strategies. This is true for both instruction and assessment, and in terms of assess-
ment, most people are familiar with forced-choice question formats. These are the
customary multiple-choice, true/false, matching, and fill-in-the-blank question for-
mats that are popular in large-scale exams and that have influenced the assessment
practices of many teachers.
A benefit of this assessment format is that it can be quick to administer and
score. Also, items can be readily developed or obtained in these formats, especially
for constructs with lower levels of cognitive demand. It is possible, but difficult, to
develop forced-choice questions that accurately assess skills and knowledge at a
higher level of cognitive skill, for example when students are required to apply, ana-
lyze, or evaluate content parameters. Depending on your level of technology profi-
ciency, and that of your students, there are a variety of technologies that support
the inclusion of forced-choice formats in your assessment toolkit.
You will probably have access to a variety of electronic test-item banks, including
ones that you and your colleagues develop. Item banks may be available from pub-
lishers of state-adopted resources, such as textbooks, or may be available for pur-
chase from assessment vendors. Some states and districts offer online access to
item banks that may include items from previously administered large-scale assess-
ments. Test-item banks may provide good formative and summative assessment
items for your own classroom.
Items for summative assessments must be kept secure; however, some states and
districts are tackling the development of online formative assessments that are avail-
able for access by teachers, students, and parents at all times on an on-demand basis.

New Trends in Computer-Adaptive Testing


Computer-adaptive testing (CAT) is based on item-response theory measured. In simple terms, it works like this: A question is presented
(IRT) and has the potential to create an optimal test for every individual at a level that is determined to be in the mid-range of the student’s
(Meijer & Mering, 1999). CAT has been applied to large-scale testing ability—although this determination can be difficult. When the student
by the military (Fletcher, 1992), admissions testing for higher educa- answers a question correctly, the following questions become progres-
tion (Bennett, 2001), and some statewide assessments (ETS, 2001). sively more difficult. When the student answers a question incorrectly,
CAT assessments attempt to present an appropriate number and the following questions become easier. The test continues until some
sequence of questions based on 1) a determination of the student’s predetermined level of accuracy is established (Wainer, 2000).
level of understanding or proficiency, and 2) the construct being
168 CHAPTER 7

These assessments capitalize on the growing number of online assessment systems—


both state-based as well as vendor-provided systems—that have been developed
since the late 1990s. Beginning with familiar formats, many online assessment sys-
tems began by replicating common paper-and-pencil formats, the most popular of
which is the forced-choice response (Bennett, 1998).
Of course there is a variety of testing software that supports forced-choice ques-
tion formats. Scannable test forms, available in several formats, are used in many
classrooms. As you probably know, students use a question booklet (or sheet) to
read the questions and indicate their answers by filling in the appropriate “bubbles”
on a form that can be read by a test scanner. Open-response questions (e.g., essay or
short answer questions) can also be included on some forms, but you’ll probably still
have to grade these responses yourself.
Online testing software is available in many schools and ranges from highly
secure formative and summative tests that are matched to state content standards
to editable forms you create yourself. Even if your school or district does not have
access to a commercial online quiz generator, several free services are available online
(see Figure 7.3). Many of these require you to register your students, often using a
student identifier, so that results can be e-mailed to you or accessed later. Check with
your school first before using any service in which student information is recorded.
A popular new technology that supports the use of forced-choice items is the
wireless responder. There are a variety of wireless tools that can be purchased and
used with or without additional hardware and software. These small tools look simi-
lar to television remote controls and allow individual students to “beam” responses
anonymously, or at least confidentially, to questions posed to an entire class. Simple
versions allow for students to select from one of four to six buttons (e.g., A, B, C, or
D depending on the corresponding answer choice) with more complex versions avail-
able that allow students to work out problems or enter calculated responses, such as
the solution to a math problem. The students’ responses are beamed using infrared
or radio frequency to a central source that is connected to a computer at the front of

Figure 7.3
Hot Potatoes is popular software for developing questions for students.
Source: © 2003–2008 Half-Baked Software Inc.
Assessing Student Learning 169

THE GAME PLAN


Forced-Choice Assessments

Set Goals Monitor


Learn more about one of the technologies used Determine how the assessments match the
to support forced-choice assessments. Choose demands of your content areas. Talk with
one of the technologies presented in this chap- other students in your class or with practicing
ter such as item banks, online assessments, or teachers to determine how the tools can be
wireless responders. used in your own teaching.

Take Action Evaluate and Extend


Review examples of these tools by investigating Discuss the strengths and weaknesses of the
them online, visiting a school that uses them, technology with others and determine strate-
reading about them in journals or magazines, gies for incorporating the technology in your
or attending an educational conference with a classroom.
vendor display area.

the room. These systems may also be connected to or built into some type of display
hardware such as an interactive whiteboard (see Chapter 5).
Detailed data collection, analysis, and reporting can occur instantaneously. This
type of live polling of responses is ideal for monitoring learning through formative
assessment and can help you and your students quickly determine content areas
that require further instruction or where there are obvious gaps in understanding.
Student data can also be stored and tracked over time to give you a picture of indi-
vidual student growth and the need for supplemental instruction.

Open-Ended Response Formats


Because it is difficult to write high-quality, forced-choice questions that truly tap into
higher-order thinking skills, open-ended questions are often employed when stu-
dents have to demonstrate higher levels of cognitive skill, such as the application or
synthesis of rules, procedures, or concepts or to demonstrate creative and original
thoughts and ideas.
What does the following bring to mind? “Explain in your own words what is
meant by . . . ” How about, “Compare and contrast the following . . . ”? These are
examples of prompts for open-ended questions in the form commonly known as
short-answer or essay questions. Responses may be restricted (“in 100 words or
less . . . ”) or extended for more in-depth responses. You’ve undoubtedly encountered
many of these questions in your career as a student and may have a few yet to expe-
rience. Although these formats are familiar, you should consider why and when to
use open-ended responses and their implications for your assessment practices.
On the plus side, open-ended questions often take very little time to develop. On
the scoring end, however, they usually take longer than forced-choice responses in
which there is only one correct answer that can be scored by machine. If not carefully
planned and inspected by peers and other content experts, open-ended questions can
be less reliable due, primarily, to the subjective nature of scoring them.
Any technology that supports text entry can be used to incorporate open-ended
text responses. But this extends beyond word-processing software, as a variety of
communication and collaboration tools exist that allow students to respond to your
queries. Students can extend their learning through participation in online commu-
nications via threaded discussions, e-mail, or even chat software. However, colloqui-
alisms familiar to the chat and IM (instant messenger) world often do not follow
standard language usage and may have limited application in settings where proper
170 CHAPTER 7

grammar or formatting is required. Depending on what you are assessing, you may or
may not want your students to express IMHO (in my humble opinion), LOL (laugh
out loud), or to wink ;- ) at each other. In this case, technology may actually lead to
superficial assessment activities.
Journaling is a common activity and can be used to facilitate self-reflection and
self-assessment. Journaling can be supported by a variety of technologies, such as
common word-processing software, designated journaling software, as well as blogs.
Many word-processing applications include note and tracking features that allow you
and your students to incorporate suggestions, options, and revisions within a single
document. They also include tools, such as grammar and spell checkers, that can sup-
port students with difficulties in these areas when critical thought outweighs the
need for mechanics. The use of grammar, spelling, and other writing tools are an
important part of communicating effectively when using information and communi-
cations tools in the 21st century.
Both stand-alone and web-based tools can incorporate video, graphics, and other
media that allow students to support their text-based entries. Web-based communi-
cations such as blogs or threaded discussions allow for an added layer of complexity
as you, other students, and even the original author can return to a posting and pro-
vide critique, clarification, or demonstration of further understanding or skill. This
type of collaborative journal can support peer review as a type of formative assess-
ment. Unique to threaded discussions and online forums is that messages can be
sorted and organized by parameters such as the author, date, or subject, making it
easy to follow a student’s argument, line of reasoning, and can even demonstrate
growth over time.

Performance-Based Assessments
There are a variety of ways students can demonstrate mastery through performance.
Performance-based assessments are possible in all content areas but may most easily
be exemplified by domains that require oral communication skills or the development
of psychomotor skills in conjunction with other content knowledge, such as sports,
the fine arts, and many lab sciences. Performances can very quickly extend beyond
the demonstration of rudimentary skill and can require students to demonstrate
very complex behaviors that may exhibit choices based on personal values and
creativity.
Oral communication is a ubiquitous teaching and assessment strategy. Teachers
ask their students questions to determine prior knowledge, levels of understanding
or misunderstanding, or simply to clarify a point. This type of questioning and dialog
can be informal or can be used in formal settings, such as in the case of an oral exam
or interview. And although this is a book about the use of technology, it’s important
to emphasize that technology should only be used when it facilitates learning, and
not simply as a novelty. Sometimes you’ll just want to ask a question. No technology
required. But there are some ways that technology can support assessment through
dialog, primarily through recordkeeping.
The ease with which digital video can be captured and edited allows it to serve as
a tool for demonstrating student skills and knowledge. As demonstrated by the hun-
dreds, if not thousands, of live early morning news shows at elementary schools
across the nation, even very young students can master basic video capture and edit-
ing to record their progress. Class or small group discussions can be captured and
stored in portfolios. Student presentations can be recorded and kept as a record of
content understanding. And although this section focuses on technology that sup-
ports student assessment, the value of using videotape to record your own teaching
as a means of evaluating and developing skills (as presented in Chapter 12) cannot be
understated.
Technology can provide a means for recording critical early literacy and numeracy
development—foundational skills for later learning. In early literacy classes, the
assessment technique of “running records” (see Figure 7.4) allows teachers to quickly
Assessing Student Learning 171

Figure 7.4
Wireless Generation uses handheld computers to allow teachers to gather and compile student response
almost instantaneously.
Source: http://www.wirelessgeneration.com/reading3d_demo3.html. © 2000–2008 Wireless Generation, Inc.

score student performance on short reading passages to gauge student fluency,


expression, accuracy, and confidence. Developed in a print-based world, running
records are now supported by handheld computers (PDAs), from companies such as
Wireless Generation, that provide the added benefit of quick data reporting. Running
records of an entire classroom of students can be completed in less than a class
period, and data from these records can then be aggregated and reported almost
instantaneously—something that takes much longer to accomplish by hand. You can
make changes to your instruction as needed, even daily if necessary, based on real-
time data collection. Handheld devices support other literacy assessments as well,
such as oral retelling and assessments of phonemic awareness and phonics develop-
ment. Whether based on formal or informal observations, many student information
systems allow you to quickly add notes to student records that can be accessed and
recorded on handheld and laptop computers, making student records dynamic.

Project-Based Assessments
Related to performance-based assessments is the creation of products to support
project- and problem-based learning, as introduced in Chapter 3. Although
project-based assessments are especially well suited for formative purposes, they are
also employed in summative settings, such as the generation of capstone projects at
the end of a year or course of study. There are a variety of methods for incorporating
project-based assessments in a classroom, as well as many different tools to support
them.
Project-based assessments are often linked to a category called authentic
assessments. In Chapter 3 we introduced you to the concept of authentic intellec-
tual work and described how authentic instruction used real-world contexts to
engage students in the actual work of a discipline. In an authentic assessment, stu-
dents are required to demonstrate understanding of concepts and perform skills
within the context of that authentic activity, that is, by replicating real-world perfor-
mances as closely as possible (Svinicki, 2004). In these cases, the assessments may be
so intricately embedded or linked to the instruction that it may not be apparent to
172 CHAPTER 7

students that there is a formal assessment. In science classes, students can perform
experiments using probes and other measurement devices that scientists use and
record their findings in a laboratory notebook—digital or paper; students can dem-
onstrate writing proficiency by using word processing and layout software to create
brochures or newspapers; students in math classes can use GPS devices to measure
buildings and land forms and generate scale drawings that show the application of
concepts such as area.
Wiggins (1998) lists six characteristics of an authentic assessment:
1. The assessment is realistic; it reflects the way the information or skills would
be used in the “real world.”
2. The assessment requires judgment and innovation; it is based on solving
unstructured problems that could easily have more than one answer and, as
such, requires the learner to make informed choices.
3. The assessment asks the student to “do” the subject, that is, to go through the
procedures that are typical to the discipline under study.
4. The assessment is done in situations as similar as possible to the context in
which the related skills are performed.
5. The assessment requires the student to demonstrate a wide range of skills
that are related to the complex problem, including some that involve
judgment.
6. The assessment allows for feedback, practice, and second chances to solve the
problem being addressed.
Project-based assessments, whether taking the form of authentic assessments or
not, can also support pedagogies related to problem-based learning. As described
in Chapter 3, problem-based learning, also called PBL, can help students meet the
demands of standards and learning goals that require higher-order thinking skills,
such as those related to problem identification, selecting and monitoring strategies
for solving the problem, applying knowledge, and evaluating the success of one’s
efforts. Problem-based learning can also utilize performance-based assessments, as a
well-designed problem can easily require students to demonstrate new knowledge
and skills through performance with the only limitations being the appropriate fit
to the content being explored.
Think of the typical science fair project. This is a project that is often built around
a specific problem, whether determined by the student or the sponsoring organiza-
tion. The problems often meet the requirements for authentic assessment as stu-
dents are required to perform like scientists in terms of the research they complete
and the tools they use. They also usually complete some type of performance in
terms of explaining their projects and their new understandings and skills that have
been developed. So in this case, you may have an authentic, problem-based project
that is assessed via performance and a product!
These distinctions are not as critical as developing assessments that appropriately
meet the demands of your curriculum and the needs of your students—it usually
doesn’t matter if you can explicitly state whether you are engaging in project- or
problem-based activities, or both. The bottom line is that project-based learning typ-
ically results in some type of product, perhaps a web page or a multimedia presenta-
tion, and it may or may not include some type of performance, such as an oral report
or class presentation using the products students have created.

Technologies to Support Performance- and


Project-Based Assessment
Products and performances can incorporate a wide variety of technologies, from
research papers composed using word-processing software to multimedia projects
that include graphics, video, and audio. Students may conduct research on the web,
Assessing Student Learning 173

use a digital camera to take pictures to support their presentations, create graphics
using drawing programs, and demonstrate their knowledge using presentation soft-
ware. Obviously, there are many tools you can use to support performance- or
project-based learning activities. In the following discussion, we’ll focus on just a
few of the many technologies that can help you assess student learning when using
performance- and project-based assessments: 1) concept maps, 2) simulations, and
3) portfolios and work samples.

Concept Maps
As you recall from Chapter 4, concept mapping is a graphic technique for represent-
ing student understanding. Concept maps traditionally consist of nodes representing
concepts and links that show the connection between nodes (see Figure 7.5 for an
example of one type of a concept map). Concept-mapping software is available for
use by young students; however, the concepts that can be addressed and the result-
ing maps can become extremely complex and thus, are often used more with older
students. Concept mapping is not dependent upon technology; but, concept-
mapping software is widely available for facilitating the process and a variety of
map templates are readily available, including Venn diagrams (see Figure 7.6), plot
outlines, timelines, lab reports, and many others. Popular concept-mapping software
can also toggle back and forth between the visual map layout and a corresponding
text-based outline, allowing students to generate and review content in graphic or
text form.
The nodes and links in a concept map help reveal student thinking and can illu-
minate misconceptions. Your students can compare their maps to those created by
experts—including yourself—and this comparison can provide specific cognitive
feedback essential to good formative assessment. Although paper-and-pencil concept

Granite Feldspar Basalt

Example Example
Example

Felsic,
mafic Magma
Composed of Formed by cooling
ultramafic Igneous

Metamorphic Types of Rocks Sedimentary

Composed of Formed by Composed of Formed by

Example Example
Quartz Quartz,
Example Example High heat that Example Example Breaking
sandtone feldsapr,
changes down rocks and
shale, lithics,
igneous rocks other particles
limestone matrix

State Quartzite Schist Sandstone Shale Limestone

Figure 7.5
Example of a concept map.
174 CHAPTER 7

some
sharks lay
evolved eggs
12-15 take in
million all dolphins oxygen
years ago bear live through
young the water
take in (some
oxygen sharks do)
evolved
through
40 million
the air
years ago
live in
open
skeletons skeletons
water as
made of made of
well as
bone cartilege
near and
offshore
upright
horizontal caudal
caudal lins lins

Dolphins Sharks

Figure 7.6
Example of a Venn diagram.

maps may also be analyzed for understanding and student misconceptions, the wide-
spread availability of concept-mapping software increases its utility. As such, it has
become a popular technology in classrooms at all grade levels.
One concern regarding using concept maps for assessment is how to implement
concept mapping consistently across classrooms to generate valid and reliable results.
Although technology can support concept mapping, it cannot resolve the human-
dependent implementation issue. Concept maps do rate high in terms of utility, how-
ever, and can be integrated into many different content areas.

Simulations
Simulation software can provide access to learning activities that might otherwise be
difficult or impossible to create in a classroom. The simulations available for use in
your classroom range in sophistication from simple animations that can be found
free online to sophisticated virtual environments. You can access online simulations
of weather-related events or space missions and collect student performance data
using an electronic journal or laboratory notebook. Students can access historical
simulations and work in small groups to create presentations to give to the rest of
the class, complete structured worksheets, or write reflective essays guided by ques-
tions you pose.
Simulation software is unique in that it not only provides opportunities for learn-
ing but it can support assessment, as well (see Figure 7.7). Simulation software can
capture different elements of student performance data, such as the paths they fol-
low through the software and the choices they make. That data can then be analyzed
and reported and used to make judgments about skill and knowledge proficiency.
It’s easy to imagine how games and virtual environments could be used for
assessment as well. You may have to generate scoring procedures for some simple
simulations or educational games that do not have data analysis and reporting fea-
tures built into them. You can combine the use of these software applications with
other assessment methods, such as journals or short quizzes, in order to check for
students’ progress on their learning goals. However, the selection and use of simula-
tions, games, or virtual environments for assessment should meet the same rigor for
Assessing Student Learning 175

Figure 7.7
This ballistics simulation has a simple form of self-assessment. Once the student correctly manipulates the
cannonball to hit the target, the target bursts into flames. Courtesy Oak Ridge National Laboratory.
Source: www.csm.ornl.gov

selecting any technology in your classroom and should appropriately match the needs
of the content, your students, and the learning goals they are trying to meet.

Portfolios and Work Samples


As you already know through the development of your own professional portfolio in
conjunction with this textbook, portfolios contain examples of work that can be com-
pared to competencies or standards—often through the use of some type of checklist
or rubric. There are varying opinions as to the different types of assessment portfolios
and what they are called. However, depending on the needs of your students and your
school, you may choose to incorporate one of two different types of assessments port-
folios or perhaps a hybrid of the two. The first demonstrates growth by incorporating
work samples that show a range of student proficiency over time and may be called a
dossier, process, or documentation portfolio. By including original and revised ver-
sions of work, portfolios can demonstrate changes in understanding or proficiency and
can show progress over time. The second type of assessment portfolio contains samples
of exemplary work and is often referred to as a showcase portfolio. This latter form of
portfolio is common for assessing entry into a program, such as the Advanced Place-
ment (AP) Studio Art General Portfolio Program used by the College Board, that com-
bines student art work, photographic slides, and a written student reflection to
determine whether student applicants receive college credit for their art skills.
Print-based versions of portfolios have made the transition to digital formats,
appearing on self-contained storage media such as CDs and DVDs and in web-based
portfolios. There is software designed specifically to support portfolio development,
but use caution when selecting this software. Be sure you take steps to ensure that
176 CHAPTER 7

students can continue to access their work after they’ve left your school or classroom.
You don’t want them to depend on proprietary software that is not readily accessible.
By creating student work samples in common digital formats, such as word-processing
documents, portable document formats (PDF), graphic files, and HTML pages, you
may be able to store these artifacts in a portfolio management system as well as pro-
vide copies to students for use in common software applications. In fact, as Helen
WEB LINK Barrett from the University of Alaska demonstrates on her portfolio website, just
Review Dr. Helen Barrett’s work on about any type of software can support the development of portfolios. As you learned
portfolios by visiting the textbook’s in Chapter 1, Dr. Barrett has created versions of her portfolio in word processing,
companion website. database, spreadsheets, and several multimedia applications and has posted them
on the web using HTML, graphics, video, and other formats.
When incorporating portfolios in your instruction and assessment, it is impor-
tant to determine an organizing structure in advance and make it apparent to your
students. For example, your own portfolio for this class is probably organized around
the ISTE NETS-T. To facilitate learning and student growth, your students them-
selves should determine what artifacts should be selected. However, the guidelines
and criteria for the selection of materials contained in the portfolio should be
explained and made clear to your students. And sometimes you may want all stu-
dents to include a specific artifact or work sample as a means of monitoring their
mastery of expected standards.
As you know, portfolios also serve as a means for self-assessment and self-
reflection, as it is common that students write or explain why they have selected
the artifacts in their portfolios, what those artifacts demonstrate in terms of their
learning or understanding, and why they feel they are exemplars. Your students will
likely need guidance and practice in creating reflections. (See information about cre-
ating self-reflections in Chapter 2.) Reflection provides a natural opportunity for pro-
viding cognitive feedback to your students as you review and discuss the portfolio
artifacts with them.
Both the process of artifact selection and the subsequent assessment of portfolios
rely on some subjective judgment. Also, portfolio use is not standard across class-
rooms. Your students’ portfolios will be highly dependent on your classroom prac-
tices, and not all teachers will place the same emphasis on portfolio development,
allot the same amount of class time for their development, provide the same access
to outside resources or help, or collaborate with students to the same degree in terms
of selecting relevant artifacts. However, the main point to remember is that although
portfolios pose problems when compared across classrooms, one of the primary ben-
efits of portfolio use is that they can help teachers and students within a classroom
become more systematic in analyzing and learning from one’s work samples than
would normally occur during instruction (Shepard, 2000).

Technologies to Support Project-Based Learning


Investigate the use of at least one technology tool, whether presented in this chapter or not, that
can be used to support the creation of a project in your area of content expertise and demonstrate
student mastery of content standards.

1. Identify a technology you have little experience with and find a sample to use on your own.
You may be able to download freeware or trial versions of concept-mapping software,
check out software or hardware from the library, or borrow from a local school.
2. Develop a small-scale project using the tool that demonstrates how you might incorporate
this into your own teaching. How is the tool helpful? What limitations does it have, if any?
What resources or training are required to use it fully?
3. Share your findings with others in your class.
Scoring Expectations and Practices 177

Summary of Assessment Formats


Formative and summative assessments can be classified as forced-choice, open-
ended, performance-based, or project-based. This section introduced you to a variety
of technologies that can support these formats. But assessments are useless unless
you score them to evaluate students’ progress and determine whether your students
learned what they set out to learn. Common scoring procedures are the focus of the
next section.

Scoring Expectations and Practices


Once you’ve selected an assessment or assessment format, you must determine
a method for obtaining an appropriate score for each student’s performance.
Scores, or grades, receive a great deal of attention in education and you want to be
sure that your assessments not only adequately allow students to demonstrate
their proficiencies, but that the judgments you make based on those assessments
are fair and accurate. Some scoring practices and guidelines are provided below.

Scoring Keys
When an assessment allows you to make an objective judgment as to whether a
response is right or wrong, grading is rather straightforward. Your student either
got it right or not. When students are required to choose between a limited set of
possible answers, as with multiple-choice, matching, and true-false questions, grading
guidelines consist of a list of correct responses. These forced-choice assessment for-
mats easily lend themselves to computerized test administration and scoring.
A popular technology for scoring forced-choice responses is a test scanner. These
scanners use the famous “bubble sheets” with which you are undoubtedly familiar.
You can create multiple test forms to increase security and a grade scanner can score
all forms within a few minutes. Test scanners also allow you to do an item analysis of
an assessment to determine whether any items were too easy or too hard, which can
help you improve the match between your standards, instruction, and assessment.
Some open-ended test formats require constructed responses that also can be
judged right or wrong. Short-answer questions and even some essay tests can be
scored using keys that consist of a list of acceptable answers. Scoring keys for these
open-ended formats should include common variations in wording for each accept-
able response. Although this type of test can also be administered and scored by com-
puter, developing a program to score constructed responses is more difficult than
developing one to score forced-choice responses.
You may have already taken essay tests that have been scored by a computer with-
out realizing it. At one time, these student responses were scored by multiple human
raters who had to undergo intensive training and were reviewed frequently for their
consistency and reliability. In several state and national tests, essays are now scored
strictly by technology using artificial intelligence. Vantage Learning has developed
software it calls IntelliMetric that uses artificial intelligence (AI) and examples of stu-
dent writing to review and score student writing assessments. The computer scoring
has been compared to and often surpasses the reliability of human scoring. The prob-
lem is that the scoring engine requires hundreds if not thousands of student samples
to provide the best results, so is still impractical for use in most classrooms. However,
some states that have shifted to the assessment of student writing online also provide
websites that teachers and students can use to practice and receive feedback.

Rubrics and Checklists


This chapter has introduced a variety of assessments that cannot be scored neatly by
a completely objective format. Class discussions, observations, open-ended questions
and essays, problem-based learning projects, and portfolios offer unique challenges to
178 CHAPTER 7

scoring. Usually, teachers turn to the use of checklists or rubrics to score these types
of assessments and to reduce the subjective quality of their judgments.
Checklists are a simple way to score the observation or demonstration of a skill.
These can be factual recall skills or more complex skills involving analysis and evalu-
ation. You can create a simple yes-no checklist in which the parameter or skill you are
looking for is deemed either present or not. For example, the science lab checklist in
Figure 7.8 simply lists standard safety behaviors students should follow.
A more complex checklist can use a scale that indicates the degree to which the
parameter you are looking for is present, from a little to a lot. A checklist can help
you monitor your students’ cognitive development. That is, you can determine gaps
in student schema (or patterns of understanding), which can help you determine
whether they need reteaching or other supplemental instruction. From a develop-
mental perspective, you can monitor what your students are able to do indepen-
dently and how their skills develop over time.
You can also use checklists for specific assessments, such as when observing a stu-
dent demonstrating required concepts during a think-aloud. In the think-aloud pro-
cess, students verbalize what they are thinking when completing a task, such as read-
ing a story, solving a math proof, or even working on complex problems that may
require advanced skill in collaboration with others. The think-aloud process helps to
generate artifacts of student cognition, as students are actually telling you what they
are thinking, the strategies they are using, and questions and concerns they have. The
chart in Figure 7.9 shows a simple method for collecting data from a think-aloud pro-
tocol used to monitor comprehension of a reading passage. Checks are placed in the
corresponding column for each time a behavior is observed. Comments can be
inserted in the final column and are not necessary for a simple performance audit.
Checklists can also be used to score projects or performances. In this case, some
care should be taken to determine whether the presence of an item on the checklist
actually corresponds to student understanding and skill. Are you grading the student
or the project? Multiple measures of assessment may help you develop a clear picture
of student understanding.
Checklists can be developed and implemented using a variety of technologies, and
since their purpose is to quickly and easily collect data, the use of technologies to
capture, store, and report that data makes them even more powerful. Some simple
checklists may be available for use on PDAs or tablet computers that allow you to

Science Lab Checklist


During science labs, please make sure that you
q Store backpacks and personal items in the storage area at the back of the room.
q Log on to the school network appropriately and keep your password and data files secure.
q Do not eat or drink in the laboratory.
q Restrain long hair and loose clothing and wear laboratory aprons when appropriate.
q Wear gloves and keep exposed skin away from all chemicals.
q Send only one lab partner to the supply room at a time.
q Close laptops before transporting them and carry them with two hands.
q Put all laptops, probes, and other equipment back in their designated storage spaces.
q Thoroughly clean all work surfaces and equipment after each use.
q Make certain hot plates, burners, gas, and water are turned off before leaving the
laboratory.
q Frequently back up all lab data on your designated folder, including journal entries and
worksheets.
q Sign off from the network at the end of class.

Figure 7.8
Science lab checklist.
Scoring Expectations and Practices 179

enter data onscreen with a stylus. And although not every helpful checklist or inven-
tory is currently available for handheld devices, common productivity software, such
as word-processing and database software, is supported by many of these devices and
can allow you to quickly develop those checklists you use regularly for assessment.
Storing the results of checklist data in spreadsheets and databases provides powerful
analysis and reporting features. Simple summaries and graphs can be created quickly
that give you individual and group profiles, and also can help you determine the need
to modify your instruction for reteaching or enrichment. Portable devices can also
support many common observation tools, such as checklists of content-based beha-
viors and interview protocols.
Generally, checklists are rather one-dimensional. Usually, either the students did the
tasks or they didn’t. Rubrics, however, provide an added dimension that allows both you
and your students to determine gradations in quality. Rubrics are common methods for
assessing performance-based projects, especially those supported by technology. Rubrics
are malleable and can be created for any content area and assessment mode, such as the
scoring of projects, essays, portfolios, or live or videotaped student performances.
Rubrics are framed by some type of scale, but the degrees of the scale are clearly
described or defined to demonstrate different levels of quality. Generally, a three-,
four-, or five-point scale is manageable depending on the complexity of the task, proj-
ect, or performance to be scored. Too many “quality” levels for a simple skill or too
few for complex skills erode the effectiveness of the rubric. Another consideration is
whether to begin your rubric scale at no points (0) or 1. Your rubric should relate to
the standards or learning goals for the activity, lesson, or project and the descrip-
tions should clearly describe the levels of performance rather than subjective judg-
ments (Brookhart, 1999). For example, a descriptor for an exemplary writing sample
that notes that all sentences and proper nouns begin with a capital letter is a clear
description of the expected level of performance; whereas, use of the terms “good”
or “weak” are subjective and provide little concrete feedback or a justifiable position.
Rubrics can be analytic or holistic (Brookhart, 1999). An analytic rubric breaks
the assessment down into component categories (see Figure 7.10). For example, an
analytic rubric for a student history presentation may include categories about accu-
racy of information, proper grammar and spelling, writing style, as well as elements
of design. A holistic rubric may have descriptors that touch on each of these ele-
ments but it does not break them down into separate rating scales per category (see
Figure 7.11).
Rubrics have become popular due to their valuable pedagogical aspects. Rubrics
can help you determine the activities and resources needed in your instruction.
Since rubrics define the different degrees of exemplary products and performances,
they are likely to delineate the critical skills and knowledge necessary for mastery.
For example, a rubric that defines excellence regarding the appropriate citation of
Internet resources requires your students to 1) find and evaluate appropriate web-
based resources and then to 2) cite them according to an accepted standard. If

Did the student…


Check for Comment
each
instance
Make predictions
Use imagery by describing pictures
Link new information to prior knowledge
Talk through confusing points
Use comprehension strategies

Figure 7.9
Collecting data using a think-aloud checklist.
180 CHAPTER 7

Multimedia Project Rubric


Assignment: Interview a friend or relative to create a biographical web page/site. You must collect the following
information:
· Date and place of birth
· Your reason(s) for interviewing this person
· Most memorable event in his/her life
· An accomplishment for which he/she is most proud
Your web page/site must contain at least five paragraphs of text, one image, and one hyperlink to a supporting
resource. Any quotations must be correctly formatted and appropriately cited.

Score 0 1 2 3
Content Several of the required Some of the content ele- All of the content elements All content elements are in-
features content elements are not in- ments are not included or are included. cluded and are explained
cluded or are inaccurate. are inaccurate. with significant detail and
supporting data.
Grammar There are many errors in There are a few errors in There are one or two There are no errors in
and grammar and punctuation. grammar and punctuation. errors in grammar and grammar and punctuation.
punctuation punctuation.
Writing style The writing is very difficult to The writing is somewhat The writing is easy to read The writing is both interest-
read throughout with little or difficult to read in some with some variation of sen- ing and easy to read with a
no variation in sentence points with little variation in tence structures and appro- variety of sentence struc-
structure and vocabulary is sentence structure and vo- priate grade-level tures and appropriate and
below grade level. cabulary is below grade vocabulary. challenging vocabulary.
level.
General The information is difficult to Some of the information is The information is presented The information is pre-
design view and text is difficult to difficult to view and/or text clearly with text that is easy sented in a creative manner
features read. The use of images, may be difficult to read. The to read. The use of images, with text that is easy to
colors, and other media use of images, colors, and colors, and other media read. The use of images,
consistently detract from the other media may detract does not detract from the colors, and other media is
presentation of the from the presentation of the presentation of the imaginative and adds to the
information. information in some information. presentation of the
instances. information.

Figure 7.10
Analytic rubric for multimedia project.

they’ve never done this before, your rubric reminds you to provide them with this
knowledge before they can successfully meet the required criteria.
Another pedagogical value of rubrics is realized when they are jointly developed
with students. Although this does take some time and you may not choose this
approach each time you create a rubric, the process helps students develop skills in
determining what constitutes best performance. Providing students with examples of
differences in quality of performance or products can help them grasp the differences
between various degrees of acceptable performance—for example, a 3-point and a 4-
point performance. Students can then apply this understanding when creating their
own products or performances.
Rubrics also provide a mechanism for providing detailed feedback to students
(Andrade, 2005). They set expectations at the beginning of your lesson so students
can better set their own goals for performance. They also provide support for forma-
tive self-assessment and peer assessment so that students can receive critical support
for determining their levels of performance and either continuing with or selecting
different strategies to complete their projects. Underlining or circling critical ele-
ments in the descriptors in a rubric can be much quicker than generating detailed
feedback for every student. A descriptor for exemplary performance that notes that
“there are no spelling errors” when compared to a descriptor that states “there are 2
or 3 misspelled words” gives the students a real measure for determining excellence.
Scoring Expectations and Practices 181

Multimedia Project Rubric


Assignment: Interview a friend or relative to create a biographical web page/site. You must collect the following
information:
· Date and place of birth
· Your reason(s) for interviewing this person
· Most memorable event in his/her life
· An accomplishment for which he/she is most proud
Your web page/site must contain at least five paragraphs of text, one image, and one hyperlink to a supporting
resource. Any quotations must be correctly formatted and appropriately cited.

3 Points—All content elements are included and are explained with significant detail and supporting data. There
are no errors in grammar and punctuation. The writing is both interesting and easy to read with a variety of sen-
tence structures and appropriate and challenging vocabulary. The information is presented in a creative manner
with text that is easy to read. The use of images, color, and other media is imaginative and adds to the presentation
of the information.

2 Points—All of the content elements are included. There are one or two errors in grammar and punctuation. The
writing is easy to read with some variation of sentence structures and appropriate grade-level vocabulary. The infor-
mation is presented clearly with text that is easy to read. The use of images, colors, and other media do not detract
from the presentation of the information.

1 Point—Some of the content elements are not included or are inaccurate. There are several errors in grammar and
punctuation. The writing is difficult to read in some parts with little variation in sentence structure and vocabulary is
below grade level. Some of the information is difficult to view and/or text may be difficult to read. The use of
images, colors, and other media may detract from the presentation of the information in some instances.

0 Points—Several of the required content elements are not included or are inaccurate. There are many errors in
grammar and punctuation. The writing is very difficult to read throughout with little or no variation in sentence struc-
ture and below grade-level vocabulary. The information is difficult to view and text is difficult to read. The use of
images, colors, and other media consistently detract from the presentation of the information.

Figure 7.11
Holistic rubric for multimedia project.

However, Shepard (2000) also cautions that simply providing explicit criteria may
not truly promote student learning if students learn to mechanically address the cri-
teria without actually developing the relevant skills or knowledge. She suggests that
students be allowed to use self-assessment as a way to understand what the criteria
mean, not just to apply them mechanically.
Rubrics can be created using commonly available software applications. There are
also websites that not only allow you to enter your descriptors to automatically gen-
erate a rubric but that house rubric examples and templates you can use for guidance.
The popular Rubistar website by 4Teachers allows you to quickly create, customize,
and save a rubric. If you are just starting out with creating rubrics, the Rubistar WEB LINK
engine can even suggest descriptors for each level of your rubric for a range of com- Visit the textbook’s companion
mon teaching models, such as the 6+1 Trait Writing Model, or even for specific skills, website for resources to help you
such as the use of manipulatives or the explanation of mathematical concepts. In create rubrics for use in your
addition to using this and similar web-based rubric generators, you may want to col- classroom.
laborate with other teachers in your school or district to develop rubrics based on
your state’s content standards. Joint development and use helps to improve the
validity and reliability of your assessments and rubrics. Rubric templates, examples,
and actual rubrics matched to lesson and unit plans can be stored electronically on
shared directories or within lesson-planning software.
182 CHAPTER 7

THE GAME PLAN


Rubrics for Assessment

Set Goals Monitor


Determine how rubrics and digital exemplars Determine whether you can successfully gener-
of student work will be accessible to students ate methods for sharing artifacts based on your
and parents. own rubric. How can you keep critical informa-
tion secure? What methods support collabora-
tion and further reflection?
Take Action
Review resources that provide guidance for
developing rubrics to identify methods that Evaluate and Extend
can be used to share artifacts developed from Extend your findings to classroom settings.
them. If necessary, visit the resources for rubrics Develop guidelines for the sharing of informa-
on the textbook’s companion website. You may tion from rubrics as well as methods for
want to consider how you would share artifacts explaining student mastery of content stan-
from your own rubric as an example. dards to parents.

Summary of Scoring Expectations and Practices


Whether used for formative or summative assessments, scoring practices often
involve scoring keys, checklists, or rubrics. Forced-choice and some open-ended
assessments can be scored using scoring keys. Other open-ended assessments, as
well as performance and product-based assessments, usually require the use of check-
lists or rubrics. The results of these assessments can be used to provide students with
feedback as to their learning progress and provide you with feedback as to the effec-
tiveness of your lesson.

Collecting, Analyzing, and Reporting Student Data


Once you have scored your assessments, you typically need to report the results to a
variety of other stakeholders, especially parents and school administrators. Maybe
you still own a few handwritten report cards, perhaps from your earliest years of
school. These are artifacts of a bygone era. Electronic gradebooks not only save teach-
ers hours of time from having to provide handwritten summative reports at the end
of a grading period, but also make it easier to generate early reports to inform par-
ents of student success as well as poor performances. Stand-alone gradebook soft-
ware may allow you to print progress reports for an entire class in a few minutes,
whereas online gradebooks can provide reports on-demand. As you recall from the
Story from Practice that began this chapter, online gradebooks can support frequent
and consistent communication between home and school, often through secure
e-mail or other messaging tools. The greatest boon is the early identification and
reporting of student difficulties that can lead to identifying helpful interventions
before the problems become insurmountable.
Electronic gradebooks can provide confirmation of trends in student performance
early in the instructional process (see Figure 7.12). Although you will undoubtedly
have some indication that some students are having difficulties, based on their daily
performances and participation, some electronic gradebooks provide visual cues and
organizational features to help identify low performing students on a daily basis.
Graphing features are common in many electronic gradebooks and student perfor-
mances can often be visually represented in easy-to-read, colorful line or bar graphs
with a few clicks. You can use these features to check the performances of individuals
or groups of students over time or to compare performances across groups. You may
Collecting, Analyzing, and Reporting Student Data 183

Figure 7.12
Electronic gradebooks typically include powerful graphic analysis tools. Courtesy Edline.

want to track select groups of students, such as students formally designated as


“at-risk.” Networked gradebooks may allow you and your colleagues, such as an IEP
team, to track the progress of an individual student or groups of students across mul-
tiple classes using real-time data. Try doing that with paper gradebooks!
Unlike paper-based gradebooks, electronic gradebooks provide more than just a
place to store grades. They can often support data analysis and reporting, as well.
Final grade averages are easily calculated and are often available at all times through
the grading period. Different types of grades, such as weighted grades or grades for
multipart assessments, are easily included and assigned appropriate significance
without the need for a calculator. Using predetermined formulas, student grades in
a class will always be calculated the same way, reducing the possibility of error. As
long as you set up the weights and averaging criteria correctly, you have less chance
for incorrectly reporting student grades.
Online student information systems (SIS) can store and report more than grades.
They can help record student attendance. They can present longitudinal data about
student performance as well as identify specific student needs. They can link to cafe-
teria and health records. They can indicate class and transportation schedules, such
as the appropriate buses students should ride. Some student information systems
can be accessed by teachers and staff throughout the building, sometimes through
wireless handheld devices, such as PDAs. Some may even allow you to include a pic-
ture of each student. All of these data are important for better understanding how
and why your students perform the way they do.
If your school or district has not adopted a networked student information sys-
tem, you can still use digital technologies to record a great deal of data. Electronic
collections of student performance, attendance, and related data can be as simple as
keeping records in a directory (folder) on a file server. Even stand-alone (as opposed
184 CHAPTER 7

Reporting to Stakeholders
1. Investigate common features of student information systems (SIS). You can do this by
investigating product websites, reviewing product descriptions in journals and magazines,
visiting a local school that utilizes one, or attending an educational conference and view-
ing vendor demonstrations. You may even consider asking for a SIS demonstration for you
and others in your class from a product representative. It may be best to divide and con-
quer by having teams or groups of students investigate different products and share their
information.
2. As you review various systems, consider the following questions:
· What technology skills, required by the SIS, are you confident with and which ones
would require practice or training?
· What kinds of data are collected by the SIS? Can you include standards or learning
goals and student progress toward those standards?
· What kind of progress reports can be created for sharing with students in class or with
parents, such as in a parent conference?
· How is two-way communication supported between you and stakeholder groups? Does
it provide options for families with limited Internet access or for having to export data in
different formats?
3. Compare and discuss your findings with other members of your class.

to networked) electronic gradebooks often include the opportunity to record student


attendance and to create anecdotal notes. Records such as these, indicating days
when students behave uncharacteristically, can be compared to other records of stu-
dent data that may be stored in different offices throughout the school. A variety of
data are sometimes required to fully understand student performance; new digital
technologies often provide a means for schools to share pertinent student data with
faculty and staff.

Evaluating Students’ Uses of Technology


In a textbook designed to help you develop your technology skills for teaching, it
seems natural that you would also consider how you will evaluate the technology pro-
ficiency of your students. It is important to know your students’ technology profi-
ciencies so that you can plan and implement appropriate learning activities and
assessments. Although many technologies have found their way into the classroom,
few have had as large an impact on teaching and learning as computers. As you
learned in Chapter 1, networked computers and the skills related to them, commonly
referred to as information and communications technology, or ICT, have also gener-
ated national and international discussions about what it means to be “technology
literate” and the role schools play in helping students become so.
The importance of technology proficiency has received greater attention through
recent legislation. The latest reauthorization of the Elementary and Secondary Educa-
tion Act, commonly called No Child Left Behind (Pub. L. No. 107-110), specifically calls
for all students entering high school to be technologically literate. The law falls short,
however, on offering any guidance on how students will demonstrate that profi-
ciency, any suggestions for assessments that can be used, or how the data will be col-
lected and reported. Although this legislation makes technology proficiency a national
priority for schools, this shifting definition of technology literacy or proficiency is dif-
ficult to pinpoint. Specifically, Title II, Part D, Goal 2 (a) of the law reads as follows:
To assist every student in crossing the digital divide by ensuring that every student is
technologically literate by the time the student finishes the eighth grade, regardless of
the student’s race, ethnicity, gender, family income, geographic location, or disability.
Evaluating Students’ Uses of Technology 185

The importance of having students prepared to live and work in a technology-


dependent environment has prompted several organizations to make attempts to
hit the moving target of defining technology proficiency. As you are well aware,
ISTE established technology standards for students (see Figure 1.7 in Chapter 1)
and some version of these standards (or the original ISTE NETS-S) have been
adopted by many states. These standards provide broad guidelines of what students
should know and be able to do with technology.
Once the states adopted the ISTE NETS-S or created their own standards, they then
set out to develop assessments to determine whether students could demonstrate them.
Early tests mirrored the early computer curricula previously described in Chapter 1 that
focused heavily on recall of facts, such as important milestones in the history of compu-
ters or the names and functions of hardware components. Just as the rapid evolution of
technology changed curricula and the definition of technology proficiency, many state
assessment programs for computer literacy or technology literacy disappeared. Some
states have dropped their student technology standards altogether and instead have
embedded technology proficiencies within content standards. Whether stated or
implied, many states also encourage the use of appropriate technologies to support the
mastery of content standards rather than isolating technology skill development.
To demonstrate the embedded nature of technology proficiency within content
standards and how your assessments can help you monitor and evaluate that profi-
ciency, let’s review a few content standards and analyze them for explicit and implied
technology proficiencies. Figure 7.13 provides excerpts of English content standards
focused on writing at the middle school level from three different states. (Please note
that these are not complete standards and have been formatted for consistency of
presentation.) Notice that some technology tools are specified in some of the

California Florida Texas


Students write clear, coherent, and focused The student writes to communicate ideas Writing/processes. The student selects and
essays. The writing exhibits students’ and information effectively. uses writing processes for self-initiated and
awareness of audience and purpose. Essays assigned writing. The student is expected to:
contain formal introductions, supporting · Writes text, notes, outlines, com-
evidence, and conclusions. Students prog- ments, observations that demon- · Generate ideas and plans for writ-
ress through the stages of the writing process strate comprehension of content ing by using prewriting strategies
as needed. and experiences from a variety of such as brainstorming, graphic
media. organizers, notes, and logs.
· Create compositions that establish a · Develop drafts by categorizing
· Organizes information using
controlling impression, have a ideas, organizing them into para-
alphabetical, chronological, and
coherent thesis, and end with a clear graphs, and blending paragraphs
numerical systems.
and well-supported conclusion. within larger units of text.
· Selects and uses appropriate for-
· Establish coherence within and · Revise selected drafts by adding,
mats for writing, including narra-
among paragraphs through effec- elaborating, deleting, combining,
tive, persuasive, and expository
tive transitions, parallel structures, and rearranging text.
formats, according to the intended
and similar writing techniques. · Use available technology to support
audience, purpose, and occasion.
· Support theses or conclusions with aspects of creating, revising, edit-
· Uses electronic technology includ-
analogies, paraphrases, quota- ing, and publishing texts.
ing databases and software to
tions, opinions from authorities, · Refine selected pieces frequently to
gather information and communi-
comparisons, and similar devices. “publish” for general and specific
cate new knowledge.
· Plan and conduct multiple-step audiences.
information searches by using · Select and use reference materials
computer networks and modems. and resources as needed for writ-
· Achieve an effective balance ing, revising, and editing final
between researched information drafts.
and original ideas.
Note: These standards have been reformatted for consistency.

Figure 7.13
Comparison of three states’ standards and objectives for writing.
186 CHAPTER 7

objectives in each standard for each state. But many of the other content standards,
although not mentioning specific technologies, can easily be demonstrated by writing
projects that utilize common technology-supported activities, whether electronic
journaling, using a word processor, creating an electronic presentation, developing a
web page, or participating in a threaded discussion. The choices you make for doing
so are governed by your content knowledge and technology proficiency, as well as the
technology proficiency of your students.
Over time, many of these standards for writing—like many content standards—
will remain the same. The technologies, however, will change. By selecting technologies
appropriate for the assessment of these content standards, you are also providing the
opportunity to evaluate your students’ technology proficiencies. As you can see from
reviewing these example standards, you may do so explicitly or by embedding tech-
nology proficiencies in the way you teach and your students learn relevant content.
ISTE NETS-T Standard 2 requires you to “design, develop, and evaluate authentic
learning experiences and assessments incorporating contemporary tools and
resources to maximize content learning in context and to develop the knowledge,
skills, and attitudes identified in the NETS•S.” Furthermore, you are expected to “pro-
vide students with multiple and varied formative and summative assessments
aligned with content and technology standards and use resulting data to inform
learning and teaching.” Throughout this chapter, we have been preparing you to do
this. With the shifting emphasis from isolated technology standards to embedding
technology proficiencies in content standards, your decisions for assessing technol-
ogy proficiency fall in line with assessing students’ mastery of your content standards
as a whole. As you develop and implement lessons and units, the technologies you
select for your activities and assessments should be representative of those embed-
ded in your content domain. Assessing student technology proficiency then becomes
part of monitoring and evaluating how well your students achieve your learning
goals.

New Trends in Technology-Assisted Assessments


of Technology Proficiency
Some interesting assessments of technology proficiency have been developed that
rely on some of the capabilities of technology to create unique learning and assess-
ment environments. Most of these assessments occur in one of two different types of
simulated environments.
The first type of simulated environment is the re-creation of common software
tools based on general features. These simulated applications allow the user to per-
form common tasks, such as creating and saving documents, creating a calculation in
a spreadsheet application, finding information from a digital data source, or creating
and sending electronic mail. Not developed solely for measuring student proficien-
cies, these simulated programs can be used to assess the proficiencies of teachers as
well. Although the simulations are generic, many of the skills tested are still basic
operational skills and many still incorporate forced-choice question formats. Some
do allow users to complete the tasks by more than one strategy, such as using short-
cut keys, menu commands, or buttons on a toolbar to complete the same cutting and
pasting task.
The second form of simulation truly starts to draw on the power of technology to
create assessments that would not otherwise be possible with paper and pencil. These
assessments also use simulations of common software, such as web browsers, e-mail
programs, and word processors, but students use these tools to tackle complex prob-
lems. Not only do they have to demonstrate basic operational skills, but they need to
show how they use these technologies to solve the problems. Although not available
for widespread use, a few prototype assessment systems of this nature have been
developed and offer exciting possibilities for creating highly authentic and engaging
assessments in all content areas—not just for determining technology proficiency.
Chapter Summary 187

Chapter Summary
This chapter has emphasized the critical role of assessment when developing authen-
tic learning experiences. Assessment is more than the assigning of grades and serves
a critical role in monitoring and evaluating the academic progress of your students.
In Chapter 6, we discussed the value of using a variety of formal and informal data to
inform your instructional decisions. Assessments should be woven throughout
instruction to serve many purposes and can take many forms. Just as you will vary
your instruction and select multiple methods for presenting your instruction, you
will draw upon multiple assessment formats and tools to support them. And
although it will be important for you to be able to monitor the technology proficien-
cies of your students, you may find that the technologies you choose to do so will
depend most on making good choices for supporting your instruction and assess-
ment. Assessment data also help you determine the effectiveness of your own
instructional choices including the selection of technology-based resources. As a
teacher it is important to know whether how you teach, with or without digital tech-
nology, is effective in helping students learn the intended content.

YOUR
PORTFOLIO
To begin to demonstrate competency in ISTE NETS-T Standard 2.d, return to the les-
son activities you began in earlier chapters and add your assessment strategies.
1. Develop assessments and scoring guidelines based on the lesson or unit plan,
to which technology is a major contributor.
2. Make sure your assessments are clearly connected to the content and technol-
ogy standards addressed by your lesson as well as the activities contained
within the lesson.

References
Andrade, H. G. (2005). Teaching with rubrics: The good, the bad, Butler, D. H., & Winne, P. H. (1995). Feedback and self-regulated
and the ugly. College Teaching, 53(1), 27–30. learning: A theoretical synthesis. Review of Educational
Assessment Reform Group. (1999). Assessment for learning: Beyond Research, 65, 245–281.
the black box. Cambridge, UK: University of Cambridge School Dick, W., Carey, L., & Carey, J. O. (2005). The systematic design of
of Education. instruction (6th ed.). New York: HarperCollins.
Bennett, R. E. (1998). Reinventing assessment: Speculations on the Educational Testing Service (ETS). (2001, July/August). People in
future of large-scale educational testing. Princeton, NJ: Educa- the know series: Ray Christensen. Capital News & Views. 1–3.
tional Testing Service (ETS). Washington, DC: State and Federal Relations Office.
Bennett, R. E. (2001). How the Internet will help large-scale Fletcher, J. D. (1992). Individualized systems of instruction. Alex-
assessment reinvent itself. Education Policy Analysis Archives, andria, VA: Institute for Defense Analyses. (ERIC Document
9(5). Retrieved January 22, 2001, from http://epaa.asu.edu/ Reproduction Service No. ED 355 917).
epaa/v9n5.html Meijer, R. R., & Mering, M. L. (1999). Computerized adaptive
Brookhart, S. M. (1999). The art and science of classroom assess- testing: Overview and introduction. Applied Psychological Mea-
ment: The missing part of pedagogy. ASHE-ERIC Higher Edu- surement, 23, 187–194.
cation Report, 27(1). Washington, DC: The George Washington Pub. L. No. 107-110. (No Child Left Behind Act of 2001).
University, Graduate School of Education and Human Public Schools of North Carolina. (2006). North Carolina standard
Development. course of study. Retrieved May 9, 2006, from http://www.dpi
.state.nc.us/curriculum/
188 CHAPTER 7

Shepard, L. A. (2000). The role of classroom assessment in teaching Wainer, H. (2000). Introduction and history. In H. Wainer (Ed.),
and learning. CSE Technical Report 517. Los Angeles: Center for Computerized adaptive testing: A primer (2nd ed.) (pp. 1–21).
the Study of Evaluation, Standards, and Student Testing Mahwah, NJ: Lawrence Erlbaum.
(CRESST). Wiggins, G. (1998). Educative assessment: Designing assessments to
Svinicki, M. D. (2004). Authentic assessment: Testing in reality. inform and improve student performance. San Francisco: Jossey-
New Directions for Teaching and Learning, 2004(100), 23–39. Bass.

You might also like