Visible Learning For Literacy Grades K 12 Sample Pages
Visible Learning For Literacy Grades K 12 Sample Pages
K–12
Implementing the Practices That Work Best to
Accelerate Student Learning
Douglas Fisher
Nancy Frey
John Hattie
FOR INFORMATION:
Corwin
A SAGE Company
2455 Teller Road
Thousand Oaks, California 91320
(800) 233-9936
www.corwin.com
SAGE Publications Ltd.
1 Oliver’s Yard
55 City Road
London EC1Y 1SP
United Kingdom
SAGE Publications India Pvt. Ltd.
B 1/I 1 Mohan Cooperative Industrial Area
Mathura Road, New Delhi 110 044
India
SAGE Publications Asia-Pacific Pte. Ltd.
3 Church Street
#10-04 Samsung Hub
Singapore 049483
Copyright © 2016 by Corwin
All rights reserved. When forms and sample documents are included,
their use is authorized only by educators, local school sites, and/or
noncommercial or nonprofit entities that have purchased the book.
Except for that usage, no part of this book may be reproduced or
utilized in any form or by any means, electronic or mechanical,
including photocopying, recording, or by any information storage and
retrieval system, without permission in writing from the publisher.
All trademarks depicted within this book, including trademarks
appearing as part of a screenshot, figure, or other image, are included
solely for the purpose of illustration and are the property of their
respective holders. The use of the trademarks in no way indicates any
relationship with, or endorsement by, the holders of said trademarks.
Printed in the United States of America
Library of Congress Cataloging-in-Publication Data
Names: Fisher, Douglas, author. | Frey, Nancy, author. | Hattie, John, author.
Title: Visible learning for literacy, grades K–12 : implementing the practices that
work best to accelerate student learning / Douglas Fisher, Nancy Frey, John Hattie.
Description: Thousand Oaks, California : Corwin/A SAGE Company, 2016. |
Includes bibliographical references and index.
Identifiers: LCCN 2015048505 | ISBN 9781506332352 (pbk. : alk. paper)
Subjects: LCSH: Language arts (Elementary) | Language arts (Secondary) | Literacy
—Study and teaching (Elementary) | Literacy—Study and teaching (Secondary) |
Visual learning.
Classification: LCC LB1576 .F338 2016 | DDC 372.6—dc23 LC record available
at http://lccn.loc.gov/2015048505
This book is printed on acid-free paper.
In the electronic edition of the book you have purchased, there are several
icons that reference links (videos, journal articles) to additional content.
Though the electronic edition links are not live, all content referenced may be
accessed at http://resources.corwin.com/VL-Literacy . This URL is
referenced at several points throughout your electronic edition.
LAYING THE
GROUNDWORK FOR
VISIBLE LEARNING
FOR LITERACY 1
© Hero Images/Corbis
2 VISIBLE LEARNING FOR LITERACY, GRADES K–12
Who can disagree with that? Who doesn’t believe that every student, in
every classroom, deserves to be educated in ways that build his or her
confidence and competence? Let’s take apart that sentence and explore
some of the thinking behind each word or phrase.
•• Every student (not just some students, such as those whose par-
ents can afford it or those who are lucky enough to live on a
street that allows them to attend an amazing school)
•• but by design (yes, there are learning designs that work, when
used at the right time. In fact, the literature is awash with evi-
dence of designs that work and those that do not work)
The design we’re talking about, the one that has great potential for
impacting students’ learning and allowing all of us to be great teachers,
is John Hattie’s Visible Learning (2009). So what do we mean by visible
learning? In part, it’s about developing an understanding of the impact
that instructional efforts have on students’ learning. Notice we didn’t
limit that to teachers. Students, teachers, parents, administrators—
everyone can determine if the learning is visible. To do so, students
have to know what they are learning, why they are learning it, what it
CHAPTER 1. LAYING THE GROUNDWORK 3
We believed that it was time to apply John’s previous work with visible
learning to the world of literacy learning. We think that visible learning
for literacy is important for several reasons:
Visible learning for literacy requires that teachers understand which strat-
egies and instructional routines are useful in which teaching situations.
There is no single right way to develop students’ literacy prowess. But
there are wrong ways. In Chapter 5, we will turn our attention to a spe-
cific list of practices that do not work in the literacy classroom. For now,
we will focus on those that do.
Effect Sizes
In addition to the meta-analyses, the largest summary of educational
research ever conducted (Visible Learning) contains effect sizes for each
practice (see Appendix, pages 169–173). An effect size is the magnitude,
or size, of a given effect. But defining a phrase by using the same terms
isn’t that helpful. So we’ll try again. You might remember from your sta-
tistics class that studies report statistical significance. Researchers make
the case that something “worked” when chance is reduced to 5% (as in
p < 0.05) or 1% (as in p < 0.01)—what they really mean is that the effect
found in the study was unlikely to be zero: something happened (but
there’s no hint of the size of the effect, or whether it was worthwhile!).
6 VISIBLE LEARNING FOR LITERACY, GRADES K–12
EF F EC T SIZE
Visible Learning provides readers with effect sizes for many influences
FO R D I REC T under investigation. As an example, direct instruction has a reasonably
I NSTRU C TI O N = 0 .59
strong effect size at 0.59 (we’ll talk more about what the effect size num-
ber tells us in the next section). The effect sizes can be ranked from
those with the highest impact to those with the lowest. But that doesn’t
mean that teachers should just take the top 10 or 20 and try to imple-
ment them immediately. Rather, as we will discuss later in this book,
some of the highly useful practices are more effective when focused on
surface-level learning while others work better for deep learning and
still others work to encourage transfer. Purpose, context, and timing of
practices all matter and must be considered. For general discussion of
effect sizes, see Figure 1.1.
A PRIMER ON EFFECT SIZES
Let us get a sense of what an effect size means. There are two common ways to calculate an effect size:
first, when two groups are compared—such as comparing a class receiving a literacy program with a similar
class not receiving this program—and second, over time—such as comparing the performance of a group
of students at the outset and again at the end of a series of literacy instruction. In both cases, the effect size
represents the magnitude of the difference—and of course the quality of the comparison, the measuring
instruments, and the research design to control extraneous factors are critical.
An effect size of d = 0.0 indicates no change in achievement related to the intervention. An effect size
of d = 1.0 indicates an increase of one standard deviation on the outcome (e.g., reading achievement), a
d = 1.0 increase is typically associated with advancing children’s achievement by two to three years, and this
would mean that, on average, the achievement of students receiving the treatment would exceed that of
84% of students not receiving the treatment. Cohen (1988) argued that an effect size of d = 1.0 should be
regarded as a large, blatantly obvious, and grossly perceptible difference, and as an example, he referred to
the difference between the average IQ of PhD graduates and high school students. Another example is the
difference between a person at 5’3” (160 cm) and one at 6’0” (183 cm)—which would be a difference visible
to the naked eye.
We do need to be careful about ascribing adjectives such as small, medium, and large to these effect sizes.
Cohen (1988), for example, suggested that d = 0.2 was small, d = 0.5 medium, and d = 0.8 large, whereas it
is possible to show that when investigating achievement influences in schools, d = 0.2 could be considered
small, d = 0.4 medium, and d = 0.6 large (Hattie, 2009). In many cases, this attribution would be reasonable,
but there are situations where this would be too simple an interpretation. Consider, for example, the effects
of an influence such as behavioral objectives, which has an overall small effect of d = 0.20, and reciprocal
teaching, which has an overall large effect of d = 0.74. It may be that the cost of implementing behavioral
objectives is so small that it is worth using them to gain an influence on achievement, albeit small, whereas it
might be too expensive to implement reciprocal teaching to gain the larger effect.
The relation between the notions of magnitude and statistical significance is simple: Significance = Effect
size × Study size. This should highlight why both aspects are important when making judgments. Effect sizes
based on small samples or small numbers of studies may not tell the true story, in the same way that statistical
significance based on very large samples may also not tell the true story (for example, a result could be
statistically significant but have only a tiny effect size). Similarly, two studies with the same effect sizes can
have different implications when their sample sizes vary (we should place more weight on the one based on
the larger sample size). The most critical aspect of any study is the convincibility of the story that best explains
the data; it is the visible learning story that needs critique or improvement—to what degree is the story in this
book convincing to you?
Figure 1.1
7
8 VISIBLE LEARNING FOR LITERACY, GRADES K–12
The effect size of direct instruction doesn’t mean that classrooms should
be composed of all direct instruction any more than they should be
EF F EC T SIZE FO R fully cooperative versus individualistic (which has an effect size of 0.59).
COO P ERATI V E
Direct instruction likely works better during surface-level literacy learn-
V ERSUS
IND I V ID UAL ISTI C ing whereas cooperative learning can deepen students’ understanding of
L EARNIN G = 0.59
content (provided that students have sufficient surface knowledge to then
make relations and extend ideas). Both can be effective when used for the
right purpose. The effect size list also includes some things that don’t work.
It turns out that about 95%+ of the influences that we use in schools
have a positive effect; that is, the effect size of nearly everything we
do is greater than zero. This helps explain why so many people can
argue “with evidence” that their pet project works. If you set the bar at
showing any growth above zero, it is indeed hard to find programs and
practices that don’t work. As described in Visible Learning (Hattie, 2009),
we have to reject the starting point of zero. Students naturally mature
and develop over the course of a year, and thus actions, activities, and
interventions that teachers use should extend learning beyond what a stu-
dent can achieve by simply attending school for a year.
This is why John Hattie set the bar of acceptability higher—at the
average of all the influences he compiled—from the home, parents,
CHAPTER 1. LAYING THE GROUNDWORK 9
Borrowing from Visible Learning, the barometer and hinge point are
effective in explaining what we focus on in this book and why. Here’s an
example of how this might play out from literacy:
Medium
H
Lo
ig
0.
h
0 Te 90
.1 ach
0
er
Ef
0
1.0
fe Zone of
0.0
Dev
e
ct
0
elop
ativ
1.10
tal E
g
−0.20 −0.1
Ne
ffec
t s
Reverse Effects
1.20
Sentence Combining d = 0.15
Figure 1.2
Our focus in Visible Learning for Literacy is on actions that fall inside
the zone of desired effects, which is 0.40 and above. When actions are in
the range of 0.40 and above, the data suggest that the learning extends
beyond that which was expected from attending school for a year.
Caution: That doesn’t mean that everything below 0.40 effect size is
not worthy of attention. In fact, there are likely some useful approaches
for teaching and learning that are not above this average. For example,
EF F EC T SIZE
FO R D RAMA/ARTS drama and arts programs have an effect size of 0.35, almost ensuring
PROG RAMS = 0.35 that students gain a year’s worth of achievement for a year of educa-
tion. We are not suggesting that drama and art be removed from the
curriculum. In fact, artistic expression and aesthetic understanding
may be valuable in and of themselves. Another critical finding was
the very low effect of teacher’s subject matter knowledge. While we
may accept the evidence that it is currently of little import, surely this
means we should worry considerably and investigate, first, why it is so
10
CHAPTER 1. LAYING THE GROUNDWORK 11
Teacher Credibility
A few things come to mind when we consider actions that teachers
can take at the more generic level. On the top of the list, with an EF F EC T S I Z E
effect size of 0.90, is teacher credibility. Students know which teach- FO R TE ACH ER
CRED I BI LIT Y = 0 . 90
ers can make a difference in their lives. Teacher credibility is a con-
stellation of characteristics, including trust, competence, dynamism,
and immediacy. Students evaluate each of these factors to determine
if their teacher is credible, and if they are going to choose to learn
12 VISIBLE LEARNING FOR LITERACY, GRADES K–12
EF F EC T SIZE cept map and are ready to write. If you haven’t had a peer review
FO R CO N CEPT yet, let me know. We need to get these done so that they can be
MAP P IN G = 0.60
included in the upcoming e-zine. If we miss the deadline, we’re out
of the issue.” Mr. Chu’s students trust him and know when it’s time
to focus. They appreciate his dynamic yet not overzealous style. And,
parenthetically, they learn a lot.
CHAPTER 1. LAYING THE GROUNDWORK 13
Teacher–Student Relationships
Closely related to teacher credibility is teacher–student relationships,
which have an effect size of 0.72. When students believe that the teacher
is credible, they are more likely to develop positive relationships with
EF F EC T S I Z E FO R
that teacher, and then learn more from him or her. But relationships TE ACH ER–STU D ENT
go deeper than credibility. Of course, relationships are based on trust, RELATI O NS H I PS
= 0 . 72
which is part of the credibility construct. But relationships also require
effective communication and addressing issues that strain the relation-
ship. Positive relationships are fostered and maintained when teachers
set fair expectations, involve students in determining aspects of the
classroom organization and management, and hold students account-
able for the expectations in an equitable way. Importantly, relationships
are not destroyed when problematic behaviors occur, on the part of
either the teacher or students. This is an important point for literacy
educators. If we want to ensure students read, write, communicate, and
think at high levels, we have to develop positive, trusting relationships
with students, all students.
Michael spilled over to the rest of the students who didn’t think their
teacher was fair or that he was trustworthy.
•• Be sincere in their pride in their students and make sure that pride
is based on evidence of student work, not generalized comments
Source: Restorative Conference Facilitator Script, Restorative Conferencing, International Institute on Restorative Practices,
http://www.iirp.edu/article_detail.php?article_id=NjYy
Figure 1.3
Institute for Restorative Practices, that allow people to figure out what
went wrong and how to repair the harm that has been done. We’ve
spent time on this because relationships matter, and students achieve
more and better when they develop strong interpersonal relationships
with their teachers. It’s these humane and growth-producing conversa-
tions that help students grow in their prosocial behaviors. (Note that
the greatest effect on achievement when students join a new class or
school is related to whether they make a friend in the first month—
it is your job to worry about friendship, counter loneliness, and help
students gain a reputation as great learners not only in your eyes but
also in the eyes of their peers.) And by the way, effectively managed
classrooms, ones in which students understand the expectations and are
held to those expectations in ways that are consistent with relationship EFFECT SIZE
development and maintenance, have an effect size of 0.52. A poorly run FOR CLASSROOM
MANAGEMENT = 0.52
classroom will interfere with high-quality literacy learning.
Teacher Expectations
Another influence on student achievement that is important for literacy
educators, but isn’t directly a literacy approach, is teacher expectations,
15
16 VISIBLE LEARNING FOR LITERACY, GRADES K–12
with an effect size of 0.43. In large part, teachers get what they expect;
yes, teachers with low expectations are particularly successful at getting
EF F EC T SIZE FO R what they expect. The more recent research has shown that teachers
E X P EC TATI O NS = 0.43 who have high (or low) expectations tend to have them for all their stu-
dents (Rubie-Davies, 2015). Teachers’ expectations of students become
the reality for students. Requiring kindergarteners to master 100 sight
words, and then aligning instruction to accomplish that, communicates
the expectations a teacher has for five-year-olds. Believing that ninth
graders can only write five-paragraph essays with 500 words sets the bar
very low, and students will jump just that high, and no higher than that.
Over time, students exert just enough effort to meet teacher expecta-
tions. Hattie (2012) called this the minimax principle, “maximum grade
return for minimal extra effort” (p. 93). And it gets in the way of better
and deeper learning. When expectations are high, the minimax princi-
ple can work to facilitate students’ learning.
This does not mean that teachers should set unrealistic expectations.
Telling first graders that they are required to read Tolstoy’s War and Peace
is a bit too far. Teachers should have expectations that appropriately
stretch students, and yet those expectations should be within reach.
Sixth graders who are held to fourth-grade expectations will be great
fifth graders when they are in seventh grade; the gap never closes. And
students deserve more. When high-yield literacy instructional routines
are utilized, students can achieve more than a year’s growth during a
year of instruction. And that’s what this book focuses on—maximizing
the impact teachers have on students’ learning.
teachers can design amazing learning environments. But it’s more than http://resources.corwin.com/
VL-Literacy
instruction. Teachers should focus on learning. It’s a mindset that we all
need, if we are going to ensure that students develop their literate selves.
A major theme throughout this book is how teachers think (and also
how we want students to think). Hattie (2012) suggests 10 mind frames
that can be used to guide decisions, from curriculum adoptions to lesson
planning (Figure 1.4).
Taken together, these mind frames summarize a great deal of the “what
works” literature. In the remainder of this book, we focus on putting
EFFECT SIZE
these into practice specifically as they relate to literacy learning, and FOR TEACHER
CLARITY = 0.75
address the better question, what works best? (Hattie, 2009). To do so, we
need to consider the levels of learning we can expect from students. How,
MIND FRAMES FOR TEACHERS
8. I am a change agent.
9. I am an evaluator.
Figure 1.4
then, should we define learning, since that is our goal? As John himself
suggested in his 2014 Vernon Wall Lecture, learning can be defined as
18
CHAPTER 1. LAYING THE GROUNDWORK 19
students are going to set their own expectations and monitor their own
EFFECT SIZE FOR
achievement. But schooling should not stop there. Learning demands SELF-REPORTED
GRADES/STUDENT
that students be able to apply—transfer—their knowledge, skills, and
EXPECTATIONS
strategies to new tasks and new situations. That transfer is so difficult = 1.44
to attain is one of our closely kept secrets—so often we pronounce that
students can transfer, but the process of teaching them this skill is too
often not discussed. We will discuss it in Chapter 4.
The ultimate goal, and one that is hard to realize, is transfer (see
Figure 1.5 on the next page). When students reach this level, learn-
ing has been accomplished. One challenge to this model is that most
assessments focus on surface-level learning because that level is easier
to evaluate. But, as David Coleman, president of the College Board, said
in his Los Angeles Unified presentation to administrators, test makers
have to assume responsibility for the practice their assessment inspires.
That applies to all of us. If the assessment focuses on recall, then a great
number of instructional minutes will be devoted to developing students’
ability to demonstrate “learning” that way.
Transfer
Deep
Surface
Figure 1.5
The range of answers was one day to two weeks. The assessment would
change practice. Another said, “What about building transfer tasks for
students to complete so that they would know that they had mastered
the content for our courses? If we asked them to apply their knowledge
to new tasks, we’d know they learned it, right? And we wouldn’t spend
hours reviewing the past.”
20
CHAPTER 1. LAYING THE GROUNDWORK 21
The conversation continued, and this group of teachers made their deci-
sion. Our point here is not to debate the merits of final exams, but rather
to focus on the levels of learning and the fact that teachers can choose
to engage students in deeper understanding. It’s within our power, as the
mind frames suggest, to do so.
1. Challenge
2. Self-efficacy
1. Challenge
The first of these global aspects is challenge. Students appreciate chal-
lenge. They expect to work hard to achieve success in school and life.
When tasks become too easy, students get bored. Similarly, when tasks
become too difficult, students get frustrated. There is a sweet spot for
learning, but the problem is that it differs for different students. There
22 VISIBLE LEARNING FOR LITERACY, GRADES K–12
is a Goldilocks notion of making a task not too easy or too hard but just
right. As Tomlinson (2005) noted,
How, then, can literacy educators keep students challenged but not frus-
trated? There are several responses to this question, and our answer is
embedded in every chapter of this book. In part, we would respond that
the type of learning intention is important to maintain challenge.
Student-to-Student Interaction
Feedback
How else can we maintain challenge for each learner? Our third
response relates to feedback. When students are engaged in appropri-
ately challenging tasks, they are more likely to respond to feedback
because they need that information to continue growing and learn-
ing. Feedback focused on something that you already know does little
to change understanding. Feedback thrives on errors. For example,
Marco has a strong sense of English spelling. His writing is filled with
complex vocabulary terms that are spelled correctly. He understands
how to use resources to build this knowledge about words. Thus, feed-
back about the misspelling of the word acknowledge, which he spelled
“acknowlege” in his handwritten draft, is not likely to result in great
changes in his learning. Any spell-check program on a computer will
tell him he is wrong, and he can correct it. A better use of time might
be to focus on Marco’s use of clichés in his writing. A useful conversa-
tion with him could show him that the more familiar a term or phrase
becomes, the more often readers skip over it as they read, essentially
rendering the text ineffective.
24 VISIBLE LEARNING FOR LITERACY, GRADES K–12
2. Self-Efficacy
A second global consideration for literacy educators is students’
self-efficacy. Hattie (2012) defines self-efficacy as “the confidence or
strength of belief that we have in ourselves that we can make our
learning happen” (p. 45). He continues, with descriptions of students
with high self-efficacy, noting that they
•• Avoid complex and difficult tasks (as these are seen as personal
threats)
DIFFICULTY AND COMPLEXITY
More Complex
Easy Hard
Less Complex
Figure 1.6
25
26 VISIBLE LEARNING FOR LITERACY, GRADES K–12
prophecy: the rich get richer, and the poor get poorer. Students with
poor self-efficacy see each challenge and setback as evidence that they
aren’t learning, and in fact can’t learn, which reduces the likelihood that
they will rally the forces for the next task the teacher assigns.
To this we add
We’re not saying that it’s easy to identify learning intentions and success
criteria. Smith (2007) notes, “Writing learning intentions and success cri-
teria is not easy . . . because it forces us to ‘really, really think’ about what
we want the pupils to learn rather than simply accepting statements
handed on by others” (p. 14). We are saying that it’s worth the effort.
Learning intentions are more than a standard. There have been far too
many misguided efforts that mandated teachers to post the standard on
the wall. Learning intentions are based on the standard, but are chun-
ked into learning bites. In too many cases, the standards are not under-
standable to students. Learning intentions, if they are to be effective,
28 VISIBLE LEARNING FOR LITERACY, GRADES K–12
Figure 1.7 contains some poorly written learning intentions and some
improvements that teachers made collaboratively as they explored the value
of this approach. Note that the intentions became longer, more specific,
and more interesting. The improved versions invite students into learning.
Of course, learning intentions can be grouped. Sometimes an activity can
contribute to several learning intentions, and other times a learning inten-
tion requires several activities. However, when learning intentions spread
over many days, student interest will wane, and motivation will decrease.
When teachers plan a unit of study and clearly identify the learning inten-
tions required for mastery of the content, most times they can identify
daily targets. In doing so, they can also identify the success criteria, which
will allow for checking for understanding and targeted feedback.
K Compare the experiences Today, we’ll read two stories about city and country life.
of characters in two stories. We’ll focus on comparing the lives of the two characters and
the differences in their lives based on where they live.
5 Use technical language in As we revise our opinion papers, we are going to learn
the revisions of essays. how to update our word choices so that we use technical
vocabulary like the authors we’ve been studying use.
7 Determine the central idea Each group has a different article, and our learning today
of a text. is going to focus on locating the central or controlling
idea, the idea that the author uses to hold the entire text
together.
11 Compare two texts for Compare how two texts from the same point in U.S. history
different themes. address a common theme and figure out what each author is
trying to say in response to the theme.
Figure 1.7
•• Discuss with a partner the way the author used visuals and how
they helped you understand the text.
•• Identify one place in the text that was confusing and how one of
the visuals helped you understand that information.
29
30 VISIBLE LEARNING FOR LITERACY, GRADES K–12
students, “How will you know you have learned this? What evidence
could we accept that learning has occurred?” In these situations, stu-
dents can share their thinking about the success criteria, and often they
are more demanding of themselves than their teachers are. In a sixth-
Video 1.4
Making Success Criteria grade English class focused on learning to come to group discussions
Visible in Fourth Grade prepared, the students identified several ways that they would know if
http://resources.corwin.com/ they met this expectation. Several suggested that they should have their
VL-Literacy learning materials with them when they moved into collaborative learn-
ing. Others added that they should have their notes and annotations
updated and be ready to talk about their reading, rather than read while
they are in the group. One student suggested that they should practice
vocabulary before the group so that they would be ready. Another added
that they should each know their role in the group so that they can get
started right away. None of these answers were wrong; they were all
useful in improving the collaborative learning time. In this case, the stu-
dents established the success criteria and opened the door to feedback
from their peers and the teacher in their successive approximations in
demonstrating mastery of their learning.
Further, when students understand the success criteria, they can be most
involved in assessing their own success, and their progression toward
this success. A simple tool allows students to put sticky notes in one of
four quadrants to communicate their status (see Figure 1.8). This alerts
the teacher, and other students, about help that is needed. It mobilizes
EF F EC T SIZE peer tutoring and cooperative versus competitive learning, as well as
FO R P EER
TUTO RI N G = 0.55 building student-centered teaching.
Other times, the tools used to create the success criteria involve rubrics
EF F EC T SIZE FO R
COO P ERATI V E and checklists. For example, students in a high school language arts class
V ERSUS
were tasked with selecting a worthy cause, something that they cared pas-
CO MP ETITI V E
L EARNIN G = 0.54 sionately about and whose value they could explain to others. Students
were encouraged to select topics that were personally relevant and to
learn more about that topic. As part of the assignment, students wrote
EF F EC T SIZE
FO R STU D ENT- an analytic essay about their chosen topic. Another part of the project
CENTERED required that they develop a web page, a Facebook page, or another elec-
TEACH IN G = 0.54
tronic way of communicating with a wider world about their cause. And
still another part of their assignment required the development of an
informational pamphlet that they could use to educate adults about the
SAMPLE SELF-ASSESSMENT OF LEARNING
Figure 1.8
31
SAMPLE PROJECT CHECKLIST
Pamphlet Portion
Date Completed
Item Projected
Cover has the title, image, and your name
Description of your cause (minimum 10 sentences)
List 3–5 important facts
Map of where this is occurring
Demographics of who/what is impacted
Minimum of 3 images in your brochure
Contact information (websites, telephone numbers)
Upcoming events (celebrations, day, movie, anniversary date, races, etc.)
Pamphlet is attractive and well organized
Correct spelling and grammar
Figure 1.9
32
CHAPTER 1. LAYING THE GROUNDWORK 33
Conclusion
Teachers, we have choices. We can elect to use instructional routines and
procedures that don’t work, or that don’t work for the intended purpose. Errors should
Or we can embrace the evidence, update our classrooms, and impact stu- be expected
dent learning in wildly positive ways. We can choose to move beyond and celebrated
surface-level learning, while still honoring the importance of teaching because they are
students surface-level skills and strategies. We can extend students’ learn- opportunities
ing in deep ways and facilitate the transfer of their learning to new tasks, for learning. If
texts, and projects, if we want. We can design amazing lessons that mobi-
students are
lize the evidence and provide opportunities for students to learn. And we
not making
can decide to evaluate our impact, if we are brave enough.
errors, they have
Monica was lucky enough to transfer to a school that embraced Visible
likely previously
Learning for Literacy. Her teachers tried out the instructional ideas, moni mastered the
tored progress, and provided feedback to her and to each other. Monica learning intention.
went from a failing student, tracked in a class with low expectations, to
a lead learner providing support for her peers. Impact has a face. It’s not
an abstract idea or ideal. Together, we can impact the literacy learning of
every student. Let’s make it so.
FEEDBACK STRATEGIES
Feedback
Strategies Can
Vary in . . . In These Ways . . . Recommendations for Good Feedback
Mode •• Oral •• Select the best mode for the message. Would a comment in
passing the student’s desk suffice? Is a conference needed?
•• Written
•• Interactive feedback (talking with the student) is best when
•• Visual/
possible.
demonstration
•• Give written feedback on written work or on assignment
cover sheets.
•• Use demonstration if “how to do something” is an issue or
if the student needs an example.
Figure 1.10
34