60% found this document useful (5 votes)
2K views44 pages

Visible Learning For Literacy Grades K 12 Sample Pages

Uploaded by

Davit uu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
60% found this document useful (5 votes)
2K views44 pages

Visible Learning For Literacy Grades K 12 Sample Pages

Uploaded by

Davit uu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Visible Learning for Literacy, Grades

K–12
Implementing the Practices That Work Best to
Accelerate Student Learning

Douglas Fisher
Nancy Frey
John Hattie
FOR INFORMATION:
Corwin
A SAGE Company
2455 Teller Road
Thousand Oaks, California 91320
(800) 233-9936
www.corwin.com
SAGE Publications Ltd.
1 Oliver’s Yard
55 City Road
London EC1Y 1SP
United Kingdom
SAGE Publications India Pvt. Ltd.
B 1/I 1 Mohan Cooperative Industrial Area
Mathura Road, New Delhi 110 044
India
SAGE Publications Asia-Pacific Pte. Ltd.
3 Church Street
#10-04 Samsung Hub
Singapore 049483
Copyright © 2016 by Corwin
All rights reserved. When forms and sample documents are included,
their use is authorized only by educators, local school sites, and/or
noncommercial or nonprofit entities that have purchased the book.
Except for that usage, no part of this book may be reproduced or
utilized in any form or by any means, electronic or mechanical,
including photocopying, recording, or by any information storage and
retrieval system, without permission in writing from the publisher.
All trademarks depicted within this book, including trademarks
appearing as part of a screenshot, figure, or other image, are included
solely for the purpose of illustration and are the property of their
respective holders. The use of the trademarks in no way indicates any
relationship with, or endorsement by, the holders of said trademarks.
Printed in the United States of America
Library of Congress Cataloging-in-Publication Data
Names: Fisher, Douglas, author. | Frey, Nancy, author. | Hattie, John, author.
Title: Visible learning for literacy, grades K–12 : implementing the practices that
work best to accelerate student learning / Douglas Fisher, Nancy Frey, John Hattie.
Description: Thousand Oaks, California : Corwin/A SAGE Company, 2016. |
Includes bibliographical references and index.
Identifiers: LCCN 2015048505 | ISBN 9781506332352 (pbk. : alk. paper)
Subjects: LCSH: Language arts (Elementary) | Language arts (Secondary) | Literacy
—Study and teaching (Elementary) | Literacy—Study and teaching (Secondary) |
Visual learning.
Classification: LCC LB1576 .F338 2016 | DDC 372.6—dc23 LC record available
at http://lccn.loc.gov/2015048505
This book is printed on acid-free paper.

Publisher: Lisa Luedeke


Editorial Development Manager: Julie Nemer
Editorial Assistant: Nicole Shade
Production Editor: Melanie Birdsall
Contents
List of Videos
Preface
Acknowledgments
Chapter 1. Laying the Groundwork for Visible Learning for
Literacy
The Evidence Base
Meta-Analyses
Effect Sizes
Noticing What Works
Learning From What Works, Not Limited to Literacy
Teacher Credibility
Teacher–Student Relationships
Teacher Expectations
General Literacy Learning Practices
1. Challenge
2. Self-Efficacy
3. Learning Intentions With Success Criteria
Conclusion
Chapter 2. Surface Literacy Learning
Why Surface Literacy Learning Is Essential
Acquisition and Consolidation
Acquisition of Literacy Learning Made Visible
Leveraging Prior Knowledge
Phonics Instruction and Direct Instruction in Context
Vocabulary Instruction
Mnemonics
Word Cards
Modeling Word Solving
Word and Concept Sorts
Wide Reading
Reading Comprehension Instruction in Context
Summarizing
Annotating Text
Note-Taking
Consolidation of Literacy Learning Made Visible
Rehearsal and Memorization Through Spaced Practice
Repeated Reading
Receiving Feedback
Collaborative Learning With Peers
Conclusion
Chapter 3. Deep Literacy Learning
Moving From Surface to Deep
Deep Acquisition and Deep Consolidation
Deep Acquisition of Literacy Learning Made Visible
Concept Mapping
Discussion and Questioning
Close Reading
Deep Consolidation of Literacy Learning Made Visible
Metacognitive Strategies
Reciprocal Teaching
Feedback to the Learner
Conclusion
Chapter 4. Teaching Literacy for Transfer
Moving From Deep Learning to Transfer
Types of Transfer: Near and Far
The Paths for Transfer: Low-Road Hugging and High-Road
Bridging
Setting the Conditions for Transfer of Learning
Teaching Students to Organize Conceptual Knowledge
Students Identify Analogies
Peer Tutoring
Reading Across Documents
Problem-Solving Teaching
Teaching Students to Transform Conceptual Knowledge
Socratic Seminar
Extended Writing
Time to Investigate and Produce
Conclusion
Chapter 5. Determining Impact, Responding When the Impact Is
Insufficient, and Knowing What Does Not Work
Determining Impact
Preassessment
Postassessment
Responding When There Is Insufficient Impact
Response to Intervention
Screening
Quality Core Instruction
Progress Monitoring
Supplemental and Intensive Interventions
Learning From What Doesn’t Work
Grade-Level Retention
Ability Grouping
Matching Learning Styles With Instruction
Test Prep
Homework
Conclusion
Appendix: Effect Sizes
References
Index
Preface

Literacy educators have been in search of “what works” for decades. As


a group, we’ve dedicated ourselves to students’ reading and writing
(and speaking, listening, and viewing) development because we know
that literacy can change lives. Our collective search for better ways to
reach students and ensure that they develop literacy knowledge and
skills has resulted in thousands and thousands of books, hundreds of
thousands of research articles, and countless websites. So why another
one?
For us, the answer is simple. Nearly all the things teachers do work
when we ask what improves student achievement. But only a few
things work at ensuring that students gain a full year’s worth of growth
for a year of enrollment in school, and we think it’s time we focused on
what works, what doesn’t work, and what can’t hurt. And we’ve turned
to Visible Learning (Hattie, 2009) for help.
In part, this has been a personal journey. We (Nancy and Doug)
engaged in literacy instruction in a wide range of settings, including
preschools, elementary schools, middle schools, and high schools, for
many years before we read Visible Learning. We have taught students
who live in poverty, a wide range of English learners, students who are
highly engaged in their own learning, students who are homeless,
students with disabilities, students who grasp concepts almost instantly,
and students who are not so motivated to be in school. Over the years,
our classrooms have been wonderfully diverse and complex places for
learning to occur. And we did a reasonably good job with developing
students’ literacy.
Of course, we made mistakes as well, but all teachers do. Doug wishes
he could find Anthony, a ninth grader from 2009, who just never got
good enough writing instruction to pass his classes. Today, Doug would
do a better job. Nancy remembers a particular first grader who would
only work on his onset and rime cards if Nancy played background
music. Whatever it takes—that’s the job of the teacher. We tried just
about any instructional strategy that we could find to engage students in
learning.
But then, along came Visible Learning. We’ve read the research, and we
knew, for example, that vocabulary instruction works to improve
student learning. We read the book and were pleased to see that many
of the literacy approaches we recommended were included in this list of
“what works best.” We congratulated ourselves on knowing the
research literature and trying to translate that into classroom practice.
The list of effect sizes was useful in making the case that literacy
educators can have a powerful impact on students’ learning when they
engage in specific actions. And it was useful to know that a great deal
of students’ learning was under the control of the teacher (so that we
could help teachers take responsibility and reduce finger-pointing).
We started focusing on influences on student learning that had a
reasonable impact. But we didn’t have them organized in any particular
way. As a result, we noticed that not all of these approaches worked
equally well. We thought it had to be us because the research was there
to support each of the routines we used. We weren’t sure what to do, so
we kept at it, engaging students in the best learning opportunities we
could. We shared responsibility with them and guided their learning,
such that more and more of our students became their own teachers,
which is one of the major lessons learned from Visible Learning.
A chance encounter with John Hattie took us to the next level. John
talked about the value of matching specific instructional routines,
procedures, or strategies with the appropriate phase of students’
learning. Of course, we knew about Bloom’s taxonomy and Webb’s
depth of knowledge. But this was a bit different. John said that students
have to develop surface-level learning if they are ever going to go deep.
And we know that deep learning can facilitate transfer, which has been
our goal all along.
So we updated our lessons and started thinking about which
instructional routines worked at the surface level. With our colleagues,
we focused on some specific instructional approaches early in units of
study, when students needed to expand their surface-level skills. And
then, importantly, we stopped using these procedures when students
moved into deeper learning. And it worked. We all had more students,
more often, engaged in deeper learning. And students were transferring
their learning from class to class, grade to grade, and year to year.
So there we sat, realizing that it was time to write another book. This
time, we needed to explore the ways in which the Visible Learning
influences could be mobilized at three levels—surface, deep, and
transfer. And who better to collaborate with than John Hattie himself?
Together, we hoped that the literacy world might be open to rethinking
strategies and shifting focus to the alignment of these strategies in tune
with phases of learning.
The result is this book that you’re holding right now. It’s our best
thinking to date about being an effective literacy educator. Knowing
how to match instructional approaches with specific phases of learning,
knowing your impact, and taking action when the impact is not
sufficient has become our newest and most robust effort to help
students inherit the world of literacy.

In the electronic edition of the book you have purchased, there are several
icons that reference links (videos, journal articles) to additional content.
Though the electronic edition links are not live, all content referenced may be
accessed at http://resources.corwin.com/VL-Literacy . This URL is
referenced at several points throughout your electronic edition.
LAYING THE
GROUNDWORK FOR
VISIBLE LEARNING
FOR LITERACY 1

© Hero Images/Corbis
2 VISIBLE LEARNING FOR LITERACY, GRADES K–12

Every student deserves a great teacher, not by chance, but by


design.

Who can disagree with that? Who doesn’t believe that every student, in
every classroom, deserves to be educated in ways that build his or her
confidence and competence? Let’s take apart that sentence and explore
some of the thinking behind each word or phrase.

•• Every student (not just some students, such as those whose par-
ents can afford it or those who are lucky enough to live on a
street that allows them to attend an amazing school)

•• deserves (yes, we believe that students have the right to a quality


education)

•• a great teacher (one who develops strong relationships, knows


his or her content and how to teach it, and evaluates his or her
impact. This is where a lot of debate enters the picture because
people differ in their understanding of what great teachers do
and how they think)

•• not by chance (meaning that we have to move beyond the luck


of the draw that permeates much of the educational landscape.
Children’s education should not be left to chance, with one year
being amazing and another average or awful. Further, children’s
education should be left not to whatever sense of challenge or
level of expectation a teacher may have, but to an appropriate
high level of challenge and expectation)

•• but by design (yes, there are learning designs that work, when
used at the right time. In fact, the literature is awash with evi-
dence of designs that work and those that do not work)

The design we’re talking about, the one that has great potential for
impacting students’ learning and allowing all of us to be great teachers,
is John Hattie’s Visible Learning (2009). So what do we mean by visible
learning? In part, it’s about developing an understanding of the impact
that instructional efforts have on students’ learning. Notice we didn’t
limit that to teachers. Students, teachers, parents, administrators—
everyone can determine if the learning is visible. To do so, students
have to know what they are learning, why they are learning it, what it
CHAPTER 1. LAYING THE GROUNDWORK 3

means to be “good” at this learning, and what it means to have learned.


The adults also need to know what students are learning, why they
are learning it, what it means to be “good” at this learning, and what
it means to have learned. Some things are learned at the surface level,
others at the deep level, and still other knowledge is available for trans-
fer to new situations. Each of these surface, deep, and transfer levels of
learning is important; each of these is the focus, in turn, of one of the
following three chapters.

We believed that it was time to apply John’s previous work with visible
learning to the world of literacy learning. We think that visible learning
for literacy is important for several reasons:

1. Literacy is among the major antidotes for poverty.

2. Literacy makes your life better.

3. Literate people have more choices in their work and personal


lives, leading to greater freedom.

4. Literacy is great at teaching you how to think successively—that


is, making meaning one step at a time to then build a story.

5. Literacy soon becomes the currency of other learning.

Visible learning for literacy requires that teachers understand which strat-
egies and instructional routines are useful in which teaching situations.
There is no single right way to develop students’ literacy prowess. But
there are wrong ways. In Chapter 5, we will turn our attention to a spe-
cific list of practices that do not work in the literacy classroom. For now,
we will focus on those that do.

There are certain things that great teachers know:

•• Great teachers understand that different approaches work more


effectively at different times. For example, a great approach for
developing students’ surface-level learning is not likely to ensure
deep learning, much less transfer. But there are times when their
surface-level learning is what students need.

•• Great teachers know that different approaches work for some


students better than for other students.
4 VISIBLE LEARNING FOR LITERACY, GRADES K–12

•• Great teachers know that different approaches work differently


depending on where in the learning process a student may be.

•• Great teachers intervene in specific, meaningful, and calculated


ways to increase students’ learning trajectories. This requires
that they understand and share challenging, yet specific and
appropriate, goals with students; monitor progress toward those
goals; provide and receive feedback; alter their actions when
learning is not occurring; and share in the joy that comes from
working with students to meet the learning goals.

Visible learning asks teachers to go even a step further. It asks us to create


the conditions necessary for students to become their own teachers. We
mean not that classrooms should be surrendered and the students be
told to teach themselves, but rather that the expectation of the instruc-
tion students receive involves student engagement to the degree that
they want to, and do, learn more and better—even beyond the class-
room walls. This requires that teachers become learners of their own
teaching, which is the major focus of this book.

The Evidence Base


Meta-Analyses
The starting point for our exploration of literacy learning is John Hattie’s
books, Visible Learning (2009) and Visible Learning for Teachers (2012).
At the time these books were published, his work was based on over
800 meta-analyses conducted by researchers all over the world, which
included over 50,000 individual studies that included over 250 million
students. It has been claimed to be the most comprehensive review of
literature ever conducted. And the thing is, it’s still going on. At the time
of this writing, the database included 1,200 meta-analyses, with over
70,000 studies and 300 million students. A lot of data, right? But the
story underlying the data is the critical matter.

Before we explore the findings and discuss what we don’t cover in


this book, we should discuss the idea of a meta-analysis because it is
the basic building block for the recommendations in this book. At its
CHAPTER 1. LAYING THE GROUNDWORK 5

root, a meta-analysis is a statistical tool for combining findings from


different studies with the goal of identifying patterns that can inform
practice. It’s the old preponderance of evidence that we’re looking for,
because individual studies have a hard time making a compelling case
for change. But a meta-analysis synthesizes what is currently known
about a given topic and can result in strong recommendations about
the impact or effect of a specific practice. For example, there was com-
peting evidence about periodontitis (inflammation of the tissue around
the teeth) and whether or not it is associated with increased risk of cor-
onary heart disease. The published evidence contained some conflicts,
and recommendations about treatment were piecemeal. A meta-analysis
of 5 prospective studies with 86,092 patients suggested that individ-
uals with periodontitis had a 1.14 times higher risk of developing
coronary heart disease than the controls (Bahekar, Singh, Saha, Molnar,
& Arora, 2007). The result of the meta-analysis was a set of clear recom-
mendations for treatment of periodontitis, with the potential of signifi-
cantly reducing the incidence of heart disease. We won’t tell you too
many other stories about health care or business, but we hope that the
value of meta-analyses in changing practice is clear.

The statistical approach for conducting meta-analyses is beyond the


scope of this book, but it is important to note that this tool allows
researchers to identify trends across many different studies and their
participants.

Effect Sizes
In addition to the meta-analyses, the largest summary of educational
research ever conducted (Visible Learning) contains effect sizes for each
practice (see Appendix, pages 169–173). An effect size is the magnitude,
or size, of a given effect. But defining a phrase by using the same terms
isn’t that helpful. So we’ll try again. You might remember from your sta-
tistics class that studies report statistical significance. Researchers make
the case that something “worked” when chance is reduced to 5% (as in
p < 0.05) or 1% (as in p < 0.01)—what they really mean is that the effect
found in the study was unlikely to be zero: something happened (but
there’s no hint of the size of the effect, or whether it was worthwhile!).
6 VISIBLE LEARNING FOR LITERACY, GRADES K–12

One way to increase the likelihood that statistical significance is reached


is to increase the number of people in the study, also known as sample
size. We’re not saying that researchers inflate the size of the research
group to obtain significant findings. We are saying that simply because
something is statistically significant doesn’t mean it’s worth implement-
ing. For example, say the sample size is 1,000. In this case, a correlation
only needs to exceed 0.044 to be “statistically significant”; if 10,000,
then 0.014, and if 100,000, then 0.004—yes, you can be confident that
these values are greater than zero, but are they of any practical value?

That’s where effect size comes in.


Effect size
represents the Say, for example, that this amazing writing program was found to be sta-
magnitude of the tistically significant in changing student achievement. Sounds good, you
impact that a given say to yourself, and you consider purchasing or adopting it. But then you
approach has. learn that it only increased students’ writing performance by 0.3 on a
5-point rubric (and the research team had data from 9,000 students). If it
were free and easy to implement this change, it might be worth it to have
students get a tiny bit better as writers. But if it were time-consuming,
difficult, or expensive, you should ask yourself if it’s worth it to go to all
of this trouble for such a small gain. That’s effect size—it represents the
magnitude of the impact that a given approach has.

EF F EC T SIZE
Visible Learning provides readers with effect sizes for many influences
FO R D I REC T under investigation. As an example, direct instruction has a reasonably
I NSTRU C TI O N = 0 .59
strong effect size at 0.59 (we’ll talk more about what the effect size num-
ber tells us in the next section). The effect sizes can be ranked from
those with the highest impact to those with the lowest. But that doesn’t
mean that teachers should just take the top 10 or 20 and try to imple-
ment them immediately. Rather, as we will discuss later in this book,
some of the highly useful practices are more effective when focused on
surface-level learning while others work better for deep learning and
still others work to encourage transfer. Purpose, context, and timing of
practices all matter and must be considered. For general discussion of
effect sizes, see Figure 1.1.
A PRIMER ON EFFECT SIZES

Let us get a sense of what an effect size means. There are two common ways to calculate an effect size:
first, when two groups are compared—such as comparing a class receiving a literacy program with a similar
class not receiving this program—and second, over time—such as comparing the performance of a group
of students at the outset and again at the end of a series of literacy instruction. In both cases, the effect size
represents the magnitude of the difference—and of course the quality of the comparison, the measuring
instruments, and the research design to control extraneous factors are critical.
An effect size of d = 0.0 indicates no change in achievement related to the intervention. An effect size
of d = 1.0 indicates an increase of one standard deviation on the outcome (e.g., reading achievement), a
d = 1.0 increase is typically associated with advancing children’s achievement by two to three years, and this
would mean that, on average, the achievement of students receiving the treatment would exceed that of
84% of students not receiving the treatment. Cohen (1988) argued that an effect size of d = 1.0 should be
regarded as a large, blatantly obvious, and grossly perceptible difference, and as an example, he referred to
the difference between the average IQ of PhD graduates and high school students. Another example is the
difference between a person at 5’3” (160 cm) and one at 6’0” (183 cm)—which would be a difference visible
to the naked eye.
We do need to be careful about ascribing adjectives such as small, medium, and large to these effect sizes.
Cohen (1988), for example, suggested that d = 0.2 was small, d = 0.5 medium, and d = 0.8 large, whereas it
is possible to show that when investigating achievement influences in schools, d = 0.2 could be considered
small, d = 0.4 medium, and d = 0.6 large (Hattie, 2009). In many cases, this attribution would be reasonable,
but there are situations where this would be too simple an interpretation. Consider, for example, the effects
of an influence such as behavioral objectives, which has an overall small effect of d = 0.20, and reciprocal
teaching, which has an overall large effect of d = 0.74. It may be that the cost of implementing behavioral
objectives is so small that it is worth using them to gain an influence on achievement, albeit small, whereas it
might be too expensive to implement reciprocal teaching to gain the larger effect.
The relation between the notions of magnitude and statistical significance is simple: Significance = Effect
size × Study size. This should highlight why both aspects are important when making judgments. Effect sizes
based on small samples or small numbers of studies may not tell the true story, in the same way that statistical
significance based on very large samples may also not tell the true story (for example, a result could be
statistically significant but have only a tiny effect size). Similarly, two studies with the same effect sizes can
have different implications when their sample sizes vary (we should place more weight on the one based on
the larger sample size). The most critical aspect of any study is the convincibility of the story that best explains
the data; it is the visible learning story that needs critique or improvement—to what degree is the story in this
book convincing to you?

Figure 1.1

7
8 VISIBLE LEARNING FOR LITERACY, GRADES K–12

The effect size of direct instruction doesn’t mean that classrooms should
be composed of all direct instruction any more than they should be
EF F EC T SIZE FO R fully cooperative versus individualistic (which has an effect size of 0.59).
COO P ERATI V E
Direct instruction likely works better during surface-level literacy learn-
V ERSUS
IND I V ID UAL ISTI C ing whereas cooperative learning can deepen students’ understanding of
L EARNIN G = 0.59
content (provided that students have sufficient surface knowledge to then
make relations and extend ideas). Both can be effective when used for the
right purpose. The effect size list also includes some things that don’t work.

Noticing What Works


If you attend any conference or read just about any professional jour-
nal, not to mention subscribe to blogs or visit Pinterest, you’ll get the
sense that everything works. Yet educators have a lot to learn from prac-
tices that do not work. In fact, we would argue that learning from what
doesn’t work, and not repeating those mistakes, is a valuable use of time.
To determine what doesn’t work, we turn our attention to effect sizes
again. Effect sizes can be negative or positive, and they scale from low
to high. Intuitively, an effect size of 0.60 is better than an effect size
of 0.20. Intuitively, we should welcome any effect that is greater than
zero—as zero means “no growth” and clearly any negative effect size
means a negative growth. If only it was this simple.

It turns out that about 95%+ of the influences that we use in schools
have a positive effect; that is, the effect size of nearly everything we
do is greater than zero. This helps explain why so many people can
argue “with evidence” that their pet project works. If you set the bar at
showing any growth above zero, it is indeed hard to find programs and
practices that don’t work. As described in Visible Learning (Hattie, 2009),
we have to reject the starting point of zero. Students naturally mature
and develop over the course of a year, and thus actions, activities, and
interventions that teachers use should extend learning beyond what a stu-
dent can achieve by simply attending school for a year.

This is why John Hattie set the bar of acceptability higher—at the
average of all the influences he compiled—from the home, parents,
CHAPTER 1. LAYING THE GROUNDWORK 9

schools, teachers, curricula, and teaching strategies. This average was


0.40, and Hattie called it the “hinge point.” He then undertook to
study the underlying attributes that would explain why those influ-
ences higher than 0.40 had such a positive impact compared with
those lower than 0.40. His findings were the impetus for the Visible
Learning story.

Borrowing from Visible Learning, the barometer and hinge point are
effective in explaining what we focus on in this book and why. Here’s an
example of how this might play out from literacy:

Let’s focus on sentence-combining efforts, which are popular in literacy


education circles. In essence, students are taught to use punctuation,
compound sentences, subordination, reduction, and apposition to take
two or more sentences and produce one. For example, students might be
given the following three sentences and asked to combine them:

John F. Kennedy was inaugurated into office in January 1961.

He was assassinated in November 1963.

He spent only 1,000 days in office.

There are a number of correct responses to this task, but students


may incorrectly think that the combined sentences are better, that
sentence complexity is important above all else, or that combined
sentences maintain the same meaning and focus as uncombined sen-
tences. But as with much of the educational research, there are stud-
ies that contradict other studies. For example, Wilkinson and Patty
(1993) compared sentence-combining instruction with a placebo
treatment and found significantly better results for sentence combin-
ing. But did their sentence-combining approach raise achievement
over that which was expected from simply attending school for a
year? That’s where the meta-analyses and effect size efforts can teach
us. The barometer and hinge point for sentence combining are pre-
sented in Figure 1.2. Note that this approach rests in the zone of
“developmental effects,” which is below the teacher effects and better
than reverse effects.
THE BAROMETER FOR THE INFLUENCE
OF SENTENCE COMBINING

Medium

0.40 0.50 0.60


0.70
0.30
0 0.8
w 0.2 0

H
Lo

ig
0.

h
0 Te 90
.1 ach
0

er
Ef
0

1.0
fe Zone of
0.0

Dev
e

ct

0
elop
ativ

men s Desired Effects


0

1.10
tal E
g

−0.20 −0.1
Ne

ffec
t s
Reverse Effects

1.20
Sentence Combining d = 0.15

Source: Adapted from Hattie (2012).

Figure 1.2

Our focus in Visible Learning for Literacy is on actions that fall inside
the zone of desired effects, which is 0.40 and above. When actions are in
the range of 0.40 and above, the data suggest that the learning extends
beyond that which was expected from attending school for a year.

Caution: That doesn’t mean that everything below 0.40 effect size is
not worthy of attention. In fact, there are likely some useful approaches
for teaching and learning that are not above this average. For example,
EF F EC T SIZE
FO R D RAMA/ARTS drama and arts programs have an effect size of 0.35, almost ensuring
PROG RAMS = 0.35 that students gain a year’s worth of achievement for a year of educa-
tion. We are not suggesting that drama and art be removed from the
curriculum. In fact, artistic expression and aesthetic understanding
may be valuable in and of themselves. Another critical finding was
the very low effect of teacher’s subject matter knowledge. While we
may accept the evidence that it is currently of little import, surely this
means we should worry considerably and investigate, first, why it is so

10
CHAPTER 1. LAYING THE GROUNDWORK 11

low and, second, how we can change what we do in the classroom to


ensure that the knowledge teachers bring to the classroom has a much
higher effect.

It is important to note that some of the aggregate scores mask situ-


ations in which specific actions can be strategically used to improve
students’ understanding. Simulations are a good case. The effect size
for simulations is 0.33, below the threshold that we established. But, EF F EC T S I Z E FO R
SIMULATIONS = 0.33
what if simulations were really effective in deepening understanding
but really, really bad when used with surface learning? In this case,
the strategic deployment of simulations could be important. There are
situations like this that we will review in this book as we focus on sur-
face-level literacy learning versus deep literacy learning and transfer
learning. For now, let’s turn our attention to actions that teachers can
take to improve student learning.

Learning From What Works,


Not Limited to Literacy
The majority of this book will focus on literacy, specifically. In this next
section, however, we focus our attention more broadly. Literacy instruc-
tion is situated in a larger classroom environment, and learning to read,
write, speak, listen, and view is contextualized in the general learning
situations that students encounter. We believe that the following influ-
ences deserve attention from teachers in all classes, including those
devoted to literacy.

Teacher Credibility
A few things come to mind when we consider actions that teachers
can take at the more generic level. On the top of the list, with an EF F EC T S I Z E
effect size of 0.90, is teacher credibility. Students know which teach- FO R TE ACH ER
CRED I BI LIT Y = 0 . 90
ers can make a difference in their lives. Teacher credibility is a con-
stellation of characteristics, including trust, competence, dynamism,
and immediacy. Students evaluate each of these factors to determine
if their teacher is credible, and if they are going to choose to learn
12 VISIBLE LEARNING FOR LITERACY, GRADES K–12

from that teacher. Teachers can compromise their credibility when


they violate trust, make a lot of errors, sit in the back of the room,
or lack a sense of urgency. They compromise their credibility partic-
ularly if they are not seen to be fair. Of course, each of these needs
to be held in balance. For example, too much pressure, and students
Our focus is on
will think that a given teacher is a stress case. Not enough, and they’ll
actions that fall
think their teacher doesn’t care. Similarly, students might think a
inside the zone
teacher is weird when he or she fakes excitement about a topic of
of desired effects. study, or realize that their teacher doesn’t care about the unit at all.
When actions are Although not specifically focused on literacy, the dynamic of teacher
in this range, the credibility is always at play.
data suggest that
the effort extends Consider Angela Conner. She’s always excited about everything. She
beyond that which knows her content well and works to establish trusting relationships
with her students. But every time something happens, it’s as if it’s
was expected
the most important and exciting thing ever. She is over the top with
from attending
enthusiasm. This worked well for her with her kindergarten students,
school for a year.
but her fifth graders think she’s a fake. As one of the students said,
“Yeah, Ms. Conner pretends to be excited, even when we get a test
back. Really? It’s important, but it’s not like she should be jumping
around like she does.” This student, and likely many more, is ques-
tioning Ms. Conner’s credibility and thus compromising her students’
ability to learn from her.

On the other hand, Brandon Chu exudes excitement episodically,


and his students wait for it. Things seem very important to Mr. Chu,
and he tells his students why things are important and how the class
builds on itself over the course of the year. In one lesson, Mr. Chu
said, “We’ve got some pressure on us to get some major work done.
It’s crunch time, people, and we need to support each other in our
learning. Please make sure that each of you has completed the con-

EF F EC T SIZE cept map and are ready to write. If you haven’t had a peer review
FO R CO N CEPT yet, let me know. We need to get these done so that they can be
MAP P IN G = 0.60
included in the upcoming e-zine. If we miss the deadline, we’re out
of the issue.” Mr. Chu’s students trust him and know when it’s time
to focus. They appreciate his dynamic yet not overzealous style. And,
parenthetically, they learn a lot.
CHAPTER 1. LAYING THE GROUNDWORK 13

Teacher–Student Relationships
Closely related to teacher credibility is teacher–student relationships,
which have an effect size of 0.72. When students believe that the teacher
is credible, they are more likely to develop positive relationships with
EF F EC T S I Z E FO R
that teacher, and then learn more from him or her. But relationships TE ACH ER–STU D ENT
go deeper than credibility. Of course, relationships are based on trust, RELATI O NS H I PS
= 0 . 72
which is part of the credibility construct. But relationships also require
effective communication and addressing issues that strain the relation-
ship. Positive relationships are fostered and maintained when teachers
set fair expectations, involve students in determining aspects of the
classroom organization and management, and hold students account-
able for the expectations in an equitable way. Importantly, relationships
are not destroyed when problematic behaviors occur, on the part of
either the teacher or students. This is an important point for literacy
educators. If we want to ensure students read, write, communicate, and
think at high levels, we have to develop positive, trusting relationships
with students, all students.

The optimal relationships also include when the teacher establishes


high levels of trust among the students. When students ask a question
indicating they are lost, do not know where they are going, or are just
plain wrong, high levels of peer-to-peer trust means that these students
are not ridiculed, do not feel that they should be silent and bear their
not knowing, and can depend on the teacher and often other students
to help them out.

Unfortunately, in some cases, specific students are targeted for behav-


ioral correction while other students engaged in the same behavior are
not noticed. This happens often across the K–12 grade span. We remem-
ber a primary-grade classroom in which a student with a disability was
repeatedly chastised for a problematic behavior, but other children
engaged in the same behavior were ignored and allowed to continue.
Yes, the children noticed. As one of the students said, “Mr. Henderson
doesn’t want Michael in our class.” It’s hard to develop positive relation-
ships, and then achieve, when you are not wanted. But, perhaps even
more importantly, the poor relationship between Mr. Henderson and
14 VISIBLE LEARNING FOR LITERACY, GRADES K–12

Michael spilled over to the rest of the students who didn’t think their
teacher was fair or that he was trustworthy.

We have also observed this phenomenon in secondary classrooms.


Video 1.1  There always seem to be some students who can get away with prob-
Teacher–Student
lematic behavior. Sometimes, these students are athletes; other times,
Relationships That
Impact Learning they’re cheerleaders or drama students or musicians or students whose
parents work in the district. It doesn’t really matter which group they
http://resources.corwin.com/
VL-Literacy belong to; their status allows them to get away with things that other
To read a QR code, you must students don’t. And it always compromises the trust students have with
have a smartphone or tablet with their teacher and the relationships that develop.
a camera. We recommend that you
download a QR code reader app
that is made specifically for your But we’re not saying that literacy educators should be strict disciplinar-
phone or tablet brand.
ians who mete out punishments and consequences for every infraction.
We are saying that it’s important to be consistent, to be fair, and to repair
relationships that are damaged when problematic behavior occurs. To
develop positive relationships, it’s important that teachers

•• Display student work

•• Share class achievements

•• Speak to the accomplishments of all students

•• Be sincere in their pride in their students and make sure that pride
is based on evidence of student work, not generalized comments

•• Look for opportunities for students to be proud of themselves


and of other students or groups of students

•• Develop parental pride in student accomplishments

•• Develop pride in improvement in addition to pride in excellence

As we mentioned above, teachers also have the responsibility to repair


harm to relationships. These restorative practices allow students to take
responsibility for their behavior and to make amends. This can be a sim-
ple impromptu conference, a class meeting or circle, or a more formal
victim–offender dialogue. Regardless, the point is to ensure that students
understand that their actions caused harm and that they can repair that
harm. Figure 1.3 contains questions, developed by the International
RESTORATIVE CONFERENCING

Questions to Ask the Offender Questions to Ask the Victim

•• “What happened?” •• “What was your reaction at the time of the


incident?”
•• “What were you thinking about at the time?”
•• “How do you feel about what happened?”
•• “What have you thought about since the
incident?” •• “What has been the hardest thing for you?”
•• “Who do you think has been affected by your •• “How did your family and friends react when
actions?” they heard about the incident?”
•• “How have they been affected?”

Source: Restorative Conference Facilitator Script, Restorative Conferencing, International Institute on Restorative Practices,
http://www.iirp.edu/article_detail.php?article_id=NjYy
Figure 1.3

Institute for Restorative Practices, that allow people to figure out what
went wrong and how to repair the harm that has been done. We’ve
spent time on this because relationships matter, and students achieve
more and better when they develop strong interpersonal relationships
with their teachers. It’s these humane and growth-producing conversa-
tions that help students grow in their prosocial behaviors. (Note that
the greatest effect on achievement when students join a new class or
school is related to whether they make a friend in the first month—
it is your job to worry about friendship, counter loneliness, and help
students gain a reputation as great learners not only in your eyes but
also in the eyes of their peers.) And by the way, effectively managed
classrooms, ones in which students understand the expectations and are
held to those expectations in ways that are consistent with relationship EFFECT SIZE
development and maintenance, have an effect size of 0.52. A poorly run FOR CLASSROOM
MANAGEMENT = 0.52
classroom will interfere with high-quality literacy learning.

Teacher Expectations
Another influence on student achievement that is important for literacy
educators, but isn’t directly a literacy approach, is teacher expectations,

15
16 VISIBLE LEARNING FOR LITERACY, GRADES K–12

with an effect size of 0.43. In large part, teachers get what they expect;
yes, teachers with low expectations are particularly successful at getting
EF F EC T SIZE FO R what they expect. The more recent research has shown that teachers
E X P EC TATI O NS = 0.43 who have high (or low) expectations tend to have them for all their stu-
dents (Rubie-Davies, 2015). Teachers’ expectations of students become
the reality for students. Requiring kindergarteners to master 100 sight
words, and then aligning instruction to accomplish that, communicates
the expectations a teacher has for five-year-olds. Believing that ninth
graders can only write five-paragraph essays with 500 words sets the bar
very low, and students will jump just that high, and no higher than that.
Over time, students exert just enough effort to meet teacher expecta-
tions. Hattie (2012) called this the minimax principle, “maximum grade
return for minimal extra effort” (p. 93). And it gets in the way of better
and deeper learning. When expectations are high, the minimax princi-
ple can work to facilitate students’ learning.

This does not mean that teachers should set unrealistic expectations.
Telling first graders that they are required to read Tolstoy’s War and Peace
is a bit too far. Teachers should have expectations that appropriately
stretch students, and yet those expectations should be within reach.
Sixth graders who are held to fourth-grade expectations will be great
fifth graders when they are in seventh grade; the gap never closes. And
students deserve more. When high-yield literacy instructional routines
are utilized, students can achieve more than a year’s growth during a
year of instruction. And that’s what this book focuses on—maximizing
the impact teachers have on students’ learning.

Establishing and communicating a learning intention is an important


way that teachers share their expectations with students. When these
learning intentions are compared with grade-level expectations, or
expectations in other schools and districts, educators can get a sense
of their appropriateness. We will spend a lot more time later in this
book focused on learning intentions and success criteria. Another way
to assess the level of expectation is to invite students to share their goals
for learning with their teachers—especially early in the instructional
sequence. If students have low expectations for themselves, they’re
likely hearing that from the adults around them, and often this is what
CHAPTER 1. LAYING THE GROUNDWORK 17

they achieve. And finally, analyzing the success criteria is an important


way of determining the expectations a teacher has for students. A given
learning intention could have multiple success criteria, some of which
may be fairly low and others of which may be high. The success criteria
communicate the level of performance that students are expected to
meet, yet are often overlooked in explorations about teacher expecta-
tions. We’ll return to success criteria in the next section of this chapter,
but before we do so, it’s important to note that teachers establish expec-
tations in other ways beyond the learning intention.

The ways in which teachers consciously and subconsciously communi-


cate their expectations to students are too numerous to list. Expectations
are everywhere, in every exchange teachers and students have. When
teachers use academic language in their interactions with others, they
communicate their expectations. When teachers maintain a clean and
inviting classroom, they communicate their expectations. When teach-
ers assign mindless shut-up sheets, they communicate their expecta-
tions. When teachers provide honest feedback about students’ work,
they communicate their expectations. When teachers give one class two
days to complete work and another class one day, they communicate
their expectations. We could go on. Students watch their teachers all the
Video 1.2 
time trying to figure out what is expected of them and if they are trust- Making Learning Visible
worthy. Literacy learning can be enhanced when teachers communicate With Teacher Clarity
specific, relevant, and appropriate expectations for students. From there, and Expectations

teachers can design amazing learning environments. But it’s more than http://resources.corwin.com/
VL-Literacy
instruction. Teachers should focus on learning. It’s a mindset that we all
need, if we are going to ensure that students develop their literate selves.
A major theme throughout this book is how teachers think (and also
how we want students to think). Hattie (2012) suggests 10 mind frames
that can be used to guide decisions, from curriculum adoptions to lesson
planning (Figure 1.4).

Taken together, these mind frames summarize a great deal of the “what
works” literature. In the remainder of this book, we focus on putting
EFFECT SIZE
these into practice specifically as they relate to literacy learning, and FOR TEACHER
CLARITY = 0.75
address the better question, what works best? (Hattie, 2009). To do so, we
need to consider the levels of learning we can expect from students. How,
MIND FRAMES FOR TEACHERS

   1. I cooperate with other teachers.

   2. I use dialogue, not monologue.

   3. I set the challenge.

   4. I talk about learning, not teaching.

   5. I inform all about the language of learning.

   6. I see learning as hard work.

   7. Assessment is feedback to me about me.

   8. I am a change agent.

   9. I am an evaluator.

10. I develop positive relationships.

Source: Hattie (2012). Reproduced with permission.

Figure 1.4

then, should we define learning, since that is our goal? As John himself
suggested in his 2014 Vernon Wall Lecture, learning can be defined as

[t]he process of developing sufficient surface knowledge


to then move to deeper understanding such that one can
appropriately transfer this learning to new tasks and situations.

Learning is a process, not an event. And there is a scale for learning.


Some things students only understand at the surface level. As we note
in the next chapter, surface learning is not valued, but it should be. You
have to know something to be able to do something with it. We’ve never
met a student who could synthesize information from multiple sources
who didn’t have an understanding of each of the texts. With appro-
priate instruction about how to relate and extend ideas, surface learn-
ing becomes deep understanding. Deep understanding is important if

18
CHAPTER 1. LAYING THE GROUNDWORK 19

students are going to set their own expectations and monitor their own
EFFECT SIZE FOR
achievement. But schooling should not stop there. Learning demands SELF-REPORTED
GRADES/STUDENT
that students be able to apply—transfer—their knowledge, skills, and
EXPECTATIONS
strategies to new tasks and new situations. That transfer is so difficult = 1.44
to attain is one of our closely kept secrets—so often we pronounce that
students can transfer, but the process of teaching them this skill is too
often not discussed. We will discuss it in Chapter 4.

Unfortunately, up to 90% of the instruction we conduct can be com-


pleted by students using only the surface-level skills (Hattie, 2012). Read
that sentence carefully—it did not say that teachers do not ask students
to complete deeper analyses, and it did not say that teachers do not
ask students to complete tests and assignments that focus on deeper
learning. It said that students only need a high level of surface-level
knowledge to do well on this work. Why? Because teachers value surface
learning while often preaching deeper learning. We need to balance our
expectations with our reality. This means more constructive alignment
between what teachers claim success looks like, how the tasks students
are assigned align with these claims about success, and how success is
measured by end-of-course assessments or assignments. It is not a mat-
ter of all surface or all deep; it is a matter of being clear when surface and
when deep is truly required.

The ultimate goal, and one that is hard to realize, is transfer (see
Figure 1.5 on the next page). When students reach this level, learn-
ing has been accomplished. One challenge to this model is that most
assessments focus on surface-level learning because that level is easier
to evaluate. But, as David Coleman, president of the College Board, said
in his Los Angeles Unified presentation to administrators, test makers
have to assume responsibility for the practice their assessment inspires.
That applies to all of us. If the assessment focuses on recall, then a great
number of instructional minutes will be devoted to developing students’
ability to demonstrate “learning” that way.

As teachers, we are faced with a wide range of assessments used to evalu-


ate student achievement and teacher performance. But these come and
go. Teachers also make tests and should assume responsibility for the
practices that result from their own creations.
LEARNING DEFINED: THE THREE-PHASE MODEL

Transfer

Deep

Surface

Figure 1.5

During an English department meeting at our school in San Diego, a


group of teachers proposed a cumulative final exam. One of them said,
“It would be better to mirror the expectations in college if we used a
final exam as part of our grades.”

As the discussion continued, another teacher asked, “How many days do


you think we’ll spend reviewing for the final?”

The range of answers was one day to two weeks. The assessment would
change practice. Another said, “What about building transfer tasks for
students to complete so that they would know that they had mastered
the content for our courses? If we asked them to apply their knowledge
to new tasks, we’d know they learned it, right? And we wouldn’t spend
hours reviewing the past.”

20
CHAPTER 1. LAYING THE GROUNDWORK 21

The conversation continued, and this group of teachers made their deci-
sion. Our point here is not to debate the merits of final exams, but rather
to focus on the levels of learning and the fact that teachers can choose
to engage students in deeper understanding. It’s within our power, as the
mind frames suggest, to do so.

In this book, we devote time to each level or phase of learning.


Importantly, there are teacher and student actions that work best at each What and when
of these phases. For example, note-taking works well for surface-level are equally
learning whereas repeated reading and close reading probably work better important when
for deep learning. A key point that we will make repeatedly is that teach- it comes to
ers have to understand the impact that they have on students, and choose instruction that
approaches that will maximize that impact. Mismatching an approach has an impact
with the level of learning expected will not create the desired impact.
on learning.
What and when are equally important when it comes to instruction that
has an impact on learning.

General Literacy Learning Practices


Before we dive into the levels of learning as they relate to literacy, there
are three aspects of learning that transcend the three-phase model:

1. Challenge

2. Self-efficacy

3. Learning intentions with success criteria

These should be considered in each and every learning situation as they


are global factors that impact understanding. We explain each of these
in more detail below.

1. Challenge
The first of these global aspects is challenge. Students appreciate chal-
lenge. They expect to work hard to achieve success in school and life.
When tasks become too easy, students get bored. Similarly, when tasks
become too difficult, students get frustrated. There is a sweet spot for
learning, but the problem is that it differs for different students. There
22 VISIBLE LEARNING FOR LITERACY, GRADES K–12

is a Goldilocks notion of making a task not too easy or too hard but just
right. As Tomlinson (2005) noted,

Ensuring challenge is calibrated to the particular needs of a


learner at a particular time is one of the most essential roles of
the teacher and appears non-negotiable for student growth.
Our best understanding suggests that a student only learns
when work is moderately challenging that student, and where
there is assistance to help the student master what initially
seems out of reach. (pp. 163–164)

How, then, can literacy educators keep students challenged but not frus-
trated? There are several responses to this question, and our answer is
embedded in every chapter of this book. In part, we would respond that
the type of learning intention is important to maintain challenge.

Learning Intention: Surface, Deep, or Transfer

The teacher should know if students need surface-, deep-, or transfer-type


work—or what combination—while ensuring the parts are explicit for
the student. In this way, the teacher can maintain the challenge while
providing appropriate instructional supports. Showing students near the
beginning of a series of lessons what success at the end should look
like is among the more powerful things we can do to enhance learning.
There are many ways to do this—among them,

•• Showing them worked examples of an A, B, and C piece of work,


and discussing how they differ
•• Giving them the scoring rubrics at the outset and teaching them
what they mean
•• Sharing last year’s students’ work in the same series of lessons
•• Building a concept map with them up front to show the interre-
lationships between the various parts they will learn about

—anything to help provide a coat hanger for students to know what


good enough is, what success looks like, how they will know when they
get there. Not showing this is like asking a high jumper to jump the bar
but not telling or showing him or her how high the bar is!
CHAPTER 1. LAYING THE GROUNDWORK 23

Student-to-Student Interaction

In addition, we would note that schools should be filled with


student-to-student interaction. As one of the mind frames above sug-
gests, classrooms should be filled with dialogue rather than monologues.
We say this for several reasons, including the fact that no one gets good
at something he or she doesn’t do. If students aren’t using language—
EF F EC T S I Z E
speaking, listening, reading, and writing—they’re not likely to excel in FO R COO P ERATI V E
those areas. Further, as students work collaboratively and cooperatively, LE ARN I N G = 0 . 4 2
the assigned tasks can be more complex because there are many minds
at work on solving the tasks. Of course, this requires clear expectations
for group work and instruction about how to work with others. But
the outcomes are worth it—students learn more deeply when they are
engaged in complex tasks that involve collaboration (they don’t nec-
essarily learn more from collaborating with others when the learning
focuses on surface-level content). Further, when students work together EF F EC T S I Z E
FO R P EER
in groups, they have an opportunity to engage in peer tutoring, which TUTO RI N G = 0 . 5 5
has an effect size of 0.55.

Feedback

How else can we maintain challenge for each learner? Our third
response relates to feedback. When students are engaged in appropri-
ately challenging tasks, they are more likely to respond to feedback
because they need that information to continue growing and learn-
ing. Feedback focused on something that you already know does little
to change understanding. Feedback thrives on errors. For example,
Marco has a strong sense of English spelling. His writing is filled with
complex vocabulary terms that are spelled correctly. He understands
how to use resources to build this knowledge about words. Thus, feed-
back about the misspelling of the word acknowledge, which he spelled
“acknowlege” in his handwritten draft, is not likely to result in great
changes in his learning. Any spell-check program on a computer will
tell him he is wrong, and he can correct it. A better use of time might
be to focus on Marco’s use of clichés in his writing. A useful conversa-
tion with him could show him that the more familiar a term or phrase
becomes, the more often readers skip over it as they read, essentially
rendering the text ineffective.
24 VISIBLE LEARNING FOR LITERACY, GRADES K–12

What Makes a Task Challenging?

Unfortunately, some people confuse difficulty with complexity. We like


to think of difficulty as the amount of effort or work a student is expected
We don’t believe to put forth whereas complexity is the level of thinking, the number of
that teachers can steps, or the abstractness of the task. We don’t believe that teachers can
radically impact radically impact students’ learning by making them do a lot more work.
We know that students learn more when they are engaged in deeper
students’ learning
thinking. That’s not to say that difficulty is bad. We think of this in four
by making
quadrants (see Figure 1.6). The quadrant that includes low difficulty and
them do a lot
low complexity is not unimportant. We think that note-taking fits into
more work.
that quadrant. If that’s all students experience, learning isn’t likely to
be robust. However, learning to take notes, and then engaging in study
skills with those notes (which likely raises the complexity but not the
difficulty), could impact learning. As part of each lesson, teachers should
know the level of difficulty and complexity they are requiring of stu-
dents. They can then make decisions about differentiation and instruc-
tional support, as well as feedback that will move learning forward.

2. Self-Efficacy
A second global consideration for literacy educators is students’
self-efficacy. Hattie (2012) defines self-efficacy as “the confidence or
strength of belief that we have in ourselves that we can make our
learning happen” (p. 45). He continues, with descriptions of students
with high self-efficacy, noting that they

•• Understand complex tasks as challenges rather than trying to


avoid them

•• Experience failure as opportunities to learn, which may require


additional effort, information, support, time, and so on

•• Quickly recover a sense of confidence after setbacks

By contrast, students with low self-efficacy

•• Avoid complex and difficult tasks (as these are seen as personal
threats)
DIFFICULTY AND COMPLEXITY

More Complex

Low Difficulty High Difficulty


High Complexity High Complexity

Easy Hard

Low Difficulty High Difficulty


Low Complexity Low Complexity

Less Complex
Figure 1.6

•• Maintain weak commitment to goals

•• Experience failure as a personal deficiency

•• Slowly recover a sense of confidence after setbacks

It almost goes without saying that the impact of self-efficacy on learning is


significant. Our emotions, the sense of failure, and our anxieties are often
invoked in our learning—or more often in our resistance to engage in learn-
ing. Building a sense of confidence that you can indeed attain the criteria of
success for the lessons may be a first critical step—without a sense of con-
fidence, we often do not open our ears to what we are being taught. Most
of us are more likely to engage in difficult, complex, or risky learning if we
know there is help nearby, that there are safety nets, that we will not be
ridiculed if we do not succeed—this is where the power of the teacher lies.

Students with high self-efficacy perform better and understand that


their efforts can result in better learning. This becomes a self-fulfilling

25
26 VISIBLE LEARNING FOR LITERACY, GRADES K–12

prophecy: the rich get richer, and the poor get poorer. Students with
poor self-efficacy see each challenge and setback as evidence that they
aren’t learning, and in fact can’t learn, which reduces the likelihood that
they will rally the forces for the next task the teacher assigns.

In their study about ways to increase students’ self-efficacy, Mathisen


and Bronnick (2009) suggested a combination of the following (each of
which is addressed later in this book in more detail):

•• Direct instruction with modeled examples

•• Verbal persuasion through introductory information

•• Feedback on attempts made by learners

•• Guided use of techniques on well-defined problems

•• Supervised use of techniques on self-generated problems

To this we add

•• Demonstrating your credibility by being fair to all

•• Being there to help students reach targets

•• Creating high levels of trust between yourself and the students


and between students

•• Showing that you welcome errors as opportunities for learning

Others have made different recommendations (e.g., Linnenbrink &


Pintrich, 2003), and our point here is not to endorse one approach
over another but rather to confirm that teachers can change students’
agency and identity such that self-efficacy, the “belief that we have in
ourselves that we can make our learning happen” (Hattie, 2012, p. 46),
is fostered.

3. Learning Intentions With Success Criteria


The third and final global aspect that should permeate literacy learning
relates to being explicit about the nature of learning that students are
expected to do and the level of success expected from the lesson. Teacher
CHAPTER 1. LAYING THE GROUNDWORK 27

clarity about learning expectations, including the ways in which stu-


dents can demonstrate their understanding, is powerful. The effect size
is 0.75. Every lesson, irrespective of whether it focuses on surface, deep, EF F EC T S I Z E
or transfer, needs to have clearly articulated learning intention and suc- FO R TE ACH ER
CLARIT Y = 0 . 75
cess criteria. We believe that students should be able to answer, and ask,
these questions of each lesson:

1. What am I learning today?

2. Why am I learning this?

3. How will I know that I learned it?

The first question requires deep understanding of the learning inten-


tion. The second question begs for relevance, and the third question
focuses on the success criteria. Neglecting any of these questions com-
promises students’ learning. In fact, we argue that these questions com-
pose part of the Learner’s Bill of Rights. Given that teachers (and the
public at large) judge students based on their performance, it seems only
fair that students should know what they are expected to learn, why
they are learning that, and how success will be determined. The marks
teachers make on report cards and transcripts become part of the per-
manent record that follows students around. Those documents have the
power to change parents’ perceptions of their child, determine future
placements in school, and open college doors. And it works. Clearly EF F EC T S I Z E FO R
GOALS = 0 . 50
articulating the goals for learning has an effect size of 0.50. It’s the right
thing to do, and it’s effective.

We’re not saying that it’s easy to identify learning intentions and success
criteria. Smith (2007) notes, “Writing learning intentions and success cri-
teria is not easy . . . because it forces us to ‘really, really think’ about what
we want the pupils to learn rather than simply accepting statements
handed on by others” (p. 14). We are saying that it’s worth the effort.

Learning intentions are more than a standard. There have been far too
many misguided efforts that mandated teachers to post the standard on
the wall. Learning intentions are based on the standard, but are chun-
ked into learning bites. In too many cases, the standards are not under-
standable to students. Learning intentions, if they are to be effective,
28 VISIBLE LEARNING FOR LITERACY, GRADES K–12

have to be understood and accepted by students. Simply writing a tar-


get on the dry-erase board and then reading it aloud waters down the
power of a learning intention, which should focus the entire lesson and
serve as an organizing feature of the learning students do. At minimum,
learning intentions should bookend lessons with clear communication
Video 1.3 
Making Learning
about the learning target. In addition, teachers can remind students of
Visible Through the learning intention at each transition point throughout the lesson.
Learning Intentions In this way, the learning intention drives the lesson, and students will
http://resources.corwin.com/ develop a better understanding of how close they are to mastering the
VL-Literacy
expectations. Most critical, the learning intention should demonstrably
lead to the criteria of success—and if you had to use only one of these,
we would recommend focusing on being more explicit about the success
criteria. Both help, but the judgment about the standard of work desired
is more important than explication about the particular tasks we ask
students to do. It is the height of the bar, not the bar, that matters.

Figure 1.7 contains some poorly written learning intentions and some
improvements that teachers made collaboratively as they explored the value
of this approach. Note that the intentions became longer, more specific,
and more interesting. The improved versions invite students into learning.
Of course, learning intentions can be grouped. Sometimes an activity can
contribute to several learning intentions, and other times a learning inten-
tion requires several activities. However, when learning intentions spread
over many days, student interest will wane, and motivation will decrease.
When teachers plan a unit of study and clearly identify the learning inten-
tions required for mastery of the content, most times they can identify
daily targets. In doing so, they can also identify the success criteria, which
will allow for checking for understanding and targeted feedback.

The success criteria must be directly linked with learning intentions


to have any impact. The success criteria describe how students will be
expected to demonstrate their learning, based on the learning intention.
That’s not to say that success criteria are just a culminating activity, but
they can be. Consider the following ways that students might demon-
strate success based on a learning intention that reads, “Analyze visual
images presented in the text and determine how this information con-
tributes to and clarifies information.”
SAMPLE LEARNING INTENTIONS

Grade Poor Example Improved Version

K Compare the experiences Today, we’ll read two stories about city and country life.
of characters in two stories. We’ll focus on comparing the lives of the two characters and
the differences in their lives based on where they live.

5 Use technical language in As we revise our opinion papers, we are going to learn
the revisions of essays. how to update our word choices so that we use technical
vocabulary like the authors we’ve been studying use.

7 Determine the central idea Each group has a different article, and our learning today
of a text. is going to focus on locating the central or controlling
idea, the idea that the author uses to hold the entire text
together.

11 Compare two texts for Compare how two texts from the same point in U.S. history
different themes. address a common theme and figure out what each author is
trying to say in response to the theme.

Figure 1.7

•• Discuss with a partner the way the author used visuals and how
they helped you understand the text.

•• Identify one place in the text that was confusing and how one of
the visuals helped you understand that information.

•• In your annotations, make sure to include situations where the


visual information helped you understand the text itself.

•• Create a visual that will help another person understand the


words in the text.

All of these work, in different situations. Clarity is important here. What


is it that students should be learning, and how will they know (not to
mention how will the teacher know) if they learned it? That’s the power
of learning intentions and success criteria.

Importantly, students can be involved in establishing the success crite-


ria and, in many cases, the learning intentions. Teachers can ask their

29
30 VISIBLE LEARNING FOR LITERACY, GRADES K–12

students, “How will you know you have learned this? What evidence
could we accept that learning has occurred?” In these situations, stu-
dents can share their thinking about the success criteria, and often they
are more demanding of themselves than their teachers are. In a sixth-
Video 1.4 
Making Success Criteria grade English class focused on learning to come to group discussions
Visible in Fourth Grade prepared, the students identified several ways that they would know if
http://resources.corwin.com/ they met this expectation. Several suggested that they should have their
VL-Literacy learning materials with them when they moved into collaborative learn-
ing. Others added that they should have their notes and annotations
updated and be ready to talk about their reading, rather than read while
they are in the group. One student suggested that they should practice
vocabulary before the group so that they would be ready. Another added
that they should each know their role in the group so that they can get
started right away. None of these answers were wrong; they were all
useful in improving the collaborative learning time. In this case, the stu-
dents established the success criteria and opened the door to feedback
from their peers and the teacher in their successive approximations in
demonstrating mastery of their learning.

Further, when students understand the success criteria, they can be most
involved in assessing their own success, and their progression toward
this success. A simple tool allows students to put sticky notes in one of
four quadrants to communicate their status (see Figure 1.8). This alerts
the teacher, and other students, about help that is needed. It mobilizes
EF F EC T SIZE peer tutoring and cooperative versus competitive learning, as well as
FO R P EER
TUTO RI N G = 0.55 building student-centered teaching.

Other times, the tools used to create the success criteria involve rubrics
EF F EC T SIZE FO R
COO P ERATI V E and checklists. For example, students in a high school language arts class
V ERSUS
were tasked with selecting a worthy cause, something that they cared pas-
CO MP ETITI V E
L EARNIN G = 0.54 sionately about and whose value they could explain to others. Students
were encouraged to select topics that were personally relevant and to
learn more about that topic. As part of the assignment, students wrote
EF F EC T SIZE
FO R STU D ENT- an analytic essay about their chosen topic. Another part of the project
CENTERED required that they develop a web page, a Facebook page, or another elec-
TEACH IN G = 0.54
tronic way of communicating with a wider world about their cause. And
still another part of their assignment required the development of an
informational pamphlet that they could use to educate adults about the
SAMPLE SELF-ASSESSMENT OF LEARNING

I do not yet understand. I am starting to understand.

I need coaching. I need coaching but want to


try some on my own.

I understand! I understand very well.

I make a few mistakes, so I can explain this to others


I’m working through those. without telling them the answers.

Figure 1.8

Template available for download at http://resources.corwin.com/VL-Literacy

issue. Students selected a range of worthy causes, from Islamophobia to


endangered animals to mental health. Figure 1.9 on the next page con-
tains the checklist that the teachers used to communicate their expecta-
EFFECT SIZE
tions to students. Note that many of these are compliance-related items FOR CREATIVITY
that will subsequently allow teachers, and students, to determine if the PROGRAMS ON
ACHIEVEMENT = 0.65
experience left a lasting impact. The teachers were aiming to tap into and
integrated curricular approaches. They were also looking for evidence of
learning transfer, asking students to mobilize their literacy skills for a task EF F EC T S I Z E
FO R I NTEG RATED
they had not completed before. CU RRI CU LA
P ROGRAMS = 0 . 39
Clearly articulating the success criteria allows errors to become more obvi-
ous. Errors should be expected and celebrated because they are oppor-
tunities for learning. If students are not making errors, they have likely
previously mastered the learning intention. Also note that feedback thrives
on the presence of errors. Errors should be the hallmark of learning—if we
are not making enough errors, we are not stretching ourselves; if we make
too many, we need more help to start in a different place. Unfortunately,
in too many classrooms, students who already know the content are privi-
leged, and students who make errors feel shame. In those situations, learn-
ing isn’t occurring for students who already know the content; they’ve
already learned it. But learning isn’t occurring for the students who make
errors because they hide their errors and avoid feedback. Classrooms have
to be safe places for errors to be recognized.

31
SAMPLE PROJECT CHECKLIST

Pamphlet Portion
Date Completed
Item Projected 
Cover has the title, image, and your name
Description of your cause (minimum 10 sentences)
List 3–5 important facts
Map of where this is occurring
Demographics of who/what is impacted
Minimum of 3 images in your brochure
Contact information (websites, telephone numbers)
Upcoming events (celebrations, day, movie, anniversary date, races, etc.)
Pamphlet is attractive and well organized
Correct spelling and grammar

Figure 1.9

Template available for download at http://resources.corwin.com/VL-Literacy

For example, a secondary science class was focused on reviewing the


changes in climate—there were clipboards everywhere, with students
running around the school checking temperatures. They had great anal-
yses and stunning box and whisker plots. But when they were asked how
long they had been doing this task, they said three weeks (and that it was
fun). What a waste. Perfection is not necessarily the aim of lessons; the
presence of errors is a better indicator of a successful lesson, and surely
hints to the teacher and student where is the most likely place to go next.

When errors are celebrated and expected, feedback takes hold.


Feedback has a powerful impact on student learning, with an effect
size of 0.75, placing it in the top 10 influences on achievement. But
it’s only when the feedback is received that it works. Giving feedback
is different from receiving feedback. Feedback is designed to close the

32
CHAPTER 1. LAYING THE GROUNDWORK 33

gap between students’ current level of understanding or performance


and the expected level of performance, which we call the success cri-
teria. For feedback to work, teachers have to understand
EF F EC T S I Z E FO R
F EED B ACK = 0 . 75
•• Students’ current level of performance

•• Students’ expected level of performance

•• Actions they can take to close the gap

Feedback, as Brookhart (2008) describes it, needs to be “just-in-time,


just-for-me information delivered when and where it can do the most
good” (p. 1). Figure 1.10 on the next page includes information about
the ways in which feedback can vary in terms of timing, amount, mode,
and audience. We’ll focus on feedback in greater depth in the chapter on
deep literacy learning (Chapter 3). For now, we hope you appreciate the
value of feedback in impacting student learning.

Conclusion
Teachers, we have choices. We can elect to use instructional routines and
procedures that don’t work, or that don’t work for the intended purpose. Errors should
Or we can embrace the evidence, update our classrooms, and impact stu- be expected
dent learning in wildly positive ways. We can choose to move beyond and celebrated
surface-level learning, while still honoring the importance of teaching because they are
students surface-level skills and strategies. We can extend students’ learn- opportunities
ing in deep ways and facilitate the transfer of their learning to new tasks, for learning. If
texts, and projects, if we want. We can design amazing lessons that mobi-
students are
lize the evidence and provide opportunities for students to learn. And we
not making
can decide to evaluate our impact, if we are brave enough.
errors, they have
Monica was lucky enough to transfer to a school that embraced Visible
likely previously
Learning for Literacy. Her teachers tried out the instructional ideas, moni mastered the
tored progress, and provided feedback to her and to each other. Monica learning intention.
went from a failing student, tracked in a class with low expectations, to
a lead learner providing support for her peers. Impact has a face. It’s not
an abstract idea or ideal. Together, we can impact the literacy learning of
every student. Let’s make it so.
FEEDBACK STRATEGIES

Feedback
Strategies Can
Vary in . . . In These Ways . . . Recommendations for Good Feedback

Timing •• When given •• Provide immediate feedback for knowledge of facts


(right/wrong).
•• How often
•• Delay feedback slightly for more comprehensive reviews of
student thinking and processing.
•• Never delay feedback beyond when it would make a
difference to students.
•• Provide feedback as often as is practical, for all major
assignments.

Amount •• How many •• Prioritize—pick the most important points.


points made
•• Choose points that relate to major learning goals.
•• How much about
•• Consider the student’s developmental level.
each point

Mode •• Oral •• Select the best mode for the message. Would a comment in
passing the student’s desk suffice? Is a conference needed?
•• Written
•• Interactive feedback (talking with the student) is best when
•• Visual/
possible.
demonstration
•• Give written feedback on written work or on assignment
cover sheets.
•• Use demonstration if “how to do something” is an issue or
if the student needs an example.

Audience •• Individual •• Individual feedback says, “The teacher values my learning.”


•• Group/class •• Group/class feedback works if most of the class missed
the same concept on an assignment, which presents an
opportunity for reteaching.

Source: Brookhart (2008).

Figure 1.10

34

You might also like