Teaching of Algebra
Teaching of Algebra
REFERENCES
Linked references are available on JSTOR for this article:
http://www.jstor.org/stable/40927286?seq=1&cid=pdf-reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted
digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about
JSTOR, please contact [email protected].
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
http://about.jstor.org/terms
Sage Publications, Inc., American Educational Research Association are collaborating with
JSTOR to digitize, preserve and extend access to Review of Educational Research
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Review of Educational Research
September 2010, Vol. 80, No. 3, pp. 372-400
DOI: 10.3102/0034654310374880
2010AERA. http://rer.aera.net
Authors' Note: An earlier version of this article was presented at the 2009 annual meeting
of the American Educational Research Association, San Diego, CA.
372
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
What Is Algebra?
373
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al
through several complex topics with multiple sources of difficulty. These topics
form the foundation for every advanced application of mathematics.
The unique challenges of algebra. Students beginning the study of algebra face
learning challenges that form a general foundational set of understandings neces-
sary to negotiate this topic. First, algebra is often the first course in which students
are asked to engage in abstract reasoning and problem solving (Vogel, 2008).
Researchers have demonstrated that the abstract nature of algebra increases its
difficulty over arithmetic (Carraher & Schliemann, 2007; Howe, 2005; Kieran,
1989). Students who have experienced mathematics only at a concrete or proce-
dural level, typical in many classrooms, must negotiate the difficult gap from con-
crete to abstract reasoning with no preparation (Freudenthal, 1983). This
inexperience with abstractness for the construction of meaning directly affects the
ability of students to manage multiple representations of algebraic objects (Kieran,
1992; Vogel, 2008).
Second, the learning of algebra requires students to learn a language of math-
ematical symbols that is also completely foreign to their previous experiences
(Kilpatrick, Swafford, & Findell, 2001). The multiple ways in which this language
is described and used during instruction often prevent students from connecting
algebraic symbols to their intended meaning (Blanco & Garrote, 2007; Socas
Robayna, 1997). In some cases, students are completely unaware that any meaning
was intended for the symbols (Kuchemann, 1978). In other cases, they may know
that meaning exists, but limited understanding prevents them from ascribing mean-
ing to the symbols, or they may assign erroneous meaning to the symbols
(Kuchemann, 1978). For example, as students study topics such as functions and
graphs, they begin to understand and interpret one set of algebraic objects in terms
of another (e.g., a function equation with its graph, a data set by its equation, a data
set by its graph, as in Leinhardt, Zaslavsky, & Stein, 1990). McDermott, Rosenquist,
and Van Zee (1987) found that students are generally able to plot points and equa-
tions; however, in spite of this procedural fluency, students still lack the ability to
extract meaning from graphical representations. They concluded that the difficulty
lies in the connection of a graph to the construct being represented. Specifically,
students are readily capable of demonstrating procedural fluency, but memory and
procedural understanding is unable to guide students through problems involving
interpretation (Skemp, 1976/2006).
Kieran (1992), Howe (2005), and Carraher and Schliemann (2007) recognized
that learning the structural characteristics of algebra creates a third challenge for
students. For example, students often fail to recognize the differences between
expressions and equations. They also have difficulty conceptualizing an equation
as a single object rather than a collection of objects. The meaning of equality is
often confused within algebra contexts as well. Taken together, these three exam-
ple structural challenges often prevent students from recognizing the utility of
algebra for generalizing numerical relationships.
These three foundational understandings, abstract reasoning, language acquisi-
tion, and mathematical structure, are often unique challenges for students. Although
each in itself can serve as a unique obstacle to learning, the interaction of all
three forms a much more formidable impediment to mastering algebra for many
374
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
students. As a result, these students have a poor initial experience with algebra and
therefore fail to gain an adequate foundation for future learning. For example,
consider the expression a + b: How students interpret the meaning of each variable
depends on how well they can handle the abstract nature of the symbols.
Furthermore, students must recognize that the expression a + b represents the total
number of items from a set of a and b items (Kieran, 1992).
The teaching methods used to convey content often exacerbate these algebra
learning barriers, possibly becoming a unique barrier themselves (Leitzel, 1989;
Thorpe, 1989). Sfard (1991) found that both students and teachers often expect
immediate rewards for teaching and learning efforts. Instead, Sfard noted that rela-
tional understanding of abstract mathematical ideas often requires a lengthy, itera-
tive process. Teaching methods that focus on skill or procedural levels on cognitive
demand fail to address these foundational understandings and therefore fall short
of providing students the tools necessary to find their way once they waver from a
scripted path. Kieran (1992) extended Sfard's findings to algebra. She suggested
that a great deal of time must be spent connecting algebra to arithmetic before
proceeding to the structural ideas of algebra. Instead, teachers often spend a short
period of time reviewing arithmetic and then proceed directly into a textbook
sequence of instruction, which are often insufficient for helping students under-
stand the abstract, structural concepts necessary for supporting the demonstrated
procedural activities in algebra (Kieran, 1992).
The difficulties of achieving competence in abstract reasoning, language acqui-
sition, and mathematical structure within the learning of algebra require teaching
strategies that purposefully target the needs of learners. For example, in recogni-
tion of the difficulties some students have learning algebra in isolation, cooperative
and collaborative learning (e.g., Slavin & Karweit, 1982) offers a relevant peda-
gogical option. For students struggling to connect abstract concepts with concrete
examples, mastery learning and problem-based learning may be an appropriate
strategy.
The No Child Left Behind Act (2002) called for the use of research-based strat-
egies to help districts, schools, and educators choose the most appropriate pro-
grams and materials for their particular settings. Easton (2010) advanced this call
by proposing the development of better collaboration between researchers and
practitioners. Unfortunately, educators have been largely left to synthesize a broad
base of research on their own - a time-consuming task, for which most have lim-
ited training. The best tools available to educators are systematic reviews and
meta-analyses that provide convenient summaries of current research on a particu-
lar topic. In the case of algebra, the core of the high school mathematics curricu-
lum, little progress has been made by the mathematics education research
community to compile research on effective ways of improving instruction and
learning. In recognition of the major gap in focused literature on algebra instruc-
tion, the National Council of Teachers of Mathematics compiled its 70th yearbook
around topics in algebra instruction (Greenes & Rubenstein, 2008). This compre-
hensive guide focused on offering practical advice to those wishing to improve
algebra instruction at the classroom, school, and district levels. The yearbook
addressed a portion of the gap in algebra research by building a pedagogical
knowledge base for algebra. However, other important gaps still exist; specifically,
375
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
which instructional improvement strategies have been studied, how effective those
strategies have been, and the consistency of the evidence for the efficacy of those
strategies.
Existing analyses of mathematical interventions have focused on algebra only
as a subcomponent within a larger framework. For example, one analysis exam-
ined interventions in elementary mathematics education (e.g., Slavin & Lake,
2008). Other studies examined specific improvement strategies in mathematics
across all grade levels, such as technology use (e.g., Hartley, 1977), calculators
(e.g., Ellington, 2003, 2006), peer tutoring (e.g., Hartley, 1977; Lou, Abrami, &
d'Apollonia, 2001; Lou, Abrami, Spence, & Poulsen, 1996; Roscoe & Chi, 2007),
proximal zones of influence (e.g., Seidel & Shavelson, 2007), and structured
inquiry (e.g., Butler & Winne, 1995; Klauer & Phye, 2008; Pajares, 1996;
Rosenshine, Meister, & Chapman, 1996). However, none of these studies focused
on algebra instructional improvement and its effects on student achievement.
One meta-analysis did focus on interventions in algebra (Haas, 2005); however,
several key features of a systematic review were missing, limiting the usefulness
of that research to meet the needs of the mathematics research community (e.g.,
did not provide rationale for the inclusion criteria and did not take steps to estimate
and maximize the reliability of data extraction). Haas (2005) identified six instruc-
tional intervention categories: direct instruction, cooperative learning, communi-
cation and study skills, multiple representations, problem-based learning, and
technology. Basing recommendations solely on the point estimates of effect sizes,
he concluded that educators should focus on direct instruction, problem-based
learning, and multiple representations, noting that his categories served not so
much as approaches to teaching but as tools to be incorporated into any lesson.
Several methodological issues within Haas's (2005) meta-analysis make the
validity and interpretability of his results unclear (e.g., did not state how effect
sizes were computed and weighted, did not account for nonindependent observa-
tions, and did not investigate the possible effects of publication bias). For educators
interested in improving algebra instruction, perhaps the most problematic aspect
of the review is that it is unclear what rules were used to place interventions into
different instructional categories. For example, "Direct instruction is a teaching
method type that may encompass all the others
376
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
Algebra learning foci: Teaching for understanding. "There may be debate about
what mathematical content is most important to teach. But there is growing consensus
that whatever students learn, they should learn with understanding" (Hiebert et al.,
2000, p. 2). The framework described by Hiebert and Carpenter (1992) differenti-
ated between conceptual and procedural understanding. In their framework, pro-
cedural knowledge isolated from conceptual meaning can result in
misunderstandings similar to those described by Skemp (1976/2006) as resulting
from instrumental understanding.
Skemp (1997/2006) described instrumental understanding of mathematics as a
way of learning that forces students to rely on memorization and prescriptions.
Kieran (2007) agreed with Skemp's viewpoint of the limiting nature of instrumen-
tal mathematics. Even the manipulation of symbols, once considered primarily an
algorithmic process, has become recognized as fundamentally dependent on con-
cept meaning and connections (Kieran, 2007). To better illustrate the meaning of
instrumental understanding, Skemp gave the analogy of a person trying to navigate
through a new city. A person with an instrumental understanding of the city may
have a number of ways to get from Point A to Point B. The difficulty with this
understanding arises when the person deviates from the original course. In such a
case, the person gets lost. Instrumental understanding of algebra produces similar
results. For instance, students may learn a set of prescriptions for solving equations
of the form ax + b = c; when they encounter equations of the form ax + b = ex + d,
their prescriptions are unable to accommodate the new form.
For example, an algebra class learning about quadratic functions might be asked
purely procedural questions such as, "Use the quadratic formula to solve x2 +
6x + 8 = 0." In such a problem, students can find a solution without any understanding
of the meaning of quadratic functions simply by factoring or use of the quadratic
formula. Alternatively, a conceptually focused question might ask students to
graph v = x2 + 6x + 8 and explain how the x intercepts of the graph are related to
the factors of the equation. Such an approach requires students to make connec-
tions between the factors of the function (i.e., [x + 2] and [x + 4]), the roots of the
function (i.e., x = -2, x = -4), and the x intercepts of the function (i.e., [-2, 0] and
[-4,0]).
Students who acquire only procedural knowledge often "get lost" when sub-
jected to unfamiliar situations and are unable to apply important mathematical
concepts and structure in those situations. The key to avoiding this and other pit-
falls, according to Hiebert and Carpenter (1992), Hiebert and Grouws (2007), and
Skemp (1976/2006), is to focus first and primarily on the meaning of important
mathematical ideas and the connections linking these ideas. Such a focus runs
counter to the traditional intuitions of educators:
377
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
Since the [conceptual] program uses a format that requires the student to do
more than memorize the formula to successfully answer a series of computa-
tional problems on that concept, students' academic performance should have
favored the [procedural] group. The data support the opposite, showing the
[conceptual] group outperforming the [procedural] group on all four unit
tests. (Peters, 1992, p. 94)
The NCTM principles, along with the National Research Council (Kilpatrick
et al., 2001), have recognized the importance of conceptual understanding and
called for an increased focus on central concepts and integrating disparate parts:
Hiebert and Grouws (2007) described two observable features for a classroom
focusing on conceptual understanding: (a) Teaching focuses explicitly to connec-
tions between facts, procedures, and ideas, and (b) students are allowed to struggle
with important mathematical concepts. Procedural fluency is supported in this
goal: In a conceptually based environment, procedures are learned as emergent
from connecting concepts (Rittle- Johnson & Alibali, 1999). This conceptual
understanding epistemology holds to a belief that focusing on conceptual knowl-
edge and relational understanding carries several benefits for students: Knowledge
becomes more adaptable to new tasks, learning becomes generative, and students
begin developing their own knowledge (Hiebert & Carpenter, 1992; Skemp,
1976/2006). Memory is enhanced, while at the same time, students need to memo-
rize less (Hiebert & Carpenter, 1992; Van De Walle, 2007). Furthermore, the build-
ing of relationships between concepts and procedures identifies the identical
elements (Thorndike, 1913) necessary for preexisting knowledge to transfer to new
knowledge (Hiebert & Carpenter, 1992).
Both conceptual and procedural epistemological viewpoints can dictate how a
teaching method is implemented. For example, manipulatives, whether virtual or
physical, can be used to build connections among ideas, as in Aburime (2007);
Cavanaugh, Gillan, Bosnick, Hess, and Scott (2008); and Suh and Mover (2007).
Manipulatives can also be used to enhance skill proficiency, as in Goins (2001),
Goldsby ( 1 994), and McClung ( 1 998). The same pattern is also true for every other
category of instructional treatment.
The challenges faced by students learning algebra have led to steep declines in
student mathematics achievement. Traditional instructional practices have led stu-
dents to view mathematics as a set of disjointed algorithms. We therefore used the
framework of Hiebert and Carpenter (1992) to capture the way algebra interven-
tions have been used to address this critical component of mathematics learning.
The descriptions of practice described by Hiebert and Grouws (2007) provided the
criteria for this differentiation. By examining both the type of intervention and the
epistemological focus of each intervention, the present study seeks to provide a
valuable synthesis for both researchers and practitioners.
378
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Study Purpose
Although the 70th NCTM yearbook (Greenes & Rubenstein, 2008) advanced
teaching strategies and theories of teaching algebra, and other meta-analyses (e.g.,
Ellington, 2003, 2006; Slavin & Lake, 2008) have touched on algebra while exam-
ining other mathematics topics, the impact of instructional improvement in algebra
on student achievement remains largely unexplored. The purpose of the current
study is to fill this gap by addressing three questions through a systematic review
and meta- analysis of the literature on algebra instruction:
To answer the first two questions, studies were organized into categories of
instructional improvement methods. The effectiveness of these categories was
measured using standardized mean difference effect sizes. Moderator tests were
used to test for category differences in effect size variance. To answer the third
question, we examined the epistemological learning focus of the study interven-
tions, specifically whether the intervention sought to develop conceptual or proce-
dural understanding. Hiebert and Grouws (2007) suggested that the degree to
which learning is focused on developing conceptual understanding may determine
the effectiveness of a teaching intervention. The synthesis of evidence addressing
these three questions may offer important insight for practitioners and researchers
on improving the learning of algebra.
Method
379
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al
2000; Wiesner, 1989) rather than achievement. Third, the study had to employ an
experimental design with a comparison group. We included quasi-experimental
designs along with random experiments to maximize statistical power and external
validity. We excluded, however, observational studies (e.g., Malloy & Malloy,
1998) and exploratory studies with no treatment (e.g., Berenson, Carter, &
Norwood, 1992). Fourth, the comparison group had to have received the "usual
instruction." We therefore excluded, for example, studies that compared the effec-
tiveness of two novel treatments but did not include a usual instruction group (e.g.,
Woolner, 2004). Taken together, the second, third, and fourth criteria addressed the
construct validity of the study (Shadish, Cook, & Campbell, 2001). Based on these
inclusion criteria, we interpreted effect sizes as the magnitude of the impact of
pedagogical strategies on student algebra achievement.
We chose to refrain from setting date limitations on our sample, which included
studies from 1968 to 2008. The inclusion of studies more than 40 years old was
both acceptable and desirable for the purposes of this analysis for two reasons.
First, the differentiation between conceptual and procedural understanding can be
traced back at least as far as Brownell's (1938) statement regarding the correction
of "errors in understanding and computation" (p. 498). Second, the traditional
methods of the early and middle 20th-century continue today. Welch (1978)
described the typical mathematics classroom as following a rote procedure that
focused solely on solving a high number of homework problems. Likewise, a
decade later, Stodolsky (1988) claimed that in the United States, "most instruction
is geared to algorithmic learning" (p. 7). Another decade later, the purpose of
mathematics lessons had changed little (Stigler & Hiebert, 1997, p. 18), and evi-
dence suggests that these patterns continue today (Hiebert & Grouws, 2007).
Based on these reported trends, we concluded that an arbitrary date limitation
would reduce the ability of our sample to represent teaching method interventions
focusing on procedural understanding.
380
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
Coding Studies
The coding of studies took place in four stages. All studies were coded by the
first author with a second coder examining a random sample at each step to meas-
ure interrater agreement. First, the titles and abstracts of electronic search results
were scanned; those that were clearly not related to mathematics (e.g., studies that
examined chemistry or physics teaching) were excluded. We identified 594 poten-
tially relevant studies. Second, upon completion of the electronic search, a judg-
ment was made about the likely relevance of the studies based on a reading of titles
and abstracts. Studies were considered not relevant for this review if they clearly
did not meet the aforementioned inclusion criteria; if relevance could not be deter-
mined from their titles and abstracts, the studies were obtained and reviewed. Upon
completion of the second round of coding, we retained 82 relevant studies as meet-
ing our inclusion criteria. Third, the number of independent samples within each
study was identified, followed by the recording of the appropriate student achieve-
ment means, standard deviations, and study characteristics such as study descrip-
tors (e.g., author, date of publication), the sample (e.g., age or grade level,
ethnicity), the intervention (e.g., the specific instructional improvement strategy
used), the measure used (e.g., properties of tests), and the results (e.g., effect sizes).
Fourth, instructional strategies were analyzed qualitatively to determine categories
of instructional treatment.
Interrater reliability was measured for Stages 2, 3, and 4 of the coding process
using a random sample of 65 studies from the original 594 identified studies. At
Stage 2, we focused on initial inclusion or exclusion for each study. At this stage,
we agreed initially on the inclusion status of 53 studies (88.3%). If either rater
thought a study might be relevant, it was included in Stage 3. As a result, 36 stud-
ies were retained in the reliability subsample. At Stage 3, relevance of these studies
was determined through an analysis of the full text, and we agreed initially on the
continuing inclusion status of 34 studies (94%). Based on this analysis, 12 studies
were retained for Stage 4. At this stage, we coded study characteristics (e.g.,
instructional strategy categories, number of independent samples within the study).
We agreed initially on 95% of the study characteristics; however, the disagree-
ments were not random. We found, instead, that most disagreement centered on
coding the instructional treatment category. For this study characteristic, initial
agreement measured approximately 45%. We deemed this level of reliability too
381
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
low to rely on any single judgment, so we proceeded to code the instructional treat-
ment category for all 82 studies with a panel of three mathematics education
researchers. This panel agreed that five categories represented the observed inter-
ventions independently: instructional strategies, manipulatives, technology tools,
technology-based curriculum, and nontechnology curriculum.
Instructional strategies consisted of teaching methods such as cooperative
learning, mastery learning, multiple representations, and assessment strategies. In
Slavin and Karweit (1982), student teams and mastery learning were used to
address limitations in group-paced algebra instruction. Ives (2007) examined the
use of graphic organizers to clarify the meaning of algebra problems. Wineland
and Stephens (1995) investigated the effects of a spiral testing strategy for improv-
ing student achievement.
Goins (2001) defined manipulatives as "concrete objects that are used to help
students understand a concept" (p. 10). In her study, rectangular tiles were used to
help students develop polynomial multiplication skills and to develop understand-
ing of the meaning of polynomial multiplication. Aburime (2007) investigated the
use of cardboard geometric cutouts to represent shapes such as triangles, quadri-
laterals, pentagons, hexagons, circles, cubes, and cylinders.
Technology tools included calculators, graphing calculators, computer pro-
grams, and Java applets. For example, Durmus (1999) investigated the use of
graphing calculators as a method for carrying out computations and checking solu-
tions. K. B. Smith and Shotsberger (1997) focused instead on the use of graphing
calculators for changing the way students approach problem solving. Suh and
Moyer (2007) and Cavanaugh et al. (2008) examined the use of Java applets as a
substitute for physical manipulatives to learn algebraic concepts such as balancing
equations.
Technology-based curricula included computer-based curricula for use in on-
site classes, online courses, and tutoring curricula. For example, Koedinger,
Anderson, Hadley, and Mark (1997); Morgan and Ritter (2002); and Shneyderman
(2001) examined the use of the Cognitive Tutor as a way of redesigning algebra
instruction in on-site classes. Weems (2002) and O'Dwyer, Carey, and Kleiman
(2007) compared online and on-site course effectiveness for algebra learning.
Hannafin and Foshay (2008) examined the impact of the PLATO learning system
as a way of teaching algebra.
Nontechnology curricula included reform-based curricula such as Math
Thematics (Reys et al., 2003), Connected Mathematics (Reys et al., 2003), UCSMP
Algebra 1 (Thompson & Senk, 2001), and a researcher-developed curriculum
based on NCTM principles and standards (McCaffrey, Hamilton, & Stecher,
2001). This category also included traditional curricula such as Saxon (e.g.,
Johnson & Smith, 1987; Lawrence, 1992; McBee, 1984) and CORD Algebra 1
(Keif, 1995).
We referred to the theoretical framework of Hiebert and Grouws (2007) to dif-
ferentiate between instructional strategies that focused on conceptual understand-
ing or procedural understanding. In their review of research, they described
conceptual understanding as beginning with the "engagement of students in strug-
gling with important mathematics" (p. 391). Going into more detail, they described
conceptual instruction as paying "explicit attention to connections among ideas,
382
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
facts, and procedures" and "posing problems that require making connections and
then working out these problems in ways that make the connections visible for
students" (p. 391). Both conceptual and procedural foci often appeared within each
intervention category. For example, the concrete-representational-abstract method
of instruction concentrated on skill development in one study (Konold, 2004),
while in another study (Witzel, Mercer, & Miller, 2003), this same type of inter-
vention was used to give explicit attention to developing meaning and connections
between mathematical ideas.
Determining the appropriate epistemological foundation of the intervention
required more than a cursory reading of the full text because studies within both
groups often used the same terminology to mean two different ideas. For example,
most studies referred to standards or improving algebra instruction over traditional
instruction. In some studies, standards-based meant an explicit focus on connect-
ing the meaning of ideas while in other studies standards-based referred to adher-
ence to topics listed within a state or national standards document. For example,
the term standards-based was used to imply a conceptual focus for a computer
program titled The Learning Equation (TLE; Walker & Senger, 2007). The TLE
software, however, focuses on problem-solving heuristics. For example, in a lesson
meant to differentiate between functions and relations, the lesson focused on
developing a heuristic for parsing out relationships rather than the meaning of the
ideas. In another case, Ives (2007) used the term mathematical concepts repeat-
edly, but the intervention focused specifically on a set of prescriptive heuristics for
solving linear equations. In short, "There are two effectively different subjects
being taught under the same name, 'mathematics'" (Skemp, 1976/2006, p. 6). In
this sample, we determined that the interventions of 25 studies (approximately
30%) focused on the development of conceptual understanding.
Approximately 12% of the 82 sample studies were randomly selected to mea-
sure interrater reliability for the coding of the intervention epistemological focus.
Initial agreement (80%) indicated a high degree of reliability. Furthermore, the
resultant groups demonstrated a high degree of discriminant validity, measured by
examining the correlations (r = .054, p = .378) between the groups, as recom-
mended by Furr and Bacharach (2008) and Urbina (2004).
Independence of effect sizes. The unit of analysis for this review was the indepen-
dent sample. In most of the studies that met our inclusion criteria, more than one
effect size was obtainable for a sample of students due to multiple subscales on a
single test or multiple tests. For example, some researchers measured the same
construct multiple ways (e.g., two versions of an assessment) or at multiple times
(e.g., at posttest and at follow-up). Other researchers employed multiple treatment
groups (e.g., by comparing two different teaching strategies to instruction as usual)
or multiple comparison groups (e.g., by comparing a treatment group to multiple
control groups). Ignoring this dependence can result in a study having too much
weight in an analysis. In those cases in which the effect sizes would not be inde-
pendent, an average effect size was calculated in order to ensure independence of
data in the final data set (Lipsey & Wilson, 2001). In some cases, the samples of
various subscales and assessments overlapped but lacked or gained a few students
so that the sample sizes of each dependent effect size varied slightly. In these cases,
383
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
a weighted average effect size was calculated, and the final sample size used was
an average of the sample sizes. For example, Coppen (1976) implemented a treat-
ment known as Individual Mastery Instruction (IMI) in a single class. The achieve-
ment scores for that class were then compared to three classes receiving instruction
as usual. Averaging the observed effects resulted in a single average effect size for
the sample.
In other studies, different samples were studied with no overlap (e.g., 2 years,
two different samples); for these studies, each effect size was independent and thus
both were included in the meta-analysis (Lipsey & Wilson, 2001). For example,
Coppen's (1976) study continued into a second year in which the design was
repeated with an entirely new sample of students. This repetition resulted in a
second effect size in the meta-analysis.
Such handling of multiple effect sizes does not, however, preclude within-study
clustering effects on the reported means and errors (Raudenbush & Bryk, 2002).
On the contrary, this method for handling multiple effect sizes only assures that
each effect size represents a distinct group of students; it does nothing to address
the dependence of students within a group. Because every study within our sample
was grouped by class rather than student, this within-group design effect (Kish,
1965) needed to be addressed through statistical adjustments to the computed
effect size to avoid spuriously small standard errors.
Data Clustering
We adjusted for the within-study dependence through two methods to minimize
potential Type I error. First, we computed a design effect (Kish, 1965) with intra-
class correlations provided by Hedges and Hedberg (2007). The design effect was
used to adjust the standard error for clustering, thereby reducing Type I error.
Second, we computed an empirical Bayes (EB) estimate, adjusting both the effect
size 8,*; (Equation 1) and its standard error using estimates from hierarchical linear
modeling (HLM) procedures (Raudenbush & Bryk, 2002; Raudenbush, Bryk,
Cheong, Congdon, & du Toit, 2004).
384
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
TABLE 1
Weighted average effect sizes for intervention categories
Design effect
Empirical Bayes adjusted random
adjusted fixed effects effects weighted
Category weighted averages, b* SE averages, d SE
Curricula 0.207 0.024* 0.404 0.115*
Instructional 0.322 0.030* 0.349 0.070*
change
Manipulatives 0.318 0.089* 0.335 0.132*
Technology 0.304 0.046* 0.165 0.073*
tools
Technology 0.311 0.050* 0.151 0.305*
curricula
*p < .05.
Results
Literature Search
385
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
TABLE 2
Pairwise category moderator tests
*/7<.05.**/?<01***P<001.
TABLE 3
Weighted Average Effect Sizes For Epistemological Emphases
Design effect
Interventions Empirical Bayes adjusted random
focused on the adjusted fixed effects effects weighted
development of: weighted averages, b* SE averages, d SE
Conceptual 0.232 0.023* 0.467 0.099*
understanding
Procedural 0.301 0.023* 0.214 0.044*
understanding
*p < .05.
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
FIGURE 1. Histogram of effect sizes for conceptual and procedural studies.
The present study began by seeking the most useful way to categorize research
on instructional methods for improving student achievement in algebra. We foun
that five categories captured the breadth of interventions used to improve studen
achievement in algebra: implementation of new curricula, technology-based cur-
ricula, instructional strategies, manipulatives, and technology tools. The analyses
of these categories resulted in five key findings:
387
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
388
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
389
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
FIGURE 2. Shrinkage of variability in empirical Bayes (EB) estimates.
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theo-
retical synthesis. Review of Educational Research, 65, 245-282.
Carraher, D. W., & Schhemann, A. D. (2007). Early algebra. In F. K. Lester (Ed.),
Second handbook of research on mathematics teaching and learning (pp. 669-706).
Reston, VA: National Council of Teachers of Mathematics.
Carroll, W. M. (1992, April). The use of worked examples in teaching algebra. Paper
presented at the annual meeting of the American Educational Research Association,
San Francisco, CA.
*Carroll, W. M. (1994). Using worked examples as an instructional support in the
algebra classroom. Journal of Educational Psychology, 86, 360-367.
*Cavanaugh, C, Gillan, K. J., Bosnick, J., Hess, M., & Scott, H. (2008). Effectiveness
of interactive online algebra learning tools. Journal of Educational Computing
Research, 38, 67-95.
Chambers, D. L. (1994). The right algebra for all. Educational Leadership, 53, 85-86.
Chirwa, A. S. (1996). Computers stimulate the generation of questions among stu-
dents. Dissertation Abstracts International-A, 57(08), 3464.
*Chung, G. K. W. K., Delacruz, G. C, Dionne, G. B., Baker, E. L., Lee, J., &
Osmundson, E. (2007, April). Towards individualized instruction with technology -
enabled tools and methods. Paper presented at the annual meeting of the American
Educational Research Association, Chicago, IL.
*Clay, D. W. (1998). A study to determine the effects of a non-traditional approach to
algebra instruction on student achievement. Dissertation Abstracts International-A,
31(06), 3478.
Cooper, H. (1998). Synthesizing research. Thousand Oaks, CA: Sage.
*Coppen, M. J. (1976). The development and evaluation of an individualized mastery
programme in mathematics for low-ability secondary school students. New Zealand
Journal of Educational Studies, 11, 132-142.
*Denson, P. S. (1989). A comparison of the effectiveness of the Saxon and Dolciani
texts and theories about the teaching of high school algebra. Dissertation Abstracts
International-A, 50(10), 3173.
*Durmus, S. (1999). The effects of the use of the technology on college algebra students'
achievements and attitudes toward mathematics: A constructivist approach.
Dissertation Abstracts International-A, 60(10), 3622.
Duval, S., & Tweedie, R. (2000a). A nonparametric "trim and fill" method of accounting
for publication bias in meta-analysis. Journal of the American Statistical Association,
95, 89-98.
Duval, S., & Tweedie, R. (2000b). Trim and fill: A simple funnel-plot-based method of
testing and adjusting for publication bias in meta-analysis. Biometrics, 56, 455-463.
Easton, J. Q. (2010, May). Out of the tower, into the schools: How new IES goals will
reshape researcher roles. Presidential session presented at the annual meeting of the
American Educational Researcher Association, Denver, CO.
Edwards, T. G. (2000). Some 'big ideas' of algebra in the middle grades. Mathematics
Teaching in the Middle School, 6, 26-31.
Ellington, A. J. (2003). A meta-analysis of the effects of calculators on students'
achievement and attitude levels in precollege mathematics classes. Journal for
Research in Mathematics Education, 34, 433-463.
Ellington, A. J. (2005). A modeling-based college algebra course and its effect on
student achievement. Primus, 15, 193-214.
Ellington, A. J. (2006). The effects of non-CAS graphing calculators on student
achievement and attitude levels in mathematics: A meta-analysis. International
Journal of Instructional Media, 106, 16-26.
392
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
Greenes, C. E., & Rubenstein, R. (Eds.). (2008). Algebra and algebraic thinking in
school mathematics: Seventieth yearbook. Reston, VA: National Council of Teachers
of Mathematics.
Haas, M. S. (2005). Teaching methods for secondary algebra: A meta-analysis of find-
ings. National Association of Secondary School Principals Bulletin, 89, 24-46.
*Hannafm, R. D., & Foshay, W. R. (2008). Computer-based instruction's (CBI) redis-
covered role in k-12: An evaluation case study of one high school's use of cbi to
improve pass rates on high-stakes tests. Educational Technology Research and
Development, 56, 147-160.
Hartley, S. S. (1977). Meta-analysis of the effects of individually paced instruction in
mathematics. Dissertation Abstracts International, 38(01), 4003.
*Haver, W. E. (1978). Developing skills in college algebra - A mastery approach. The
Two-Year College Mathematics Journal, 9, 282-287.
* Hawkins, V. J. (1982). A comparison of two methods of instruction, a saturated learning
environment and traditional learning environment: Its effects on achievement and
retention among female adolescents in first-year algebra. Dissertation Abstracts
International-A, 43(02), 416.
*Heath, R. D. (1987). The effect of calculators and computers on problem- solving
ability and attitude toward mathematics. Dissertation Abstracts International-A,
48(05), 1102.
393
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al
*Hecht, J. B., Roberts, N. K., Schoon, P. L., & Fansler, G. (1995, October). Teacher
teams and computer technology. Paper presented at the annual meeting of the
Midwestern Educational Research Association, Chicago, IL.
Hedges, L. V., & Hedberg, E. C. (2007). Intraclass correlation values for planning
group-randomized trials in education. Educational Evaluation and Policy Analysis,
29, 60-87.
Henderson, R. W., & Landesman, E. M. (1995). Effects of thematically integrated
mathematics instruction on students of mexican descent. Journal of Educational
Research, 88, 290-300.
Hiebert, J. (2003). What research says about the NCTM standards. In J. Kilpatrick,
W. G. Martin, & D. Schifter (Eds.), A research companion to "Principles and stan-
dards for school mathematics " (pp. 5-23). Reston, VA: National Council of Teachers
of Mathematics.
Hiebert, J., & Carpenter, T. P. (1992). Learning and teaching with understanding.
In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning
(pp. 65-100). Reston, VA: National Council of Teachers of Mathematics.
Hiebert, J., & Grouws, D. A. (2007). The effects of classroom mathematics teaching
on students' learning. In F. K. Lester (Ed.), Second handbook of research on math-
ematics teaching and learning (pp. 371-404). Reston, VA: National Council of
Teachers of Mathematics.
Hiebert, J., Carpenter, T. P., Fennema, E., Fuson, K. C, Wearne, D., Murray, H., et al.
(1997). Making sense: Teaching and learning mathematics with understanding.
Portsmouth, NH: Heinemann.
Hiebert, J., Morris, A. K., Berk, D., & Jansen, A. (2007). Preparing teachers to learn
from teaching. Journal of Teacher Education, 58, 47-61.
Hiebert, J., Stigler, J. W., Jacobs, J. K., Givvin, K. B., Gamier, H., Smith, M., & . . .
Gallimore, R. (2005). Mathematics teaching in the United States today (and tomor-
row): Results from the TIMSS 1999 video study. Educational Evaluation and Policy
Analysis, 27, 111-132.
*Hodo, L. B. (1989). The effects of study skills instruction on achievement and usage
of selected study strategies in Algebra II classes. Dissertation Abstracts
International-A, 50(04), 862.
Howe, R. (2005). Comments on NAEP algebra problems. Retrieved from http://www.
brookings.edu/brown/~/media/Files/Centers/bcep/AlgebraicReasoningConference
Howe.pdf.
*Hutchinson, N., & Hemingway, P. (1987, April). Teaching representation and solu-
tion of algebraic word problems to learning disabled adolescents. Paper presented
at the annual meeting of the American Educational Research Association,
Wachinotnn DC
394
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
*Kalyuga, S., & Sweller, J. (2005). Rapid dynamic assessment of expertise to improve
the efficiency of adaptive e-learning. Educational Technology Research and
Development, 53, 83-93.
*Keif, M. G. (1995). Performance of students completing courses using the CORD
Applied Mathematics curriculum in four Missouri school districts. Dissertation
Abstracts International- A, 57(09), 3863.
Kieran, C. (1989). The early learning of algebra: A structural perspective. In S. Wagner
& C. Kieran (Eds.), Research issues in the learning and teaching of algebra (pp. 33-56).
Hillsdale, NJ: Lawrence Erlbaum.
Kieran, C. (1992). The learning and teaching of school algebra. In D. A. Grouws (Ed.),
Handbook of research on mathematics teaching and learning (pp. 390-419). Reston,
VA: National Council of Teachers of Mathematics.
Kieran, C. (2007). Learning and teaching of algebra at the middle school through col-
lege levels: Building meaning for symbols and their manipulation. In F. K. Lester
(Ed.), Second handbook of research on mathematics teaching and learning
(pp. 707-762). Reston, VA: National Council of Teachers of Mathematics.
Kieran, C. (2008). What do students struggle with when first introduced to algebra
symbols? Retrieved from http://www.nctm.org/uploadedFiles/Research_News_
and_Advocacy/Research/Clips_and_Briefs/Brief%20-%20What%20Can%20
We%20Learn.pdf.
*Kieren, T. E. (1968). The computer as a teaching aid for eleventh grade mathematics:
A comparison study. Dissertation Abstracts International- A, 29(10), 3526.
Kilpatrick, J., Swafford, J., & Findell, B. (Eds.). (2001). Adding it up: Helping children
learn mathematics. Washington, DC: National Academy Press.
*Kim, T. (2006). Impact of inquiry-based teaching on student mathematics achievement
and attitude. Dissertation Abstracts International- A, 67(04).
Kish, L. (1965). Survey sampling. New York, NY: John Wiley.
Klauer, K. J., & Phye, G. D. (2008). Inductive reasoning: A training approach. Review
of Educational Research, 78, 85-123.
*Koedinger, K. R., Anderson, J. R., Hadley, W. H., & Mark, M. A. (1997). Intelligent
tutoring goes to school in the big city. International Journal of Artificial Intelligence
in Education, 8, 30-43.
*Konold, K. B. (2004). Using the concrete-representational-abstract teaching sequence
to increase algebra problem-solving skills. Dissertation Abstracts International- A,
65(08), 2949.
Kiichemann, D. (1978). Children's understanding of numerical variables. Mathematics
in School, 9, 23-26.
*Lawrence, L. K. (1992). The long-term effects of an incremental development model
of instruction upon student achievement and student attitude toward mathematics.
Dissertation Abstracts International- A, 53(03), 747.
Lee, J., Grigg, W., & Dion, G. (2007). The nation's report card: Mathematics
2007 -National Assessment of Educational Progress at grades 4 and 8 [NCES 2007-
494]. Washington, DC: National Center for Education Statistics.
Leinhardt, G., Zaslavsky, O., & Stein, M. K. (1990). Functions, graphs, and graphing:
Tasks, learning, and teaching. Review of Educational Research, 60, 1-64.
Leitzel, J. R. (1989). Critical considerations for the future of algebra instruction or a
reaction to: "Algebra: What should we teach and how should we teach it?" In
S. Wagner, & C. Kieran (Eds.), Research issues in the learning and teaching of
algebra (pp. 25-27). Hillsdale, NJ: Lawrence Erlbaum.
Lipsey, M. W, & Wilson, D. B. (2001). Practical meta-analy sis. Thousand Oaks, CA: Sage.
395
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et ah
Lou, Y., Abrami, P. C, & d'Apollonia, S. (2001). Small group and individual learning
with technology: A meta-analysis. Review of Educational Research, 77, 449-521.
Lou, Y., Abrami, P. C, Spence, J. C, & Poulsen, C. (1996). Within-class grouping:
A meta-analysis. Review of Educational Research, 66, 423-458.
*Lwo, L. S. (1992). Effects of individualized examples and personalized contexts in
computer-based adaptive teaching of algebra word problems. Dissertation Abstracts
International- A, 53(06), 1832.
*Makanong, A. (2000). The effects of constructivist approaches on ninth grade algebra
achievement in Thailand secondary school students. Dissertation Abstracts
International-A, 67(03), 923.
Malloy, C. E., & Malloy, W. W. (1998). Resiliency and Algebra 1: A promising non-
traditional approach to teaching low-achieving students. Clearing House, 71, 314-317.
*Mathews, S. M. (1997). The effect of using two variables when there are two unknowns
in solving algebraic word problems. Mathematics Education Research Journal, 9,
122-135.
*McBee, M. ( 1 984). Dolciani vs. Saxon: A comparison of two Algebra I textbooks with
high school students. Oklahoma City, OK: Oklahoma City Public Schools, Oklahoma
Department of Planning, Research, and Evaluation. (ERIC Document Reproduction
Service No. ED24 1348)
*McCaffrey, D. R, Hamilton, L. S., & Stecher, B. M. (2001). Interactions among
instructional practices, curriculum, and student achievement: The case of standards-
based high school mathematics. Journal for Research in Mathematics Education,
32,483-517.
*McClung, L. W. (1998). A study on the use ofmanipulatives and their effect on student
achievement in a high school Algebra I class. Unpublished master's thesis, Salem-
Teiko University, Salem, WV. (ERIC Document Reproduction Service No.
ED425077)
McDermott, L. C, Rosenquist, M. L., & Van Zee, E. H. (1987). Student difficulties in
connecting graphs and physics: Examples from kinematics. American Journal of
Physics, 55,503-513.
*McKenzie, S. Y. (1999). Achievement and affective domains of Algebra I students in
traditional or self-paced computer programs. Dissertation Abstracts International-A,
60(09), 3297.
Morgan, P., & Ritter, S. (2002). An experimental study of the effects of Cognitive Tutor
Algebra I on student knowledge and attitude. Retrieved from http://www.carnegie-
learning.com/web_docs/morgan_ritter_2002.pdf.
National Council of Teachers of Mathematics. (1989). Curriculum and evaluation
standards for school mathematics. Reston, VA: Author.
National Council of Teachers of Mathematics. (2000). Principles and standards for
school mathematics. Reston, VA: Author.
National Council of Teachers of Mathematics. (2008). Algebra: What, when, and for
whom. Retrieved from http://www.nctm.org/uploadedFiles/About_NCTM/
Position_Statements/Algebra%20final%2092908.pdf.
National Mathematics Advisory Panel. (2008). Foundations for success: The final
report of the National Mathematics Advisory Panel. Retrieved from http://www2
.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf.
*Nichols, J. D., & Miller, R. B. (1994). Cooperative learning and student motivation.
Contemporary Educational Psychology, 19, 167-178.
No Child Left Behind Act of 2002, 20 U.S.C. 6301 et seq. (West, 2002).
396
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
*O'Dwyer, L. M., Carey, R., & Kleiman, G. (2007). A study of the effectiveness of the
Louisiana Algebra I Online Course. Journal of Research on Technology in Education,
39, 289-306.
Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational
Research, 66, 543-578.
*Parham, J. W. (1993). An analysis of the effects of tutoring on seventh-grade students
engaged in the mastery of pre-algebra concepts. Dissertation Abstracts
International-A, 54(1 1), 4021.
* Peters, K. G. (1992). Skill performance comparability of two algebra programs on an
eighth-grade population. Dissertation Abstracts International- A, 54(01), 1993.
*Pierce, R. (1984). A quasi-experimental study of Saxon's incremental development
model and its effects on student achievement in first-year algebra. Dissertation
Abstracts International- A, 45(02), 443.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications
and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage.
Raudenbush, S. W., Bryk, A. S., Cheong, Y. R, Congdon, R., & du Toit, M. (2004).
HLM 6: SSI Scientific Software International: Hierarchical linear & non-linear
modeling. Lincolnwood, IL: Scientific Software International, Inc.
*Rech, J. K, Juhler, S. M., & Johnson, H. L. (1995). The effectiveness of videotapes in
the individualized instruction of intermediate algebra and college algebra.
International Journal of Mathematical Education in Science and Technology, 26,
463-472.
*Reys, R. E., Reys, B., & Lapan, R. (2003). Assessing the impact of standards-based
middle grades mathematics curriculum materials on student achievement. Journal
for Research in Mathematics Education, 34, 74-95.
Rittle- Johnson, B., & Alibali, M. W. (1999). Conceptual and procedural knowledge of
mathematics: Does one lead to the other? Journal of Educational Psychology, 91,
175-189.
*Rodgers, C. E. (1995). An investigation of two instructional methods for teaching
selected pre-algebra concepts to minority at-risk seventh-grade mathematics students.
Dissertation Abstracts International- A, 56(08), 3042.
Roscoe, R. D., & Chi, M. T. H. (2007). Understanding tutor learning: Knowledge-
building and knowledge- telling in peer tutors' explanations and questions. Review of
Educational Research, 77, 534-574.
Rosenshine, B., Meister, C, & Chapman, S. (1996). Teaching students to generate
questions: A review of the intervention studies. Review of Educational Research, 66,
181-221.
*Schumacker, R. E., Young, J. I., & Bembry, K. L. (1995). Math attitudes and achieve-
ment of Algebra I students: A comparative study of computer-assisted and traditional
lecture methods of instruction. Computers in the Schools, 11, 27-33.
*Seals, G. J. (2001). The effects of portfolio use as a learning tool on Algebra II
students' achievement and their attitudes toward mathematics. Dissertation Abstracts
International-A, 63(0'), 72.
Seidel, T, & Shavelson, R. J. (2007). Teaching effectiveness research in the past
decade: The role of theory and research design in disentangling meta-analysis
results. Review of Educational Research, 77, 454-^499.
Sfard, A. (1991). On the dual nature of mathematical conceptions: Reflections on
processes and objects as different sides of the same coin. Educational Studies in
Mathematics, 22, 1-36.
397
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2001). Experimental and quasi-
experimental design. Boston, MA: Houghton Mifflin.
*Shneyderman, A. (2001). Evaluation of the cognitive tutor algebra 1 program.
Retrieved from http://oer.dadeschools.net/algebra.pdf.
*Shoecraft, P. J. (1971). The effects of provisions for imagery through materials and
drawings on translating algebra word problems, grades seven and nine. Dissertation
Abstracts International- A, 52(07), 3874.
*Silvis, K. T. (2007). A comparison of hands-on and traditional approaches for teaching
eighth grade pre-algebra. Dissertation Abstracts International- A, 68(03), 921.
*Siskind, T. G. (1994). The effect of calculator use on mathematics achievement for
rural high school students. The Rural Educator, 16, 1-4.
Skemp, R. R. (2006). Relational understanding and instrumental understanding.
Mathematics Teaching in the Middle School, 12, 88-95 . (Reprinted from Mathematics
Teaching, 77, 20-26, 1976)
*Slavin, R. E., & Karweit, N. L. (1982). Student teams and mastery learning: A facto-
rial experiment in urban math nine classes. Baltimore, MD: Johns Hopkins
University, Center for Social Organization of Schools. (ERIC Document Reproduction
Service No. ED2 15904)
Slavin, R. E., & Lake, C. (2008). Effective programs in elementary mathematics: A
best-evidence synthesis. Review of Educational Research, 78, 427-515.
* Smith, K. B. (1994). Studying different methods of technology integration for teaching
problem solving with systems of equations and inequalities and linear programming.
Journal of Computers in Mathematics and Science Teaching, 13, 465-479.
*Smith, K. B., & Shotsberger, P. G. (1997). Assessing the use of graphing calculators
in college algebra: Reflecting on dimensions of teaching and learning. International
Journal of Instructional Media, 97, 368-376.
*Smith, L. R. (1985). Presentational behaviors and student achievement in mathematics.
Journal of Educational Research, 78, 292-298.
Socas Robayna, M. M. (1997). Dificultades, obstaculos y errores en el aprendizaje de
las matematicas en la educacion secundaria. [Difficulties, obstacles, and errors in
learning mathematics in secondary education]. In L. R. Romero (Ed.), La educacion
matemdtica en la ensenanza secundaria (pp. 125-154). Barcelona: Horsori y
Instituto de Ciencias de la Educacion.
*St. John, D. J. (1992). A comparison of two methods of teaching eighth-grade
pre-algebra students how to construct tables for use in solving selected mathematical
problems. Dissertation Abstracts International-A, 54(02), 453.
Stephens, L. J., & Konvalina, J. (1999). The use of computer algebra software in
teaching intermediate and college algebra. International Journal of Mathematical
Education in Science and Technology, 30, 483-488.
Stigler, J. W., & Hiebert, J. (1997). Understanding and improving classroom mathematics
instructions: An overview of the TIMSS video study. Phi Delta Kappan, 79, 14-21.
Stodolsky, S. S. (1988). The subject matters: Classroom activity in math and social
studies. Chicago, IL: University of Chicago Press.
*Suh, J., & Moyer, P. S. (2007). Developing students' representational fluency using
virtual and physical algebra balances. Journal of Computers in Mathematics and
Science Teaching, 26, 155-173.
*Tenenbaum, G. (1986). The effect of quality of instruction on higher and lower mental
processes and on the prediction of summative achievement. Journal of Educational
Research, 80, 105-114.
398
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Algebra Instructional Improvement
399
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms
Rakes et al.
ninth annual computer conference of the League for Innovation in the Community
College, Orlando, FL. (ERIC Document Reproduction Service No. ED352099)
Woolner, P. (2004). A comparison of a visual-spatial approach and a verbal approach
to teaching mathematics. In M. J. Hoines & A. B. Fuglestad (Eds.), Proceedings of
the 28th Conference of the International Group for the Psychology of Mathematics
Education, 2004 (pp. 449-456). Bergen, Norway: International Group for the
Psychology of Mathematics Education.
Authors
CHRISTOPHER R. RAKES has recently completed his doctoral studies at the University
of Louisville, Curriculum and Instruction in Secondary Mathematics Education. Over the
past 10 years, he has taught mathematics in both urban and rural settings at the secondary
and postsecondary levels. His research is framed by two overarching issues: (a) effective
strategies to teach mathematics and (b) issues and approaches in the preparation of teach-
ers of mathematics. His scholarly work involves multiple methods such as systematic
review, meta-analysis, structural equation modeling (SEM), hierarchical linear modeling
(HLM), and mixed methodology.
JEFFREY C. VALENTINE is an associate professor in the College of Education & Human
Development at the University of Louisville. His research interests include how non-
school activities affect students, both psychologically and academically. He is also inter-
ested in improving the methods used in systematic reviews and meta-analyses and in
improving assessments of study quality.
MAGGIE B. MCGATHA is an assistant professor in the College of Education & Human
Development at the University of Louisville. Her research interests include how mentoring
and coaching affects teacher practice and in turn, student learning. She is also interested
in improving mathematics teacher education for preservice and inservice teachers.
ROBERT N. RONAU is a professor of mathematics education at the University of Louisville
whose research interests and publications include the implementation of instructional
technology, teacher knowledge, teacher preparation, and assessment. Over the past 15
years, he has played a critical role in numerous statewide and local grant efforts including
development of State Wide Mathematics Core-Content and Assessments, LATTICE
(Learning Algebra Through Technology, Investigation, and Cooperative Experience), the
Secondary Mathematics Initiative (SMI) of PRISM (Partnership for Reform Initiatives in
Science and Mathematics), Kentucky's statewide systemic reform initiative, Technology
Alliance, Teaching K-4 Mathematics in Kentucky, the Park City/IAS Geometry Project,
and U2MAST. He currently serves as a Co-PI on the NSF-funded project, Geometry
Assessments for Secondary Teachers (GAST), an initiative to determine critical geometry
knowledge for secondary teachers and to design assessments to measure levels of teacher
knowledge.
400
This content downloaded from 121.52.158.245 on Sun, 06 Nov 2016 16:25:30 UTC
All use subject to http://about.jstor.org/terms