Digital 2
Digital 2
June 2017
Acknowledgments
This research was made possible through the collaboration and support of many individuals
beyond the SRI Education research team. First, we are grateful for the cooperation of the Adult
Basic Education program administrators, instructors, and adult learners and their willingness to
participate in the research and data collection. We also acknowledge the contributions of the
leadership and staff of each product vendor and their valuable insights and collaboration
throughout this study. In addition, we thank our partners, Digital Promise and Mockingbird
Education, and in particular Patti Constantakis, Tamara Thompson, and Shannon Sims for
supporting the research and its dissemination in a variety of important ways. The project also
benefited greatly from the contributions of the advisory panel members—Richard Mayer, Lynda
Ginsburg, John Fleischman, John Sabatini, David Rosen, Gabe Martinez Cabrera, and Daphne
Greenberg—who provided guidance and insight along the way and valuable feedback on early
drafts of this report. Finally, we thank The Joyce Foundation and our program officer, Matthew
Muench, for their support and commitment to this important area of study.
Authors
Robert Murphy, Marie Bienkowski, Ruchi Bhanot, Sam Wang, Tallie Wetzel, Ann House, Tiffany
Leones, and Jennifer Van Brunt
SRI Education
Suggested Citation
Murphy, R., Bienkowski, M., Bhanot, R., Wang, S., Wetzel, T., House, A., Leones, T., Van
Brunt, J. (2017). Evaluating Digital Learning for Adult Basic Literacy and Numeracy. Menlo Park,
CA: SRI International.
Conclusion .............................................................................................................................. 11
Introduction................................................................................................................................. 1
References ................................................................................................................................ 49
i
Executive Summary
The Need
The magnitude of the problem of unskilled labor for the U.S. workforce is known. More than
36 million adults in the United States do not have the basic literacy and math skills needed for
many entry-level jobs and even less so for the types of jobs expected to dominate in the future.
We also know that our federal- and state-funded adult basic education (ABE) programs, the
main providers of skill development and training programs for this population, do not have the
resources, facilities, or trained staff to serve all those adults in need of further education to
improve their basic skills and job prospects. The purpose of this research was to understand the
potential role of technology as a significant part of the solution to address the needs of ABE
programs and these low-skilled adult learners. Specifically,
The Research
In 2014, The Joyce Foundation asked SRI Education (sri.com/about/organization/education) to
investigate the role and efficacy of digital learning technologies in (1) improving the basic
reading, writing, and math outcomes for low-skilled adults in adult basic education programs
(distinct from efforts to teach English as a second language or technology literacy) and (2)
helping programs increase their capacity to serve a greater number of students. Through this
research, we set out to understand how ABE programs might use these technologies to improve
the instruction they offer, whether such technologies are effective with low-skilled adults
(performing at fourth- to ninth-grade levels in reading and/or math) and which practices and
product features might be associated with better outcomes for students and programs.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-1
The primary research questions were the following:
1. How are the online technologies used to support instruction and program objectives in
the courses ABE institutions offer?
2. What program factors and practices and product design features are associated with
more intense use of the online technologies?
4. Which instructional design features of the online learning technologies are associated
with better learning gains and student outcomes?
5. Which program conditions, program practices, and online learning technology uses are
associated with better learning gains and student outcomes?
To address these research questions, we studied five products at 13 sites and in 14 different
ABE programs (one site piloted a different product in two ABE progams) using various data
collection methods and data sources. The research included visits to participating ABE sites for
classroom observations and interviews with administrators, instructors, and adult learners to
learn about their programs and how they were using the products and supporting their use. We
also surveyed instructors and students about their experiences using the products. For an
independent measure of product use, we accessed vendors’ student-level use data captured by
the products. To assess learning, we accessed scores on nationally normed standardized
assessments administered by the ABE program sites. The sites also provided demographic
information on the adult learners.
Quasi-experimental methods were used to estimate program impacts by comparing the scores
on the learning assessment for students who used a product with scores on the same
assessment for students who did not. A common statistical matching technique—propensity
score matching—was used to improve the baseline equivalence of the groups that were
compared. We also analyzed the relationship between the intensity of use of a product and
students’ performance on the site’s standardized learning measure.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-2
In general, we found that the majority of instructors and students had positive experiences in
using the online products. According to self-reports from the interviews and surveys, use of the
products provided significant value in how instructors supported students while allowing
students to extend their learning time beyond the classroom.
Although these findings are promising, suggesting that digital technologies of the types studied
can play an important role in ABE programs, we found that use varied widely across program
sites. At some sites, use of a product was very limited. In those cases, use was affected by such
factors as the program site’s commitment to using the product as a required core instructional
activity, students’ intrinsic and extrinsic motivation to complete their program, students’ mobility,
and students’ access to technology outside the classroom. Some of these factors are endemic
to ABE programs and the lives of their students, but we believe others can be addressed by
providing more time for programs to plan how best to integrate technology into their curriculum
and enhance support for instructors and students.
In interpreting the findings presented here, consider that, except in two cases, the sites were
using the products in their curriculum for the first time. Thus, the findings are for ABE program
sites, instructors, and students in the early adoption stage; they may not reflect the outcomes of
product use in more mature implementations, once program sites and instructors have had time
to reflect and iterate on how they are using the products and supporting students.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-3
The Products
After careful consideration, including input provided from the project’s expert advisors,1 five
products were selected for the study.
The project participants. Thirteen ABE program sites were recruited to participate in the
research. Candidate ABE program sites were identified through recommendations from the
1
The seven advisors had expertise in a range of areas such as adult learning and basic education, learning science, curriculum
design, math learning, adult literacy assessment, learning technologies, and technical assistance for the use of technology in adult
basic education programs.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-4
advisory board and from interviews with three ABE directors in states with a long history of
support for the use of technology in ABE programming. We also asked the product vendors to
recommend high-capacity ABE sites with known interest in piloting their products. We selected
sites to represent a range of program types, governance, and goals for the adult learners: public
and county school districts, community colleges, and community-based organizations. All the
sites selected had a strong interest and willingness to participate in the research (including the
required data collection), had an existing technology infrastructure, and served a large enough
population of adult learners to provide reliable estimates of the utility and impacts of the
products. All had multiple campuses where they offered ABE-related courses and programs.
The majority of students in the research were 18–45 years old, had incoming math and/or
reading skills at the fourth- to ninth-grade levels, and were not enrolled in an ESL program.
One-hundred and five instructors and 1,579 adult learners participated in the study.
The Findings
• Online refers to use of a product as the primary mechanism of instructional content and
delivery for the course. Students’ use of the product was required; it took place in the
program, school, or college computer labs monitored by instructors or on students’ own
time with individual support from instructors available on request. When instructors
provided direct instruction, it was in response to student needs they identified while
reviewing progress reports provided by the product or on student request.
• Blended use models require tight integration of the product into a broader curriculum
and instructional program. When we characterized product use as “blended,” instructors
had planfully integrated product use with face-to-face instruction, so the whole program
of study was partly online and partly face to face. Instructors attempted to link the
content in their lectures to the content that students were assigned in the product, or
they closely monitored student progress in the product and modified instruction in the
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-5
classroom accordingly and/or used students’ performance in the product to identify those
in need of individual attention.
• Hybrid models also combine the use of the product as a core instructional activity in
combination with instructor-led instruction during regular class time. However, in this use
model, the students’ work in the product, although required, is not necessarily connected
to instructor-led lessons and does not directly influence what instructors do in the
classroom. To a casual observer, blended and hybrid models may appear alike.
However, in hybrid use models, online activities are not coordinated with the face-to-face
instruction. Instructors do not regularly review product dashboards, nor do they use their
direct instruction time to cover topics that were revealed as potentially problematic for
students based on their performance in the product. Instructors using a hybrid model
often do so for several sound pedagogical reasons, namely (1) to provide students at
different skill levels an instructional opportunity to fill in skill gaps at their own pace so
they can better engage in the instructor-led lessons, (2) give more advanced students an
opportunity to go beyond the current pace of the curriculum, and (3) to give all students
an opportunity to become more comfortable learning with digital resources.
• Supplemental models are product uses that are scheduled outside regular class time
(e.g., during lunch or before or after class). Students often perceive these add-on
sessions as extracurricular, and instructors often do not require attendance since the
activity is outside core instructional time. Typically, programs choose this use model
because it does not interfere with the existing core curriculum and does not require
instructors to plan for and adapt to potentially new ways of teaching.
Intensity of Use
Although many students logged significant hours on the products, overall the intensity
of use was less than expected and varied greatly by site and product. Median total hours
of use, including time students spent working with a product outside regularly scheduled time,
ranged from a low of 3 hours to a high of 68 hours over program sessions that typically ran from
8 to 16 weeks. Seven of the 14 pilot sites had median use of less than 10 hours—less than half
the time stipulated by the research team (20 hours) as a requirement for participation. Similar
variation was evident in the number of days that products were used. The median number of
log-in days across the sites ranged from 5 days to 26 days. The median student in 6 of the 14
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-6
sites used the products less than a total of 10 days, or about 1 day (or less) per week over the
duration of an ABE program site’s typical course offering.
The extent of use varied by product, reflecting differences in design features and
intended uses. On average, Core Skills Mastery, MyFoundationsLab, and ALEKS had the
highest student use levels, and GED Academy had the lowest. This cross-product variation is
most likely due to the role the vendors intended the products to have in formal education
settings and the individualized and self-paced nature of many ABE programs (including high
school equivalency diploma prep programs).
Intensity of use varied by how programs decided to use and support the product. From
the site visits and interviews with program staff and students, several factors emerged that
appeared to be associated with consistent and greater use of the products by instructors and
students, as well as product effectiveness. These factors are described below.
• Use of the products must be mandatory whenever possible. Products were also
more likely to be used, both on and off site, when use was a mandatory part of the
course rather than just encouraged. Mandating use of an online product or even class
attendance is not always feasible in an ABE setting. However, before investing in
technologies like the ones in this study, ABE sites should consider their willingness to
make product use mandatory and consequential or whether their limited financial
resources might be better invested in alternative supports for students.
• The products must be aligned with the rest of curriculum. Usage was higher when
students and instructors viewed the products as instrumental in helping students achieve
their goals. For example, when used in GED prep programs, products that are not tightly
aligned with the GED exam may be perceived by instructors and students as providing
less support for students. In addition, procedures, approaches, and explanations within a
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-7
product’s content that were not aligned with classroom materials and instructor
explanations often caused confusion among students. In either case, products that are
not perceived as aligned with the goals of the curriculum or teacher-led instruction are
less likely to be used or taken seriously by both instructors and students.
Intensity of use also varied by different student characteristics. Within a site, product use
tended to be higher among female and older students, with some variation by product. Across
most products, female students tended to use them 60% more on average than males. Older
students (30 years old or older) tended to use the products about 80% more than younger
students (18–29 years of age). Although we have no firm evidence for the reasons behind these
differences, the females and older students in our sample appeared to be more motivated in
their coursework than their peers and perhaps more likely to attend class regularly and persist
within the digital learning technologies and their instructional programs. We also found evidence
that the intensity of product use varied by students’ incoming skills for three of the five products.
Students with lower incoming prior test scores tended to use Reading Horizons Elevate and
GED Academy more than their peers. In contrast, students with more advanced incoming math
and reading skills tended to spend more time on Core Skills Mastery than their peers. Variation
in the amount of use by incoming skill level is most likely a result of a combination of factors
including the product design and reading-level demands of the instructional content.
Many students reported they used the products during off hours, but a lack of access to
computers limited others. Part of the promise of instructional technology in ABE programs is
that it can extend instructional hours by providing students with access to quality learning
environments anytime and anywhere. Sixty-five percent of students surveyed reported using the
product during off hours, ranging from 40% of students in ABE programs using Reading
Horizons to 86% of students in programs using Core Skills Mastery. Many program sites
encouraged students to use products outside regularly scheduled class time but did not
mandate it. Almost half the instructors surveyed (46%) reported that the students’ lack of access
to the products at home limited their potential for improving student outcomes. About 25% of
students reported that they did not use the products at home because they did not have access
to a computer or compatible mobile device (only 5% cited a lack of Internet connectivity).
In general, students and instructors found value in using the products and believed they had
some benefit to instruction, student confidence, and student learning. Instructors interviewed
and surveyed found digital learning tools enabled them to better support students with a range
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-8
of different skill levels, something that would not be possible without the individualized
instruction provided by the products. A majority of instructors reported they would recommend
the products to colleagues (83%) and would like to use the products in future courses (78%).
Many but not all of the students interviewed said they enjoyed the experience of learning
independently with the products, appreciating that they could make mistakes and struggle in
private and receive immediate feedback. They also liked the opportunity to learn at their own
pace rather than at the pace of the class, which may have been moving slower or faster than
they were comfortable with. Fifty-nine percent of students reported that the products gave them
confidence they could learn new things on their own, while 50% reported that they had more
confidence in their ability to read or do math. Eight in 10 students reported they would
recommend the product they used to other students.
Instructors and students experienced several challenges that most likely impacted the
use and effectiveness of products. The majority of instructors reported favorably on their
experience using the products, but challenges were noted, such as some products’ insufficient
scaffolding to support struggling learners, content reading levels that may have been too difficult
for some students, and some students’ resistance to using the online learning technologies.
2
Impacts were estimated for only 6 of the 14 product pilot sites because (1) an insufficient number of eligible students were
available for analysis (five sites), (2) sites that provided grade equivalence scores for TABE failed to provide information on the level
of test used for the pretest and posttest (two sites), or (3) the site did not have a viable comparison group available because the
product was implemented in a new course (one site). Students in courses that used products were included in the impact analysis if
they used the products for 10 or more hours based on usage computed from the products’ back end data provided by the vendors.
For a site to be included in the impact analysis, we needed to identify at least 25 eligible students in the both the user and nonuser
groups. Five sites had too few eligible students because of (1) low initial enrollments and completions, (2) insufficient use of the
product (less than 10 hours), or (3) missing scores on pretest and/or posttest achievement measures.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-9
statistically significant negative effect was found for one product in a single site: Reading
Horizons (effect size for STAR Reading [spring] = -0.49).
These impact findings should be interpreted with caution. Even though we made every attempt
to implement the most rigorous designs available given the local research contexts by
comparing the outcomes for students using the products with outcomes for a group of similar
students who did not use them, these designs are unable to sufficiently isolate the effect of
product use from other potential contributing factors. Other plausible explanations for the
estimated impacts are differences between the groups in the quality of instruction experienced
outside the use of the products as well as potential differences in the math and reading curricula
the two groups of students were exposed to during the study period.
• To ensure that students spend sufficient time on the products and make adequate
progress, commit to using the products as a regular part of core instruction (not as an
add-on activity) and make use mandatory and consequential.
• To support product use outside scheduled class time, help students take advantage of
federal, state, and local programs providing low-cost devices and Internet access and
make sure all students know how and where they can obtain devices and connectivity on
and off site (e.g., public libraries, workplaces, and community resource centers). In
addition, provide incentives for off-hour use.
• To help ensure instructors’ commit to using the products, provide adequate time for
training, planning, and piloting to ensure better integration of the products into the
curriculum and the instructors’ own practices.
• Prepare to offer students who are struggling with the transition to online learning
additional monitoring and support, including a more gradual ramp-up time on the
products and alternative instructional activities during the transition. Plan for the
likelihood that some students will not want to make a transition to digital instruction.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-10
For developers and vendors of digital ABE products
• To ensure that all students can access the instructional content, particularly struggling
readers, scaffold the text with audio and video presentations.
• To support blended learning models and to keep instructors invested in students’ work in
the online environment, make the content modular so that programs and instructors can
better integrate product use into the existing curriculum and with direct instruction.
• To help motivate instructors and students to use the product, make sure the content is
aligned with all current ABE standards and competency exams.
• Provide sites with a variety of models of use to support a range of student types and
program goals. Most students can learn online and independently with proper
monitoring, coaching, and motivating factors.
Conclusion
The technology revolution in K–12 and postsecondary education has yet to reach adult basic
education in a meaningful way. There is sparse research evidence and information to help ABE
program administrators, instructors, and product developers understand which products,
product features, models of use, and student supports are associated with effective learning
technology implementations. The goal of this research project was to begin to generate some
reliable independent evidence and information on the supports and practices needed to
leverage the potential value of digital technologies for an ABE student population.
Overall, programs, instructors, and students found value in the digital learning technologies they
used in the study. Instructors reported that product use enabled them to differentiate instruction
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-11
to fill gaps in basic literacy and math skills across a wide range of students in ways that were
not possible without the products. In addition, a majority of students, but not all, reported that
they enjoyed using the products and that the products helped them improve their math and
reading skills and gave them confidence they could use online resources to learn on their own
without an instructor’s direct involvement. A majority of students also reported that they used the
products to continue to learn outside the regularly scheduled instruction time.
We found evidence that under the right set of conditions programs can effectively integrate use
of these products into their curriculum, and students will use the products for significant
amounts of time on and off site and enjoy the experience. We also found that it is possible to
use digital learning technologies with low-skilled adults as the primary instructional content and
delivery mode (i.e., online model), with instructors acting as facilitators and providing
motivational and individualized support as needed. However, for the existing technology-based
instructional products like those included in this study, it is likely that for many students,
particularly those with the lowest skills, blended and hybrid models (with instructors delivering
50% or more of the instruction) will be the most prevalent and perhaps most effective use
models for ABE programs.
This research also revealed challenges in using learning technologies with low-skilled adults in
ABE programs. Use of the products at several sites was well below what had been planned at
the study outset. Instructors reported having insufficient time to plan how best to integrate the
products into their curriculum and, in particular, to learn how to best use the feedback on
student performance captured by the systems to inform their instruction and identify the
students who were struggling the most. Across the board, the training the instructors received
from vendors was relatively modest; although it was adequate to get them and their students
started on the products, it was probably insufficient to enable the instructors to leverage the full
potential of the products with their students. Vendors, state and federal agencies, and
professional associations responsible for supporting ABE programs and instructors need to
continue to develop and disseminate instructional online resources and webinar trainings that
offer practical guidance and models of implementation that have been demonstrated to be
effective across a variety of program and student populations.
Finally, the study produced no conclusive evidence that the use of the products was more
effective in raising students’ math or reading skills than the participating ABE program sites’
current curricula and approaches. The impacts estimated varied by product and site. Given that
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-12
most of the sites were in an early stage of adoption, use models evolved over time. In addition,
the designs used to estimate product impacts were not optimal for isolating the effects of
product use from other plausible factors. Clearly, more rigorous research is needed on specific
products and use models to understand their potential benefits for improving math and literacy
skills.
This research represents an initial step in exploring the product design features and program
conditions under which digital learning may support the goals of ABE programs and their
students. More rigorous research is needed to understand which product features and aspects
of online, blended, and hybrid models are the most feasible to implement and the most effective
for ABE programs with different capacities, instructors, and students. Digital learning
technologies like those selected for this study, although not the solution for all ABE program
needs, can be an important support for programs and instructors in expanding access to basic
skills instruction and improving outcomes for low-skilled adults.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy ES-13
Introduction
The magnitude of the unskilled labor problem for the U.S. workforce is known. According to a
recent Organisation for Economic Co-operation and Development (OECD) survey (OECD,
2013), more than 36 million adults in the United States do not have the basic reading, writing,
and math skills needed for many of today’s entry-level jobs and even less so for the types of
jobs expected to dominate in the future. We also know that U.S. federal- and state-funded adult
basic education (ABE) programs, the main providers of skill development and training programs
for this population, do not have the resources, facilities, or trained staff to adequately help these
adults improve their skills and job prospects. A recent report (Tyton Partners, 2015) highlighted
the extent of the gap between demand and supply: Currently, ABE programs receiving federal
or state funding can serve about 4 million adults, or just over 10% of those in need. It is likely
that this gap will only widen in the future. While many factors contribute to this widening gap
(e.g., inadequate state and federal funding and the changing labor economy), researchers in
this study set out to understand the potential role of technology as a part of the solution.
Specifically, they addressed the question of whether digital learning technologies can increase
the capacity of ABE programs by providing more efficient and effective learning opportunities to
better serve the adult learning needs in their communities.
• Anytime, anywhere learning: DLTs can bring learning to the adult learner wherever
he/she has access, overcoming the limits of time and place (see Warschauer & Liaw,
2010).
• More productive practice time for learners and instructors: DLTs can provide
immediate feedback to learner responses, access to solution steps, and links to
additional resources to support students in practicing newly learned skills and learning
from their mistakes, while freeing instructors’ time to work with individuals or small
groups.
• Monitoring of student progress: DLTs can provide instructors with real-time class- and
student-level progress reports, helping them monitor individual learners’ progress and
identify challenging concepts and areas where additional instructional support may be
needed.
• Development of independent learning skills: DLTs can help struggling learners gain
confidence that they can learn on their own with digital resources, potentially opening up
a broader world of digital information and learning resources for them and better
preparing them for today’s job market.
Broadly defined, learning technologies are not new to ABE, as distance education for basic
skills has been offered for many years (Fleischman, 1998; Petty, Johnston, & Shafer, 2004 in
many states. What is new is learners’ expanding access to devices with broadband Internet
connectivity and capabilities associated with technological advances—including web-based
delivery, adaptive technologies, and streaming video and audio—along with sophisticated
dashboards and embedded motivation supports. These expanding capabilities are coupled with
a better understanding of how people learn in digital environments (Means, Bakia, & Murphy,
The Research
In 2014, The Joyce Foundation’s Employment Program and Innovation Fund asked SRI
Education (sri.com/about/organization/education) to investigate the role and efficacy of online
learning technology products in improving the basic reading and math outcomes of low-skilled
adults in ABE programs (distinct from efforts to teach English as a second language or
technology literacy) and in helping these programs serve more students. Through this research,
the objective was to understand how ABE programs might use these technology products to
improve their instruction, whether such technologies are effective with low-skilled adults (those
performing at fourth- to ninth-grade levels in reading and/or math), and which practices and
product features might be associated with better outcomes for students and ABE programs.
1. How are the online technologies used to support instruction and program objectives in
the courses ABE programs offer?
2. What program factors and practices and product design features are associated with
more intense use of the online technologies?
4. Which instructional design features of the online learning technologies are associated
with better learning gains and student outcomes?
5. Which program conditions, program practices, and online learning technology uses are
associated with better learning gains and student outcomes?
To address these research questions, we used a variety of data collection methods and data
sources. The research included a visit to participating ABE program sites for observations of
classroom instruction and interviews with administrators, instructors, and adult students to learn
about the programs and how they were using the online technology products and supporting
their use.3 We also surveyed instructors and students about their experiences using the
selected technology products. For an independent measure of the use of each of the products,
we obtained from the vendors student-level use data captured by the technologies.
Several limitations of the research have implications for our ability to generalize the findings to
the broader ABE program population. First, sites volunteered to participate and were selected
for having both the leadership capacity and the desire to implement learning technology
products as well as the infrastructure in place to support their use. To increase our power to
detect effects, we also selected program sites that planned to serve 100–200 students or more
3
Four sites were not visited because of their limited use of the products or the timing of their start in the research (Site 2, Site 4, Site
8, and Site 10). In such cases, researchers interviewed administrators and instructors by phone.
4
The research team had planned to administer the Education & Skills Online (ESOL) assessment at all participating ABE sites
(http://www.oecd.org/skills/ESonline-assessment/abouteducationskillsonline/). ESOL was developed by Educational Testing Service
with funding from OECD. Administered online, ESOL is based on the Survey of Adult Skills administered by the Programme for the
International Assessment of Adult Competencies (PIAAC). It was made available to SRI for use in the research in August 2015. We
attempted to have all sites administer ESOL on participating students’ enrollment and after the students had used a product for a
minimum of 20 hours. However, for reasons that varied by ABE site, compliance with ESOL administration varied greatly across
sites and response rates were extremely low. ESOL scores therefore were not used to estimate product impacts on student learning
but were used in the case of one site to explore the correlational relationship between the intensity of product use and student
learning.
Another limitation of this research is the inability to make definitive claims about the
effectiveness of the products in the study or the product features and practices associated with
effects. The quasi-experimental designs used to estimate effects do not disentangle the effects
of product use from other aspects of instruction, including direct instruction by the instructor.
(Other limitations specific to the impact analyses are covered under “Impacts of Product Use on
Learning” below.) Where we found a positive or negative effect for a product on student learning
at an ABE program site, the strongest claim we can make is that the effect was associated with
the program site’s curriculum, in which the product may have played a key role in the students’
instruction. The more major the role of the technology in the curriculum, the greater the
likelihood that use of the product contributed to the effect. Because we did not attempt to
systematically manipulate product features and program practices and test whether they
contributed more or less to a product’s effect, we cannot definitively say that particular features
or practices caused the effect. Instead, to identify features and practices that might warrant
further investigation, we highlight the distinctive features and practices of the instructional
setting that were associated with program sites where greater or small product effects were
detected.
Finally, each product was piloted in up to three ABE program sites, and, except for two, the sites
were implementing the products for the first time. Thus, the findings reported here are for ABE
program sites, instructors, and students in the early adoption stage and may not reflect the
outcomes of product use in more mature implementations, once program sites and instructors
have time to reflect and iterate on how they are using the products and supporting students.
Also, although we observed a number of use models and practices across the various sites and
We engaged in a range of activities to identify candidate products, with a goal of selecting five
for the study. Through web searches and recommendations from ABE experts, including the
project’s expert advisors, we identified a pool of potential products. Of those, 29 products met
the selection criteria. Interviews with all 29 vendors gave us a deeper understanding of the
products, how they were being used or could be used in ABE programs, and the vendor’s
willingness to participate in the research. We also saw demonstrations of the products. On the
basis of these interviews and demonstrations, we selected 12 products as viable and qualified
for inclusion in the study. To select the final five products, we created a decision matrix of the
pros and cons of each product, trading off features to select a diversified set of products
according to important characteristics such as adaptive content and unique features such as
support for social-emotional factors in learning.
Thirteen sites were recruited to participate in the research (Table 2). ALEKS was piloted in two
institutions (one of which operated across three distinct locations), and one site piloted two
different products in two different adult basic education programs (Site 4 and Site 6). The other
products were piloted in three institutions each. Candidate ABE program sites were identified
through recommendations from the advisory board and from interviews with a selection of state
ABE directors. We also asked vendors to recommend high-capacity ABE sites with known
interest in piloting their learning technology products. Sites were selected to represent a range
of program types, governance, and goals for adult learners: public and county school districts,
community colleges, and community-based organizations. All the sites had a strong interest and
willingness to participate in the research (including the required data collection), had an existing
technology infrastructure that could support robust use of the products, and served a large
enough population of adult learners to provide reliable estimates of the utility and impacts of the
products. Sites were provided with a $20,000 stipend for their participation.
The majority of students in the research were 18–45 years old, had incoming math and/or
reading skills at the fourth- to ninth-grade level, and were not enrolled in an ESL program. A
total of 105 instructors and 1,579 adult learners participated in the study. We obtained
permission to conduct the research from each organization's research review board and/or
administration, as appropriate, and negotiated data use agreements with each site to facilitate
the sharing of student data with the research team.
Number of Number of
Product Piloted Site (State) Organization Type
Teachers Students
Site 1 (CO) K–12 school district adult 4 96
and family education
program
ALEKS
Site 2, Program A (CA); Nonprofit organization 3 56
Program B (CA); Program C
(MA)
Site 3 (IL) Nonprofit adult education 6 85
center
The start and end time of each site’s participation in the study during the 2015–16 academic
year varied (Figure 1). The variation was due to three factors: (1) when the sites were identified,
recruited, and approved for participation in the study by their administration; (2) the scheduling
of staff training; and (3) the sites’ schedule for offering the targeted courses. All sites were
recruited with the expectation that students participating in the study would have an opportunity
to enroll for a minimum of 10 weeks of instruction and 20 hours of exposure to the digital
learning product selected. The duration of an individual student’s participation in the study may
have been shorter than the duration of the site’s participation depending on whether the site had
a rolling admissions or open-entry/open-exit policy (see Figure 1) and when the student may
have completed and exited the program. In sites with rolling admissions, students were included
in the research if (1) they were administered a pretest assessment at enrollment as a policy of
the site and (2) they completed their site’s posttest assessment (typically after 40 hours or more
of instruction, although this varied by site). Students with pretest assessment scores who left
their programs before taking a posttest were excluded from the samples used in the estimation
of product impacts on achievement measures and course outcomes.
Two outside sources supported sites in product implementation, the product vendor and
Mockingbird Education. Each vendor was responsible for scheduling a training session with
each organization. Training was delivered mainly online, although Reading Horizons Elevate
and Pearson provided it in person. Vendor trainings ranged from a single session as short as 1
hour up to a full day (Reading Horizons and Pearson delivered in-person full-day trainings)
covering start-up, student onboarding, and introductions to the product features. As indicated in
the site visit interviews, whether or not instructors were trained in product use varied depending
on whether the sites requested training, scheduling constraints, and staff turnover. During the
site visits we learned that often more experienced instructors supported less experienced staff
in use of the products. Each vendor was also encouraged to offer follow-up support as needed,
and several instructors reported using a vendor’s customer support service.
In addition to vendor-provided training, free technical assistance funded by the research project
was offered by Mockingbird Education (www.mockingbirdeducation.net). The primary technical
assistance provided was a full-day workshop (6 hours) for educators on blended learning in the
adult classroom with a focus on needs specific to vulnerable learning populations. Six of the
sites took advantage of this workshop, although sometimes the geographic spread of instructors
made arranging attendance difficult. Mockingbird also developed a technical assistance website
for the ABE sites that had resources for supporting and implementing their digital technology as
well as weekly project updates.
Overall, 75% of the instructors surveyed said they participated in some form of training.
However, only 52% said they were “well prepared” or “very well prepared” to use the product,
ranging from a high of 100% (ALEKS) to a low of 11% (GED Academy).
The sections that follow present the major findings from the research. We begin with a
description of the types of use models that emerged at the program sites—that is, the ways the
products were used to support teaching and learning. Then follows a description of the findings
from the analysis of the system use data and student and instructor surveys. The report ends
with an examination of the possible impacts of product use on student achievement and the
features of ABE programs, use models, and products that may have been associated with these
impacts.
In observing the products in action in classrooms and interviewing program administrators and
instructors, we noticed different types of use models across sites. Some of these were
consistent with the vendors’ intentions for the product and some were not. For purposes of this
report, we use the term use model to indicate how the product was actually used, as opposed to
its intended use. In general, although vendors may have presented preferred ways to implement
the products, ultimately it was the decision of each ABE program site and its instructors to
implement a model that they believed was best for their students.
Four use models emerged based on how the program sites used the products to support
their instructional program. Many influences shaped actual product use, from product design
features to the mission and goals of a particular ABE program center to instructors’ beliefs about
how to teach, how adults learn, and how best to motivate students. In addition, a site’s vision for
the potential role of the technology product in its curriculum and the ability to implement that
vision also most likely influenced the use models. The four use models observed across the
sites were online, blended, hybrid, and supplemental. These are defined as follows.
• Online refers to use of a product as the primary mechanism of instructional content and
delivery for the course. Students’ use of the product was required. It took place in the
• Blended use models require tight integration of the product into a broader curriculum
and instructional program. When we characterized product use as “blended,” instructors
had planfully integrated product use with face-to-face instruction, so the whole program
of study was partly online and partly face to face. Instructors attempted to link the
content in their lectures to the content that students were assigned in the product, or
they closely monitored student progress in the product and modified instruction in the
classroom accordingly and/or used students’ performance in the product to identify those
in need of individual attention.
• Hybrid models also combine the use of the product as a core instructional activity in
combination with instructor-led instruction during regular class time. However, in this use
model, the students’ work in the product, although required, is not necessarily connected
to instructor-led lessons and does not directly influence what instructors do in the
classroom. To a casual observer, blended and hybrid models may appear alike.
However, in hybrid use models, online activities are not coordinated with the face-to-face
instruction. Instructors do not regularly review product dashboards, nor do they use their
direct instruction time to cover topics that were revealed as potentially problematic for
students based on their performance in the product. Instructors using a hybrid model
often do so for several sound pedagogical reasons, namely (1) to provide students at
different skill levels an opportunity to fill in skill gaps at their own pace so that they can
better engage in the instructor-led lessons, (2) to give more advanced students an
opportunity to go beyond the current pace of the curriculum, and (3) to give all students
an opportunity to become more comfortable learning with digital resources.
• Supplemental models are product uses that are scheduled outside regular class time
(e.g., during lunch or before or after class). Students often perceive these add-on
sessions as extracurricular, and instructors often do not require attendance since the
activity is outside core instructional time. Typically, programs choose this use model
because it does not interfere with the existing core curriculum and does not require
instructors to plan for and adapt to potentially new ways of teaching.
Table 3 shows the use models implemented for each product by each site. Some sites left the
choice of use model up to the educators, so some sites implemented more than one use model.
Details on the use models at each site are in Appendix B.
Use Models
Product Piloted Site Name (State)
Online Blended Hybrid Supplemental
Site 1 (CO)
ü
Site 2 – Program A (CA) ü
ALEKS
Site 2 – Program B (CA) ü
Site 2 – Program C (MA) ü
Site 3 (IL) ü ü
Core Skills Site 4
Mastery Adult Diploma Program (OH)
ü
Site 5 (CO)
ü
Site 6
Adult Basic and Literacy Education (OH)
ü
GED Academy Site 7 (KS)
ü
Site 8 (KY) ü
Site 9 (AZ) ü ü
Site 10 (IN) ü ü
MyFoundationsLab
Site 11 (RI)
ü
Site 12 (IL) ü
Reading Horizons Site 13 (UT) ü
Elevate
Site 14 (KY)
ü
Intensity of Use
Two key questions driving this research were (1) whether ABE program sites would be able to
integrate the digital learning technologies into their curriculum in a meaningful way and (2)
whether the average low-skilled ABE learner would actually use the products over a sustained
During recruitment of the ABE program sites, the research team communicated its expectations
about use of the products during the study period. First, the products were to be used as a
regular, required core instructional activity and not as a supplemental activity used at the
discretion of individual instructors or students. Second, over the period of the targeted course or
program session (typically 10–12 weeks), instructors were to dedicate a minimum of 20 hours of
instructional time to use of the product by students. This was to include both time students spent
working on the products on campus under an instructor’s or instructor aid’s supervision and the
time students spent working with the products independently on and off site. Ultimately, the use
models and actual intensity of use at each site varied depending on (1) discussions between the
vendors and the program sites about what type of use might be appropriate for each site and (2)
individual decisions made by site administrators and instructors based on program and student
needs.
To indicate use overall and across products and program sites, we report the results for two
variables that were available for all products: (1) the number of hours students used the
products—on and off site—for reading and math instruction5 and (2) the number of days
students logged on to the products. We were unable to obtain a richer set of common variables
because of differences among the products in what student use data were archived and how the
data were formatted and stored.
Table 4 shows the actual level of software use for each product and ABE program site. The use
statistics shown are for all the students who enrolled in a course in one of the program sites
during the study, including those who may have joined and left the program during the study.
5
Students used MyFoundationsLab and GED Academy to receive instruction in other subject areas in addition to reading and math.
Because this research concerned the potential benefit of online instruction to help low-skilled adult improve their reading and math
skills, we analyzed and report results for use of the products for reading and math instruction only.
Use Statistics
No. of No. of Log-in Total Hours Use Levels
Students Days (median) (% users at each level)
with Log-in (median)
ID
0–10 hours 10+ hours
ALEKS
Site 1 96 14 28 8% 82%
Site 2 (Program A /Program a
56 17/24/1 9/20/1 57% 43%
B/Program C)
Core Skills Mastery
Site 3 85 26 68 8% 92%
GED Academy
MyFoundationsLab
a
Number of log-in days and total hours shown separately for the three local program sites (Program A, Program B, Program C) for
Site 2.
Seventy-five percent of all students who enrolled in a course or instructional program in one of
the sites participating in the study used one of the products. Not all students who enrolled in a
site’s ABE program during the study period used the product that a site had selected. Such
students had left the ABE program before the site’s initial use of the product or decided not to
use the product either on their own or at the encouragement of instructors who might have
believed it was not a suitable learning environment for them. The percentage of students who
Specific findings related to the use of the products during the study were as follows.
Although many students logged significant hours on the products, overall the intensity
of use by students was less than expected and varied greatly by site and by product.
Median total hours of use, including time students spent working with a product outside regularly
scheduled time, ranged from a low of 2.5 hours (GED Academy at Site 7 and Reading Horizons
Elevate at Site 12) to a high of 68 hours (Core Skills Mastery at Site 3). Seven of the 14 sites
had median use of less than 10 hours (including all three sites using Reading Horizons Elevate),
or less than half the time originally expected by the research team (20 hours). Similar variation
was evident in the number of days that products were used. The median number of log-in days
ranged from 5 days (MyFoundationsLab at Site 9) to 26 days (Core Skills Mastery at Site 3).
The median student in 6 of the 14 sites used the product less than a total of 10 days, or 1 day or
less per week over the duration of an ABE program site’s typical course offering.
The extent of use varied by product, reflecting differences in design features and
intended uses. On average, Core Skills Mastery, MyFoundationsLab, and ALEKS had the
highest levels of student use, and GED Academy had the lowest. This cross-product variation is
most likely due to the role the vendors intended the products to have in formal education
settings and the individualized and self-paced nature of many ABE programs (including high
school equivalency diploma prep programs).
The products’ intended role in the curriculum most likely had a large effect on how often
students used them. Core Skills Mastery (CSM) is designed to be a stand-alone program of
instruction; students are to work independently through the 30-plus units of content with the goal
of completing them all and receiving the CSM certificate of completion. Students progress to the
next unit only after they demonstrate mastery. They are expected to work for 30 minutes to 2
hours in a single session, and the typical student works through the content in 10–60 hours, on
and off site, depending on the requirements of the ABE program site. In contrast, Reading
Horizons Elevate is meant to be used as discrete activity in a blended or hybrid learning model
within a broader program of instruction; it was not designed to be a comprehensive literacy
curriculum or an online reading course. The product is designed to help students fill in gaps in
foundational literacy skills such as decoding and phonetics so they can more fully engage in and
We also found that products used in high school equivalency (HSE) diploma prep programs,
such as GED Academy, had the lowest intensity of use overall compared with products used in
other types of programs. Students enrolled in general HSE diploma prep programs enter with
widely varying motivation and skill levels. Each student is on his or her own timetable for being
prepared to take the HSE diploma exam, depending on incoming skills, family and work
responsibilities, and intrinsic and extrinsic motivating factors. Some students may need 6–8
weeks of instruction before being prepared to take the exam and leave the ABE program,
whereas others may require a year or more. It is also typical for students to rotate in and out of
these programs on the path to completing their diploma. Staff at these program sites are
sensitive to the needs of their students and go out of their way to accommodate their schedules.
They are less likely to require attendance or minimum levels of progress for students to maintain
their program eligibility. As a result, research sites using the products in HSE diploma prep
programs tended to show significantly lower overall use during the study. This was certainly the
case for Site 14 (Reading Horizons Elevate) and Site 7 (GED Academy).
Intensity of use varied by how programs decided to use and support the product. From
the site visits and interviews with program staff and students, several factors emerged that
appeared to be associated with consistent and greater use of the products by instructors and
students, as well as product effectiveness. These factors are described below.
• Use of the products must be mandatory whenever possible. Products were also
more likely to be used, both on and off site, when use was a mandatory part of the
• The products must be aligned with the rest of curriculum. Students and instructors
must view the products as instrumental in helping students achieve their goals. For
example, when used in GED prep programs, products that are not tightly aligned with
the GED exam may be perceived by instructors and students as providing less support
for students. In addition, procedures, approaches, and explanations within a product’s
content that were not aligned with classroom materials and instructor explanations often
caused confusion among students. In either case, products that are not perceived as
aligned with the goals of the curriculum or teacher-led instruction are less likely to be
used or taken seriously by both instructors and students.
Finally, we also noted that product use was higher when products were integrated into courses
that were required for advancement in a program or pathway. In several cases, products were
used in courses or instructional programs that, if completed successfully, gave students the
opportunity to advance to a higher level course or program or to receive a high school, diploma,
certification, or job. For example, Site 4 used completion of CSM as a prerequisite for entrance
to a new adult diploma program. Similarly, use of CSM at Site 3 was a requirement for
participation in a select career pathway program. At Site 10, MyFoundationsLab was the
primary source of instruction for most students in a noncredit developmental skills course
designed to help prepare them to enter a credit-bearing math-related career pathway. At all
three sites, the intensity of product use was among the highest across the participating sites.
This higher intensity may have been related to some program factors that motivated use,
including the fact that use was compulsory and was part of a program that helped students
achieve an important tangible goal. In addition to these external motivating factors, the
characteristics of the students enrolled in these programs may have also contributed to higher
levels of product use, including higher average levels of persistence and attendance relative to
students enrolled in the other ABE programs in the study.
In this section, we explore how student characteristics predicted use of the products. We
investigated the extent to which student age, gender, and incoming skill level predicted intensity
of use (total number of hours of use) for each of the products (Tables 5–8). For details on the
analytical models and results see Appendix A, Section A.3.
In general, product use tended to be higher among female and older students, with some
variation by product (Table 5). On average, female students used the products for 60% more
time than males, ranging from no difference to little difference for MyFoundationsLab and
Reading Horizons Elevate to two times as much for ALEKS, Core Skills Mastery, and GED
Academy. Except for Core Skills Mastery, older students (30 years old or older) also tended to
use the products more than younger students (18–29 years of age), ranging from 50% more for
MyFoundationsLab to four times more for GED Academy (Table 6). Although we have no firm
evidence for the reasons behind these differences, the females and older students in our
sample appeared to be more motivated in their coursework than their peers and perhaps were
more likely to attend class regularly and persist within the digital learning technologies and their
instructional programs.
Female Male
N Med. Mean SD Min Max N Med. Mean SD Min Max
ALEKS
Sites Included: Site 1, 75 25.6 36 35.7 0 205 69 18.3 19.1 16.0 0 60.7
Site 2
Core Skills Mastery
Sites Included: Site 3,Site 205 26.4 47.4 55.6 0 467 98 12.0 19.0 26.8 0 178.9
4, Site 5
GED Academy
Sites Included: Site 6, 166 6.3 16.5 23.6 0 134.1 112 2.7 9.8 17.0 0 85.6
Site 7, Site 8
MyFoundationsLab
Sites Included: Site 9, 256 10.2 24.9 44.3 0.1 340 368 10.9 19.9 35.7 0 547.2
Site 10, Site 11
Reading Horizons
Sites Included: Site12, 112 6.5 8.1 7.4 0 23.9 86 2.9 5.8 7.0 0 23.6
Site 13, Site 14
We also found evidence that the intensity of product use varied by students’ incoming skills
(Tables 7 and 8). In particular, students with lower incoming achievement scores (below the
median score) tended to spend more time working on the products for ABE programs using
Reading Horizons Elevate (based on prior reading scores) and GED Academy (based on prior
reading and math scores) than students in the same programs who scored at or above the
median on a prior achievement test. Thus, for these products it appears that students with the
greatest needs spent more time working on the products. Interviews with instructors and
vendors indicate that the variation in use by incoming skill level might be a result of the design
of these particular products and how they were used by the program sites. For example,
instructors using Reading Horizons Elevate reported that some of their advanced students were
able to complete the available units before the end of the term and were assigned other
activities while their peers continued working in the product. In the case of GED Academy,
performance on an intake diagnostic assessment determines the sequence of content that
students work through and need to master within the product. Students with lower scores on the
intake assessment will receive a learning plan that requires them to complete more units and
master more topics than their more advanced peers before being prepared to take the GED
exam. In contrast, students with higher prior reading and math scores tended to use Core Skills
Mastery more than students with lower prior achievement scores (based on both prior math and
Below Median Prior Math Score At or Above Median Prior Math Score
N Med. Mean SD Min Max N Med. Mean SD Min Max
ALEKS
Sites Included: Site 1, 70 21.8 26.4 24.8 0 153.3 74 20.2 29.3 32.9 0.03 205
Site 2
Core Skills Mastery
Sites Included: Site 100 9.0 26.4 38.8 0 188.1 100 29.45 50.48 49.0 0 272.8
3,Site 4, Site 5
GED Academy
Sites Included: Site 6, 135 5.7 13.2 19.0 0 106.7 116 4.3 15.8 24.1 0 134.1
Site 7, Site 8
MyFoundationsLab
Sites Included: Site 9, 295 13.6 25.8 44.2 0.1 547.2 235 10.0 21.3 38.3 0 340
Site 10, Site 11
Note: Because Reading Horizons is a literacy product, it was not included in these analyses.
Below Median Prior Reading Score At or Above Median Prior Reading Score
N Med. Mean SD Min Max N Med. Mean SD Min Max
Core Skills Mastery
Sites Included: Site 89 14.7 32.3 41.5 0 162.4 88 32.0 52.0 50.6 0 272.8
3,Site 4,Site 5
GED Academy
Sites Included: Site 6, 109 7.4 19.5 26.1 0 134.1 103 2.8 11.1 18.0 0 87.0
Site 7, Site 8
MyFoundationsLab
Sites Included: Site 9, 222 13.3 23.9 44.4 0.1 547.2 240 9.3 20.6 37.49 0 340
Site 10, Site 11
Reading Horizons
Sites Included: Site 12, 101 6.6 8.7 8.3 0 23.9 98 3.0 5.4 5.9 0 23.7
Site 13, Site 14
Note: ALEKS, a math product, was not included in these analyses because a score for prior reading ability was missing for a
majority of the students.
We also investigated the extent to which students used the products outside regular class
hours. Part of the promise of instructional technology in ABE programs is that it can extend
instructional hours by providing students with access to quality learning opportunities anytime
and anywhere (assuming students have access to devices and broadband Internet).
At the start of the study, an open question for us, and for many of the ABE program staff we
interviewed, was whether students would use the products on their own outside regular class
time. Most students enrolled in ABE programs have many demands on their time besides their
coursework, including family and, for many, one or more full- or part-time jobs. In addition,
several ABE administrators were concerned that students would not be able to access the
software because of a lack of home access to working computers or inadequate Internet
connectivity. In fact, almost half the instructors surveyed (46%) reported that the students’ lack
of access to the products at home limited their potential for improving student outcomes. About
25% of students surveyed reported that they did not use the products at home because they did
not have access to a computer or compatible mobile device (only 5% cited lack of Internet
connectivity). As a result, although many program sites strongly encouraged students to use the
products outside regularly scheduled class time, they did not mandate it.
However, evidence from the student survey and an analysis of system use data from two of the
products revealed that students’ off-hour use was fairly significant, albeit varying by product and
site. On the survey, 65% of students reported using the product outside regular class time,
ranging from 40% of students in ABE programs using Reading Horizons Elevate to 86% of
students in programs using Core Skills Mastery (Table 9).
Table 9. Product Use Outside Regular Class Time, as Reported in Student Survey
Table 10. Product Use Outside Regular Class Time, as Calculated from Product
System Data
The differences in the time students used Core Skills Mastery and Reading Horizons Elevate
outside class was probably associated with the intended role of the product, how it was used at
a site, and whether the ABE instructors expected students to use it outside class hours. For
example, the highest median off-hour use across the sites using Core Skills Mastery, a
complete program of math instruction, was where use outside regular class time was expected,
supported, and rewarded: a median of 19 and 56 hours for Site 3 and Site 4, respectively. In
contrast, the off-hour use of Reading Horizons was relatively modest for the median student,
ranging from 0 hours to 0.5 hour. This low off-hour use was probably related to the intended role
of Reading Horizons in the curriculum and the expectations that program sites had for students’
external use of it. Reading Horizons was designed for use in a blended or hybrid model, with
instructors using it as a discrete instructional activity for building foundational literacy skills
within a broader reading curriculum. It is not a comprehensive literacy curriculum or online
reading course and was not meant to be. While Reading Horizons can be used at home and
instructors encouraged this, the ABE sites in this study did not require that students use it
outside class time and did not have strong expectations that they would.
In general, instructors and students found value in using the products and believed they
had some benefit to instruction, student confidence, and student learning. Instructors
were relatively positive about their experiences using the digital learning tools with their
students. Many reported they felt they were better able to support students with a range of skill
levels because of the individualized instruction the products provided. On the survey, a majority
of instructors reported they would recommend the products to colleagues (83%) and would like
to use the product in future courses (78%).
A clear majority of the instructors surveyed reported that the products helped them improve the
instruction they offered. Almost 90% agreed the products helped them identify struggling
students (88%), provided immediate feedback to students (88%), and allowed students to
progress at their own pace (91%). Slightly fewer instructors, but still a significant majority,
reported that the online products helped them differentiate the content they provided students
based on individual student needs (79%).
Of the numerous challenges instructors of ABE classes face in trying to deliver effective
instruction, perhaps the most significant is adjusting to the wide variation in their students’ skills.
Some programs used placement tests to assign students to classes based on their incoming
skill levels to reduce the range of skills in a given classroom. While this “leveling” by classes
helped to some degree, it did not solve the problem, with instructors still finding it difficult to
support all students. The value of using digital learning products reported by many instructors
was the products’ ability to provide instruction that was differentiated and targeted to an
individual student’s current skill or understanding.
Many, but not all, of the students interviewed reported that they enjoyed the experience of
learning independently with the products, appreciating that they could make mistakes and
struggle in private and receive immediate feedback. They also liked the opportunity to learn at
their own pace rather than at the pace of the class, which may have been slower or faster than
they were comfortable with. Fifty-nine percent of students reported that the products gave them
the confidence they could learn new things on their own, while 50% reported that they had more
confidence in their ability to read or do math. Eight in 10 students reported they would
recommend the product they used to other students.
The majority of instructors reported favorably on their experience using the products, but
challenges were noted, such as some products’ insufficient scaffolding to support struggling
learners, content reading levels that may have been too difficult for some students, and some
students’ resistance to using online learning technologies in place of instruction delivered by
their teacher in a classroom of their peers. We believe these challenges are probably relevant to
many product developers and ABE program sites considering adopting online technologies to
support the learning of low-skilled adults.
Some students may experience anxiety and be resistant to using the product during the
transition to independent online learning environments. Although in interviews and surveys
many students expressed satisfaction with the products they used and felt they benefited from
their use, about 20% of students reported they did not. For these students, the transition to
independent online learning may take longer and require more support from instructors; some
students might never make the transition. An instructor from Site 10 described his own class’s
transition to using MyFoundationsLab as the primary mode of instruction:
Overall a lot of them are very overwhelmed at first. It’s a lot to do in a little bit of
time…. Once we kind of get going, and they see that I’m there to help, that they get
one-on-one attention, by the end of class those that stay [enrolled] tell me how
helpful it is.
However, the same instructor reported that some of her older students showed the greatest
resistance: “Older students seemed to feel like they had been shoved into a computer class and
they were not there to have a computer class…[they] felt like it was the college’s way of blowing
them off.”
In the second approach, we analyzed the relationship between the intensity of use and students’
performance on a standardized posttest measure for those students who used the product. The
question examined with this analysis was: Did students who used the products more frequently
show greater gains in learning outcomes and skill development than students who used the
products less often?
We attempted to isolate the effect of the products on learning outcomes from other factors by
controlling for factors in our models that may have been associated with better test scores and
that were external to the use of the products, such as students’ age, gender, and incoming skill
levels. A detailed explanation of the analytical models and tables with results are provided in
Appendix A.
6
Impacts were estimated for only 6 of the 14 pilot sites because (1) an insufficient number of eligible students were available for
analysis (5 sites), (2) sites that provided grade equivalence scores for TABE failed to provide information about the level of test used
for the pretest and posttest (2 sites), or (3) the site did not have a viable comparison group available because the product was
implemented in a new course (1 site). Students in courses that used products were included in the impact analysis if they used the
products for 10 or more hours based on usage computed from the products’ back end data provided by the vendors. For a site to be
included in the impact analysis, we needed to identify at least 25 eligible students in the both the user and nonuser groups. Five
sites had too few eligible students due to (1) low initial enrollments and completions, (2) insufficient use of the product (less than 10
hours), or (3) missing scores on pretest and/or posttest achievement measures.
We used two sources of data to measure gains in student literacy and math skills. The primary
and most complete data set on learning was from the ABE programs’ own student records.
Except for Site 10, all institutions had a policy of testing students when they entered the
program (pretest) and when they exited the program or after a period of instructional hours as
required by federal reporting guidelines (e.g., after 40 hours of instruction). The most prevalent
assessment ABE programs used for state and federal accountability purposes was the Test of
Adult Basic Education (TABE; http://www.datarecognitioncorp.com/Assessment-
Solutions/Pages/TABE.aspx). Pretest and posttest TABE scores were available for 8 of the 10
sites included in the analyses. Scores on the STAR assessment were available and analyzed
for Site 12, and scores on the Comprehensive Adult Student Assessment Systems (CASAS;
https://www.casas.org/home) were available for students enrolled at Site 11 campuses.7
7
Scale scores were analyzed whenever available from sites. However, for 4 of the 10 sites only grade-equivalent scores were
provided and analyzed (Site 2, Site 3, Site 5, and Site 14).
Effects based on comparative impact designs. Even though the designs applied were the
most rigorous available, they could not completely isolate the impacts of the digital learning
technologies from other aspects of the use models and learning environments that might also
affect learning, such as differences in instructor quality. The comparative quasi-experimental
designs we used to collect evidence on product impacts are described in Appendix A (Section
A4). For each impact estimated, although the comparison group (nontechnology users) may
have been similar in many ways to the group of students using the product, important
differences between the two group may have still existed (e.g., differences in curriculum,
instructor capacity, and unobserved differences in the characteristics of the students). These
existing differences may explain differences between the groups on the posttest above and
beyond any effect due to the use of the product. Thus, because we cannot completely isolate
the effect of the introduction of a product in a curriculum from other key differences between the
product user and nonuser groups, we cannot be sure the estimated impact was caused by the
use of the product alone.
Finally, the impacts estimated are based on measures of academic cognitive skills only,
assessed through the administration of comprehensive standardized tests. Use of the products
may have impacted other skills and attitudes of importance to students, ABE sites, employers,
and the product vendors (such as students’ digital literacy skills and confidence they can
acquire academic skills and can use digital resources to learn independently), but these were
not measured reliably or consistently across students and were not the primary focus of this
research.
Examining the relationship between use and student outcomes. In these analyses,
examined the degree to which time spent using a product was related to student performance
on standardized measures of achievement (see Appendix A, Section A5, for details). Although
these models can help indicate whether a relationship between use and learning outcomes
exists, they cannot be used to establish, with any level of confidence, whether product use
caused better student leaning outcomes. There are multiple plausible explanations for any of
the reported associations. The findings should be treated as exploratory and positive
associations as promising but not definitive evidence of a causal connection between greater
product use and improved learning and skill development.
To estimate the impacts of product use on student learning, we compared the scores on a
learning assessment for students who used a product and students who did not. Propensity
score matching was used, a technique for improving the baseline equivalence of the groups that
were compared (see details about the models in Appendix A). Only in Site 14 were we unable to
adequately adjust for initial baseline nonequivalence well enough to meet standards for impact
estimation. As a result, that site was dropped from this analysis. In Site 7 and Site 8, the sample
sizes were too small to estimate an impact because of a combination of low enrollments,
missing test scores, and low use of the products (less than 10 hours of use). For Site 10
(MyFoundationsLab), the impacts estimated were based on comparing matriculation and pass
rates for an entry-level credit-bearing English course that followed the noncredit developmental
education course the products were used in (following program policy, no posttest was
administered in the developmental education course). Because few students in our sample
matriculated to the credit-bearing course, the sample available for estimating impacts was not
sufficient, so impact results for Site 10 also are unavailable. Finally, we requested scale scores
for each test administered from all program sites, but a few sites using the TABE could provide
only grade-equivalent scores. However, because grade-equivalent scores are not comparable
across different test levels (TABE uses separate forms to assess students at different skills
levels), two sites, Site 2 and Site 5, were eventually excluded from the analyses because they
could not provide information on the test form used for the pretest and posttest.
To aid in interpreting the differences in test scores across sites, tests, grade levels, and subject
areas, we report the difference in adjusted mean scores as a standardized effect size. An effect
size expresses the difference between two mean scores in terms of how spread out the scores
are. (Technically, the effect size is expressed in terms of standard deviations of outcome
scores.)8 An effect size of 0.3, for example, means that one group on average scored 0.3
standard deviation higher than the other group. This would apply whether the scale of the test
score were 0 to 100, 150 to 600, or any other measure. That is, an effect size of 0.3 would
essentially represent the same magnitude of difference regardless of the underlying point
8
An effect size is commonly computed by taking the mean difference in test scores between the treatment and comparison groups
and dividing it by the pooled standard deviation for the total sample (treatment and comparison students combined).
In addition to reporting an effect size for each site and outcome measure analyzed, we provide
the 95% confidence interval to give a sense of the precision and the uncertainty of an estimate.
The confidence interval describes the probability (95%) that the true impact lies somewhere
within the interval if we were to rerun the study with a different sample of schools, instructors,
and students within these sites. Confidence intervals rather than point estimates alone are often
preferred by researchers because they include information about the uncertainty of the point
estimate. Every value in the confidence interval is a plausible value for the effect. If zero is in the
interval, the null hypotheses, a zero effect cannot be rejected. In general, the larger the sample
size, the greater the precision of the point estimate and the narrower the confidence interval.
Table 11 shows the results of the impact analyses for learning outcomes. It presents the
adjusted differences in the gains between groups as well as estimated effect sizes (Hedges’ g)
and confidence intervals. Statistically significant results are in boldface. Figures 2 and 3 indicate
how the distribution of the effects vary across products and program sites, in descending order
from positive to negative, for reading and language and math. The width of the bars represents
the minimum and maximum of the 95% confidence interval. When the interval includes an effect
size of zero, the actual effect may include a no-effect result and the estimated impact is
considered not statistically significant.
Product and Test Condition N Pre- Post- Adjusted Effect Effect Size
Site test test Mean Size 95% CI
Differences
Mean Mean Hedges Min Max
’g
(SD) (SD)
MyFoundationsLab
TABE Control 90 490.6 522.5 -5.37 -0.08 -0.39 0.23
Math (60.7) (63.8)
Treatment 73 494.2 519.8
(56.19) (61.7)
TABE Control 89 549.9 555.1 -18.21 -0.3 -0.61 0.01
Reading (71.7) (61.1)
Site 9 Treatment 70 546.8 536.5
(72.0) (60.2)
TABE Control 91 537.3 551.8 -8.04 -0.13 -0.44 0.18
Languag (51.7) (60.3)
e Treatment 75 531.9 540.4
(74.2) (67.5)
CASAS Control 176 218.7 225.8 -0.72 -0.07 -0.34 0.2
Math (9.8) (10.6)
Treatment 76 220.6 226.4
(10.0) (9.3)
Site 11
CASAS Control 161 231.4 238 -0.26 -0.03 -0.38 0.32
Reading (10.0) (10.2)
Treatment 36 231.6 237.9
(9.8) (8.96)
Reading Horizons Elevate
STAR Control 38 523.7 552.5 29.51 0.19 -0.24 0.62
Winter (110.9) (143.9)
Treatment 43 516.3 557.0
(129.0) (165.5)
Site 12
STAR Control 60 540.4 711.5 -96.82 -0.49 -0.9 -0.08
Spring (97.3) (230.2)
Treatment 40 528.9 596.7
(114.1) (156.5)
The results of the analyses are shown in Table 12 including the direction of the relationship
between the intensity of product use and students’ posttest scores. Overall, the results were
inconclusive but tended to be more positive than negative: Greater use of the products was
associated with better gains in student test scores. Most of the relationships were not
statistically significant except in two cases. Of the 19 estimated relationships, linking use levels
to learning outcomes, 12 of the relationships were positive and 7 were negative. Only two of
9
Impacts were estimated for only 6 of the 14 sites because (1) an insufficient number of eligible students were available for analysis
(5 sites), (2) sites that provided grade equivalence scores for TABE failed to provide information about the level of test used for the
pretest and posttest (2 sites), or (3) the site did not have a viable comparison group available because the product was implemented
in a new course (1 site). Students in courses that used products were included in the impact analysis if they used the products for 10
or more hours based on usage computed from the products’ back end data provided by the vendors. For a site to be included in the
impact analysis, we needed to identify at least 25 eligible students in the both the user and nonuser groups. Five sites had too few
eligible students due to (1) low initial enrollments and completions, (2) insufficient use of the product (less than 10 hours), or (3)
missing scores on pretest and/or posttest achievement measures.
Table 12. Relationship Between Duration of Use per Day and Scores on Learning
Assessment
Overall, programs, instructors, and students found value in the digital learning technologies they
used in the study. Instructors reported that the use of the products enabled them to differentiate
instruction to fill gaps in basic literacy and math skills across a wide range of students in ways
that were not possible without the products. In addition, a majority of students, but not all,
reported that they enjoyed using the products and that the products helped them improve their
math and reading skills and gave them confidence they could use online resources to learn on
their own without an instructor’s direct involvement. A majority of students also reported that
they used the products to continue learning outside the regularly scheduled class time.
The significance of these findings should not be underestimated. Many of the students enrolled
in ABE program have had little prior success developing their basic skills in formal education
environments. This was probably the first time that many of them had used learning
technologies in a meaningful way. Given the size of the population in need of the kinds of
services ABE programs provide, these findings indicate that learning technologies like those in
this study can be part of the solution, helping ABE programs and instructors do what they do
better and providing many adults with the confidence that they can use online resources on their
own time and at their own pace, inside and outside a formal ABE program.
However, this research did uncover challenges in using learning technologies with low-skilled
adults in ABE programs. Use of the products at several sites was well below what had been
The primary challenge facing ABE programs, instructors and product developers is how to
engage and support all students in online learning environments. While a majority of students
reported they enjoyed using the products, would recommend them to their peers, and thought
they benefited from using the products in important ways, 1 in 5 students did not. In general,
these students reported that they preferred working directly with instructors over learning online
and with technology. This finding will not be a surprise to anyone who has spent time with these
students. A majority of them, all over age 18 and struggling to read and do math at the fourth- to
ninth-grade level, have had difficulty learning throughout their lives. Several instructors
interviewed said they thought the reading level of several of the products was too difficult for
many of their students at the low end of the reading spectrum. Further, depending on the
learning scaffolds embedded in the products, the immediacy of instructors’ support, and
students’ ability to seek help when needed, some students may become stuck in a digital
learning environment and experience frustration. Product developers and ABE programs and
instructors must be aware that without the proper design features, supports, and monitoring in
place, a certain percentage of the most vulnerable students will struggle in any learning
environment. Products and the ABE programs that implement them must be designed to identify
these students before they enter the digital learning environment and provide appropriate
support once they start using the product.
This research does suggest that under the right conditions, ABE programs can effectively
integrate learning technology products into their curriculum and that most students will use them
for a significant amount of time on and off site and will have a positive experience. Greater use
occurred when ABE program sites and instructors were committed to using the products as a
An initial set of recommendations for ABE program administrators, instructors, and product
developers, based on these research findings, follows.
• To ensure that students spend sufficient time on the products and make adequate
progress, commit to using the products as a regular part of core instruction (not as an
add-on activity) and make use mandatory and consequential.
• To support product use outside scheduled class time, help students take advantage of
federal, state, and local programs providing low-cost devices and Internet access and
make sure all students know how and where they can obtain devices and connectivity on
and off site (e.g., public libraries, workplaces, and community resource centers). In
addition, provide incentives for off-hour use.
• To help ensure instructors’ commit to using the products, provide adequate time for
training, planning, and piloting to ensure better integration of the products into the
curriculum and the instructors’ own practices.
• Prepare to offer students who are struggling with the transition to online learning
additional monitoring and support, including a more gradual ramp-up time on the
products and alternative instructional activities during the transition. Plan for the
likelihood that some students will not want to make a transition to digital instruction.
• To ensure that all students can access the instructional content, particularly struggling
readers, scaffold the text with audio and video presentations.
• To support blended learning models and to keep instructors invested in students’ work in
the online environment, make the content modular so that programs and instructors can
better integrate product use into the existing curriculum and with direct instruction.
Teachers like the fact that the products’ instruction is individualized and allows students
to work at their own pace on the skills they need. Yet many teachers wishing to
implement blended instruction feel disconnected from what students are working on
when the product’s instructional content is not the same as what is being covered in the
classroom. These teachers are less likely to be invested in the use of the technology and
are more likely to resort to potentially less effective hybrid and supplemental models of
use.
• To help motivate instructors and students to use a product, make sure the content is
aligned with all current ABE standards and competency exams.
• Provide sites with a variety of models of use to support a range of student types and
program goals. Most students can learn online and independently with proper
monitoring, coaching, and motivating factors.
This research represents an initial step in exploring how digital learning products might support
the goals of ABE programs and their students. Given the wide variety of skills of the adult
learners and the different ABE goals and resources, more rigorous research is needed to
understand which product features and aspects of online, blended, and hybrid use models are
the most feasible and the most effective for ABE programs. Digital learning technologies like
those selected for this study, although not the solution for all ABE program needs, can be an
important support for programs and instructors in expanding access to basic skills instruction
and improving outcomes for low-skilled adults.
Fleischman, J. (1998). Distance learning and adult basic education. In C. E. Hopey (Ed.),
Technology, basic skills, and adult education: Getting ready to move forward
(Information Series No. 372, pp. 81-90). Columbus: ERIC Publications on Adult, Career,
and Vocational Education, Ohio State University.
Litster, J., Mallows, D., Morris, M., Redman, R., Benefield, P., & Grayson, H. (2014). Learning
technology in adult English, maths and ESOL/ELT provision: An evidence review. BIS
Research Paper No 196. Retrieved from https://www.gov.uk/government/uploads/
system/uploads/attachment_data/file/377604/bis-14-1206-learning-technology-in-adult-
english-maths-and-esol-elt-provision-an-evidence-review.pdf
Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about
whether, when and how. New York, NY: Routledge.
Means, B., & Roschelle, J. (2010). Technology and learning: Overview. In E. Baker, B. McGaw,
& P. Peterson (Eds.), International encyclopedia of education (3rd ed.). Oxford, England:
Elsevier.
Petty, L. I., Johnston, J., & Shafer, D. (2004). Handbook of distance education for adult learners
(3rd ed.). Ann Arbor: Project IDEAL Support Center, University of Michigan.
Tyton Partners. (2015). Learning for life: The opportunity for technology to transform adult
education – Part II: The supplier ecosystem. Boston, MA: Author.
Warschauer, M., & Liaw, M.-L. (2010). Emerging technologies in adult literacy and language
education. Washington, DC: National Institute for Literacy.
To maintain instructor and student data confidentiality, the data were archived in a secure file
server at SRI to which only a limited number of approved SRI analysts had access. An SRI staff
member not otherwise involved in the project anonymized the data files before releasing them to
the analysis team by substituting the site-specific ID or personally identifiable information with a
consistently formatted SRI-generated ID number.
We ran separate models for the five products in the study. The total time students were logged
in to the product, measured in hours, was the dependent variable. For two products, MFL and
ALEKS, outliers were recoded so they did not exceed a maximum of 5 hours (300 minutes) per
day. Given the distribution of the dependent variable, we determined that a multiple linear
regression was not suitable. A Poisson regression model was considered. However, given
evidence of overdispersion in the dependent variable, we ultimately selected a negative
binomial regression to model the dependent variable. When provided by the program site, we
controlled for age, gender, and prior achievement. We also specified robust standard errors. For
purposes of interpretation, age was centered around the median and prior achievement was
centered around the mean. The output was reported in incidence rate ratios. Both univariate
and multivariate models were run, univariate models to determine whether there was a
statistical difference in product use with respect to a single variable and multivariate models to
determine which variables were most significant.
Tables A-3 to A-7 show the results for the negative binomial regression models by product.
Table A-4. Core Skills Mastery. Results of Student-Level Predictors of Product Use
(Total Hours)
(1) (2) (3) (4) (5) (6)
Variable Univariate Univariate Univariate Univariate Multivariate Multivariate
Model with Model with Model with Model with Model with Model with
Gender Age Pre-math Pre- reading Pre-math Pre- reading
Score Score Score Score
Female 2.499*** 1.724** 1.995***
a
(0.409) (0.431) (0.524)
Age (median 1.017 1.009 1.019
split)
(0.0135) (0.0132) (0.0152)
This section describes the preparation and modeling of academic achievement data from the
study sites.
1. We limited the sample to students who used the product for 10 or more hours during the
period of the study. The rationale was to ensure that we were estimating the impacts of
product use on learning for students who used the product for a relatively significant
period of time—at least half the total time specified by the research team (20 hours).
2. Students with complete demographic information (i.e., gender and age) and academic
achievement information (e.g., TABE scores before product use and post-TABE scores)
were included in the analysis. This enabled us to find a matched and balanced
comparison group to estimate impacts on student academic achievement.
3. We further limited the sample to students who took the pretest within 14 days of using
the product the first time. When information was available, a maximum period of 14 days
between the pretest and the student’s program or product start date (whichever was
available) was included as an exclusionary criterion in the analysis. By adding this
constraint, we attempted to standardize the timing of the administration of the prior
achievement measure (e.g., TABE scores before the start of the program) and the
amount of instructional time between the pretest and posttest so that it was comparable
across students and across the treatment and comparison groups.
Table A-8 indicates the extent to which the program site location, instructors, curriculum, and
instructional time were the same or different for treatment and control students. When data were
available, the number of campuses and instructors for each group are shown.
Table A-8. Similarities Between Treatment and Control Groups in Site Location,
Instructors, Curriculum, and Instructional Time
Course or Instructional
Product Site Campus Instructors
Curriculum Time
Same/different
ALEKS Site 1 Same (4 treatment; 4 comparison; Same Same
2 instructors the same)
Same/different
Different
(6 treatment; 16
GED Academy Site 6 (6 treatment; 25 Same Same
comparison; 2
comparison)
campuses the same)
Different
Site 9 Same (6 treatment; unknown Different Same
multiple comparison)
MyFoundations
Lab Different
(unknown multiple
Site 11 Same Different Different
treatment; unknown
multiple comparison)
Reading
Site 12 Same Different Different Different
Horizons
To assess whether students in the treatment and comparison groups were similar, we took
students with complete achievement data and background information and examined the
equivalence of their pretest scores, elapsed time (days) between pretest and posttest, age, and
gender. We used the What Works Clearinghouse Procedures and Standards Handbook
(version 3) to guide the analysis.10 When a difference between treatment and comparison
students for any baseline measure had an effect size greater than 0.25, propensity score
matching was used to improve the equivalence between the groups. We used R MatchIt to
implement propensity score matching.11 Specifically, we used nearest neighbor matching and
matching with replacement to select the best comparison matches for each student in the
treatment group. Logistic regression models were used to estimate the propensity score,
defined as the probability of receiving treatment, conditional on the student characteristics. After
matching with replacement, the R package generated the weights to account for the frequency
with which each control student was used as a match to students in the treatment group.
General linear modeling (linear regression) was used for analysis, with weights generated by
MatchIt. The regression coefficient, p-value, effect size, and confidence interval of each effect
size were reported.12 The results of the model are shown in Table A-9.
10
For details, see page 15 in the What Works Clearinghouse Procedures and Standards Handbook version 3.0
(https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_v3_0_standards_handbook.pdf)
11
D. Ho, K. Imai, G. King, & E. Stuart. (2007). Matching as Nonparametric Preprocessing for Reducing Model Dependence in
Parametric Causal Inference. Political Analysis, 15, 199–236.
12
An unbiased effect size estimate corrected for small sample bias was calculated by multiplying the Hedges’ g by a factor of ω = [1
– 3/(4N – 9)], as suggested by the What Works Clearinghouse standards.
ALEKS
TABE Math Comparison 54 493.6 523.9 11.6 0.28 -0.09 0.65
(prior (30.4) (40.7)
Site 1 cohort)
Treatment 53 499.3 539.7
(41.8) (41.4)
Core Skills Mastery
TABE Math Comparison 33 9.3 9.3 1.28 0.48 0.05 0.91
(grade (prior (2.7) (2.9)
equivalent) cohort)
Treatment 67 9.2 10.3
(2.6) (2.5)
Site 3
TABE Read Comparison 42 8.9 9.2 0.24 -0.17 0.64 0.24
(grade (prior (2.4) (2.6)
equivalent) cohort)
Treatment 52 9.0 9.3
(2.6) (2.7)
GED Academy
TABE Math Comparison 46 488.7 527.7 -5.24 -0.11 -0.54 0.32
(concurrent) (62.0) (45.5)
Treatment 40 491.6 525.9
(52.4) (47.3)
TABE Comparison 41 528.3 532.5 7.67 0.16 -0.29 0.61
Reading (concurrent) (58.5) (47.6)
Site 6
Treatment 33 541.5 548.1
(55.7) (44.3)
TABE Comparison 45 489.1 512.8 8.9 0.16 -0.29 0.61
Language (concurrent) (68.8) (51.4)
Treatment 35 498.2 524.2
(64.7) (58.7)
MyFoundationsLab
TABE Comparison 90 490.6 522.5 -5.37 -0.08 -0.39 0.23
Math (prior cohort) (60.7) (63.8)
Treatment 73 494.2 519.8
(56.19) (61.7)
TABE Comparison 89 549.9 555.1 -18.21 -0.3 -0.61 0.01
Reading (prior cohort) (71.7) (61.1)
Site 9 Treatment 70 546.8 536.5
(72.0) (60.2)
TABE Comparison 91 537.3 551.8 -8.04 -0.13 -0.44 0.18
Language (prior cohort) (51.7) (60.3)
Treatment 75 531.9 540.4
(74.2) (67.5)
CASAS Comparison 176 218.7 225.8 -0.72 -0.07 -0.34 0.2
Math (prior cohort) (9.8) (10.6)
Treatment 76 220.6 226.4
(10.0) (9.3)
Site 11
CASAS Comparison 161 231.4 238 -0.26 -0.03 -0.38 0.32
Reading (prior cohort) (10.0) (10.2)
Treatment 36 231.6 237.9
(9.8) (8.96)
Reading Horizons Elevate
STAR Comparison 38 523.7 552.5 29.51 0.19 -0.24 0.62
Winter (concurrent) (110.9) (143.9)
Treatment 43 516.3 557.0
(129.0) (165.5)
Site 12
STAR Comparison 60 540.4 711.5 -96.82 -0.49 -0.9 -0.08
Spring (concurrent) (97.3) (230.2)
Treatment 40 528.9 596.7
(114.1) (156.5)
A.5. Examining the Relationship Between Use and Basic Skill Outcomes
This section describes the modeling of system use data from each of the product vendors to
examine the relationship between product use and academic outcomes. Any student who used
the product and for whom vendors provided system use data was included in the statistical
analysis. A general linear model (linear regression) was used with academic achievement as
the dependent variable.
To create a measure of product use, we first examined three variables: time spent on the
product (hours), number of days with at least one log-in, and a combination of the two variables
(average hours spent on the product per day). There was high correlation among the three
variables. In examining the relationships between the three variables and the posttest scores,
we found that average hours spent on the product per day had the highest correlation with the
test scores across the study sites. To avoid collinearity issues in the linear regression model, we
decided to use the average hours spent on the product per day as the measure for the product
use for these analyses.
We controlled for pretest scores, age, gender, and elapsed time between pre- and post-test in
the linear regression model when these variables were available for a particular program site.
Table A-10 shows the results of the linear regression models.
Table A-10. Results of Analysis of Relationship Between Product Use and Learning
Outcomes
2
Product Test Covariates N Beta p-Value Total R
Site
ALEKS
TABE Math Pre-TABE Math, gender, 56 +0.06 0.28 0.60
Site 1 age, days between pre-
and post- tests
Core Skills Mastery
TABE Math Pre-TABE Math, gender, 66 -0.00 .30 0.62
(Grade Equivalent) days between pre- and
post- tests
Site 3
TABE Read Pre-TABE Read, gender, 58 -0.01 .05 0.60
(Grade Equivalent) days between pre- and
post- tests
Education Skills Pre-ESO Math, gender 33 +0.10 .46 0.3
Online
Math
Site 4
Education Skills Pre-ESO Read, gender 33 -0.10 .31 0.25
Online
Reading
Beta represents the change in the outcome score for a one-unit change in the amount of product use (average
hours spent on the product per day), given that we control for other independent variables in the model, including a
student’s pre-test score, gender, age, and days between pre- and post-tests. For example, for site 1, for every unit
increase in the average hours students spent working on ALEKS per day, on average, we observed a 0.06 increase in
the post TABE Math score. Note that since the scales for different tests are not the same, the size of the Betas
associated with different tests cannot be compared.
Site Portrait
The overall mission of Site 1 in Central Colorado is “to provide quality educational experiences
that equip all students for success as parents, citizens, and workers.” The district’s adult and
family education program offers classes in several areas including Adult Basic Education (ABE),
Adult Secondary Education (ASE), and English as a Second Language (ESL) for students over
age 17. The district’s adult education also includes a Family Literacy Program that has four
components: parent time, parent and child time, adult education, and an age-appropriate
children’s literacy program. Parents taking adult education classes in either the ABE/ASE or
ESL tracks can participate in the Family Literacy Program.
Site 1 serves a population of mostly Latino students, but this is evolving as people of other
nationalities and refugee populations enroll. Learners typically range in age from 17 to 40, with a
median age of late 20s. Most students are underemployed or unemployed and have had 3 to
12-plus years of formal education, not necessarily in a U.S. school system. Most ESL students
in the ABE/ASE program enter with both low literacy and low numeracy.
During the study year, classes using ALEKS were offered at one school campus in Central
Colorado.
Site 1 operates from August through June, with rolling enrollment offered each month. The
ABE/ASE classes are designed to help students improve their language arts and mathematics
The program uses scores on the TABE administered at enrollment to assign students to classes
at the appropriate skill level (0.0–3.9 range = low-level class; 4.0–8.9 range = medium-level
class; 9.0–12.9 range = high-level class). Only students in the medium-level classes
participated in the study, and they were taught by three instructors. The medium-level classes
were further split to create two sublevels, 4.0–5.9 range and 6.0–8.9 range, and students were
assigned courses based on these sublevels.
Use Model
The instructors used ALEKS to support their math classes using either a blended or hybrid
model. Classes met twice a week for a 3-hour period, totaling 6 hours per week. During each
class, students used ALEKS for 45 minutes to an hour. In the classes observed as taught by
three instructors, two instructors used ALEKS in the last hour of the class and another used it in
the first hour of the class. All students were expected to work on ALEKS for 1.5 to 2 hours
during class and up to 2 hours outside class, for approximately 3–4 hours per week.
The two instructors who taught the 4.0–5.9 range classes covered the same content sequence
and pacing, and developed a common set of assessments for their classes. These instructors
also aligned the content of their direct instruction and the textbook (Achieving TABE Success
Level M) with the topics students were working on in ALEKS.
The instructors varied in how they integrated ALEKS into their direct instruction. For example,
the instructor teaching the 6.0–8.9 skill-level students assigned homework in ALEKS, to be
completed inside and outside class time. This instructor set a goal of 80% correct for each
student on the homework and reviewed the most difficult topics with the whole class to help
students toward that goal. One of the instructors teaching the 4.0–6.9 skill-level students also
assigned homework in ALEKS but did not set any specific performance requirements. While
most students worked on ALEKS independently, some also worked together in pairs or small
groups to solve problems. After the in-class ALEKS use, the instructor reviewed some selected
ALEKS problems with the whole class.
Students were expected to use ALEKS outside class, but the time they spent varied
substantially depending on their access to technology. Because technology and Internet access
were an issue for many students outside class, the program began to offer drop-in access to the
computer lab. Some students who were intrinsically motivated to use ALEKS outside class time
did find ways to access ALEKS on other devices or accommodate use in their busy schedules.
To increase variety in the types of instruction students experienced and to lessen the chance of
fatigue setting in during extended periods working within ALEKS, one instructor began to
incorporate breaks after every 15 minutes of ALEKS use (a practice recommended by the
study’s technical assistance provider, Mockingbird Education). During each break, students
were encouraged by the instructor to present to their peers a solution to an ALEKS problem
they recently solved.
Overall, the adult education program director’s, instructors’, and students’ feedback on their
experience using ALEKS was positive. According to the program director, instructors found the
reports and dashboards showing data on student progress provided by ALEKS to be useful as
well as how the product differentiated instruction for students of different skill levels. The director
felt that the blended learning model worked for their program, especially for the students in the
low-level class who needed more review, practice, and instructor support. The students
interviewed also said they enjoyed their experience using ALEKS but they thought they would
prefer a blended or hybrid learning model, combining both online and direct instruction, to
online-only learning.
As mentioned, one challenge instructors and administrators faced was getting students to use
ALEKS outside class. Allowing drop-in access to the computer lab was one change made
during the study year to boost the use of ALEKS outside regular class time in order to address
many students’ lack of off-campus access to technology and the Internet.
Site Portraits
Site 2 is a nonprofit organization serving high-risk youth ages 16–24 with a network of 260
urban and rural programs in 46 states. While the individual services local program sites provide
vary, overall the local programs provide a range of comprehensive services including
empowerment, educational, and vocational training; career development; social support;
community service opportunities; access to postsecondary education; and job placement.
Two Site 2-affiliated programs in California. A local nonprofit operates two Site 2-affiliated
programs in the Los Angeles area, Program A and Program B. Both have an education
partnership with a charter school sponsored by Site 2, which has a waiver to issue credits and a
high school diploma to those over 18 who participate in Programs A and B. These programs
serve high-risk youth in economically distressed areas through a small-group cohort structure
(approximately 34 students per cohort) that combines occupational/vocational (e.g.,
construction, hospitality, and culinary arts) and educational opportunities including a credit-
based high school diploma program. Students also participate in community advocacy,
At each local site, the TABE is administered at the beginning and end of the program but is not
typically used for program placement; instead, school transcripts, sometimes in conjunction with
TABE scores, are used. The average student enrolling in the programs is typically at a sixth-
grade reading level and a fourth-grade math level.
One Site 2-affiliated program in northern Massachusetts. The Site 2 program in northern
Massachusetts, Program C, is operated by a foundation and financed primarily by the U.S.
Department of Labor. In contrast to the credit-based diploma programs in California, Program C
prepares high-risk students (e.g., low income, out of school, in the criminal justice system,
and/or unemployed) to pass the Massachusetts High School Equivalency Test (HiSET). In
addition to the high school equivalency preparation program, Program C provides
empowerment and vocational training, career development, social support services, community
service opportunities, access to postsecondary education, and job placement. At Program C,
students are tracked into three skill levels based on their incoming TABE scores.
Only students 18 years old and older were included in the research.
Program Organization
In Program A, approximately 60 students are on campus and another 15 are off campus as part
of an independent studies track. The program has five full-time instructors and one part-time
Approximately 80–100 students are in Program B during a year. The program academic
calendar is divided into three semesters roughly 10–12 weeks in duration. Each class has a
maximum of 25 students. The three classes in the study were algebra, pre-algebra, and multi-
core craft curriculum (MC3). In the MC3 course, students are trained in the construction and
building trade and in math skills. As part of their math curriculum, students are required to raise
their numeracy skills with the goal to move up a grade level in math. All three classes in the
study met Monday through Thursday for 55 minutes each day, and in all three classes
instructors required students to cover at least 15 topics per week and spend at least 3 hours per
week on ALEKS. The class schedule was slightly different for the MC3 course because these
students came early, at 7 a.m., to receive training in construction skills until noon. Math
instruction started at noon when this class joined the pre-algebra and algebra class.
While the number of full-time and part-time instructors at the Site2-affiliated programs vary,
each participating program had one full-time instructor for math. For example, at Program C,
there is typically an additional person (e.g., aide, assistant, tutor) who works one on one with
students. At Program B, there are four full-time teachers each in social studies, science/math,
English, and construction and one part-time culinary teacher.
Use Model
Because Site 2 was not approached about participating in the research until January 2017 and
the study’s data collection activities ended in July, Program C students used ALEKS during the
second quarter and part of the beginning of the third quarter, and Program A and Program B
students used ALEKS during the last trimester.
Because use models varied with each program, we describe each program’s model separately.
A typical math class usually started with the instructor going over a common set of learning
objectives for the class period. The instructor then checked in with individual students and led a
group discussion on a particular math topic. After the discussion, students were given time to
work independently on ALEKS at their own pace. They sought help from the instructor as
needed, especially when they experienced difficulties entering their answers to problems.
Students were encouraged but not required to use ALEKS outside class. The instructor was not
aware of any students doing so.
The instructor reported that she did not regularly use the ALEKS reports on student progress to
monitor use or provide feedback to students because of limits on her time. However, the
instructor said she planned to go over the reports with individual students during their
evaluations at the end of the quarter.
Program B (Los Angeles, CA). ALEKS was implemented during the last trimester only during
the pre-algebra, algebra, and construction courses. About 90% of class time was spent on
ALEKS, and students primarily worked independently on topics assigned by ALEKS based on
their scores on an initial assessment administered within ALEKS. During students’ ALEKS use,
the instructor remained in the classroom to provide guidance and support when needed.
According to the instructor, two different groups of students emerged: one that was able to work
on ALEKS independently and another that needed more teacher support to make progress.
The use model for ALEKS was modified over time because of a number of factors including the
amount of time needed for all students to get logged in to ALEKS on their laptops. Originally, the
instructor had planned to do a 15-minute mini lesson at the beginning of the class and then
have students work on ALEKS for the remainder of the period. Instead, to maximize the amount
of time students spent working in ALEKS, the instructor decided to provide mini lessons only on
an as-needed basis when many students were struggling with a common topic. Typically, these
mini lessons took place no more than once a week and required advance planning by the
instructor.
Some students who were on track to graduate at the end of the program year were not required
to attend class in person and used ALEKS off campus. Those students were required to meet
with the instructor after school on their own time. The instructor noted that among these remote
users of ALEKS, students unable to schedule regular check-ins with the instructor tended to
spend little time on ALEKS.
Program C (Northern MA). Instructors had planned to spend 3 hours per week on math
instruction. The initial plan for each class period was to dedicate half the time to direct
instruction and half to ALEKS. However, the actual time allotment varied widely from class to
class, with more time spent on ALEKS than direct instruction, at least partially to accommodate
the technology setup for working on ALEKS. Some students used ALEKS outside class as well.
There was no coordination between the topics that students covered on ALEKS and what the
instructor covered during direct instruction. According to the instructor, since she planned the
direct instruction lessons 2–3 weeks in advance and students worked in ALEKS at their own
pace, it was not possible to align the direct instruction content with the ALEKS topics.
While students worked on ALEKS, the instructor ensured they remained on task and monitored
their progress. In addition, the instructor helped clarify anything students found confusing,
including the explanations provided by ALEKS.
Program B (Los Angeles, CA). To ensure students would make significant progress while on
ALEKS, Program B planned to enforce both a weekly time requirement (3 hours per week) and
a topic mastery requirement (15 topics). Noting that some students needed to spend more time
mastering a topic than others, however, the instructor adapted the topic coverage requirement
for each student.
To encourage use of ALEKS outside regularly scheduled class time, the instructor attempted to
have students use ALEKS on their smartphones, but this was not effective because many
students found the text on the mobile version too small to read. So to provide students with an
opportunity to use the product outside class time, Program B gave students access to desktop
computers on site. Many students do not have a computer or Internet access at home. (Any
Internet access is usually through a mobile device.)
Program A (Los Angeles, CA). In hindsight, the instructor believes that his use of ALEKS
would have been more effective if he had integrated and aligned the content students worked
on in ALEKS with his lectures and discussion. The instructor initially adopted a hybrid model for
using ALEKS to support his instruction: instructor-led lecture and whole-class discussion
followed by self-paced use of ALEKS along individualized pathways determined by the product.
Since ALEKS covers a breadth of topics, students might be working on topics that the instructor
is not covering in class; the instructor thus felt that students’ work in ALEKS was not necessarily
reinforcing what they learned in recent lectures. As a result, in the future the instructor plans to
assign ALEKS as a supplemental activity outside regular class time as part of mandatory
homework assignments.
Program C (Northern MA). The Program C instructor felt that overall implementation would
have benefited if he had started using ALEKS at the beginning of the 10-month program. This
would have provided him with more time to establish classroom norms and expectations about
how the product would be used, fostering greater student buy-in and more consistent use.
Additionally, starting the use of ALEKS earlier could have provided the instructor an opportunity
to work out how to best to incorporate it into the program in a way that showcased how
In every academic class period, time is allotted for independent practice. The instructor noted
that integrating ALEKS use during this portion of the class worked well. It provided an
opportunity for students to work at their own pace and to work on topics that may not have
aligned with the overall shared objectives for the class.
The instructor participated in the initial ALEKS webinar, felt adequately prepared, and was
comfortable with computer use. He found ALEKS easy to use, particularly after spending some
time on it and becoming familiar with its teacher-facing features. According to the instructor,
“Some of the reports that I was…generating took a little bit longer to figure out. I didn’t think it
was the program itself, it was just I needed to give myself enough time to play with it.”
In general, instructors reported that the individualized and adaptive nature of ALEKS were
beneficial to students. Several instructors felt that while some students used ALEKS to review
and practice the material and topics being taught in the class, others used it to get exposure to
advanced topics once they mastered the topic being covered in class. From the instructors’
perspective, ALEKS provided an ability to cover more content than was possible during direct
instruction and to use class time to cover specific topics in more depth. While both breadth and
depth are preferable, limited class time made it challenging to include both without the use of
ALEKS. For example, using ALEKS at Program B significantly changed how instruction was
structured and the amount of math content students were exposed to and mastered.
Given the disparity in students’ academic preparation, especially in math, within any Site 2
cohort, instructors found ALEKS useful in meeting each student’s individual needs. The initial
ALEKS assessment, along with the instructor dashboard, provided instructors with useful
information, giving them an understanding of the skills of each student so they could
appropriately target the needs of different students and groups of students in their direct
instruction. Some instructors reported that ALEKS and its dashboards also saved them
Overall, the students’ reactions to ALEKS were positive, although some found it more useful
and helpful than others. Students who had unfavorable opinions about the courseware often
expressed frustration with ALEK’s “knowledge checks,” which serve as formative assessments
that help the instructor understand what students fully or partially understand and can inform
lesson planning. Some students became frustrated when they felt they were being tested on a
concept they had not seen before or when a concept from a prior lesson reappeared. Some
students found the knowledge checks too frequent (every 3 hours), wanting more time to
progress through the topics before completing a knowledge check. To try to minimize frustration
that might build in students who are struggling with a particular concept, ALEKS transitions
students to new topics after a certain threshold of unsuccessful attempts at solving a problem, a
practice that frustrated some students who wanted to persist and master a topic. Although some
students interviewed found these practices frustrating, in general instructors found them sound
pedagogically and supportive of student mastery. One instructor reported she liked that ALEKS
had students periodically revisit concepts and problem types to ensure mastery. This also
helped this particular instructor know whether a student had full or partial mastery of a topic.
Students and instructors had mixed reactions about the appropriateness of the reading levels.
Most of the instructors and some students noted that some explanations in the courseware were
difficult for some students. However, other students felt that the reading levels of the
explanatory text in ALEKS were appropriate; one student also reported particularly enjoying the
positive reinforcement messages that ALEKS provided after the student completed a problem.
In general, most students interviewed liked solving problems within ALEKS and the support it
provided, particularly compared to completing assignments on paper-pencil worksheets in class.
For example, one student commented that she found the explanations of concepts and
procedures in ALEKS “were a lot clearer than having a conversation with a teacher. Being in a
classroom with five or six other students, [the] teacher can’t always focus on you. On the
computer, it’s just you and the computer.”
Site Portrait
Site 3, a nonprofit organization in Northeast Illinois founded in the late 1970s, supports adults
with workforce training in health careers, manufacturing, and computer technology. Site 3
targets urban Latino and African American populations and offers literacy instruction (in
students’ native language), vocational English as a Second Language (ESL), and college
prerequisites. Site 3 aids adults in their transition to tailored job training and community college
courses.
Students are accepted into Site 3 after an application process that includes providing proof of
income and transcripts and participating in an in-person interview. Throughout the application
process, staff are looking for barriers to success, and the result is that Site 3 students are very
motivated. Site 3 serves approximately 400 students yearly and has a very good retention rate.
Some students even help as tutors after completing their Site 3 program.
All students are assigned both a case manager and an academic advisor to pave the way for
community college admission. Literacy and numeracy instruction is used to prepare students
entering the career pathways or workforce development program; there are separate entry-level
classes for health care and for manufacturing pathways. Literacy is emphasized in the context
The average age of the students in the Site 3 career pathways programs is 35–40; all students
are over age 18. The healthcare pathway has predominately Latina enrolled, whereas the
Manufacturing Training students are predominantly men, approximately half African American
and half Latino. For both pathways programs, only 50% of the students are employed.
Use Model
Site 3 students in the healthcare career pathways start their training on one of three “bridges”
based on their pre-program competencies:
2. Pre-Certified Nursing Assistants (CNA) for those at an 8th- to 10th-grade reading level,
followed by CNA health assistant training when they achieve a 9th-grade level
These bridge courses are 16 weeks long in the fall and spring and 8–10 weeks long in the
summer. They are held in the evening (fall 2015 courses were 6:00–10:00 p.m. and spring and
summer 2016 were 5:30–9:30 p.m.) in a Site 3 facility that hosts regular high school classes
during the day. Manufacturing offers a 10-week basics course that emphasizes practical skills
such a blueprint reading and applied math for shop, with portions set aside for instruction in
literacy and numeracy skills. The course pathway for Manufacturing is based on stackable
credentials, common in manufacturing training.
Core Skills Mastery (CSM) was used for improving reading, mathematics, problem solving, and
use of technology (e.g., emailing and messaging). The CSM pedagogical approach is to provide
content in a self-paced learning environment with built-in motivational strategies, where
instructors are coaches and students are encouraged develop skills in self-regulation.
Site 3 tried two different implementations of CSM in its bridge courses. First, it devoted all the
class time during the first 2 weeks of a 16-week course (a total of 40 hours) for students to
complete CSM. Four instructors who were not math teaching experts served as coaches for this
pilot. The second and more successful approach spread CSM use over the entire 16-week
The math instructor had face-to-face contact with the students each week and was less likely to
use the CSM reports or coaching messages because the face-to-face interaction allowed the
instructor to tailor instruction based on observations of students’ work during the lab or based on
what would be tested when the course was completed.
Integrating CSM regularly across the entire 10- or 16-week session seemed more effective than
limiting its use to the first 2 weeks of a course. At first, students tried to complete CSM in the
first 2 weeks of the course by using it every evening as the sole source of instruction. Spreading
it out over the entire bridge course worked better: Students could finish at their own pace
(although not all students did), had more time to get used to navigating through CSM, and were
less likely to show signs of frustration while using the software.
Site 3’s initial approach of using case managers, rather than trained teachers, to supervise the
CSM portion of the health care bridge course was not as successful as planned. The case
managers felt that they did not know the mathematics well enough to support students.
Eventually, a certified math teacher was hired and assigned to supervise students using CSM.
This instructor who joined Site 3 during the second pilot study was trained in the use of CSM by
CSM staff and felt more comfortable with it than instructors in the first pilot. CSM staff members
explained the coaching philosophy and demonstrated the instructor toolkit. Both the case
managers and the instructors also found value in the CSM feature of being able to experience
the course as a student. The Site 3 staff also benefited from regular check-ins from CSM
personnel.
CSM teaches not only math content and terminology, but also explicit problem-solving
approaches. Students who wanted to rush through the CSM course found this slowed them
down because CSM teaches problem solving step by step and shows multiple approaches to
solving problems. Most of the students we spoke to (four total) became self-motivated learners
CSM’s motivational strategies built into both the program and the coaching worked well.
Students are required to succeed at all problems (i.e., 100% pass rate is required to progress to
the next topic). Students thus not only learn to persist, but are also coached to take breaks and
move on to a new set of problems after repeated unsuccessful attempts at one set (students are
later guided back to the problems that were skipped). CSM encourages achievement through a
reward system (earning karate-like belts), and some students reported that they liked the sense
of accomplishment that came with earning the belts.
Students had mixed feelings about the motivational aspects built into CSM. They liked the
achievement aspects (belts) but not necessarily the effort messages because some felt unfairly
judged. Some students also felt that, despite the built-in motivational supports, working on the
computer was isolating, especially for those who were reentering formal schooling for the first
time in many years.
Site Portrait
The Adult Diploma Program (ADP) at Site 4 is a new competency-based, career-focused, high
school diploma program for adults in Ohio. The state created it to help adults earn a diploma,
prepare for job training, and earn certifications in high-demand jobs such as machining, nursing,
security, and medical billing. The program is free to Ohio adults who are over 22 years old and
must be completed within 2 years. Students in ADP are eligible for scholarships for career
training programs. Scholarship awards are based on students’ scores on the WorkKeys
assessment.
ADP is geared to county residents who do not have a high school diploma or its equivalent,
have basic desktop/laptop computer and Internet skills, and have a fifth-grade reading level or
above. Most of the students in the course (approximately 90%) are African American females,
ages 22–60.
Core Skills Mastery (CSM) is being used to prepare students for entering ADP. Prospective
students must meet several requirements to apply to ADP, one being 100% completion of CSM.
Students must also pass the ACT WorkKeys test on applied math and reading for information
and locating information. While they are working to complete CSM for admission to ADP, they
also work on WorkKeys preparation modules and practice tests. CSM is used to ensure that
students are prepared to succeed in ADP by developing skills to problem solve in a technology-
According to the ADP program coordinator, the ADP differs from GED programs offered by the
same community college in its emphasis on applied versus academic knowledge. The program
coordinator believes that the workplace orientation of CSM’s instructional approach is more
effective and motivating for this population than a more traditional approach that replicates
instruction in a typical K–12 school setting, where many of these students struggled to learn in
the past.
Use Model
To prepare students for CSM, Site 4 holds an orientation session that includes a workshop on
how to use CSM. The students also are introduced to the lead instructor for the course and
receive a virtual introduction to their assigned CSM coach. Two coaches, students at the
college, were hired for this study, working under the guidance of a college instructor. The main
support for learners once they begin working in CSM comes from these coaches who have
experience in online learning. Students work at their own pace within CSM. Since completion of
CSM is a requirement for admission to ADP, motivation is high.
Coaches manage a caseload of 25–30 students. While students have access to the lead
instructor, most of the instruction is provided via CSM. Coaches contact participants weekly via
CSM’s messaging feature, email, telephone, and video chat to provide support, motivation,
regular updates, and (if needed) just-in-time instruction. They also assist students in finding out
how to get help through CSM or via external supports, advise them with time management, and
help them develop learning skills such as taking notes or printing CSM lessons to use as a
study guide.
Students are expected to spend 10–12 hours a week to complete the course in 6–10 weeks.
Students have access to the college’s Technology and Learning Center, a dedicated drop-in
computer lab, 6 hours per week, 6 days per week. Use of the lab to access CSM is not
mandatory, and some students completed CSM without ever going to the lab. Students are
provided with referrals and encouragement to use free computer and Internet access at a
variety of local nonprofit organizations and libraries (e.g., students could check out a Wi-Fi
hotspot and computer from a library), and they are free to use their own laptop/desktop
computer in their homes. However, the use of mobile devices is strongly discouraged to ensure
CSM’s dashboard provides coaches with reports on students’ progress and effort. For example,
the reports show coaches how many unsuccessful attempts a student made to solve a practice
problem and the extent to which the student interacted with the digital instructional materials
before attempting the problem. When necessary, after reviewing a report, a coach might
message the student and prompt him or her to spend more time reviewing the instructional
materials. Coaches’ use of the reports varied: One coach who was interviewed reviewed and
messaged her 25 students at least twice a week, while another coach reported he reviewed the
CSM dashboard reports only after a student reached out to him and asked for help.
Many students sought out their peers or the lead instructor when they needed additional
support. When students needed assistance, they often messaged the instructor or, more often,
relied on each other for help rather than seeking help from the coaches. Students knew that
others were working on CSM and took it upon themselves to find students to work with. Some
students who went to the drop-in labs when they needed help found other students there who
were working on the same problems, worked together, and even worked at each other’s homes.
Every other month, the coaches hosted “meet and greets” that encouraged students to come to
campus to interact and collaborate.
Although CSM discourages students from using other web resources while working on CSM
(marking this as “off-task” behavior), the coaches encouraged Site 4 students to seek out and
use external resources to support their learning, such as Khan Academy or YouTube tutorial
videos. Coaches reported that some students found the outside resources more effective in
helping them learn certain topics than the instructional approach taken by CSM.
The program coordinator provided general support to coaches, but for the most part the
coaches operated independently. The coaches reported they found value in setting up their own
accounts and working through the CSM units themselves before working with students; this
helped them review concepts they had not studied since high school.
Both coaches reported that the training was useful and helped them learn how to serve as a
coach, such as messaging the students. One of the coaches received training on CSM through
Each coach tracked students’ progress using the CSM dashboard and had weekly meetings
with the lead instructor. To prevent attrition, coaches called or emailed students to make sure
they did not fall behind. One coach called each student weekly to ask how his or her studies
were coming along. Students who were not making progress could be referred to a tutoring
program at Site 4.
Both coaches felt that the students benefited from the motivational messages that CSM
provides throughout the instruction. The program coordinator also reported value in CSM’s
motivational messages after students answered a problem incorrectly, such as, "20% of all
adults struggle with this concept." He felt that it was useful for these adult learners to receive
feedback that they are not the only ones struggling with difficult topics.
Coaches reported that students’ reading skills hindered their ability to make progress in CSM.
According to the coaches, many students taking CSM are below a ninth-grade reading level and
found it difficult to complete the math portion of CSM because they had difficulty comprehending
CSM’s text-rich explanations and problem scenarios. However, the coaches also reported that
the reading instruction components of CSM helped students understand the importance of
reading for comprehension, especially because the reading comprehension sections reiterated
the importance of reading in everyday, real-world problem solving. One coach reported that
many students eventually realized that it was best to read a math question more than once
before attempting to solve the problem.
Coaches reported that students found CSM’s use of “belts” to signal progress a significant
motivating factor. As students make progress, CSM issues them different levels of belts (e.g.,
yellow and black, as in karate) and other tokens of achievements. The coaches said students
enjoyed the belts and viewed them as confirmation that they succeeded at something: “They are
something you can see and show other people.”
The coaches said they believe that CSM is effective, helps build students’ confidence that they
can learn independently, and makes students more persistent; they did note that it takes
students time to get used to CSM’s mastery learning approach. According to one of the
coaches, students perceived CSM as more demanding, difficult, and, at times, more frustrating
than the online ACT WorkKeys preparation modules that they were also assigned after they
completed their work on CSM. CSM does not allow students to progress in the software unless
they answer all problems at the end of a unit correctly (100%); in contrast, in the WorkKeys
modules 70% correct allows students to pass to another level. The greatest challenge for
students according to the coaches and students is accepting that they must complete CSM and
thus continue to make progress to be eligible for the ADP.
Coaches reported that students liked not having to come to the campus to work on CSM
because they preferred working in their own environment. When students were struggling on
their own, they often used the drop-in lab as resource for support, particularly from other
students. According the coaches, this was the primary use of the drop-in lab, as a place for
support and tutoring, rather than as the primary location where students accessed CSM. One
coach estimated that 1 in 5 students used the drop-in lab weekly and commented that the
extended lab hours were helpful for students.
Coaches also believed that students’ computer literacy increased with CSM use. In addition,
one instructor said that CSM’s messaging functions helped overcome problems with students’
lack of familiarity with using email because, according to this instructor, the messaging system
in CSM is easier to use than your typical email program.
Site Portrait
Site 5 serves approximately 2,200 students per year across 33 locations. Enrollees enter with a
range of educational backgrounds and skills and are typically emerging and struggling readers
with an incomplete K–12 education, students with a high school degree who need to pass a
math exam to be accepted to college, or recent immigrants with degrees from other countries.
Students vary in their employment status, stability of living situation, and family responsibilities.
Courses offered vary across the 33 locations to meet local needs but typically are 15 weeks in
duration with 8–12 contact hours per week.
Site 5 used Core Skills Mastery (CSM) at four sites: two community colleges, a workforce
education program, and a community-based GED preparation program. Objectives of the
courses in which CSM was used include improving basic adult numeracy skills, helping students
pass the GED, and preparing students for college-level math courses (as measured by passing
a college math entrance exam).
The CSM curriculum was used to supplement traditional instruction in mathematics. The use
model for CSM was almost purely online—either in class or in a separate lab—with instructors
present to answer students’ questions and serving in the CSM role of coach. Some instructors
provided direct instruction for small groups when they noticed, through a review of the system’s
dashboard, that several students were struggling on the same topics within CSM.
Students’ progress in CSM is self-paced. For each module, students first take a formative
assessment and are assumed to have mastery over the content covered in that module if they
answer all the test items correctly. Otherwise, they are offered the opportunity to master the
content through additional resources, lessons, and examples before being given the opportunity
to take another assessment to gauge their understanding of the content.
Students use CSM during regular class time, sometimes with classroom laptops and sometimes
in a separate computer lab. Planned usage of CSM varied from 1.5 to 5 hours per week
depending on the schedule for a particular class (typical classes last 15 weeks). Some portion
of class time may be set aside for direct instruction the instructor deems necessary. Students
were encouraged to use CSM at home. Because students complete CSM at different rates,
students who finished before the end of the term were provided with additional online
instructional materials, including an online component to a math textbook purchased by the
organization and made available to different campuses.
Program implementation would have benefited from a stronger technology infrastructure across
many of the participating sites at the outset of the study. Although the research team attempted
to recruit sites with the appropriate technology infrastructure, some Site 5 program sites did not
have enough computers to provide the necessary flexibility of having students use computers in
class, as originally planned, rather than needing to schedule the use of a computer lab. Some
sites also lacked sufficient Internet access or Wi-Fi speed. Program administrators and/or local
To implement CSM, instructors (or others) must be trained as coaches to support students while
they are working in CSM. Some instructors used in-class tutors to provide direct instruction
when students struggled to understand concepts within CSM.
Some instructors reported needing more training and support than was provided by CSM to
learn how to effectively coach and facilitate with CSM. Instructors received only 1 hour of
training on the use of the product from the vendor. Instructors reported that they used their own
time to learn about CSM and its features (including the coaches’ portal or dashboard), and
many did not learn about the available digital supports (such as the “playground” and YouTube
videos) until later in the implementation.
Overall, the instructors found the coaches’ portal helpful. Once they learned how to run reports,
the instructors made regular use of the portal, which displays student progress, effort, and
learning indicators and individual student strengths and topics of concern. Instructors used the
reports to identify and address specific topics students were struggling with in CSM and thus did
not have to rely on students seeking them out for help. Instructors particularly liked the simplicity
of the indicators for student progress (black/red/yellow karate belt-like levels), found the
interface easy to navigate, and liked the “coaches notes” feature where they can record notes
that are visible only to them (not to the students).
Instructors’ and administrators’ overall reaction to the product was positive. Each instructor
appreciated the personalized approach to learning that CSM provides and that the problem
scenarios attempt to connect to the adult learners’ lives. Instructors liked that CSM
accommodates students’ different skill levels and individually differentiates instruction based on
what a student needs to learn. The instructors also reported that the use of CSM adds variety to
instruction and gives students another way to learn to math content. Students, instructors, and
administrators also appreciated that CSM is available to students from anywhere they have
Internet access. Instructors encouraged students to work on CSM outside class, and some
students even continued work after completing their 15-week course. However, some
At the same time, students working independently, off campus, reported they missed being able
to ask their peers or the instructor questions when they were having difficulty with a topic or
concept. Some of these students said they relied on family members for help. Some nonnative
English-speaking students who struggled with understanding the text-rich explanations within
CSM reported they relied on other web-based resources such as YouTube videos in their native
language on various math topics.
Instructors also appreciated that CSM content was geared toward adults with a low reading
level, although most commented that the content may not be appropriate for students scoring at
a reading level of grade 4 or below.
Site Portrait
The goal of the Adult Basic and Literacy Education (ABLE) program at Site 6 is to build adults’
basic skills in reading, math, and language so they can pass the GED. In addition to the GED
preparation courses, the ABLE program offers distance education opportunities, corrections
education, courses for English for Speakers of Other Languages (ESOL), and family literacy.
Site 6’s adult education programs also aim to bridge students from noncredit-bearing Adult
Basic Education (ABE) and Adult Secondary Education (ASE) courses into credit-bearing
community college programs, job training, and employment.
The ABLE program at Site 6 serves a variety of students. It had projected enrollment of 2,750
for 2016. Approximately 67% of ABE students need remedial education (to move them up to
eighth grade-level skills), 28% are ESOL students, and 5% are in ASE courses on a path toward
a high school education. Students range in age from 16 to 60-plus, with more than half in the
19–24 range. Forty-one percent of the students are male and 59% are female. A total of 38-
affiliated institutions offer ABLE courses through Site 6.
Despite high initial enrollment in GED preparation courses, only 70–80% of students are likely to
attend their first class, with substantially fewer completing their course. Numerous factors affect
these adult learners’ abilities to pursue and complete their education: changing work schedules,
Use Model
GED Academy was used in GED preparatory courses at five different campuses in the Site 6
network. These courses generally met 3–4 days per week for 3 hours per session. Each
instructor had discretion over how he or she organized the class around GED Academy, with
most instructors splitting the time equally between direct instruction and time on the product. At
least one instructor used GED Academy as the primary curriculum and mode of instruction.
However, this instructor supplemented the GED Academy instruction with one-on-one direct
instruction on more advanced math topics and essay writing. Within the class time devoted to
GED Academy, students’ work was self-paced. Students worked on modules selected by the
GED Academy software based on a diagnostic assessment administered within the product in
each of four subjects (Reading/Language/Writing, Math, Social Studies, and Science). In an
attempt to boost attendance at classes during the study period, the ABLE program required
students to attend 50% of their classes.
The availability of technology influenced how some instructors organized and planned their
instruction. For example, at one program site that had fewer computers than students, students
rotated between using GED Academy, group work, self-study using the textbook, attending a
pullout session in small groups with the instructor, or working with a volunteer tutor. At another
program site at a community center, the instructor had to ensure she had scheduled the use of
the computers for her class ahead of time.
Instructors’ reports about the adequacy of the vendor’s support were mixed. Some teachers felt
the availability of GED Academy technical support (through a toll-free number) was adequate to
support their use of the product with their students, but others expressed a desire for additional
follow-up sessions, particularly before the start of the term.
Both students and instructors reported that they preferred the blending of direct instruction and
group work in class with time on GED Academy over only direct instruction or only GED
Academy. However, coordinating the self-paced GED Academy instruction with teacher-led
curriculum was a challenge: During any session or week, individual students may be working on
The ABE program’s decision to purchase the textbooks that accompany GED Academy for the
students seemed to be an important motivating factor for many students. Many students said
they were excited to have their own physical book that they could highlight and write in as well
as use for review at home.
In general, students, instructors, and administrators reported that they appreciated the
opportunity to use GED Academy and wished to continue using it. Even individuals who were
somewhat critical in their feedback said that enjoyed the experience and found the product
helpful. Instructors believed the product helped them facilitate each student’s learning.
Instructors also reported that the blended model gave them more time, during students’ use of
GED Academy, to answer individual questions and work with individual students with the
greatest needs. Students enjoyed working at their own pace, being able to skip topics they were
already proficient on and repeat topics they were struggling with.
In general, students interviewed appeared to be were highly engaged with GED Academy.
They said that using a computer to access GED Academy prepared them better for the
computer-administered GED exam. Many students, but not all, said they liked the video-based
lessons presented by the animated instructor teaching a virtual class; students reported they
preferred this mode of instruction over text-based content that put a greater demand on their
literacy skills. Instructors interviewed also approved and felt that the student-characters asked
relevant and intelligent questions. However, some students found the animated characters in
the virtual classroom reflected negative stereotypes or characters they could not relate with.
The instructors’ overall response was positive, although some expressed concern about how
students might experience their struggles to progress in GED Academy. Instructors generally
believed that when some students struggle to master a topic or skill and are continuously
reminded they have not reached mastery (for example, each time they enter an incorrect
response), they may become less confident about their ability to learn the subject. The
instructors believed that these students need extra monitoring and support. In addition, the
instructors reported they found that GED Academy content was not appropriate for students
with the lowest skill levels, i.e., TABE scores at level 1 or level 2.
Most students did not have access to computers outside class and thus found it difficult to use
GED Academy outside scheduled class time. In general, use in the home was low because of a
lack of technology access or because students had to juggle work schedules and family
obligations. As a result, to get extra time on GED Academy, some students came early to
campus before class started, some stayed late and used it after class, and others accessed the
product in one of the campus’ computer labs.
In at least one campus, technology difficulties hindered use of GED Academy. GED Academy
expects students to review materials as PDF files that they download and print using a link in
the product. Students reported they could not download these files onto the local program’s
computers and thus were unable to review the recommended materials.
Site Portrait
The mission of the nonprofit organization Site 7 is to “provide basic skills training to meet the
changing educational requirements of the workplace, and to help students meet their education
and career goals.” Site 7 offers a variety of classes that meet throughout the year including over
the summer. It provides GED preparation classes in English and Spanish for adults age 16
years and older. Students enrolled in the Adult Basic Education (ABE) classes are working on
improving their basic skills and TABE scores so they can transition to the GED preparation
classes. Site 7 also offers the English for Speakers of Other Languages (ESOL) program and a
training program for the construction trade.
GED Academy was used in classes in the GED preparation program. Classes are offered in
English, day and evening, and in Spanish. (Students in the Spanish GED program were not
included in the study.) Enrollment is “rolling,” with students admitted on a weekly basis. Class
attendance is not mandatory.
Site 7 serves approximately 700 students a year. A majority of the students in the study were
male, which is consistent with the general student demographics at this site. Most of the
students have completed at least an eighth-grade education but have average reading and
math skills, at the fourth- and fifth-grade levels. A majority of students enter the program at or
below the poverty level; if they are employed, they work mostly in minimum-wage, entry-level
The program has a total of four instructors: two teach the day classes, and two teach the
evening classes. Two new instructors were hired in April 2016 to replace two instructors who left
the program in March 2016.
Use Model
The use of GED Academy was originally planned to support students in the morning GED
preparation session. This session met for 3 hours starting at 9:00 a.m., Monday through
Thursday. To accommodate the use of GED Academy, a 1-hour drop-in lab starting at 8:00 was
added to the morning session. The expectation was that students would use GED Academy for
up to 4 hours per week before their direct instruction classes, with instructors being available to
answer students’ specific questions. Because Site 7 has an open attendance policy, attendance
at the drop-in lab was not required. While most students did work on GED Academy during that
hour, some students preferred to instead use that time to work with the instructor one on one. In
general, the students generally preferred not to stay late after class or come in early before
class to use GED Academy because they need to balance family, work, and school, and many
are constrained by the public transportation schedules.
The original instructors started using GED Academy at the beginning of the study, in fall 2015.
The new instructors, who started at Site 7 in April, used GED Academy significantly less
frequently than the original instructors.
As the year progressed, GED Academy use evolved. The product was used differently by two
groups of students in the GED preparation course. One group worked on GED Academy during
the drop-in lab, and the other worked mostly online on GED Academy and rarely came into
class. The director of the program was the instructor for this group of remote online students
and tracked their progress through the GED Academy dashboard. Two of the highest users in
the Site 7 sample were from this group of students. Instructors reported that roughly half the
class came to the lab regularly.
Both groups of students had access to digital copies of the textbook, Kaplan’s 2014 GED Test
Strategies, Practice, and Review, and some students chose to purchase their own copies. The
GED Academy product aligns well with the content in this textbook, even to the extent of listing
the pages that students should read if they need help while working on the product.
All instructors and students agreed that having access to the Kaplan textbook, in both online
and physical form, was helpful because when students were struggling with a particular skill or
concept, they could use the textbook to review the topics recommended by GED Academy.
Site 7 planned to require that all students in the participating classes use GED Academy. Site
7’s open attendance policy and the use of GED Academy as supplemental activity, however,
made this difficult to enforce. Thus, not all students used it, and many did not use it consistently.
Further, the instructors who joined the program in April 2016 may not have received the same
orientation training on the use of GED Academy by the vendor as the original instructors and
thus may have been less committed to using it with their students, limiting these students’
exposure.
In general, the original instructors were very positive about their experience. They reported they
preferred using GED Academy to other online products they had used in their classes, primarily
because of the engagement they observed while students were using and talking about the
product; the instructors felt that the animated virtual classroom and instructor engaged students
in ways other products had not. Several other features that instructors believed supported
students’ learning were the ability to highlight text within the online lessons, links to external
websites giving access to additional online math resources, printable worksheets, and the online
dictionary.
Yet overall time on GED Academy was limited by the factors mentioned above (use of product
as a supplemental activity, open attendance policy, instructor turnover). In addition, the new
instructors reported they did not have the time to effectively integrate GED Academy into their
curriculum. Nor did they have the time to regularly review students’ individual progress reports
from GED Academy and provide feedback; provision of feedback was also hindered because of
many students unavailability for meetings outside regularly scheduled class time due to
transportation and work schedules.
Some students resisted using the product, particularly while on campus. These students said
they preferred to use their time on campus learning from the instructor directly (the instructors
confirmed these reports). One student commented, “I can do [online learning] anyways at home;
why should I come here for that?” Some of these students liked the ability to use GED
Academy at home, but when they were on site they preferred being taught by the instructor.
For more effective use of GED Academy, some students said they wanted a more thorough
orientation to the software so they could better use the all its features and resources. In the
future, Site 7 plans to have a more in-depth training for instructors on all the product’s features
so they can better inform their students about their utility. The program director also would like
to see a blended integration of GED Academy into the curriculum by using the product to
provide primary instruction on skills and concepts and then use the direct classroom instruction
to clarify difficult concepts. The director’s other plans to encourage regular use of GED
Academy include making the computer lab time integral to the instructors’ lesson plans and
scheduling use during regularly scheduled class time.
Site Portrait
Site 8 provides basic education for adults working toward their GED certification as well as
English as a Second Language classes for students in northwest Kentucky. Site 8 recently
added college preparation programs, including classes in ACT test preparation and COMPASS
test preparation (for community college). Site 8 also provides basic computer skills courses to
help adults become comfortable using technology (e.g., how to use a computer mouse, conduct
searches, and navigate websites). To help students achieve their goals, Site 8 tries to remove
obstacles by providing transportation and child care. It also offers a separate program for adults
in the Ohio correctional system.
Site 8 enrolls approximately 300 students, down from 500 since the release in 2014 of the newly
designed GED test, according to the program director. Classes are offered both during the day
and in the evening. The typical student is under 35 years old and white, with a few Hispanic and
African American students. A majority of the daytime students are unemployed or working part
time. These students typically have reading and math skill levels ranging from second- to sixth-
grade level. Most of the evening students are employed and working full time. Low class
attendance is a recurring issue at Site 8.
The GED preparation program has seven instructors, including the program director. All of them
taught the students participating in the research study. The instructors described math as the
Use Model
In general, instructors and students used GED Academy during class time. Students worked
remotely in one course, with the program director serving as the instructor and progress
monitor. In the on-campus classes, students took GED Readiness tests that helped the
instructors and students identify which subjects or topics they needed to focus on. Students
received individualized instruction on these topics, with instructors providing group lectures and
instruction when needed (there was little whole-class instruction). According to instructors, time
on GED Academy was assigned more to students who had more advanced incoming math skills
and less for students at the lowest skill levels, particularly with students who had less
experience with technology and computers. Some instructors felt that GED Academy and its
adaptive features would be most valuable for students working on more advanced math (e.g.,
algebra, geometry, and slope). They felt that students working on the lower skill levels (e.g.,
fractions, decimals) might become overly frustrated with the slow pace of their progress within
GED Academy, particularly when they were assigned content they had been exposed to
multiple times during their formal school years.
In the one class of students who worked on GED Academy remotely with the program director
tracking their progress online, the director used the reports provided by the system to track
student progress and effort and provide encouragement and reminders as needed. When the
director noticed students’ lack of progress on specific topics, he encouraged the students to
come to campus for one-one-one instructional support from the instructors teaching the on-
campus courses.
Students used GED Academy across all subject areas on the GED exam—math, language arts,
social studies, and science. The subject-specific use of GED Academy was informed mostly by
how students had performed on external tests, such as the GED Readiness test, along with
instructors’ own assessment of students’ progress. Students were encouraged to use GED
Academy outside class as well. Once students started scoring sufficiently well on the GED
practice tests within GED Academy, they were encouraged to take the GED exam.
To support students working independently on GED Academy either outside or during regular
class time, the instructors reviewed the product’s dashboard each week to monitor student
progress and then follow up with students if needed.
The instructors reported that they plan to continue using GED Academy after the study. The
primary factors behind this decision were the product’s user-friendly interface and detailed
reports of students’ mastery of concepts. The instructors found the detailed student reports,
which are based on the formative tests, particularly beneficial as they helped them identify the
topics students still needed to master.
Instructors reported that students regularly reviewed their own progress reports and found this
motivating. Students understand they can “test out” of a subject area within GED Academy
based on their scores on the built-in GED practice tests.
Another advantage of GED Academy instructors reported was how well the product mapped
onto the GED practice tests. The instructors took advantage of this by having their advanced
students use GED Academy when they were close to being ready to take the GED exam.
Additionally, because the GED exam is computer based, instructors felt that having students
work on GED Academy helps them become comfortable with taking computer-based tests as
well as build their general computer technology skills.
However, instructor feedback on the product was not all positive. Instructors reported that they
believed the animated characters used in the virtual classroom GED Academy might be
ineffective for students who are not native English speakers and for older adults. They
mentioned that the colloquial references used by the characters may hinder motivation and
learning for these students.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 100
Product: MyFoundationsLab
Site Portrait
Site 9, a member of a large community college system, provides many online courses in its
training, certificate, and degree programs, including its Adult Basic Education (ABE) programs.
Its College Bridge Pathways, the focus of the study at Site 9, helps students reach high school
equivalency and serves as a bridge to college-level programs. Bridge programs are offered at
six campuses across the county. Students are encouraged to concentrate on earning their GED
to pursue the path to higher education or professional certification. College Bridge Pathways
students range from 16 to 60-plus in age, and the program has 11 full-time and more than 75
part-time instructors. The program offers “managed enrollment,” whereby new students are
accepted every 2 weeks.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 101
Use Model
Site 9’s HSE courses focus on basic skills development and preparation for the four GED
subject exams. Courses meet three times per week for 3 hours per session. Two days per week
are spent on traditional classroom instruction, and 1 day is set aside for MFL Lab in the campus
computer lab.
All students start in the Foundations course and then continue to higher level HSE courses,
depending on their TABE scores. Differentiated classes enable instructors to adapt instruction to
student learning needs. During lab periods, however, classes may be mixed because of
scheduling and space.
The 3-hour MFL lab sessions are primarily self-directed and self-paced. Students initially take a
PathBuilder diagnostic test within MFL for each of the four content areas. Students are shown
their areas of mastery and areas where they require additional work. Although some instructors
assigned MFL modules during the lab sessions that corresponded with what they were teaching
in class, for most of the MFL lab time students chose the subjects they wanted to work on
among the modules recommended to them based on their PathBuilder diagnostic.
All instructors attended an 8-hour face-to-face training seminar at the Arizona Department of
Education delivered by a Pearson representative. Instructors also had access to Pearson 1-hour
web tutorials on the MFL functions available to them. In addition to this support from Pearson,
instructors and some administrators attended a 3-hour seminar by Mockingbird Education on
technology integration in teaching.
During students’ 3-hour lab session, a technician was available in the room to help with logging
in and basic MFL use-related issues. This was especially helpful for new students. Students
also relied on other students for help with both technical and content issues. Finally, students
reported consulting other online instructional resources when needed.
Students gave mixed reviews to Site 9’s decision to separate in time and space the periods of
direct instruction and online instruction using MFL. While some students said they preferred
having two periods per week of direct whole-class instruction and one period dedicated to the
MFL lab, others reported they would have preferred having each class period split into direct
and online instruction, thus reducing the time spent online from 3 hours per session to 1.5 hours
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 102
and possibly the fatigue associated with prolonged work in the online environment. Program
administrators seemed to agree. The administrators reported that they felt the lab sessions
would be more effective if they were shorter and if instructors were available in the lab to
address students’ content-related questions and provide support and motivation.
Students were strongly encouraged to continue their studies at home using MFL. In practice,
their use of MFL outside class varied depending on access, schedule, and motivation. While
some students had no opportunity or motivation to use the software outside the required lab
period, others reported using it for anywhere from an hour per week to an hour per day.
Students’ reviews of the MFL program varied widely. Some students thought highly of it, gave it
a “5 star” rating, and told us that before using it they had not been very successful in trying to
improve their basic skills using online resources. Students liked the flexibility of using MFL
whenever they wanted, of having choice over the topics they worked on, and of being able to
have all their work and progress saved each day and being able to continue to work where they
left off the next time they logged on. However, several students reported they would have
preferred that the course be taught entirely by their instructors and would not have used MFL if
use had not been mandatory.
Students’ views of the effectiveness of the text-rich instructional content also varied. Some
students reported that the instructional format was effective and helped them develop reading
comprehension skills, while other students found this format “boring” and not very engaging.
Many students (even those who spoke favorably overall about MFL) mentioned that the product
could benefit from some video-based instruction. Several students cited Khan Academy as an
example of a program they found more engaging and preferable “because it feels like someone
is teaching you instead of you just reading.”
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 103
Product: MyFoundationsLab
Site Portrait
Site 10, Indiana’s community college system, has more than 30 campuses across the state and
serves nearly 200,000 students annually.
MyFoundationsLab (MFL) is used at all Site 10’s campuses in Foundations, a course for
students with reading, writing, and math remediation needs who plan to pursue technical tracks,
such as welding, automotive, and HVAC. Those tracks can lead to 2-year associate degrees or
two- to three-semester certificates. The student population for Foundations is predominantly
male, and students are age 17 and older. Before entering the course, all students take a
customized ACCUPLACER diagnostic exam and are provided with an individualized study path
in MFL based on the ACCUPLACER score. Some students require remediation in both math
and reading and some in only one subject area. For the purpose of this research, we focused
only on those students with a reading requirement and who were planning to enroll in a math-
related career pathway. We were interested in whether students in the Foundations course who
used MFL were better prepared for the first English course in their career pathway than students
who also had a reading remediation requirement but did not use MFL.
Use Model
MFL is the centerpiece of the Foundations course. During class time, students work
independently on MFL within their individual study paths. The goal is to achieve mastery of the
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 104
required modules. Only Modules 1–4 in MFL (in both math and reading) are required content in
Foundation classes. Course grades are based 90% on MFL competencies and 10% on other
factors. Classes meet for 3–6 hours per week (depending on the campus schedule) in computer
labs. Classes and can be as small as 3 students and as large as 15. Courses meet for 8 to 16
weeks.
Instructors are present in the lab when students are using MFL. The instructors’ main role is to
provide individual tutoring for students who are struggling with a concept in MFL. They also
monitor students’ progress on MFL and help them set goals and stay on task. For example, an
instructor told us when students struggled with reading tasks, she checked how they were
taking notes on the reading passages to see whether they were identifying the core concepts.
According to instructors, students in the Foundations course often have trouble managing their
time, so instructors constantly meet with them to talk about how to stay on track. As an
instructor noted, students “need encouragement, motivation, and pushing.”
In the Site 10 use model, some instructors had students work on MFL for the entire class period,
while others gave short lectures or held class discussions to start the class before turning to
work on MFL. During the lab sessions, students were allowed to refer to other online resources
such as Khan Academy on the belief that such supplemental alternative approaches to the
course instruction supported their learning of various concepts covered by MFL. Students were
not required to work in MFL outside class time, but instructors encouraged them to, particularly
students who had too many MFL modules to complete during class time alone.
Instructors typically receive 1 hour of training from Pearson, the publisher of MFL. Instructors
also receive a 1-hour training session from Site 10 support staff on how to teach Foundations
with MFL.
MFL was used in Foundations courses during both 8- and 16-week terms. During the 8-week
term students spent 4 or 6 hours per week in the computer lab, and during the 16-week term
they spent 3 hours per week. The instructors interviewed felt that the 8-week term was more
effective for most students because of the more intensive time spent on MFL.
The coordinator for the Foundations course emphasized that Site 10 uses MFL because it is
correlated to the customized ACCUPLACER diagnostic. Students work only on those modules
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 105
on which they have not demonstrated proficiency. The coordinator believes that the program
has worked well for students who need remediation in reading and math.
Of the five instructors interviewed, most had generally positive views about the mastery-based
approach of MFL. They believed it works well for their students, who start with a wide range of
abilities and gaps. Using MFL also permits more individualized instruction, so that students can
work on their areas of need, and instructors can work with students one on one. It would not be
possible to provide this kind of instruction without the technology.
Instructors reported that some students, especially the older ones, struggled with the technology
at first because they lacked basic computer literacy skills. These students tended to need more
help logging on the system and performing some basic tasks like attaching files to email. After
an initial adjustment period, the struggles with technology tended to drop off. However, one
instructor observed that a reluctance to use MFL persisted throughout the course for some of
her older students due to their preference for working directly with their instructor rather than the
software.
According to the instructors interviewed, many students found value in MFL’s individualized
progress reports, helping them visualize their progress in the online system. As one instructor
reported,
While instructors believe that the difficulty level of the reading passages is appropriate for their
students, they did express some concerns about whether students find the content sufficiently
engaging. One instructor observed that the passages tend to be about politics or history, which
are not topics necessarily relevant to the students’ interests. “It’s very rare that they get
something technical related to their expertise,” one mentioned. “They never get the opportunity
to use their skill in their reading. Topics are always far from being relevant to what [a] student
plans to do or is good at.”
Another instructor suggested that including fiction passages might help students become more
engaged with reading.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 106
Instructors also observed that some students seem to get frustrated by the presentation of the
content. One instructor said, “I have students who say this doesn’t teach you, it just makes you
do the work. It just shows you the same thing over and over, if [you] don’t understand you’re
kind of stuck, it doesn’t show you an alternative way.”
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 107
Product: MyFoundationsLab
Site Portrait
Site 11 was established in the late 1970s as a regional learning center for the Rhode Island
Department of Education. Site 11 has three main facilities in three cities and two smaller
operations in two of these cities. Students are age 16 and up, with about 50% between the ages
of 25 and 44. The main goal for students is to earn a high school credential through the GED or
National External Diploma Program (NEDP). Roughly half the students are employed.
To meet a range of student needs, Site 11 offers multiple programs including courses for Adult
Basic Education (ABE), preparation for a high school credential, English as a Second
Language, and a transition to college program. Classes are offered in a classroom environment
at learning resource centers. Course topics include math, English language arts, social studies,
and science.
Site 11’s courses are offered during fixed fall, winter, spring, and summer term dates rather than
on open or rolling enrollment. The courses are offered in both day and evening sessions. At the
larger centers, math classes are roughly grouped by skill level. Students work their way through
various skill levels, receiving both group and personalized instruction. The length of time
students spend in the program depends on their skill level on entry and the time and effort they
can dedicate to their studies.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 108
Site 11 offers counseling and training to support students in adult education and to assist with
the transition to college and careers, including case management, life skills training, referrals,
and individualized career planning and advising. Site 11 also provides assessment services,
including the administration of the Comprehensive Adult Student Assessment System (CASAS),
the GED, and National External Diploma Program assessments, and serves as an authorized
test center for work-related and certification tests for a test publisher.
Use Model
Site 11 staff used MyFoundationsLab (MFL) to provide personalized instruction addressing the
widely ranging skill levels of the students enrolled in ABE courses. In the intermediate and
advanced ABE courses in two cities, students were required to use MFL for 3 hours per week in
math and English language arts classes, amounting to about half the class time (the amount
recommended by Pearson).
During these 3 hours per week, the different instructors used MFL differently in their course
sessions. One instructor used whole-class instruction (e.g., solving problems on the white
board) for the first half of class and then had students work independently in MFL for the second
half. During this time, she walked around to help students when they were stuck and to make
sure they were staying on task. The instructor assigned everyone to work on a particular topic in
MFL, and once they were finished, the students were allowed to choose what to work on in
MFL. Another instructor assigned students to work together in pairs on problems in MFL. When
new students joined the class, she paired them with other students to observe how to navigate
MFL. Students then worked individually or pairs on problems in MFL. Students at the smaller
sites and in the beginner ABE level in one of the larger ABE programs had a more flexible
model for MFL use. They had the opportunity to use MFL during class time but were not
required to. According to the instructor, students in the beginner ABE class did not use MFL
consistently during class because she felt that it was not appropriate for her students.
Students in the Site 11 ABE courses were encouraged but not required to use MFL outside
class to accelerate their progress. Students’ use outside class varied: Some reported spending
hours per week on MFL on their own time, whereas others did not use it at all.
Some, but not all, Site 11 staff involved in the study were able to attend a 1-hour webinar
provided by Pearson. Program directors participated in a webinar that covered the information
the product provides to help instructors monitor student progress and performance in MFL.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 109
Mockingbird Education also delivered a full-day in-person training session with Site 11 staff in
March 2016. This training did not focus on MFL or blended learning per se but rather on
instructional strategies to address the challenges of teaching vulnerable learners.
Internet connection and computers appeared to be sufficient for the study. Students in
classrooms all had access to laptop computers or tablets. Staff and students interviewed were
satisfied with the available Internet connection and reliability.
Some students had Internet access and were able to work on MFL at home, but others did not.
Because MFL is not formatted for smartphones, many students without home Internet access
were not able to work on it outside class.
Students sometimes used hand-held calculators to help solve problems in MFL. One instructor
encouraged students to do so because students can use calculators for the GED and high
school diploma program exams.
Instructors and students identified various benefits of MFL. Instructors cited the ability to
personalize and differentiate instruction and increase student accountability. Students liked the
ability to work at their own pace, the opportunities MFL provides to practice skills, and the
immediate feedback they received when attempting to solve problems. One student who
reported feeling anxious about speaking out in class said she particularly appreciated being able
to work independently.
Instructors and students also raised several challenges in the use of MFL. Some instructors
found the MFL ABE and GED content too advanced for students with low math and reading
skills, particularly the vocabulary and general reading level of the overview sections of the main
instructional passages. Some instructors also believed the text-rich instructional passages were
not topical and did not appear to be engaging or inspiring for students. Some students said they
had difficulty comprehending the overview sections when reading off the computer screen and
often printed hard copies of these sections so they could highlight key terms and concepts and
take notes. Some instructors also felt that the content was not well aligned with the content of
the GED exam. Some students were openly resistant to learning online with MFL and said they
preferred learning directly from the instructor.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 110
All instructors interviewed reported that they struggled with implementing MFL and would have
benefited from earlier and more frequent training and support. Nine of 15 staff members
involved in the study did not participate in any Pearson-provided training on MFL.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 111
Product: Reading Horizons Elevate
Site Portrait
Site 12 provides alternative education programs for dropouts and at-risk youth, ages 16–20, in a
large urban area through a multicampus system. A charter school within the city’s school
district, Site 12 operates at 19 sites, often within existing high schools. Many Site 12 students
have reenrolled in school after a 3-month to 1-year period of disengagement, with some
students having been out of school for up to 3 years. Students in Illinois are eligible to receive a
high school diploma until age 21. Students are typically enrolled in Site 12 schools for 18
months.
The mission of Site 12 goes beyond simply attainment of a high school diploma. Site 12
provides academic classes, academic remediation, and support for social emotional
development to help students earn a diploma through multiple pathways tailored to their needs.
Site 12 also offers career pathways, support for college enrollment, and support services for
workforce readiness.
Site 12 students face multiple barriers that lessen their engagement with traditional schooling:
poverty, transient living situations, truancy, interactions with the criminal justice system, and low
literacy (e.g., fourth- to sixth-grade reading levels). Students who enter the program with low
reading levels, and whose schools have the appropriate technology infrastructure, are placed in
an online reading intervention course. The goal of the intervention is to boost students’ basic
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 112
literacy skills so they have a better chance of passing their general education courses and
recovering credits needed for graduation, as well as being successful in the workforce.
Use Model
Reading Horizons Elevate, familiar to Site 12 through use in its special education instructional
program, is used by students in a literacy lab as a pullout program taught by reading coaches or
specialists. Each site has its own way of implementing the literacy lab. The literacy intervention
must fit within the existing high school curriculum, with a focus on accumulating general
education credits for graduation. As a result, the reading intervention cannot always be in the
form of a full course. Depending on the site, during the study the intervention was delivered as
part of an English language arts course, an extra session during lunchtime or study period, a
pullout activity during a regular class, or as an elective credit. The literacy intervention labs are
rarely scheduled for before or after school, however, because students would be unlikely to
attend at those times.
The literacy intervention labs and the use of Reading Horizons Elevate is overseen by a group
of reading specialists assigned to sites for this purpose, some with certification in reading
instruction. These specialist work directly with the students. The program coordinator for the
literacy intervention labs also acts as a coach for these specialists, providing help with literacy
labs setup, student motivation, technology use, and reading pedagogy. She also helps make
sure the intervention is consistently implemented across the sites, with an emphasis on
competency-based achievement.
How Reading Horizons Elevate was used with the labs varied depending on the site. For sites
with shorter literacy lab periods, instructors monitored students’ self-paced work in a computer
lab or library and answered questions as needed. In other cases, instructors felt that working on
the computer for an entire class period was too much for the students, so they combined
Reading Horizons Elevate with off-computer activities. One teacher let each student have one
day to read a book of their choosing during class. Another mixed in direct instruction or reading
Lexile-leveled articles from another product (Newsela) related to the social justice theme of the
school.
Even though Reading Horizons Elevate allows students to alternate between decoding practice,
reading-in-context, and reading comprehension activities, the program coordinator felt that, over
time, the use of the product as part of the literacy intervention would evolve to a mixed-mode
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 113
model. In this future model, students use of Reading Horizons Elevate to learn and practice
basic skills would be combined with non-computer-based reading activities that would allow
students to apply and reinforce their improved skills by reading texts of their own choosing.
The program coordinator, and her coaching of the reading specialists, was critical to the
successful use of the Reading Horizons Elevate program in Site 12. She served as both a
pedagogical coach and technical support specialist for the instructors. The participating
instructors had different backgrounds in teaching reading to low-skilled adult readers. Typically,
high school English teachers are not prepared to teach basic reading skills and so are not
familiar with the decoding system that is the basis for the Reading Horizons Elevate
pedagogical approach. Thus, Reading Horizons Elevate professional development teaches
instructors not only how to use the software, but also how to approach the teaching of reading
for low-skilled adults.
Students interviewed felt that Reading Horizons Elevate was helping them learn by “breaking
words down and putting them back together.” They felt that the word decoding and reading
practice built into the product helped them learn and that the variety in the program helped them
stay engaged. Students also said that they applied the word decoding skills they learned with
the product in their other classes.
From the outset of the study, the instructors believed it was unlikely that the students would
work on Reading Horizons on their own outside the literacy lab. In the literacy lab, instructors
adopted a variety of strategies to motivate students to make progress in the product (e.g.,
incentives such as gift cards to local fast-food restaurants), yet some instructors still reported
they were disappointed in students’ progress.
Some instructors also noted that students may have viewed the content as too remedial or
“juvenile,” not geared toward adults and not relevant for the workplace, thus influencing these
students’ use of the product.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 114
Product: Reading Horizons Elevate
Site Portrait
Site 13 is one of six specialty schools in a rural school district located in Northern Utah. Site 13
offers programs specifically designed to meet the needs of adult learners, most of whom pursue
one of the following: a high school diploma, a GED certificate, English as a Second Language
(ESL) skill development, literacy or numeracy instruction (starting at or below a high school
graduate level), or a transition to community college.
The program serves about 200 students each day, about 70% of whom are employed. Students
range in age from 16–50, although most are between 18 and 25. While the majority of the
students enrolled in the Site 13 school are white, the program does have disproportionately
more minorities than the surrounding area.
Each class is taught by one instructor. Classes meet twice per week for 2 hours per session
typically over a 5-week period (students usually enroll in more than one 5-week session). On
average, each class has 15–20 students. After every 40 hours of instruction, each student is
tested using the TABE assessment.
The Site 13 school program has no formal sequence of literacy courses. Instead, students take
(and repeat if needed) the literacy course Reading Improvement until they are prepared to take
more advanced classes, based on the instructor’s assessment of their progress.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 115
Use Model
The Reading Improvement course in which Reading Horizons Elevate was used does not have
a formal curriculum or textbook. Instead, each instructor prepares her own class and covers the
skill areas of phonics, writing, reading aloud, and comprehension.
The instructor who participated in this study used Reading Horizons Elevate during the last 30
minutes of each 2-hour course session, providing a total of 1 hour per week of in-class use.
Students were required to spend 7 hours total on Reading Horizons Elevate to pass the class,
so they also needed to spend 2 more hours working with it outside class. To complete this
requirement, some students chose to return to the school and work in the computer lab,
whereas others worked from home.
The regular classroom had sufficient computers to accommodate all students. After 90 minutes
of instructor-led instruction, students used the classroom desktop and laptop computers for
Reading Horizons Elevate work. While students worked on Reading Horizons, the instructor
circulated among them, checking in and working with individual students. Because of the
adaptive design of Reading Horizons Elevate content, each student worked at his or her own
level and pace, something the instructor appreciated. The instructor commented, “I have to
teach to the middle [during whole-class instruction]. Using Reading Horizons Elevate is a way
for them to have success at their level and improve but also feel part of the class.”
The school had sufficient technology hardware and infrastructure to use Reading Horizons
Elevate. It had recently been awarded grants that provided a laptop cart in each classroom, to
complement the four or five desktop computers in each room.
The instructor also mentioned that students were increasingly bringing their own computers into
the classroom. As observed during a site visit, 3 of 12 students were using their own computers
(one was a tablet) and easily navigated through the Reading Horizons Elevate screens. Two
other students brought in their own headphones.
The instructor had received a full-day orientation to the program from Reading Horizons staff.
She was positive about the experience and reported that on two additional occasions the
Reading Horizons area representative had been helpful, providing a printed handbook of
Reading Horizons materials and answering questions about the product’s Lexile scores. In
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 116
general, the instructor felt that Reading Horizons Elevate was easy to learn and that the training
and support were sufficient.
Site 13 participants had very positive reactions to Reading Horizons Elevate. The instructor,
director, and students reported that it was a good addition to the Reading Improvement course.
The students interviewed said they would not enroll in another Reading Improvement class
unless it used Reading Horizons Elevate. The teacher and director emphasized that the
students who seemed most enthusiastic about the product were ESL students.
Students reported that they found the online “short” books used for reading practice engaging
and interesting. Others also liked being able to work on their own: “It’s more private than in the
class. It’s just you and the computer. You don’t want others to know you can’t say the word.
With this, others don’t know what you can’t say.” Other students added, “With the computer,
you can keep repeating the word as much as you want,” and “It’s like you have your own
teacher.”
Challenges to using Reading Horizons Elevate appeared to be minimal. The instructor and
some students reported small technical issues, including a software bug in the Lexile test and
occasionally having the program “freeze up” during use.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 117
Product: Reading Horizons Elevate
Site Portrait
As part of a large school district in Kentucky, Site 14 offers programs for adults to improve basic
skills and prepare for the GED test. It also offers English as a Second Language (ESL),
vocational certifications, career training, and college preparation. Site 14 serves up to 5,000
adult students in a calendar year at 10 sites across a medium-sized city in Southern Kentucky.
Site 14 offers three levels of courses based on students incoming skills: Basic (lowest),
Foundation (grades 4–6), Intermediate (grades 6–8), and Express (for those nearly ready to
take the GED test). Students are placed according to their precourse TABE assessment results.
Courses are typically 6 weeks long, and most classes meet for 3 hours per day, 4 days per
week. These classes cover both reading and mathematics. On completion, students can
reenroll at the same level or qualify with their posttest TABE score for a higher level course.
All TABE assessments were administered in a computer lab at the central program site. At this
location, counselors were also available to help students make education and career
preparation plans.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 118
Use Model
Site 14 started using Reading Horizons Elevate in January 2016. Instructors were given
flexibility in how to include it in their courses, and each used it slightly differently. Typically, the
instructors interviewed reported having students use Reading Horizons Elevate for 90 minutes
per class, 4 days a week, for the first 3 weeks of the session. The instructors then began direct
instruction on TABE-related content for the remaining 3 weeks of the session. Reading Horizons
Elevate was not integrated into any other reading instruction provided. Variations of this use
model were as follows:
• Local Site 1. The instructor started her class (three meetings per week for a total of 4
hours) with a direct-instruction introductory phonics lesson using instructions provided by
Reading Horizons. Students then worked individually for the remainder of the 90-minute
session on Reading Horizons Elevate. The instructor used this format for the first three
sessions before switching to having students work on the product for the entire 90-
minute session. After 3.5 weeks (20 instructional hours), the instructor stopped using
Reading Horizons Elevate and began teaching her own lessons.
• Local Site 2. The instructor used Reading Horizons Elevate through the first 3–4 weeks
of her session, totaling 20 hours of instructional time. Students worked independently on
the product for the entire 90-minute reading period.
• Local Site 3. The instructor also had students work on the product for the entire 90-
minute class session. However, because the computers at this site needed updating,
this instructor was not able to start using Reading Horizons Elevate with students for
several weeks into the class, when arrangements were made for the class to use the
computers in another classroom. The instructor provided direct instruction on reading in
the weeks before and after the use of Reading Horizons Elevate.
• Local Site 4. The instructor supported student use of Reading Horizons Elevate during
an “open” lab time. Students had the freedom to use the lab and product as much or as
little as they wanted. The instructor was available to support all students in the lab,
including those students not using Reading Horizons Elevate.
The use of the Reading Horizons Elevate dashboard also varied across the instructors
interviewed. Two instructors referred to it regularly, checking the number of hours students were
spending on the program. These instructors checked student Lexile scores and the scores that
students earned on program modules. One instructor did not use the dashboard at all, and
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 119
another used it to identify a few students who were struggling with specific skills so as to provide
them with targeted supplemental materials.
At each of the adult education sites, computers were in the classrooms. Few students had
computers of their own. One program staff member spent 2 to 3 weeks helping each site get
started on Reading Horizons Elevate, such as providing headphones, helping students log in,
and diagnosing issues when computers would freeze.
Reading Horizons personnel provided a full-day training in September 2015. They provided
refresher training in late December 2015 remotely, and Site 14 began using Reading Horizons
Elevate in January 2016. The Reading Horizons area representative reached out to Site 14 by
email and provided clarification on a technical question the instructors wanted help on. Staff
also participated in a 7-hour training in April 2016 provided by Mockingbird Education. This
training was provided after instructors finished using Reading Horizons Elevate with their
students.
Overall, instructors reported that students liked Reading Horizons Elevate and felt that the
product is very effective. For many students, this was their first use of a computer. Instructors
appreciated that this experience helped students gain confidence in using computers and build
their computer literacy skills since the students will take an online version of the GED exam. As
one student commented, “I loved it. At first I was scared to death, I thought I was going to break
the computer or something, but I got the hang of it. Now I can read a lot. It brought up my skills.”
Instructors felt that ESL students used the product the most and benefited most from it. They
also reported that those students who were more invested in their learning and motivated
benefited more from the product than others. They believed that those who were less motivated
were more likely to click through the lessons “mechanically,” with less reflection.
The instructors’ greatest concern was the product’s alignment with the content covered by the
TABE assessment. They commented that because the content taught within Reading Horizons
Elevate is not content directly tested in the TABE, use of the product took time away from
teaching TABE-related content, and several instructors observed TABE scores for their students
decrease over the course of the session when the product was used. As one instructor stated, “I
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 120
got very positive feedback. Students loved it, and they got computer skills. My concern was that
it was not connected to our content.”
Some students reported issues associated with working on the product for extended periods of
time. These students, who worked on Reading Horizons Elevate for up to 90 minutes per
session, said they found the content and activities became tedious and less engaging as time
passed during a session. Finally, some students expressed frustration that they could not skip
ahead to the next lesson until they achieved a passing score on the assessment, forcing them
to repeat lessons multiple times until they did so.
Evaluating Digital Learning for Adult Basic Literacy and Numeracy 121
SRI Education, a division of SRI International, is tackling the most complex issues in education
to identify trends, understand outcomes, and guide policy and practice. We work with federal
and state agencies, school districts, foundations, nonprofit organizations, and businesses to
provide research-based solutions to challenges posed by rapid social, technological and
economic change. SRI International is a nonprofit research institute whose innovations have
created new industries, extraordinary marketplace value, and lasting benefits to society.