Identifying The Barriers To Games and Simulations in Education
Identifying The Barriers To Games and Simulations in Education
Abstract
The purpose of this study was to create a valid and reliable instrument to measure
teacher perceived barriers to the adoption of games and simulations in education.
Previous research, interviews with educators, a focus group, a think-aloud protocol,
and an expert review were used to design a survey instrument. After finalization, the
survey was made available for trial on the Internet for a group of educators
(N ¼ 255). A portion of the survey required respondents to rate to what degree
32 potential barriers were perceived as an impediment to the adoption of games and
simulations. Some of the highest rated barriers included cost of equipment, lack of
time to plan and implement, inability to try before purchasing, lack of balance
between entertainment and education, lack of available lesson plans/examples, lack
of alignment to state standards/standardized testing, inability to customize a game/
simulation, and inability to track student progress within the game/simulation. An
exploratory factor analysis identified seven factors that accounted for 67% of the
variability in the respondents’ rankings. Several factors were found to have significant
interactions with other questions on the survey. Implications of these results, as well
as suggestions for future research, are discussed.
1
Morehead State University, KY, USA
2
School of Teaching and Learning, College of Education, University of Florida, Gainesville, FL, USA
Corresponding Author:
Albert D. Ritzhaupt, School of Teaching and Learning, of Education, University of Florida, 2423 Norman
Hall, PO Box 117048, Gainesville, FL 32611, USA.
Email: [email protected]
Justice and Ritzhaupt 87
Keywords
games, simulations, teachers, adoption, barriers, education
Introduction
Presently, technology-based games and simulations have been identified as a
potential learning tool (Aldrich, 2005; Annetta, Mangrum, Holmes, Collazo, &
Meng-Tzu, 2009; Barko & Sadler, 2013; Gee, 2003; Halverson, 2005; Hamlen,
2010; Mayer, Warmelink, & Bekebrede, 2012; Prensky, 2001; Shaffer, 2006;
Shaffer, Squire, Halverson, & Gee, 2005; Squire, 2006; Warburton, 2009).
For example, Rieber (1996) points out that play and imitation are natural
learning strategies; therefore, students of all ages can play games to accommo-
date and assimilate extensive critical thinking and problem-solving skills.
A large portion of research has focused on identifying the benefits of using
games and simulations in education (Ke, 2008; Koh, Kin, Wadhwa, & Lim,
2011; Reese, 2007; Ritzhaupt et al., 2010; Shaffer et al., 2005; Sliney,
O’Mullane, & Murphy, 2009; Squire, 2005). Some of the identified benefits
include increasing student motivation and engagement; enhancing problem-
solving skills, peer learning, and collaboration; facilitating language acquisition;
stimulating information assimilation and retention; improving the integra-
tion of concepts and thinking cross-functionally; and learning in a failsafe
environment (Ferdig & Boyer, 2007; Gee, 2003; Koh et al., 2011; Reese,
2007; Rosas et al., 2003; Royle & Colfer, 2010; Torrente, del Blanco,
Marchiori, Moreno-Ger, & Fernandez-Manjon, 2010; Vos & Brennan, 2010).
Additionally, technology-based games, in particular, have become integral parts
of our social and cultural environment (Oblinger, 2004). For all of these rea-
sons, educators have been looking for technology-based games and simulations
to facilitate the learning experience by creating a new learning culture that
better corresponds with students’ habits and interests (Kiili, 2005; Prensky,
2001; Sanford et al., 2006; Warburton, 2009).
Given the popularity of the concept of incorporating games and simulations
in education, as well as all of the potential benefits, some researchers have
found that game and simulation adoption into education has been slow
(Barko & Sadler, 2013; Gee, 2003; Gee & Levine, 2008; Kenny & Gunter,
2011; Koh et al., 2011; Prensky, 2001). For this reason, several researchers
have tried to identify the obstacles in the adoption of games and simulations
in education (Baek, 2008; Becker & Jacobsen, 2005; Boyle, Connolly, Hainey,
& Boyle, 2012; Egenfeldt-Nielsen, 2004; Kebritchi, 2010; Kenny & Gunter,
2011; Lean, Moizer, Towler, & Abbey, 2006; Moizer, Lean, Towler, &
Abbey, 2009; Rice, 2007; Ritzhaupt et al., 2010; Simpson & Stansberry,
2008). However, at present, it has been suggested that researchers are not
taking a broad enough approach to identifying the barriers to the adoption
88 Journal of Educational Technology Systems 44(1)
Purpose
Because of the lack of a widely accepted instrument to measure the barriers of
the use of games and simulations in the classroom, the purpose of this study is to
create a valid and reliable comprehensive tool to discover the barriers recognized
by educators in the use of games and simulations in their classrooms. More
specifically, this survey is comprehensive in that teacher demographic informa-
tion (i.e., gender, age, ethnicity, highest degree earned) and teacher unfamiliarity
with games and simulations are taken into consideration when identifying the
perceived barriers of adoption of games and simulation in formal education.
Therefore, while designing our comprehensive tool, we kept the following
research objectives in mind:
Conceptual Framework
Technology-based games and simulations can be considered a newer, more
innovative technology. Unfortunately, education has been especially resistant
to change and this has become more obvious in the adoption, or lack of adop-
tion, of instructional technology (Germanne & Sasse, 1997). Innovation causes
change; resistance to change is a natural reaction to the uncertainty that any
transformation creates (Rogers, 2003). One of the best known and well-
respected attempts to describe the adoption of new ideas (or technology)
through cultures is the theory of Diffusion of Innovation put forth by Everett
Rogers (2003). The theory is complex and its full spectrum is beyond the scope
of this article, but a brief synopsis, using the barriers of game and simulation
adoption as the innovation, is included to address this concept in Table 1.
As research indicates, the adoption of any new technology is not an easy
proposition for educators (Taylor, 2008). Identifying the barriers to adopting
games and simulations will not only give insight into the adoption process but
also may assist teachers, instructional designers, administrators, and policy
makers in understanding how to successfully adopt this instructional technol-
ogy. Additionally, those individuals who create and market games and simula-
tions may find this information useful as well.
Method
To create this instrument, we incorporated the research already conducted, with
interviews from educators to design a comprehensive survey of the barriers to
the adoption of games and simulations in curriculum. Using these educator
interviews and previously identified barriers from research, we created a draft
of the instrument. We used a focus group, expert review, and think-aloud proto-
col to increase the accuracy and efficacy of the survey instrument (American
Educational Research Association, American Psychological Association, and
National Council on Measurement in Education [AERA, APA, & NCME],
1999; Beatty, 2004; Chioncel, Van Der Veen, Wildemeersch, & Jarvis, 2003;
Grant & Davis, 1997; Jones & Hunter, 1995; Rabiee, 2004; Van Someren,
Barnard, & Sandberg, 1994; Vogt, King, & King, 2004). Upon the finalization
of the survey, we tested the survey by distributing it to a group of educators. We
then analyzed the results to ensure that the data gathered corresponded to the
intent for which the survey instrument was designed and to determine the tea-
cher perceived barriers to the use of games and simulations in formal education.
Figure 1 visualizes the survey development process.
Table 1. Continued
contexts. Interview participants included educators from all grade levels (i.e.,
elementary, presecondary, secondary, postsecondary, adult education) and from
all learner levels (i.e., low-level learners, general learners, gifted learners).
Interviews were done in person or by phone, depending on the location and
schedule of the interviewee. All interviewees were either public or private school
educators who were teaching during the 2011–2012 school year. We began with
a few educators that were acquaintances who were interested in being inter-
viewed. We then began the snowball or chain referral sampling technique
where each interviewee was given an opportunity to suggest another educator
to be interviewed (Biernacki & Waldorf, 1981; Noy, 2008; Penrod, Preston,
Cain, & Starks, 2003). We stopped the interview process when the data collected
from interviewees started becoming repetitious (Biernacki & Waldorf, 1981).
Due to the nature of this type of sampling, there are some limitations. For
example, respondents largely controlled who was being interviewed since they
suggested potential participants; therefore, many interviewees may have similar
92 Journal of Educational Technology Systems 44(1)
views or social opinions since they belonged to the same social circles.
Additionally, we could not verify some interview information since we could
not observe the interviewees teaching. For instance, when asking about their
usage of games and simulations in their curriculum, interviewees may, due to
social reasons or to please the interviewer, have inflated their usage. For this
same reason, interviewees without gaming experience may have biased their
answers due to their lack of knowledge about this type of media. Also, those
individuals who have not had good experience with video games may also have
biased responses about video games due to their lack of positive experiences.
The interviews contained questions to determine demographic information
(i.e., age, gender, ethnicity, education). Additionally, educators were asked
about their experience with and opinion of games and simulations since these
characteristics may influence their opinions (Kenny & Gunter, 2011; Ritzhaupt
et al., 2010). Given that a key component of the research is the conceptual
framework of the theory of Diffusion of Innovation, we composed several inter-
view questions with this in mind. For example, understanding the importance of
seeing the benefits of adoption from other adopters in the Diffusion of
Innovation, we asked, “Have you seen a co-worker successfully using games
and simulations in his or her classes?” (Rogers, 2003). With the sampling limi-
tations in mind and the potential partiality of interview responses, we combined
the results of the interviews with the corresponding research to build the foun-
dation of the survey instrument for a broader perspective and reduction in bias.
For instance, by using the interview responses to generic barrier questions, along
with researcher-suggested barriers, we were able to determine a more precise list
of potential barriers.
Focus Group
After we compiled the information to help design a draft of the survey instru-
ment, but before finalizing and distributing the survey, we gathered a focus
group of educators to validate the content of the survey instrument draft
(Chioncel et al., 2003; Grant & Davis, 1997; Rabiee, 2004; Vogt et al., 2004).
Participants in the focus group were eight interviewed educators who were able
to meet and discuss the survey at one time (this was the only limitation to
participate). The focus group consisted of two men and six women. The purpose
of the focus group was to check the information, especially the list of potential
barriers in the adoption of games and simulations, for accuracy and to help
clarify any information that was found confusing during the interview process.
Upon arrival to the meeting, participants were given a copy of the survey
draft. Each question had a blank to the side so that participants could grade
each item (Jones & Hunter, 1995). Since all participants were educators, they are
very familiar with the traditional A to F grading scale. We asked them to read
through the survey, independently, and grade each item. When everyone had
Justice and Ritzhaupt 93
completed the grading process, we then discussed any item that received a grade
lower than an “A.” Participants were asked to grade questions based on gram-
mar, clarity, and general opinion. When there was a question that received a
lower grade, we discussed why it did not receive an “A” grade and what did the
participant not like about the survey item. Other participants were encouraged
to comment and discuss the survey item during this process. We followed this
process with each survey item.
Expert Review
Following the incorporation of the comments of the focus group, the survey
instrument was ready for an expert review. The expert review consisted of three
educational technology faculty members holding terminal degrees with extensive
teaching and research experience in field. One of these faculty members was also
an expert in research methodology and psychometrics. These professionals each
reviewed an electronic copy of the survey, separately, and returned their elec-
tronic edits and comments. The expert review helped to further clarify the
survey. The professionals’ comments helped to reduce potential confusion of
survey respondents. Since these experts had not previously seen our survey,
they had a fresh insight into each survey item. For instance, one professional
suggested the addition of mobile devices to the question about use of technology
in curriculum. Also, because of their expertise, these professionals could easily
identify questions that were not worded thoroughly enough to capture the mean-
ing of the question. For example, one professional suggested changing the
“College/University/Technical” category to “Postsecondary” and to include
examples for each category to eliminate confusion with the “Adult
Education” category.
Think-Aloud Protocol
After incorporating the suggestions from the expert review, the survey draft
went under a think-aloud protocol (Van Someren et al., 1994). We utilized
this type of usability test to review the survey for clarity and intent. Three
people agreed to perform the think-aloud protocol. Participants were educa-
tors with varying experience with games and simulations (i.e., used them in
class, played them out of class, had no experience whatsoever) and varying
experience with this research (i.e., interview participation, focus group par-
ticipation, no other participation). The participants were instructed to say
everything they were thinking out loud. Frequently, during the process of
answering the item, they would fall silent in thought. At that point, we
would remind them that we needed to hear what they were thinking. A
few times during each protocol, we would again remind them of this if
they fell silent.
94 Journal of Educational Technology Systems 44(1)
Survey Design
After using the expert review, the focus group, and the think-aloud protocol
techniques to increase the accuracy and efficacy of the survey draft, the survey
was considered complete (AERA, APA, & NCME, 1999; Beatty, 2004; Van
Someren et al., 1994). To test that the survey gathered the information for
which it was designed, determining the teacher perceived barriers to the use of
games and simulations in formal education, the survey instrument was distrib-
uted to a large number of educators so that enough responses could be acquired
and measured (Johanson & Brooks, 2010).
The survey consisted of 18 questions, two of which were open-ended, six of
which had the option of writing in an answer for clarification, and one contained
a list of 32 potential barriers for the respondent to rank (see Appendix A).
Respondents could rank these potential barriers on a scale from 0 to 4, where
0 is not a barrier, 2 is somewhat a barrier, and 4 is definite barrier. If respondents
cannot relate their experience to the potential barrier, instead of ranking from 0
to 4, they could mark don’t know. Respondents were also allowed the option of
writing in a barrier to the adoption of games and simulations in the classroom in
case their barrier(s) was not on the list to rank. Additionally, five questions were
intentionally designed to have multiple responses. For example, respondents
were asked their opinion about how games or simulations could be useful for
educational purposes. The respondents were asked to mark all the answers that
apply, and there was a space available for them to write in an option if they
wanted to add anything.
until after the data were collected. These mistakes only affected the ability to
analyze the data and did not affect any actual data. More information can be
found in the Limitations and Delimitations section later. There were 275 indi-
viduals that opened the instrument and at least answered the first question, the
Informed Consent question.
level of education will be negatively affected (i.e., not learning the intended
lesson or not learning as well from a game as from other educational resources).
This category had the highest Cronbach’s alpha (a ¼ .93), which suggests a
high internal consistency between six of the ranked barriers (Nunnaly, 1978).
98 Journal of Educational Technology Systems 44(1)
1 2 3 4 5 6 7
1 1
2 .364** 1
3 .293** .474** 1
4 .482** .582** .458** 1
5 .675** .543** .389** .508** 1
6 .429** .480** .523** .500** .504** 1
7 .400** .408** .292** .470** .415** .388** 1
**Correlation is significant at the .01 level (two-tailed).
Item M SD
The opinion that games and simulations cause problems with 1.01 1.301
classroom management and in-class student behavior
The perception that games may cause student behavioral prob- 0.85 1.217
lems (i.e., violence or aggression)
The perception that games may cause student obsession or 0.92 1.213
addiction
The concern that students will not learn the intended lesson using 1.36 1.302
the game/simulation
The opinion that students learn more from a teacher than from a 1.17 1.267
game or simulation
The opinion that other learning strategies are more effective than 1.41 1.295
using games or simulations
Although these data appear normally distributed, with a mean of 1.12 and
standard deviation of 1.08, the data appear moderately positively skewed and
slightly leptokurtic (skewness ¼ .97). The highest ranked barrier in this category
was the opinion that other learning strategies are more effective than games and
simulations. The lowest ranked barrier was the perception that games and simu-
lations may cause student aggression or behavioral problems.
Technology Issues
Technology Issues was defined as a category of the barriers to the addition of
games and simulations in education that concerns problems with technology.
Justice and Ritzhaupt 99
Item M SD
Lack of games and simulations for disabled students (i.e., 1.69 1.344
access, equipment, game/simulation options)
Cost/expense of games/simulations/equipment 2.62 1.234
Inability to try a game or simulation before purchase 2.41 1.330
Lack of access to games and simulations outside of school 1.85 1.377
Lack of technical support (for teachers and students) 1.93 1.424
Lack of technology reliability 1.71 1.372
This category includes difficulties with the technology itself, such as the cost, the
inability to preview, and the lack of accessibility to disabled students.
Additionally, this category also reflects the usage of the technology, for instance,
the reliability of the technology, the amount of technical support available, and
if the technology can be easily accessed outside of school. The item level statistics
are illustrated in Table 5.
This category had a Cronbach’s alpha of .80, which suggests a good relation-
ship between the six barriers in this group (Nunnaly, 1978). Although these data
are also normally distributed, with a mean of 2.04 and standard deviation of .95,
the data appear approximately symmetric (slight negative skew) and somewhat
platykurtic (skewness ¼ 22). The highest rated barrier in this category was the
cost or expense of the game or simulation. The lowest rated barrier was the lack
of game and simulation options for students with disabilities.
Item M SD
Item M SD
Teacher Issues
Teacher Issues was identified as a category of the barriers to the addition of
games and simulations in education that concerns problems that teachers may
face. Some of these barriers include time (i.e., to plan and implement the use of a
game or simulation), finding games or simulations that match state standards or
standardized testing, the lack of available lesson plans or examples of game and
simulation incorporation, and characteristics of the teacher (i.e., not motivated
to use games and simulations, not very tech savvy, lack of knowledge about
games and simulations). The item statistics are shown in Table 7.
Justice and Ritzhaupt 101
Item M SD
The perception of the term game (rather than the term educational 1.55 1.448
simulation, for instance)
Lack of evidence to support the use of games and simulations in 1.46 1.239
education
Lack of parental and community support for the use of games and 1.58 1.360
simulations in classrooms/lessons
Lack of clear expectations, by administrators, for teacher usage 1.55 1.321
Lack of administrative support 1.57 1.425
The Teacher Issues category had a Cronbach’s alpha of .79, suggesting good
internal consistency between the six barriers in this group (Nunnaly, 1978). The
data are normally distributed with a mean of 1.65, a standard deviation of .87, and
an approximate symmetry and a moderately flattened curve (skewness ¼ .67). The
highest ranked barrier in this category is time to plan and implement. The lowest
ranked barrier in this category is the respondent’s own technology abilities.
Student Issues
Student Issues was defined as a category of barriers to the adoption of games
and simulations in education that are specific to students, as shown in Table 9.
102 Journal of Educational Technology Systems 44(1)
Item M SD
Lack of student motivation to use games and simulations in lessons 0.61 0.942
(i.e., students do not seem interested in games/simulations)
Varying student abilities (i.e., technology skills, learning ability) 1.05 1.038
Item M SD
For instance, students have a wide range of technical abilities, which makes it
difficult for a teacher since some students may need extra help and other students
may become bored while waiting for others to catch up. Additionally, some
students may not be very motivated to use a game or simulation in class. For
example, the student may have had a previous bad experience with a game or
simulation and thus may have decided not to try any more games or simulations.
This category had a Cronbach’s alpha of .73, just above the .7 boundary of
acceptable, which suggests a relationship between the two barriers in this group
(Nunnaly, 1978). As with the subsequent category, Incorporation Issues, this
category, Student Issues, may have a low Cronbach’s alpha because there are
only two barriers in this group. These data were normally distributed, with a
mean of .83 and a standard deviation of .88 but moderately positively skewed
and leptokurtic (skewness ¼ 1.00). Since there were only two barriers in this
category, one is the highest (varying student abilities) and one is the lowest
(lack of student motivation).
Incorporation Issues
Incorporation Issues was described as a category of barriers to the adoption of
games and simulations in education that reflect some of the specific issues in the
integration of this technology into the classroom. For example, many games and
simulations are too complex to fit into one class period or there may be too
many students in the class to help each student effectively. The item level stat-
istics for this construct is shown in Table 10.
Justice and Ritzhaupt 103
The Incorporation Issues category had the lowest Cronbach’s alpha (a ¼ .65).
These data showed a normal distribution with a mean of 1.75, a standard devi-
ation of 1.03, and an approximate symmetry with moderate platykurtosis (skew-
ness ¼ .33). The highest ranked barrier in this category is the lack of
customizability or adaptability in a game or simulation. The lowest barrier in
this category is class size.
grade level taught by the respondent had two approaching significant relation-
ships: Technology Issues (p ¼ .061) and Teacher Issues (p ¼ .058).
Analysis of Variance
ANOVAs for each dependent variable were conducted as follow-up tests to the
MANOVA results that were significant at the .05 level. The four follow-up
analyses included two independent variables, gender and frequency of game
play. The ANOVA using gender and Issues With Negative Potential Student
Outcomes resulted in a significant difference between males and females
(F ¼ 3.286, p ¼ .030). Females had a mean of 0.99 (SD ¼ 0.97, n ¼ 145) and
males had a mean of 1.38 (SD ¼ 1.26, n ¼ 72). These results suggest that, on
average, males thought that barriers in this category, Issues With Negative
Potential Student Outcomes, were more of a barrier than females.
Also, the ANOVA using gender and technology issues resulted in a significant
difference between males and females (F ¼ 6.164, p ¼ .032). Females had a mean
of 2.16 (SD ¼ 0.93, n ¼ 146) and males had a mean of 1.80 (SD ¼ 0.96, n ¼ 69).
These results suggest that technology issues were, on average, more of a barrier
for females than for males. An additional significant difference between genders
surfaced in the ANOVA examining gender and teacher issues (F ¼ 3.393,
p ¼ .031). Females had a mean of 1.75 (SD ¼ 0.85, n ¼ 145) and males had a
mean of 1.47 (SD ¼ 0.87, n ¼ 71). These results suggest that teacher issues were,
on average, more of a barrier for females than for males.
A follow-up ANOVA between the independent variable of respondents’ game
play frequency and the dependent variable of technology issues yielded results
that were not significant at the .05 level (F ¼ 2.030, p ¼ .076). However, by using
the definition previously applied, these results are approaching significance since
the results would be considered significant at the .10 level.
Game play frequency was based on the average amount of time the respond-
ent played games in one week. There were six categories: 0 hours per week
(M ¼ 2.28, SD ¼ 0.95, n ¼ 29), 0–2 hours per week (M ¼ 2.16, SD ¼ 0.92,
n ¼ 90), 2–5 hours per week (M ¼ 1.93, SD ¼ 0.92, n ¼ 53), 5–10 hours per
week (M ¼ 1.97, SD ¼ 1.05, n ¼ 23), 10–25 hours per week (M ¼ 1.68,
SD ¼ 0.97, n ¼ 17), and >25 hours per week (M ¼ 1.00, SD ¼ 0.67, n ¼ 3).
These results suggest that individuals who had less experience playing games
rated items in the Technology Issues category as more of a barrier.
identifying seven factors that accounted for most of the variability in the
respondents’ rankings. These seven categories of barriers were as follows:
Issues With Negative Potential Student Outcomes, Technology Issues, Issues
Specific to Games and Simulations, Teacher Issues, Issues With Games and
Simulations in Education, Incorporation Difficulties, and Student Ability. The
factor analysis assisted with the construct validity of our survey for these data.
Additionally, we conducted internal consistency reliability and demonstrated an
internally consistent survey for these data.
Technology Issues
All of these individual barriers can be thought of as barriers for any technology,
not just games and simulations. For example, researchers identified some serious
problems with top quality equipment and software in schools: cost of the tech-
nology, availability, as well as accessibility, to staff and students, and the inabil-
ity to try-out products, like simulations, before buying (Becker & Jacobsen,
2005; Egenfeldt-Nielsen, 2004; Koh et al., 2011; Lean et al., 2006; Moizer
et al., 2009; Rice, 2007; Rosas et al., 2003; Royle & Colfer, 2010; Russell &
Shepherd, 2010; Summers, 2004; Torrente, Moreno-Ger, Martinez-Ortiz, &
Fernandez-Manjon, 2009).
106 Journal of Educational Technology Systems 44(1)
The gender of the respondent also significantly (p ¼ .032) influenced how the
individual barriers in the Technology Issues category were ranked. In general,
female educators ranked individual technology items as more of a barrier than
male educators. Given some of the research on females and technology, this
result is provocative. For example, one study suggests that technology is a
male domain since males have positive attitudes toward technology, report
less problems with technology, and can integrate technology smoothly into les-
sons (Bourgonjon et al., 2011). Additionally, Abbiss (2008) broadly character-
ized technology as a male domain and a female deficit. Conversely, many
researchers claim that females use technology just as much as males or that
females use technology in different ways in males; therefore, their motivations
and level of confidence may not be the same as males (Annetta et al., 2009;
Jensen & De Castell, 2010; Joiner et al., 2011; Padilla-Walker, Nelson, Carroll,
& Jensen, 2010; Wilson, 2006). These starkly opposing sides to the question of
gender and technology, along with this study’s significant results, indicate this
subject deserves a closer look; however, this falls outside the purpose of this
study to create a valid and reliable survey.
Teacher Issues
This factor contains individual barriers that, although each of these barriers
originates from different places (i.e., administrator pressure, state standardiza-
tion, understanding complicated technology), ultimately cause a perceived bar-
rier to the teacher. Although many researchers claim that a major barrier of
adopting games and simulations in the classroom is the characteristics of the
teacher, perhaps, at least in this case, it is more about the beliefs and perceptions
of the teacher, rather than the characteristics of the teacher (Egenfeldt-Nielsen,
2004; Niederhauser & Stoddart, 2001; Ritzhaupt et al., 2010; Rosas et al., 2003;
Royle & Colfer, 2010; Simpson & Stansberry, 2008; Taylor, 2008; Virvou,
Katsionis, & Manos, 2005). For example, Simpson and Stansberry (2008) sug-
gest that political mechanisms, such as high-stakes testing, cause pressure on
teachers to improve their students’ test scores. Unless a technology is perceived
as a guaranteed improvement of students’ scores, teachers may not risk an
unknown technology, like games and simulations, which could take time away
from technologies and lessons that have been proven to increase test scores.
Another example is the teacher perceived confidence in their knowledge of
games and simulations. Since researchers consistently identify training as a bar-
rier to the adoption of any instructional technology, this could be a perceived
problem to teachers who have had little to no training in the use of games and
simulations (King, 2002; Koh et al., 2011; Kotrlik & Redmann, 2009;
Niederhauser & Stoddart, 2001; Royle & Colfer, 2010; Simpson & Stansberry,
2008; Smarkola, 2007).
Justice and Ritzhaupt 107
Student Issues
This barrier construct contained two individual barriers: the lack of student
motivation and the variation in student abilities (i.e., technology skills, learning
abilities). Neither of these barriers is specific to games and simulations but could
potentially be a barrier for any new technology. Both, lack of student motivation
and a wide range of student skill and experience, would make it difficult for an
educator to keep all students on task because some may be disinterested or
familiar with the technology and become bored; whereas others will be lost
and need extra help (Schrum, Burbank, Engle, Chambers, & Glassett, 2005;
Vos & Brennan, 2010). Interestingly, there were no significant or approaching
significant interactions with this barrier category. One barrier did have the
lowest averaged ranking (i.e., the lack of student motivation to use games and
simulations). The other barrier was not rated very highly either (i.e., varying
student abilities), which suggests that perhaps this category is not much of a
perceived barrier overall to the adoption of games and simulations in education
in this nonrandom sample.
Justice and Ritzhaupt 109
Incorporation Issues
This category contains three barriers: class size, complexity, and class period
length. Class size can be considered as more of a barrier typical of integration
into school, since many teachers have no control over the number of students in
their classrooms. Complexity of games and simulations can be thought of as a
game-specific barrier since they are difficult to play in one day; consequently,
most students will have little recollection from the previous day and will essen-
tially start from scratch each day the game or simulation is played (Egenfeldt-
Nielsen, 2004; Squire, 2006). Interestingly, the length of the class period could be
considered both school-based and game-specific barriers. For example, the
length of a class period is not controlled by the teacher but is dictated by the
school; thus, it could be considered a school-based barrier. Additionally, because
of the way games and simulations are constructed, a typical game or simulation
cannot be completed within one day despite one class period. Hence, it could be
considered a game-specific barrier.
Another identified problem with using games and simulations in the class-
room is the amount of class time needed for this complex software (Kebritchi,
2010; Koh et al., 2011; Lean et al., 2006; Royle & Colfer, 2010; Sanchez, 2009;
Torrente et al., 2010). It is difficult to learn to play a game within one class
period and then continue that play a day or two later. Additionally, class size
can be a barrier for the introduction of any new technology. With the use of
games and simulations in education, Egenfeldt-Nielsen (2004) cites larger class
sizes as a barrier to adoption. Incorporation Issues had no significant or
approaching significant interactions.
biased toward issues within their own educational level. However, since the
intent was to test the survey, a completely random sample was not absolutely
necessary for our purposes. Another limitation is the researcher’s bias and its
impact on the interpretation of the data collected, especially the interpretation of
interviewed individuals and the definition of the seven categories of barriers
created by the exploratory factor analysis, during this study.
An additional limitation of this study was not discovered until the analysis of
the data. A few of the questions were not placed correctly into the Internet
survey host to permit their use in the data analyses. For example, a question
about which grade levels would benefit from the addition of games and simula-
tions included an option for the respondent to choose more than one grade level.
The option for multiple answers made this question ineligible for statistical
analysis. If these data would have been correctly collected, this information
could have been included in the statistical analyses, which would have led to a
broader approach to identifying the barriers to the adoption of games and simu-
lations in education.
A delimitation of this study is that the survey results may not be generalized
to all learning with games and simulations. For example, the results of this study
may not be applicable to a company that trains their employees through the use
of computer simulations or a branch of military that trains their soldiers with
computer simulations. Also, the results of this study may not be completely
generalizable to online educational programs, since those barriers (i.e., access
to the Internet) may be quite different from brick and mortar schools. Since the
respondents were comfortable with technology, those schools that have little
technology may have more severe technology problems (i.e., access, comfort
with technology) than what is reflected in this study.
future researchers may have more valid results. A larger, more random popu-
lation and a corrected survey instrument (i.e., broader scope) may provide new
interactions and more understanding about those interactions that this study
found significant and approaching significant. Future study is imperative to
understanding the teacher perceived barriers so that these barriers may be over-
come and, subsequently, that games and simulations are successfully introduced
into formal education. A final consideration is conducting a confirmatory factor
analysis of this instrument on a new sample to verify the factor structure.
Conclusions
Because, at present, very few, if any, studies take a broad, comprehensive look at
potential barriers to the adoption of games and simulations in formal education,
the purpose of this study was to create an all-inclusive survey to discern these
barriers. A comprehensive survey that distinguishes if teacher perceived barriers
vary at different grade categories, teacher demographics, teacher game and simu-
lation inexperience, and if the identified barriers are general to the adoption of
any new technology or are specific to games and simulations may be more likely
to become a widely accepted, valid, and reliable instrument in ascertaining the
barriers to the adoption of games and simulations in formal education.
To achieve this goal, we incorporated research already conducted with inter-
views from educators to design a draft of a survey of the potential barriers to the
adoption of games and simulations in curriculum. We used a focus group, expert
review, and think-aloud protocol to increase the accuracy and efficacy of the
survey instrument (AERA, APA, & NCME, 1999; Beatty, 2004; Chioncel et al.,
2003; Grant & Davis, 1997; Rabiee, 2004; Van Someren et al., 1994; Vogt et al.,
2004). This process in particular assisted in the content validity of this survey.
Then, we transferred the survey onto the Internet, so that we could test the
survey by distributing it to a group of educators. We then analyzed the results,
using an exploratory factor analysis, to ensure that the data gathered corres-
ponded to the intent for which the survey instrument was designed and to deter-
mine the teacher perceived barriers to the use of games and simulations in formal
education.
The exploratory factor analysis led to the discovery of seven barrier cate-
gories: Issues With Negative Potential Student Outcomes, Technology Issues,
Issues Specific to Games and Simulations, Teacher Issues, Issues With Games
and Simulations in Education, Incorporation Difficulties, and Student Ability.
These categories accounted for approximately 67% of the variance in the results.
By using a MANOVA and then a follow-up ANOVA on significant results, we
found that gender had a significant interaction with three barrier categories:
Issues With Negative Potential Student Outcomes, Technology Issues, and
Teacher Issues. Upon reviewing the means, it appears that males are more con-
cerned with individual barriers, like negative student behavioral outcomes and
112 Journal of Educational Technology Systems 44(1)
negative student learning outcomes, in the Issues With Negative Potential Student
Outcomes category. Female educators ranked individual barriers in the
Technology Issues category (i.e., technical support, technology reliability, acces-
sibility outside of school) as more of a barrier to the adoption of games and
simulations in their curriculum than male educators. And finally, female educa-
tors thought that individual barriers in the Teacher Issues category (i.e., time to
plan and implement, matching to standards or standardized testing) were more of
a barrier than male educators. Another significant interaction was Respondent
Game Play Frequency and Technology Issues. After reviewing the means, it
appears that those individuals who are inexperienced with playing games and
simulations ranked the individual barriers in the Technology Issues (i.e., technical
support, technology reliability, accessibility outside of school) category as more of
a barrier. All of these results, from the test of the survey, can be explained and
supported by research, which further suggests that we have successfully created a
valid and reliable comprehensive tool for identifying the teacher perceived barriers
to using games and simulations in education.
All of our research helped to create a valid and reliable survey instrument to
discern teacher perceived barriers to the adoption of games and simulations in
formal education. This survey instrument, with slight improvements (i.e., chan-
ging the multiple response questions to single response, as seen in the limitations
section), can be used on a larger scale with a more random set of educators to
more definitively ascertain the barriers that teachers identify as preventing them
from using games and simulations in their curriculum.
White/Caucasian
Native American
Other
5) Highest Degree Earned
Associates
Bachelors
Masters
Specialist
Doctorate
Other (please specify)
6) What grade level do you currently teach?
Elementary
Middle school
High school
Postsecondary (i.e., college, university, technical)
Adult education (i.e., ABE/GED, ESL/ESOL, adult high school)
Other (please specify)
7) How do you use technology in your curriculum? Please check all that apply.
Electronic presentations (i.e., PowerPoint, Prezi, SlideRocket, and so on)
Digital programs included with textbooks
District programs (i.e., Discovery Education, Standardized Test Prep pro-
grams, and so on)
Learning/Course management systems (i.e., BlackBoard, Angel, WebCT
and so on)
Mobile digital devices
Internet searches/Research
Internet/Specific websites
Electronic meeting place (i.e., Elluminate, Wimba, and so on)
Gaming platforms (i.e., Wii, Xbox, PlayStation, and so on)
Computer games/simulations (i.e., software, Internet, mobile application)
Teacher-created digital media for lesson
Students create digital media
Other (please explain)
8) How often do you play games (board, card, Internet, software, gaming plat-
form, mobile application, and so on)? Please check the box next to the number
of hours per week you play games.
0 hours per week
0–2 hours per week
2–5 hours per week
5–10 hours per week
10–25 hours per week
More than 25 hours per week
114 Journal of Educational Technology Systems 44(1)
High school
Postsecondary (i.e., college, university, technical)
Adult education (i.e., ABE/GED, ESL/ESOL, Adult High School)
Other (please explain)
15) What learner level(s) do you think would benefit from the addition of educa-
tional games and simulations? Please check all that apply.
None
Low-level learners
General (intermediate) learners
Gifted (high-level) learners
Mixed learners (two or more groups combined in one class)
Other (please explain)
16) Please rate each potential barrier according to your opinion of how
much the item may be an obstacle to your use of educational games
and simulations in your classroom, teaching practices, lesson plans, or curriculum.
In other words, how much does each of these potential barriers prevent you
from using games and simulations?
Scale: 0 (not a barrier), 1, 2 (somewhat a barrier), 3, 4 (definitely a barrier)
1) Lack of time (i.e., find a game or simulation, learn the game or simulation,
incorporate a game or simulation into the lesson)
2) Lack of games and simulations for disabled students (i.e., access, equipment,
game/simulation options)
3) Lack of games and simulations with a good balance between education and
entertainment (i.e., game/simulation is entertaining but with little learning, or it
has enough learning but has little entertainment)
4) Complexity (too difficult) of games and simulations for my students
5) Simplicity (too easy) of games and simulations for my students
6) Lack of customizability or adaptability in a game or simulation (i.e., inability
to modify game/simulation subjects, goals, or objectives)
7) Lack of the ability to track and assess student progress within a game/
simulation
8) Lack of knowledge about how to use games and simulations appropriately
9) The opinion that games and simulations cause problems with classroom
management and in-class student behavior
10) The perception that games may cause student behavioral problems (i.e.,
violence or aggression)
11) The perception that games may cause student obsession or addiction
12) The concern that students will not learn the intended lesson using the game/
simulation
13) The opinion that students learn more from a teacher than from a game or
simulation
14) The opinion that other learning strategies are more effective than using
games or simulations
116 Journal of Educational Technology Systems 44(1)
Continued
Continued
Factor
Item 1 2 3 4 5 6 7
Continued
Factor
Item 1 2 3 4 5 6 7
Funding
The authors received no financial support for the research, authorship, and/or publication
of this article.
120 Journal of Educational Technology Systems 44(1)
References
Abbiss, J. (2008). Rethinking the “problem” of gender and IT schooling: Discourses in
literature. Gender and Education, 20, 153–165.
Aldrich, C. (2005). Learning by doing: A comprehensive guide to simulations, computer
games, and pedagogy in e-learning and other educational experiences. San Francisco,
CA: Pfeiffer.
American Educational Research Association, American Psychological Association, and
National Council on Measurement in Education. (1999). Standards for educational and
psychological testing. Washington, DC: American Psychological Association.
Annetta, L., Mangrum, J., Holmes, S., Collazo, K., & Meng-Tzu, C. (2009). Bridging
realty to virtual reality: Investigating gender effect and student engagement on learn-
ing through video game play in an elementary school classroom. International Journal
of Science Education, 31(8), 1091–1113.
Arrindell, W. A., & Van der Ende, J. (1985). Crosssample invariance of the structure of
self-reported distress and difficulty in assertiveness. Advances in Behavior Research and
Therapy, 7, 205–243.
Baek, Y. K. (2008). What hinders teachers in using computer and video games in the
classroom? Exploring factors inhibiting the uptakes of computer and video games.
CyberPsychology & Behavior, 11(6), 665–671.
Barko, T., & Sadler, T. (2013). Practicality in virtuality: Finding student meaning in video
game education. Journal of Science Education and Technology, 22, 124–132.
Beatty, P. (2004). The dynamics of cognitive interviewing. In S. Presser, J. Rothgeb &
M. Couper, et al. (Eds), Methods for testing and evaluating survey questionnaires.
New York, NY: Wiley.
Becker, K., & Jacobsen, D. (2005). Games for learning: Are schools ready for what’s to
come? Proceedings of DIGRA 2005 Conference, 16–20 June 2005. Vancouver, Canada.
Biernacki, P., & Waldorf, D. (1981). Snowball sampling: Problems and techniques of
chain referral sampling. Sociological Methods and Research, 10(2), 141–163.
Boyle, E., Connolly, T., Hainey, T., & Boyle, J. (2012). Engagement in digital
entertainment games: A systematic review. Computers in Human Behavior, 28(2012),
771–780.
Bourgonjon, J., Valcke, M., Soetaert, R., de Wever, B., & Schellens, T. (2011).
Parental acceptance of digital game-based learning. Computers & Education, 57(2011),
1434–1444.
Buckingham, D. (2003). Media education: Literacy, learning, and contemporary culture.
Cambridge, England/Malden, MA: Polity Press/Blackwell Publishing.
Charsky, D., & Mims, C. (2008). Integrating commercial off-the-shelf video games into
school curriculums. Tech Trends, 52(5), 38–44.
Chioncel, N. E., Van Der Veen, R. G. W., Wildemeersch, D., & Jarvis, P. (2003). The
validity and reliability of focus groups as a research method in adult education.
International Journal of Lifelong Education, 22(5), 495–517.
De Aguilera, M., & Mendiz, A. (2003). Video games and education. ACM Computers in
Entertainment, 1(1), 1–14.
Egenfeldt-Nielsen, S. (2004). Practical barriers in using educational computer games. On
the Horizon, 12(1), 18–21.
Justice and Ritzhaupt 121
Kenny, R., & McDaniel, R. (2011). The role teachers’ expectations and value assessments
of video games play in their adopting and integrating them into their classrooms.
British Journal of Educational Technology, 42(2), 197–213.
Kerlinger, F. (1974). Foundations of behavioural research. New York, NY: Holt, Rinehart
and Winston.
Kiili, K. (2005). Educational game design: Experiential gaming model revised (Research
Report 4, pp. 1–12). Pori, Finland: Tampere University of Technology. Retrieved
from http://amc.pori.tut.fi/publications/EducationalGameDesign.pdf
King, K. (2002). Educational technology professional development as transformative
learning opportunities. Computers & Education, 39(3), 283–297.
Klabbers, J. G. (2009). Terminological ambiguity: Game and simulation. Simulation &
Gaming, 40(4), 446–463.
Koh, E., Kin, Y., Wadhwa, B., & Lim, J. (2011). Teacher perceptions of games in
Singapore schools. Retrieved from http://sag.sagepub.com/content/early/2011/04/28/
1046878111401839
Kotrlik, J., & Redmann, D. (2009). Analysis of teachers’ adoption of technology for use
in instruction in seven career and technical education programs. Career and Technical
Education Research, 34(1), 47–77.
Lean, J., Moizer, J., Towler, M., & Abbey, C. (2006). Simulations and games: Use and
barriers in higher education. Active Learning in Higher Education, 7(3), 227–242.
Lim, C. (2008). Global citizenship education, school curriculum, and games: Learning
mathematics, English and science as a global citizen. Computers & Education,
51(2008), 1073–1093.
Mayer, I., Warmelink, H., & Bekebrede, G. (2012). Learning in a game-based virtual
environment: A comparative evaluation in higher education. European Journal of
Engineering Education, 38(1), 85–106.
McCall, J. (2012). Navigating the problem space: The medium of simulation games in the
teaching of history. History Teacher, 46(1), 9–28.
Moizer, J., Lean, J., Towler, M., & Abbey, C. (2009). Simulations and games:
Overcoming the barriers to their use in higher education. Active Learning in Higher
Education, 10(3), 207–224.
Niederhauser, D. S., & Stoddart, T. (2001). Teachers’ instructional perspectives and use
of educational software. Teaching and Teacher Education, 17(1), 15–31.
Noy, C. (2008). Sampling knowledge: The hermeneutics of snowball sampling
in qualitative research. International Journal of Social Research Methodology, 11(4),
327–344.
Nunnaly, J. (1978). Psychometric theory. New York, NY: McGraw-Hill.
Oblinger, D. (2004). The next generation of educational engagement. Journal of
Interactive Media in Education, 2004(8), 1–18.
Padilla-Walker, L., Nelson, L., Carroll, J., & Jensen, A. (2010). More than just a game:
Video game and internet use during emerging adulthood. Journal of Youth and
Adolescence, 39, 103–110.
Penrod, J., Preston, D., Cain, R. E., & Starks, M. T. (2003). A discussion of chain referral
as a method of sampling hard-to-reach populations. Journal of Transcultural Nursing,
14(2), 100–107.
Prensky, M. (2001). Digital game based learning. New York, NY: McGraw-Hill.
Justice and Ritzhaupt 123
Rabiee, F. (2004). Focus-group interview and data analysis. Proceedings of the Nutrition
Society, 63, 655–660.
Reese, D. (2007). First steps and beyond: Serious games as preparation for future learn-
ing. Journal of Educational Multimedia and Hypermedia, 16(3), 283–300.
Rice, J. (2007). New media resistance: Barriers to implementation of computer video
games in the classroom. Journal of Educational Multimedia and Hypermedia, 16(3),
249–261.
Rieber, L. P. (1996). Seriously considering play: Designing interactive learning environ-
ments based on the blending of microworlds, simulations, and games. Educational
Technology Research & Development, 44(2), 43–45.
Rieber, L. P., & Noah, D. (2008). Games, simulations, and visual metaphors in educa-
tion: Antagonism between enjoyment and learning. Educational Media International,
45(2), 77–92.
Ritzhaupt, A., Gunter, G., & Jones, G. (2010). Survey of commercial off-the-shelf video
games: Benefits and barriers in formal educational settings. International Journal of
Instructional Technology and Distance Learning, 7, 45–55.
Roberts, D., & Foehr, U. (2008). Trends in media use. The Future of Children, 18(1),
11–37.
Robertson, J. (2012). Making games in the classroom: Benefits and gender concerns.
Computers & Education, 59(2012), 385–398.
Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.
Rosas, R., Nussbaum, M., Cumsille, P., Marianov, V., Correa, M., Flores, P., . . . Salinas,
M. (2003). Beyond Nintendo: Design and assessment of educational video games for
first and second grade students. Computers & Education, 40(2003), 71–94.
Royle, K., & Colfer, S. (2010). Computer games and learning – Where next? The breadth
and scope of the use of computer games in education. Walsall, England: CeDARE:
University of Wolverhampton.
Russell, C., & Shepherd, J. (2010). Online role-play environments for higher education.
British Journal of Educational Technology, 41(6), 992–1002.
Sanchez, J. (2009). Barriers to student learning in second life. Library Technology Reports,
45(2), 29–34.
Sanford, R. Ulicsak, M., Facer, K., & Rudd, T. (2006). Teaching with games:
Using commercial off-the-shelf computer games in formal education. Bristol, UK:
Futurelab.
Sauvé, L., Renaud, L., Kaufman, D., & Marquis, J. (2007). Distinguishing between
games and simulations: A systematic review. Journal of Educational Technology &
Society, 10(3), 247–256.
Schrum, L., Burbank, M., Engle, J., Chambers, J., & Glassett, K. (2005). Post-secondary
educators’ professional development: Investigation of an online approach to enhan-
cing teaching and learning. The Internet and Higher Education, 8(4), 279–289.
Shaffer, D. (2006). How computer games help children learn. New York: Palgrave
Macmillan.
Shaffer, D., Squire, K., Halverson, R., & Gee, J. (2005). Video games and the future of
learning. Phi Delta Kappan, 87(2), 104–111.
Simpson, E., & Stansberry, S. (2008). Video games and teacher development: Bridging the
gap in the classroom. In K. McFerrin, et al. (Eds), Proceedings of society for
124 Journal of Educational Technology Systems 44(1)
Author Biographies
Lenora Jean Justice is an assistant professor at Morehead State University
where she teaches graduate educational technology courses and chairs doctoral
student research committees. Additionally she is a consultant to several career
pathway and educational training organizations. Her research interests include
using games and simulations in education and training, mobile learning, flipped
instruction, and using the backchannel in education and training.