0% found this document useful (0 votes)
38 views40 pages

Identifying The Barriers To Games and Simulations in Education

The document discusses creating a valid and reliable survey to identify barriers that educators face in adopting games and simulations in their classrooms. It provides background on games and simulations in education, discusses the need for a comprehensive survey, and describes the study's purpose and conceptual framework. The method section describes using previous research, interviews, a focus group, and expert review to design the survey.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views40 pages

Identifying The Barriers To Games and Simulations in Education

The document discusses creating a valid and reliable survey to identify barriers that educators face in adopting games and simulations in their classrooms. It provides background on games and simulations in education, discusses the need for a comprehensive survey, and describes the study's purpose and conceptual framework. The method section describes using previous research, interviews, a focus group, and expert review to design the survey.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Article

Journal of Educational Technology

Identifying the Systems


2015, Vol. 44(1) 86–125
! The Author(s) 2015
Barriers to Games Reprints and permissions:
sagepub.com/journalsPermissions.nav
and Simulations in DOI: 10.1177/0047239515588161
ets.sagepub.com
Education: Creating
a Valid and Reliable
Survey

Lenora Jean Justice1 and Albert D. Ritzhaupt2

Abstract
The purpose of this study was to create a valid and reliable instrument to measure
teacher perceived barriers to the adoption of games and simulations in education.
Previous research, interviews with educators, a focus group, a think-aloud protocol,
and an expert review were used to design a survey instrument. After finalization, the
survey was made available for trial on the Internet for a group of educators
(N ¼ 255). A portion of the survey required respondents to rate to what degree
32 potential barriers were perceived as an impediment to the adoption of games and
simulations. Some of the highest rated barriers included cost of equipment, lack of
time to plan and implement, inability to try before purchasing, lack of balance
between entertainment and education, lack of available lesson plans/examples, lack
of alignment to state standards/standardized testing, inability to customize a game/
simulation, and inability to track student progress within the game/simulation. An
exploratory factor analysis identified seven factors that accounted for 67% of the
variability in the respondents’ rankings. Several factors were found to have significant
interactions with other questions on the survey. Implications of these results, as well
as suggestions for future research, are discussed.

1
Morehead State University, KY, USA
2
School of Teaching and Learning, College of Education, University of Florida, Gainesville, FL, USA
Corresponding Author:
Albert D. Ritzhaupt, School of Teaching and Learning, of Education, University of Florida, 2423 Norman
Hall, PO Box 117048, Gainesville, FL 32611, USA.
Email: [email protected]
Justice and Ritzhaupt 87

Keywords
games, simulations, teachers, adoption, barriers, education

Introduction
Presently, technology-based games and simulations have been identified as a
potential learning tool (Aldrich, 2005; Annetta, Mangrum, Holmes, Collazo, &
Meng-Tzu, 2009; Barko & Sadler, 2013; Gee, 2003; Halverson, 2005; Hamlen,
2010; Mayer, Warmelink, & Bekebrede, 2012; Prensky, 2001; Shaffer, 2006;
Shaffer, Squire, Halverson, & Gee, 2005; Squire, 2006; Warburton, 2009).
For example, Rieber (1996) points out that play and imitation are natural
learning strategies; therefore, students of all ages can play games to accommo-
date and assimilate extensive critical thinking and problem-solving skills.
A large portion of research has focused on identifying the benefits of using
games and simulations in education (Ke, 2008; Koh, Kin, Wadhwa, & Lim,
2011; Reese, 2007; Ritzhaupt et al., 2010; Shaffer et al., 2005; Sliney,
O’Mullane, & Murphy, 2009; Squire, 2005). Some of the identified benefits
include increasing student motivation and engagement; enhancing problem-
solving skills, peer learning, and collaboration; facilitating language acquisition;
stimulating information assimilation and retention; improving the integra-
tion of concepts and thinking cross-functionally; and learning in a failsafe
environment (Ferdig & Boyer, 2007; Gee, 2003; Koh et al., 2011; Reese,
2007; Rosas et al., 2003; Royle & Colfer, 2010; Torrente, del Blanco,
Marchiori, Moreno-Ger, & Fernandez-Manjon, 2010; Vos & Brennan, 2010).
Additionally, technology-based games, in particular, have become integral parts
of our social and cultural environment (Oblinger, 2004). For all of these rea-
sons, educators have been looking for technology-based games and simulations
to facilitate the learning experience by creating a new learning culture that
better corresponds with students’ habits and interests (Kiili, 2005; Prensky,
2001; Sanford et al., 2006; Warburton, 2009).
Given the popularity of the concept of incorporating games and simulations
in education, as well as all of the potential benefits, some researchers have
found that game and simulation adoption into education has been slow
(Barko & Sadler, 2013; Gee, 2003; Gee & Levine, 2008; Kenny & Gunter,
2011; Koh et al., 2011; Prensky, 2001). For this reason, several researchers
have tried to identify the obstacles in the adoption of games and simulations
in education (Baek, 2008; Becker & Jacobsen, 2005; Boyle, Connolly, Hainey,
& Boyle, 2012; Egenfeldt-Nielsen, 2004; Kebritchi, 2010; Kenny & Gunter,
2011; Lean, Moizer, Towler, & Abbey, 2006; Moizer, Lean, Towler, &
Abbey, 2009; Rice, 2007; Ritzhaupt et al., 2010; Simpson & Stansberry,
2008). However, at present, it has been suggested that researchers are not
taking a broad enough approach to identifying the barriers to the adoption
88 Journal of Educational Technology Systems 44(1)

of games and simulations in formal education (Bourgonjon, Valcke, Soetaert,


de Wever, & Schellens, 2011). Additionally, Jensen and De Castell (2010) sug-
gest that no technology should be assumed as value-neutral, or, in other words,
no technology should be indiscriminately used regardless of identity factors
such as gender, race, nationality, or class. Also, research suggests that teaching
level, years of experience, teaching subject, and personal experience with games
and simulations affect the potential adoption of games and simulations by a
teacher (Charsky & Mims, 2008; De Aguilera & Mendiz, 2003; Hamlen, 2010;
Kenny & McDaniel, 2011; Koh et al., 2011; Lim, 2008; Ritzhaupt et al., 2010).
Furthermore, games and simulations have been linked since their introduction,
and confusion between them is still apparent (Jones, 1998; Klabbers, 2009;
Rieber & Noah, 2008; Sauve, Renaud, Kaufman, & Marquis, 2007; Wolfe &
Crookall, 1998). For example, many educators may not understand the differ-
ence between games, simulations, simulation games, serious games, and gaming
simulations and, consequently, may be intimidated to incorporate one of these
into practice.
There is no widely accepted valid and reliable instrument to measure the
barriers that educators identify in the use of games and simulations in their
classrooms. For an instrument to become widely accepted, we believe that a
more comprehensive survey is needed: A survey that distinguishes barriers at
varying grade categories, teacher demographics, and teacher game and simula-
tion inexperience, as well as distinguishing if the identified barriers are general to
the adoption of any new technology or are specific to games and simulations.
This type of instrument may discern the actual barriers to the adoption of games
and simulations in education.

Purpose
Because of the lack of a widely accepted instrument to measure the barriers of
the use of games and simulations in the classroom, the purpose of this study is to
create a valid and reliable comprehensive tool to discover the barriers recognized
by educators in the use of games and simulations in their classrooms. More
specifically, this survey is comprehensive in that teacher demographic informa-
tion (i.e., gender, age, ethnicity, highest degree earned) and teacher unfamiliarity
with games and simulations are taken into consideration when identifying the
perceived barriers of adoption of games and simulation in formal education.
Therefore, while designing our comprehensive tool, we kept the following
research objectives in mind:

1. What are the barriers to adopting games and simulations in education?


2. Are there any barriers related to the instructor’s demographic (i.e., gender,
age, ethnicity, highest degree earned) characteristics or the instructor’s inex-
perience with games and simulations?
Justice and Ritzhaupt 89

Conceptual Framework
Technology-based games and simulations can be considered a newer, more
innovative technology. Unfortunately, education has been especially resistant
to change and this has become more obvious in the adoption, or lack of adop-
tion, of instructional technology (Germanne & Sasse, 1997). Innovation causes
change; resistance to change is a natural reaction to the uncertainty that any
transformation creates (Rogers, 2003). One of the best known and well-
respected attempts to describe the adoption of new ideas (or technology)
through cultures is the theory of Diffusion of Innovation put forth by Everett
Rogers (2003). The theory is complex and its full spectrum is beyond the scope
of this article, but a brief synopsis, using the barriers of game and simulation
adoption as the innovation, is included to address this concept in Table 1.
As research indicates, the adoption of any new technology is not an easy
proposition for educators (Taylor, 2008). Identifying the barriers to adopting
games and simulations will not only give insight into the adoption process but
also may assist teachers, instructional designers, administrators, and policy
makers in understanding how to successfully adopt this instructional technol-
ogy. Additionally, those individuals who create and market games and simula-
tions may find this information useful as well.

Method
To create this instrument, we incorporated the research already conducted, with
interviews from educators to design a comprehensive survey of the barriers to
the adoption of games and simulations in curriculum. Using these educator
interviews and previously identified barriers from research, we created a draft
of the instrument. We used a focus group, expert review, and think-aloud proto-
col to increase the accuracy and efficacy of the survey instrument (American
Educational Research Association, American Psychological Association, and
National Council on Measurement in Education [AERA, APA, & NCME],
1999; Beatty, 2004; Chioncel, Van Der Veen, Wildemeersch, & Jarvis, 2003;
Grant & Davis, 1997; Jones & Hunter, 1995; Rabiee, 2004; Van Someren,
Barnard, & Sandberg, 1994; Vogt, King, & King, 2004). Upon the finalization
of the survey, we tested the survey by distributing it to a group of educators. We
then analyzed the results to ensure that the data gathered corresponded to the
intent for which the survey instrument was designed and to determine the tea-
cher perceived barriers to the use of games and simulations in formal education.
Figure 1 visualizes the survey development process.

Interviews With Educators


For the interviews, we spoke with educators to help develop an unbiased set of
survey questions. We interviewed 20 educators in several disciplines and
90 Journal of Educational Technology Systems 44(1)

Table 1. A Synopsis of the Characteristics of an Innovation, Using Games and Simulations


as the Innovation, That Can Increase the Probability of Adoption According to Rogers’
(2003) Theory of Diffusion of Innovations.

Characteristic of the innovation


that may increase the probability Perception in education using G & S as the
of adoption: innovation:

Relative advantage—the innov- Potential benefits of using G & S include encour-


ation is seen as advantageous age active learning or learning by doing, can
(the benefits outweigh the enhance learning of complex subject matter,
costs) can increase motivation and engagement, can
foster collaboration among learners, and can
encourage systematic ordering and solving of
problems (Egenfeldt-Nielsen, 2010; Gee, 2003;
Ke, 2008; Royle & Colfer, 2010; Torrente et al.,
2009, 2010).
A potential cost of using G & S include parents
and teachers fear that students will develop
aggressive tendencies from the violence in
games or may become addicted to playing
these games (Koh et al., 2011).
Compatibility—the innovation is The educational system, which is an organization
compatible with existing that changes quite slowly and appears inflex-
norms ible, may see compatibility as a more important
issue (Rogers, 2003). Therefore, compatibility
of the old and new methodologies is important
in educators’ adoption of the use of games and
simulations (Becker & Jacobsen, 2005; McCall,
2012; Simpson & Stansberry, 2008; Torrente
et al., 2010).
Complexity—the innovation has G & S may be considered complex by some
a low level of complexity (i.e., potential adoptees in education. For instance,
it is not too difficult to har- the complexity of G & S require extra time by
ness the relative advantage instructors to incorporate into lessons and
from the innovation) also the game complexity requires extended
time to be played by students (Baek, 2008;
Egenfeldt-Nielsen, 2004; Koh et al., 2011;
McCall, 2012; Moizer et al., 2009; Rice, 2007;
Sanchez, 2009; Squire, 2006; Torrente et al.,
2010).
Trialability—the innovation can G & S may not be very easy to experiment with in
easily be experimented with schools. Cost of the equipment, lack of specific
and tried out methodologies, negative opinions about
gaming, and cultural resistance lessen the
(continued)
Justice and Ritzhaupt 91

Table 1. Continued

Characteristic of the innovation


that may increase the probability Perception in education using G & S as the
of adoption: innovation:

trialability of games and simulations in educa-


tion (Baek, 2008; Becker & Jacobsen, 2005;
Kenny & Gunter 2011; Ritzhaupt et al., 2010;
Royle & Colfer, 2010).
Observability—potential adop- Observability may be difficult in public schools
tees can see the benefits of since, as Egenfeldt-Nielsen (2010) points out,
using the innovation many teachers essentially work in a vacuum
and do not see firsthand what other teachers
do in their own classrooms.
Note. G & S: games and simulations.

Figure 1. Survey development process.

contexts. Interview participants included educators from all grade levels (i.e.,
elementary, presecondary, secondary, postsecondary, adult education) and from
all learner levels (i.e., low-level learners, general learners, gifted learners).
Interviews were done in person or by phone, depending on the location and
schedule of the interviewee. All interviewees were either public or private school
educators who were teaching during the 2011–2012 school year. We began with
a few educators that were acquaintances who were interested in being inter-
viewed. We then began the snowball or chain referral sampling technique
where each interviewee was given an opportunity to suggest another educator
to be interviewed (Biernacki & Waldorf, 1981; Noy, 2008; Penrod, Preston,
Cain, & Starks, 2003). We stopped the interview process when the data collected
from interviewees started becoming repetitious (Biernacki & Waldorf, 1981).
Due to the nature of this type of sampling, there are some limitations. For
example, respondents largely controlled who was being interviewed since they
suggested potential participants; therefore, many interviewees may have similar
92 Journal of Educational Technology Systems 44(1)

views or social opinions since they belonged to the same social circles.
Additionally, we could not verify some interview information since we could
not observe the interviewees teaching. For instance, when asking about their
usage of games and simulations in their curriculum, interviewees may, due to
social reasons or to please the interviewer, have inflated their usage. For this
same reason, interviewees without gaming experience may have biased their
answers due to their lack of knowledge about this type of media. Also, those
individuals who have not had good experience with video games may also have
biased responses about video games due to their lack of positive experiences.
The interviews contained questions to determine demographic information
(i.e., age, gender, ethnicity, education). Additionally, educators were asked
about their experience with and opinion of games and simulations since these
characteristics may influence their opinions (Kenny & Gunter, 2011; Ritzhaupt
et al., 2010). Given that a key component of the research is the conceptual
framework of the theory of Diffusion of Innovation, we composed several inter-
view questions with this in mind. For example, understanding the importance of
seeing the benefits of adoption from other adopters in the Diffusion of
Innovation, we asked, “Have you seen a co-worker successfully using games
and simulations in his or her classes?” (Rogers, 2003). With the sampling limi-
tations in mind and the potential partiality of interview responses, we combined
the results of the interviews with the corresponding research to build the foun-
dation of the survey instrument for a broader perspective and reduction in bias.
For instance, by using the interview responses to generic barrier questions, along
with researcher-suggested barriers, we were able to determine a more precise list
of potential barriers.

Focus Group
After we compiled the information to help design a draft of the survey instru-
ment, but before finalizing and distributing the survey, we gathered a focus
group of educators to validate the content of the survey instrument draft
(Chioncel et al., 2003; Grant & Davis, 1997; Rabiee, 2004; Vogt et al., 2004).
Participants in the focus group were eight interviewed educators who were able
to meet and discuss the survey at one time (this was the only limitation to
participate). The focus group consisted of two men and six women. The purpose
of the focus group was to check the information, especially the list of potential
barriers in the adoption of games and simulations, for accuracy and to help
clarify any information that was found confusing during the interview process.
Upon arrival to the meeting, participants were given a copy of the survey
draft. Each question had a blank to the side so that participants could grade
each item (Jones & Hunter, 1995). Since all participants were educators, they are
very familiar with the traditional A to F grading scale. We asked them to read
through the survey, independently, and grade each item. When everyone had
Justice and Ritzhaupt 93

completed the grading process, we then discussed any item that received a grade
lower than an “A.” Participants were asked to grade questions based on gram-
mar, clarity, and general opinion. When there was a question that received a
lower grade, we discussed why it did not receive an “A” grade and what did the
participant not like about the survey item. Other participants were encouraged
to comment and discuss the survey item during this process. We followed this
process with each survey item.

Expert Review
Following the incorporation of the comments of the focus group, the survey
instrument was ready for an expert review. The expert review consisted of three
educational technology faculty members holding terminal degrees with extensive
teaching and research experience in field. One of these faculty members was also
an expert in research methodology and psychometrics. These professionals each
reviewed an electronic copy of the survey, separately, and returned their elec-
tronic edits and comments. The expert review helped to further clarify the
survey. The professionals’ comments helped to reduce potential confusion of
survey respondents. Since these experts had not previously seen our survey,
they had a fresh insight into each survey item. For instance, one professional
suggested the addition of mobile devices to the question about use of technology
in curriculum. Also, because of their expertise, these professionals could easily
identify questions that were not worded thoroughly enough to capture the mean-
ing of the question. For example, one professional suggested changing the
“College/University/Technical” category to “Postsecondary” and to include
examples for each category to eliminate confusion with the “Adult
Education” category.

Think-Aloud Protocol
After incorporating the suggestions from the expert review, the survey draft
went under a think-aloud protocol (Van Someren et al., 1994). We utilized
this type of usability test to review the survey for clarity and intent. Three
people agreed to perform the think-aloud protocol. Participants were educa-
tors with varying experience with games and simulations (i.e., used them in
class, played them out of class, had no experience whatsoever) and varying
experience with this research (i.e., interview participation, focus group par-
ticipation, no other participation). The participants were instructed to say
everything they were thinking out loud. Frequently, during the process of
answering the item, they would fall silent in thought. At that point, we
would remind them that we needed to hear what they were thinking. A
few times during each protocol, we would again remind them of this if
they fell silent.
94 Journal of Educational Technology Systems 44(1)

Survey Design
After using the expert review, the focus group, and the think-aloud protocol
techniques to increase the accuracy and efficacy of the survey draft, the survey
was considered complete (AERA, APA, & NCME, 1999; Beatty, 2004; Van
Someren et al., 1994). To test that the survey gathered the information for
which it was designed, determining the teacher perceived barriers to the use of
games and simulations in formal education, the survey instrument was distrib-
uted to a large number of educators so that enough responses could be acquired
and measured (Johanson & Brooks, 2010).
The survey consisted of 18 questions, two of which were open-ended, six of
which had the option of writing in an answer for clarification, and one contained
a list of 32 potential barriers for the respondent to rank (see Appendix A).
Respondents could rank these potential barriers on a scale from 0 to 4, where
0 is not a barrier, 2 is somewhat a barrier, and 4 is definite barrier. If respondents
cannot relate their experience to the potential barrier, instead of ranking from 0
to 4, they could mark don’t know. Respondents were also allowed the option of
writing in a barrier to the adoption of games and simulations in the classroom in
case their barrier(s) was not on the list to rank. Additionally, five questions were
intentionally designed to have multiple responses. For example, respondents
were asked their opinion about how games or simulations could be useful for
educational purposes. The respondents were asked to mark all the answers that
apply, and there was a space available for them to write in an option if they
wanted to add anything.

Procedures to Test the Survey


The survey was opened to educators of all school levels (i.e., elementary, pre-
secondary, secondary, postsecondary, adult education). An invitation to partici-
pate, including a hyperlink to the survey, was sent out by the Chair of NCPN
(National Career Pathways Network), a group of adult educators who are inter-
ested in the promotion of Career Pathways (an education plan that helps stu-
dents determine a career and then plan their education to achieve the certificate
or degree needed for that chosen career); the Chair of the ISTE (International
Society for Technology in Education) special interest group for games and simu-
lations; and the Chair of the RCCPN (Research Coast Career Pathways
Network), the local chapter of the Career Pathways Network. The survey was
open to participants from May 3, 2012 to May 30, 2012. Educators from all
grade levels (i.e., elementary, presecondary, secondary, postsecondary, adult
education) and learning levels (i.e., low-level learners, general learners, gifted
learners), worldwide, were eligible to participate in this study. The survey was
made available in a web-based format using Survey Monkey. There were a few
mistakes in the setup of the survey in this electronic format that were not noticed
Justice and Ritzhaupt 95

until after the data were collected. These mistakes only affected the ability to
analyze the data and did not affect any actual data. More information can be
found in the Limitations and Delimitations section later. There were 275 indi-
viduals that opened the instrument and at least answered the first question, the
Informed Consent question.

Participants in the Test of the Survey


A different number of respondents completed different parts of the survey.
Respondents ranged from 184 that completed every item to 255 that completed
at least one item on the survey. We felt it was best to include as many partici-
pants in the analysis as possible, though several participants did not complete
the entire survey. Sixty-nine percent were female and 31% were male. Seventy-
eight percent of those respondents were between the ages of 31 and 60.
Additionally, 87.1% of those respondents identified themselves as White/
Caucasian. The majority, 55.6%, of respondents had masters degrees. The
grade levels taught were fairly equally distributed except for adult education
(only 3.8%). Elementary, middle, high, postsecondary, and other ranged
between 16% and 21%. Eighty-six percent identified that they play some type
of game on a weekly basis, and 100% of them thought that games or simulations
could be useful for educational purposes. Approximately 96% considered games
and simulations compatible with their teaching practices.
When asked how respondents use technology in their curriculum, over
90% of participants identified that they use electronic presentations (i.e.,
PowerPoint, Prezi, SlideRocket) and Internet searches/research in their cur-
riculum. Only 10.3% of respondents identified that they use gaming platforms
(i.e., Wii, Xbox, PlayStation). Approximately 96% of the respondents indi-
cated that games and simulations were not too complex for students to learn
an intended lesson. Only 7.5% of participants regarded games and simula-
tions as difficult to experiment with in a lesson. Interestingly, 66.5% of the
participants have either seen very few or no coworkers using games or simu-
lations in their classroom.

Data Analysis for the Test of the Survey


The completed survey instruments were sorted by demographic information
(i.e., gender, ethnicity, age) to better understand the group of participants.
Descriptive statistics were calculated on the item level data to explore the
responses to individual items. To explore the underlying relationships between
the barriers of the adoption of games and simulations, we conducted an explora-
tory factor analysis. This procedure helped to reduce the data to a smaller set of
variables that were easier to compare. To understand the statistical correlation
between the reduced variables, a standard multivariate analysis of variance
96 Journal of Educational Technology Systems 44(1)

(MANOVA) was used. If significance was detected, a follow-up one-way


ANOVA was performed.

Results for the Test of the Survey


Exploratory Factor Analysis
To examine the underlying structure of the data of the 32 barriers, we conducted
an exploratory factor analysis. Bartlett’s test of sphericity of these data had a
2 ¼ 3573.77 (p < .001). The Kaiser-Meyer-Olkin measure of sampling adequacy
was .897, which was above the .5 recommended limit (Kaiser, 1974). The parti-
cipant-to-item ratio was approximately 5:1, which is below the 10:1 ratio for
factor analysis suggested by Kerlinger (1974) and above the thresholds described
as more than adequate by some researchers in maintaining factor stability
(Arrindell & Van der Ende, 1985; Guadagnoli & Velicer, 1988). Therefore,
these data appeared to be well suited for factor analysis. All models were exe-
cuted using an oblique (promax) rotation, as the factors were anticipated to be
related.
The initial unconstrained model resulted in seven factors explaining approxi-
mately 67% of the variability based on a Kaiser’s criterion (i.e., eigenvalues
greater than 1) and a review of the Scree plot (see Figure 2). The model con-
verged after eight rotations. After reviewing the correlation matrix, it was deter-
mined that no unusual correlations were detected (e.g., negative correlations
between items since all were positively stated). Further, the pattern matrix coef-
ficients exhibited a reasonably simple structure with each item loading on an
associated item and very few cross-loadings. As can be gleaned, the seven-factor
model appears to be a reasonable representation of these data (see Table 2).
Thus, we decided to use this model to explain these data. The full pattern matrix
can be observed in Appendix C. Those seven factors were identified as follows:
Issues With Negative Potential Student Outcomes, Technology Issues, Issues
Specific to Games and Simulations, Teacher Issues, Issues With Games and
Simulations in Education, Incorporation Difficulties, and Student Ability (see
Table 3).

Issues With Negative Potential Student Outcomes


Issues With Negative Potential Student Outcomes was defined as a category of
the barriers to the addition of games and simulations in education that addresses
concerns about the effect the addition has on the student. This category con-
tained six individual barriers as illustrated in Table 4. For example, some tea-
chers are concerned that the outcome of using a game and simulation in a lesson
is that it may cause behavioral problems or addiction. Additionally, another
potential outcome of using a game or simulation in a lesson is that the students’
Justice and Ritzhaupt 97

Figure 2. Scree plot for 32-item instrument.

Table 2. The Seven Identified Factors From the Model.

Factor M SD Skewness Kurtosis a Items

1. Issues With Negative Potential 1.12 1.08 .97 .18 .93 6


Student Outcomes
2. Technology Issues 2.04 .95 .22 .61 .80 6
3. Issues Specific to Games and Simulations 2.05 .91 .16 .32 .75 4
4. Teacher Issues 1.65 .87 .05 .67 .79 6
5. Issues With Games and 1.55 1.10 .23 .95 .87 5
Simulations in Education
6. Student Issues .83 .88 1.00 .36 .73 2
7. Incorporation Issues 1.75 1.03 .33 .76 .65 3

level of education will be negatively affected (i.e., not learning the intended
lesson or not learning as well from a game as from other educational resources).
This category had the highest Cronbach’s alpha (a ¼ .93), which suggests a
high internal consistency between six of the ranked barriers (Nunnaly, 1978).
98 Journal of Educational Technology Systems 44(1)

Table 3. Correlation Matrix for the Model.

1 2 3 4 5 6 7

1 1
2 .364** 1
3 .293** .474** 1
4 .482** .582** .458** 1
5 .675** .543** .389** .508** 1
6 .429** .480** .523** .500** .504** 1
7 .400** .408** .292** .470** .415** .388** 1
**Correlation is significant at the .01 level (two-tailed).

Table 4. Issues With Negative Potential Student Outcomes Item Statistics.

Item M SD

The opinion that games and simulations cause problems with 1.01 1.301
classroom management and in-class student behavior
The perception that games may cause student behavioral prob- 0.85 1.217
lems (i.e., violence or aggression)
The perception that games may cause student obsession or 0.92 1.213
addiction
The concern that students will not learn the intended lesson using 1.36 1.302
the game/simulation
The opinion that students learn more from a teacher than from a 1.17 1.267
game or simulation
The opinion that other learning strategies are more effective than 1.41 1.295
using games or simulations

Although these data appear normally distributed, with a mean of 1.12 and
standard deviation of 1.08, the data appear moderately positively skewed and
slightly leptokurtic (skewness ¼ .97). The highest ranked barrier in this category
was the opinion that other learning strategies are more effective than games and
simulations. The lowest ranked barrier was the perception that games and simu-
lations may cause student aggression or behavioral problems.

Technology Issues
Technology Issues was defined as a category of the barriers to the addition of
games and simulations in education that concerns problems with technology.
Justice and Ritzhaupt 99

Table 5. Technology Issues Item Statistics.

Item M SD

Lack of games and simulations for disabled students (i.e., 1.69 1.344
access, equipment, game/simulation options)
Cost/expense of games/simulations/equipment 2.62 1.234
Inability to try a game or simulation before purchase 2.41 1.330
Lack of access to games and simulations outside of school 1.85 1.377
Lack of technical support (for teachers and students) 1.93 1.424
Lack of technology reliability 1.71 1.372

This category includes difficulties with the technology itself, such as the cost, the
inability to preview, and the lack of accessibility to disabled students.
Additionally, this category also reflects the usage of the technology, for instance,
the reliability of the technology, the amount of technical support available, and
if the technology can be easily accessed outside of school. The item level statistics
are illustrated in Table 5.
This category had a Cronbach’s alpha of .80, which suggests a good relation-
ship between the six barriers in this group (Nunnaly, 1978). Although these data
are also normally distributed, with a mean of 2.04 and standard deviation of .95,
the data appear approximately symmetric (slight negative skew) and somewhat
platykurtic (skewness ¼ 22). The highest rated barrier in this category was the
cost or expense of the game or simulation. The lowest rated barrier was the lack
of game and simulation options for students with disabilities.

Issues Specific to Games and Simulations


Issues Specific to Games and Simulations was described as a category of the
barriers to the addition of games and simulations in education that addresses
concerns about games and simulations in general. Some examples include the
lack of games and simulations that balance education and entertainment, the
lack of customizability or adaptability, the lack of the ability to track student
progress, and the lack of games and simulations that are not considered too easy
and simplistic for students. The item level statistics are illustrated in Table 6.
This category had a Cronbach’s alpha of .75, which suggests a good relation-
ship between this set of four barriers (Nunnaly, 1978). There is a normal distri-
bution of data with a mean of 2.05, a standard deviation of .91, and approximate
symmetry (slight negative skew) with slightly flattened curve (skewness ¼ .16).
The highest ranked barrier in this category is “edutainment” or the lack of
games with a balance between entertainment and education. The lowest
ranked barrier is the perception that games are too simple for students.
100 Journal of Educational Technology Systems 44(1)

Table 6. Issues Specific to Games and Simulations Item Statistics.

Item M SD

Lack of games and simulations with a good bal- 2.34 1.167


ance between education and entertainment
Simplicity (too easy) of games and simulations for 1.62 1.203
my students
Lack of customizability or adaptability in a game 2.18 1.188
or simulation (i.e., inability to modify game/
simulation subjects, goals, or objectives)
Lack of the ability to track and assess student 2.08 1.219
progress within a game/simulation

Table 7. Teacher Issues Item Statistics.

Item M SD

Lack of time (i.e., find a game or simulation, learn 2.52 1.337


the game or simulation, incorporate a game or
simulation into the lesson)
Lack of knowledge about how to use games and 1.60 1.336
simulations appropriately
Lack of games/simulations that are aligned to 2.20 1.352
state standards or standardized testing
Lack of examples and available lesson plans using 2.22 1.347
games and simulations
Lack of your own motivation to use games and 0.71 0.960
simulations in lessons
Lack of my own technology abilities 0.63 1.002

Teacher Issues
Teacher Issues was identified as a category of the barriers to the addition of
games and simulations in education that concerns problems that teachers may
face. Some of these barriers include time (i.e., to plan and implement the use of a
game or simulation), finding games or simulations that match state standards or
standardized testing, the lack of available lesson plans or examples of game and
simulation incorporation, and characteristics of the teacher (i.e., not motivated
to use games and simulations, not very tech savvy, lack of knowledge about
games and simulations). The item statistics are shown in Table 7.
Justice and Ritzhaupt 101

Table 8. Issues With Games and Simulations in Education Item Statistics.

Item M SD

The perception of the term game (rather than the term educational 1.55 1.448
simulation, for instance)
Lack of evidence to support the use of games and simulations in 1.46 1.239
education
Lack of parental and community support for the use of games and 1.58 1.360
simulations in classrooms/lessons
Lack of clear expectations, by administrators, for teacher usage 1.55 1.321
Lack of administrative support 1.57 1.425

The Teacher Issues category had a Cronbach’s alpha of .79, suggesting good
internal consistency between the six barriers in this group (Nunnaly, 1978). The
data are normally distributed with a mean of 1.65, a standard deviation of .87, and
an approximate symmetry and a moderately flattened curve (skewness ¼ .67). The
highest ranked barrier in this category is time to plan and implement. The lowest
ranked barrier in this category is the respondent’s own technology abilities.

Issues With Games and Simulations in Education


The Issues With Games and Simulations in Education category was defined as a
group of barriers specific to using games and simulations in education. For
example, teachers may feel more comfortable about using games and simula-
tions if there were more evidence that they were helpful to student learning.
Additionally, if there were clearly outlined expectations by administrators, it
may be easier for teachers to add them. Also, teachers may feel that by using
the term game in a lesson promotes the belief that students are playing and not
learning. Furthermore, if parental and community support were openly dis-
played, teachers might also feel more inclined to incorporate games and simu-
lations in their lessons. These items are shown in Table 8.
This category had a Cronbach’s alpha of .87, which suggests a very close
relationship between the five barriers in this group (Nunnaly, 1978). Although
these data are normally distributed with a mean of 1.55, a standard deviation of
1.10, and an approximate symmetry, the distribution curve appears quite flat
(skewness ¼ 1.00). The highest rated barrier in this category is the lack of par-
ental and community support. The lowest rated barrier was the lack of evidence
to support the use of games and simulations in education.

Student Issues
Student Issues was defined as a category of barriers to the adoption of games
and simulations in education that are specific to students, as shown in Table 9.
102 Journal of Educational Technology Systems 44(1)

Table 9. Student Issues Item Statistics.

Item M SD

Lack of student motivation to use games and simulations in lessons 0.61 0.942
(i.e., students do not seem interested in games/simulations)
Varying student abilities (i.e., technology skills, learning ability) 1.05 1.038

Table 10. Incorporation Issues Items Statistics.

Item M SD

Lack of customizability or adaptability in a game 2.18 1.188


or simulation (i.e., inability to modify game/
simulation subjects, goals, or objectives)
Length of class period 1.57 1.429
Class size 1.50 1.404

For instance, students have a wide range of technical abilities, which makes it
difficult for a teacher since some students may need extra help and other students
may become bored while waiting for others to catch up. Additionally, some
students may not be very motivated to use a game or simulation in class. For
example, the student may have had a previous bad experience with a game or
simulation and thus may have decided not to try any more games or simulations.
This category had a Cronbach’s alpha of .73, just above the .7 boundary of
acceptable, which suggests a relationship between the two barriers in this group
(Nunnaly, 1978). As with the subsequent category, Incorporation Issues, this
category, Student Issues, may have a low Cronbach’s alpha because there are
only two barriers in this group. These data were normally distributed, with a
mean of .83 and a standard deviation of .88 but moderately positively skewed
and leptokurtic (skewness ¼ 1.00). Since there were only two barriers in this
category, one is the highest (varying student abilities) and one is the lowest
(lack of student motivation).

Incorporation Issues
Incorporation Issues was described as a category of barriers to the adoption of
games and simulations in education that reflect some of the specific issues in the
integration of this technology into the classroom. For example, many games and
simulations are too complex to fit into one class period or there may be too
many students in the class to help each student effectively. The item level stat-
istics for this construct is shown in Table 10.
Justice and Ritzhaupt 103

The Incorporation Issues category had the lowest Cronbach’s alpha (a ¼ .65).
These data showed a normal distribution with a mean of 1.75, a standard devi-
ation of 1.03, and an approximate symmetry with moderate platykurtosis (skew-
ness ¼ .33). The highest ranked barrier in this category is the lack of
customizability or adaptability in a game or simulation. The lowest barrier in
this category is class size.

Multivariate Analysis of Variance


Since one of our research objectives included consideration of demographic
characteristics of the instructor and instructor inexperience with games and
simulations, we used the factor groups to look for relationships among these
more comprehensive instructor characters. Consequently, a one-factor, between-
subjects MANOVA was conducted for each of the demographic questions (i.e.,
gender, age, ethnicity, highest degree earned, grade level currently taught) and
the question about the average amount of time the respondent spends gaming on
the survey. Each of the seven categories of barriers determined by the explora-
tory factor analysis served as the dependent variables in each analysis. Each
question on the survey comprised the independent variable for that particular
MANOVA. Results from the MANOVA including those that were statistically
significant and approaching significance at the a priori level of significance of
.05. Four dependent variables (i.e., barrier categories) had a significant effect, at
the .05 level, with two independent variables (i.e., gender and respondent game
playing frequency). Approaching significance was defined as any probability
between .05 and .10 since the relationship would have been considered signifi-
cant if a was set at the .10 level. Both levels were taken into consideration since
this would be a decision of the researcher who uses the survey. Appendix B
contains the results of each comprehensive character in relation to the factors.

Significant Results. Gender, an independent variable, had a significant relationship


with three dependent variables: Issues With Negative Potential Student
Outcomes (p ¼ .011), Technology Issues (p ¼ .005), and Teacher Issues
(p ¼ .040). The respondents’ game play frequency, an independent variable,
had a significant relationship with one dependent variable, Technology Issues
(p ¼ .009), at the .05 level.

Approaching Significant Results. Age, an independent variable, had an approaching


significant relationship with Issues Specific to Games and Simulations (p ¼ .055)
since it would have been significant if a was set at .10. The independent variable
ethnicity had an approaching significant relationship with Issues With Games
and Simulations in Education (p ¼ .098). The independent variable of gameplay
frequency had an approaching significant relationship with the dependent
variable Teacher Issues (p ¼ .061). And finally, the independent variable
104 Journal of Educational Technology Systems 44(1)

grade level taught by the respondent had two approaching significant relation-
ships: Technology Issues (p ¼ .061) and Teacher Issues (p ¼ .058).

Analysis of Variance
ANOVAs for each dependent variable were conducted as follow-up tests to the
MANOVA results that were significant at the .05 level. The four follow-up
analyses included two independent variables, gender and frequency of game
play. The ANOVA using gender and Issues With Negative Potential Student
Outcomes resulted in a significant difference between males and females
(F ¼ 3.286, p ¼ .030). Females had a mean of 0.99 (SD ¼ 0.97, n ¼ 145) and
males had a mean of 1.38 (SD ¼ 1.26, n ¼ 72). These results suggest that, on
average, males thought that barriers in this category, Issues With Negative
Potential Student Outcomes, were more of a barrier than females.
Also, the ANOVA using gender and technology issues resulted in a significant
difference between males and females (F ¼ 6.164, p ¼ .032). Females had a mean
of 2.16 (SD ¼ 0.93, n ¼ 146) and males had a mean of 1.80 (SD ¼ 0.96, n ¼ 69).
These results suggest that technology issues were, on average, more of a barrier
for females than for males. An additional significant difference between genders
surfaced in the ANOVA examining gender and teacher issues (F ¼ 3.393,
p ¼ .031). Females had a mean of 1.75 (SD ¼ 0.85, n ¼ 145) and males had a
mean of 1.47 (SD ¼ 0.87, n ¼ 71). These results suggest that teacher issues were,
on average, more of a barrier for females than for males.
A follow-up ANOVA between the independent variable of respondents’ game
play frequency and the dependent variable of technology issues yielded results
that were not significant at the .05 level (F ¼ 2.030, p ¼ .076). However, by using
the definition previously applied, these results are approaching significance since
the results would be considered significant at the .10 level.
Game play frequency was based on the average amount of time the respond-
ent played games in one week. There were six categories: 0 hours per week
(M ¼ 2.28, SD ¼ 0.95, n ¼ 29), 0–2 hours per week (M ¼ 2.16, SD ¼ 0.92,
n ¼ 90), 2–5 hours per week (M ¼ 1.93, SD ¼ 0.92, n ¼ 53), 5–10 hours per
week (M ¼ 1.97, SD ¼ 1.05, n ¼ 23), 10–25 hours per week (M ¼ 1.68,
SD ¼ 0.97, n ¼ 17), and >25 hours per week (M ¼ 1.00, SD ¼ 0.67, n ¼ 3).
These results suggest that individuals who had less experience playing games
rated items in the Technology Issues category as more of a barrier.

Discussion of Survey Test Results


A large portion of the survey included 32 potential barriers that respondents
rated according to how much (or how little) the respondent perceived the item as
a barrier to the adoption of games and simulations into his or her curriculum.
An exploratory factor analysis helped to understand the underlying structure by
Justice and Ritzhaupt 105

identifying seven factors that accounted for most of the variability in the
respondents’ rankings. These seven categories of barriers were as follows:
Issues With Negative Potential Student Outcomes, Technology Issues, Issues
Specific to Games and Simulations, Teacher Issues, Issues With Games and
Simulations in Education, Incorporation Difficulties, and Student Ability. The
factor analysis assisted with the construct validity of our survey for these data.
Additionally, we conducted internal consistency reliability and demonstrated an
internally consistent survey for these data.

Issues With Negative Potential Student Outcomes


Since all six of these barriers are teacher concerns that are specific to games and
simulations, it is reasonable that they are contained within one construct. For
example, overstimulation, aggression, and addiction have been identified as con-
cerns by some educators and parents who worry about games and simulations
being used as educational components (Barko & Sadler, 2013; Kenny & Gunter,
2011; Koh et al., 2011; Rice, 2007; Rosas et al., 2003).
Gender was the only demographic characteristic that had significant rela-
tionships with this barrier category. Thus, gender was a significant (p ¼ .030)
factor in ranking the individual barriers in the Issues With Negative Potential
Student Outcomes factor. On average, male educators had a tendency to rank
these individual barriers as more of a barrier than did female educators. This
is very interesting since much of the research shows that males tend to enjoy
playing games and simulations more and prefer learning by the use of games
and simulations more than females (Greenberg, Sherry, Lachlan, Lucas, &
Holmstrom, 2010; Hainey, Connolly, Stansfield, & Boyle, 2011;
Hamlen, 2010; Robertson, 2012); however, since these data were
acquired through a nonrandom sample of the population, speculation of
these results falls outside of the intent of this article—to create a valid and
reliable survey.

Technology Issues
All of these individual barriers can be thought of as barriers for any technology,
not just games and simulations. For example, researchers identified some serious
problems with top quality equipment and software in schools: cost of the tech-
nology, availability, as well as accessibility, to staff and students, and the inabil-
ity to try-out products, like simulations, before buying (Becker & Jacobsen,
2005; Egenfeldt-Nielsen, 2004; Koh et al., 2011; Lean et al., 2006; Moizer
et al., 2009; Rice, 2007; Rosas et al., 2003; Royle & Colfer, 2010; Russell &
Shepherd, 2010; Summers, 2004; Torrente, Moreno-Ger, Martinez-Ortiz, &
Fernandez-Manjon, 2009).
106 Journal of Educational Technology Systems 44(1)

The gender of the respondent also significantly (p ¼ .032) influenced how the
individual barriers in the Technology Issues category were ranked. In general,
female educators ranked individual technology items as more of a barrier than
male educators. Given some of the research on females and technology, this
result is provocative. For example, one study suggests that technology is a
male domain since males have positive attitudes toward technology, report
less problems with technology, and can integrate technology smoothly into les-
sons (Bourgonjon et al., 2011). Additionally, Abbiss (2008) broadly character-
ized technology as a male domain and a female deficit. Conversely, many
researchers claim that females use technology just as much as males or that
females use technology in different ways in males; therefore, their motivations
and level of confidence may not be the same as males (Annetta et al., 2009;
Jensen & De Castell, 2010; Joiner et al., 2011; Padilla-Walker, Nelson, Carroll,
& Jensen, 2010; Wilson, 2006). These starkly opposing sides to the question of
gender and technology, along with this study’s significant results, indicate this
subject deserves a closer look; however, this falls outside the purpose of this
study to create a valid and reliable survey.

Teacher Issues
This factor contains individual barriers that, although each of these barriers
originates from different places (i.e., administrator pressure, state standardiza-
tion, understanding complicated technology), ultimately cause a perceived bar-
rier to the teacher. Although many researchers claim that a major barrier of
adopting games and simulations in the classroom is the characteristics of the
teacher, perhaps, at least in this case, it is more about the beliefs and perceptions
of the teacher, rather than the characteristics of the teacher (Egenfeldt-Nielsen,
2004; Niederhauser & Stoddart, 2001; Ritzhaupt et al., 2010; Rosas et al., 2003;
Royle & Colfer, 2010; Simpson & Stansberry, 2008; Taylor, 2008; Virvou,
Katsionis, & Manos, 2005). For example, Simpson and Stansberry (2008) sug-
gest that political mechanisms, such as high-stakes testing, cause pressure on
teachers to improve their students’ test scores. Unless a technology is perceived
as a guaranteed improvement of students’ scores, teachers may not risk an
unknown technology, like games and simulations, which could take time away
from technologies and lessons that have been proven to increase test scores.
Another example is the teacher perceived confidence in their knowledge of
games and simulations. Since researchers consistently identify training as a bar-
rier to the adoption of any instructional technology, this could be a perceived
problem to teachers who have had little to no training in the use of games and
simulations (King, 2002; Koh et al., 2011; Kotrlik & Redmann, 2009;
Niederhauser & Stoddart, 2001; Royle & Colfer, 2010; Simpson & Stansberry,
2008; Smarkola, 2007).
Justice and Ritzhaupt 107

Again, gender was a significant (p ¼ .031) factor in ranking these individual


barriers. In this study, female educators ranked these items as more of a barrier
than their male counterparts. Some recent research may help explain these
results. For instance, Hamlen (2010) concluded that although females initially
have the same ability as males, their lack of motivation about using technology
leads to less overall experience and thereby lowering their confidence. In other
words, because females do not receive the same feelings of reward for using
technology that males do, females may not be motivated to continue using
technology. Interestingly, Lim (2008) suggested that a major barrier to designing
successful learning environments was a lack of motivation by teachers coupled
with a focus on standards, grades, and measured outcomes. Unfortunately,
gender difference was not a focus of Lim’s (2008) study, but these results are
very interesting when coupled with Hamlen’s (2010) results. In general, to under-
stand how these individual barriers are specifically associated with gender, more
research is required.

Issues Specific to Games and Simulations


These barriers seem to be problems with the construct of the game or simulation
itself. For instance, many educators, who are not opposed to using games and
simulations in class, cite the problem of “edutainment” (i.e., a game that does
not balance entertainment and education) in finding a game to use in a lesson
(Rosas et al., 2003; Torrente et al., 2010). Another example is the inability to
monitor and evaluate each student’s progression through the game or simulation
(Russell & Shepherd, 2010; Sliney et al., 2009; Torrente et al., 2009, 2010).
Lacking this ability, it becomes almost impossible for teachers to assign a
grade, based on progress or improvement within the game or simulation, to a
student without creating additional assignments outside of the game or
simulation.
There were no significant interactions with this barrier construct, although
age was approaching a significant (p ¼ .055) interaction. Unfortunately, not
much research has been done in regards to age and game and simulation
usage. Kotrlik and Redman (2009) did find that older teachers have less
confidence in technology and their ability to use that technology, but this
is not specific to games and simulations. The main concern between age and
technology use is the generation gap (i.e., digital divide) between technology-
savvy users and those who are not very savvy with technology (Buckingham,
2003; Tapscott, 1998). Perhaps, there is a difference in the understanding and
use of game and simulation technology between younger teachers, who are
familiar with game and simulation technology, and older teachers, who are
not familiar with game and simulation technology, as they stand on either
side of the digital divide. Clearly, without further research, this is just
speculation.
108 Journal of Educational Technology Systems 44(1)

Issues With Games and Simulations in Education


This is another category that contains two barriers that may be considered spe-
cific to games, while the other three individual barriers may be considered as
specific to integration into school. For example, the term game, which is some-
times associated with playing rather than learning, and other negative percep-
tions of games and simulations may cause the need for extra evidence to justify
use in education (Kenny & Gunter, 2011; Koh et al., 2011; Rice, 2007; Rosas
et al., 2003; Wexler et al., 2007). Additionally, school cultural resistance, espe-
cially support from administrators, parents, and community, has been cited by
several researchers as a barrier to the adoption of games and simulations in
education (Koh et al., 2011; Royle & Colfer, 2010; Sliney et al., 2009).
This barrier had no significant interactions, although ethnicity had an
approaching significant (p ¼ .098) interaction with this category. As with age,
there has been little research in the differences between ethnicities in regards to
game and simulation usage. Although Roberts and Foehr (2008) found that
African American males played games longer than other ethnicities, it is inter-
esting that most video games today offer very few characters of ethnicities other
than Caucasian. Some of those video games that do offer minority characters
may not have the most positive character roles or those games may be strictly for
entertainment purposes (i.e., Grand Theft Auto). Perhaps as game designers
create more positive character roles of ethnicities other than Caucasian, the
possible interaction between ethnicity and Issues with Using Games and
Simulations in Education will decrease. Of course, this is just conjecture and
outside of the purpose of this study to create a reliable and valid survey.

Student Issues
This barrier construct contained two individual barriers: the lack of student
motivation and the variation in student abilities (i.e., technology skills, learning
abilities). Neither of these barriers is specific to games and simulations but could
potentially be a barrier for any new technology. Both, lack of student motivation
and a wide range of student skill and experience, would make it difficult for an
educator to keep all students on task because some may be disinterested or
familiar with the technology and become bored; whereas others will be lost
and need extra help (Schrum, Burbank, Engle, Chambers, & Glassett, 2005;
Vos & Brennan, 2010). Interestingly, there were no significant or approaching
significant interactions with this barrier category. One barrier did have the
lowest averaged ranking (i.e., the lack of student motivation to use games and
simulations). The other barrier was not rated very highly either (i.e., varying
student abilities), which suggests that perhaps this category is not much of a
perceived barrier overall to the adoption of games and simulations in education
in this nonrandom sample.
Justice and Ritzhaupt 109

Incorporation Issues
This category contains three barriers: class size, complexity, and class period
length. Class size can be considered as more of a barrier typical of integration
into school, since many teachers have no control over the number of students in
their classrooms. Complexity of games and simulations can be thought of as a
game-specific barrier since they are difficult to play in one day; consequently,
most students will have little recollection from the previous day and will essen-
tially start from scratch each day the game or simulation is played (Egenfeldt-
Nielsen, 2004; Squire, 2006). Interestingly, the length of the class period could be
considered both school-based and game-specific barriers. For example, the
length of a class period is not controlled by the teacher but is dictated by the
school; thus, it could be considered a school-based barrier. Additionally, because
of the way games and simulations are constructed, a typical game or simulation
cannot be completed within one day despite one class period. Hence, it could be
considered a game-specific barrier.
Another identified problem with using games and simulations in the class-
room is the amount of class time needed for this complex software (Kebritchi,
2010; Koh et al., 2011; Lean et al., 2006; Royle & Colfer, 2010; Sanchez, 2009;
Torrente et al., 2010). It is difficult to learn to play a game within one class
period and then continue that play a day or two later. Additionally, class size
can be a barrier for the introduction of any new technology. With the use of
games and simulations in education, Egenfeldt-Nielsen (2004) cites larger class
sizes as a barrier to adoption. Incorporation Issues had no significant or
approaching significant interactions.

Limitations and Delimitations


Several of the limitations of this study center on the respondents. For example,
since the survey was sent via the Internet, it is assumable that the respondents
are comfortable with using technology. Also, another assumption was that all
respondents were proficient enough with the English language to understand the
terminology and questions in the survey. Additionally, the respondents may not
represent a random sample of the whole population of educators since the invi-
tations to the Internet survey were sent to two groups of educators: members of
a special interest group of ISTE (International Society for Technology in
Education) that is a proponent of the use of games and simulations in class-
rooms and members of two different groups of career pathway organizations
that are primarily composed of various types of educators from Adult Education
(i.e., ESL, ABE/GED, AHS). As a result, most of the ISTE respondents can be
considered bias in that they are comfortable with technology, and that these
educators approve of the use of technology, games and simulations in particular,
in formal education. As for those respondents in Adult Education, they may be
110 Journal of Educational Technology Systems 44(1)

biased toward issues within their own educational level. However, since the
intent was to test the survey, a completely random sample was not absolutely
necessary for our purposes. Another limitation is the researcher’s bias and its
impact on the interpretation of the data collected, especially the interpretation of
interviewed individuals and the definition of the seven categories of barriers
created by the exploratory factor analysis, during this study.
An additional limitation of this study was not discovered until the analysis of
the data. A few of the questions were not placed correctly into the Internet
survey host to permit their use in the data analyses. For example, a question
about which grade levels would benefit from the addition of games and simula-
tions included an option for the respondent to choose more than one grade level.
The option for multiple answers made this question ineligible for statistical
analysis. If these data would have been correctly collected, this information
could have been included in the statistical analyses, which would have led to a
broader approach to identifying the barriers to the adoption of games and simu-
lations in education.
A delimitation of this study is that the survey results may not be generalized
to all learning with games and simulations. For example, the results of this study
may not be applicable to a company that trains their employees through the use
of computer simulations or a branch of military that trains their soldiers with
computer simulations. Also, the results of this study may not be completely
generalizable to online educational programs, since those barriers (i.e., access
to the Internet) may be quite different from brick and mortar schools. Since the
respondents were comfortable with technology, those schools that have little
technology may have more severe technology problems (i.e., access, comfort
with technology) than what is reflected in this study.

Recommendations for Future Use


With a few modifications, we believe the survey instrument to be valid, reliable
for the data it obtains, and able to capture the data for which it was intended,
the teacher perceived barriers to using games and simulations in formal educa-
tion. For example, because of the limitation of a multiple response on the ques-
tion about technology use in curriculum, these data could not be used in the
statistical analysis; however, we believe it would have been interesting to com-
pare the amount of technology a respondent used to the ranking of barriers. For
instance, if a respondent used a lot of technology, he or she may have ranked the
individual barriers in the Technology Issues category lower. It would also have
been interesting to look more in depth at the particular technologies used and
barrier rankings.
Additionally, this survey was tested on a nonrandom sample of educators to
ensure that it collected the information corresponded to the intent for which the
survey instrument was designed. By using a more random sample of educators,
Justice and Ritzhaupt 111

future researchers may have more valid results. A larger, more random popu-
lation and a corrected survey instrument (i.e., broader scope) may provide new
interactions and more understanding about those interactions that this study
found significant and approaching significant. Future study is imperative to
understanding the teacher perceived barriers so that these barriers may be over-
come and, subsequently, that games and simulations are successfully introduced
into formal education. A final consideration is conducting a confirmatory factor
analysis of this instrument on a new sample to verify the factor structure.

Conclusions
Because, at present, very few, if any, studies take a broad, comprehensive look at
potential barriers to the adoption of games and simulations in formal education,
the purpose of this study was to create an all-inclusive survey to discern these
barriers. A comprehensive survey that distinguishes if teacher perceived barriers
vary at different grade categories, teacher demographics, teacher game and simu-
lation inexperience, and if the identified barriers are general to the adoption of
any new technology or are specific to games and simulations may be more likely
to become a widely accepted, valid, and reliable instrument in ascertaining the
barriers to the adoption of games and simulations in formal education.
To achieve this goal, we incorporated research already conducted with inter-
views from educators to design a draft of a survey of the potential barriers to the
adoption of games and simulations in curriculum. We used a focus group, expert
review, and think-aloud protocol to increase the accuracy and efficacy of the
survey instrument (AERA, APA, & NCME, 1999; Beatty, 2004; Chioncel et al.,
2003; Grant & Davis, 1997; Rabiee, 2004; Van Someren et al., 1994; Vogt et al.,
2004). This process in particular assisted in the content validity of this survey.
Then, we transferred the survey onto the Internet, so that we could test the
survey by distributing it to a group of educators. We then analyzed the results,
using an exploratory factor analysis, to ensure that the data gathered corres-
ponded to the intent for which the survey instrument was designed and to deter-
mine the teacher perceived barriers to the use of games and simulations in formal
education.
The exploratory factor analysis led to the discovery of seven barrier cate-
gories: Issues With Negative Potential Student Outcomes, Technology Issues,
Issues Specific to Games and Simulations, Teacher Issues, Issues With Games
and Simulations in Education, Incorporation Difficulties, and Student Ability.
These categories accounted for approximately 67% of the variance in the results.
By using a MANOVA and then a follow-up ANOVA on significant results, we
found that gender had a significant interaction with three barrier categories:
Issues With Negative Potential Student Outcomes, Technology Issues, and
Teacher Issues. Upon reviewing the means, it appears that males are more con-
cerned with individual barriers, like negative student behavioral outcomes and
112 Journal of Educational Technology Systems 44(1)

negative student learning outcomes, in the Issues With Negative Potential Student
Outcomes category. Female educators ranked individual barriers in the
Technology Issues category (i.e., technical support, technology reliability, acces-
sibility outside of school) as more of a barrier to the adoption of games and
simulations in their curriculum than male educators. And finally, female educa-
tors thought that individual barriers in the Teacher Issues category (i.e., time to
plan and implement, matching to standards or standardized testing) were more of
a barrier than male educators. Another significant interaction was Respondent
Game Play Frequency and Technology Issues. After reviewing the means, it
appears that those individuals who are inexperienced with playing games and
simulations ranked the individual barriers in the Technology Issues (i.e., technical
support, technology reliability, accessibility outside of school) category as more of
a barrier. All of these results, from the test of the survey, can be explained and
supported by research, which further suggests that we have successfully created a
valid and reliable comprehensive tool for identifying the teacher perceived barriers
to using games and simulations in education.
All of our research helped to create a valid and reliable survey instrument to
discern teacher perceived barriers to the adoption of games and simulations in
formal education. This survey instrument, with slight improvements (i.e., chan-
ging the multiple response questions to single response, as seen in the limitations
section), can be used on a larger scale with a more random set of educators to
more definitively ascertain the barriers that teachers identify as preventing them
from using games and simulations in their curriculum.

Appendix A. Survey Instrument


1) Do you agree to the terms of this study?
2) Gender
Female
Male
3) Age
0–20
21–30
31–40
41–50
51–60
61–65
Older than 65
4) Ethnicity (please check all that apply)
Asian
Black/African American
Hawaiian/Pacific Islander
Hispanic/Latino/Caribbean Islander
Justice and Ritzhaupt 113

White/Caucasian
Native American
Other
5) Highest Degree Earned
Associates
Bachelors
Masters
Specialist
Doctorate
Other (please specify)
6) What grade level do you currently teach?
Elementary
Middle school
High school
Postsecondary (i.e., college, university, technical)
Adult education (i.e., ABE/GED, ESL/ESOL, adult high school)
Other (please specify)
7) How do you use technology in your curriculum? Please check all that apply.
Electronic presentations (i.e., PowerPoint, Prezi, SlideRocket, and so on)
Digital programs included with textbooks
District programs (i.e., Discovery Education, Standardized Test Prep pro-
grams, and so on)
Learning/Course management systems (i.e., BlackBoard, Angel, WebCT
and so on)
Mobile digital devices
Internet searches/Research
Internet/Specific websites
Electronic meeting place (i.e., Elluminate, Wimba, and so on)
Gaming platforms (i.e., Wii, Xbox, PlayStation, and so on)
Computer games/simulations (i.e., software, Internet, mobile application)
Teacher-created digital media for lesson
Students create digital media
Other (please explain)
8) How often do you play games (board, card, Internet, software, gaming plat-
form, mobile application, and so on)? Please check the box next to the number
of hours per week you play games.
0 hours per week
0–2 hours per week
2–5 hours per week
5–10 hours per week
10–25 hours per week
More than 25 hours per week
114 Journal of Educational Technology Systems 44(1)

9) How do you think games or simulations could be useful for educational


purposes? Please check all that apply.
Games are NOT useful for education
Review of material
Motivating and engaging students
Applying learning styles and varied learning
Immediate feedback and self-correction
Building hand-eye coordination
Problem solving and critical thinking
Differentiated (personalized) learning
Peer learning opportunities
Pretest for current skills to assign lessons
Posttest for learned skills
Foster good-natured competition among students
Approximate real-life situations
As a reward for students
Other (please explain)
10) In general, how compatible (complimentary/well-suited) are educational
games and simulations with your own teaching practices?
Not at all compatible
Somewhat compatible
Mostly compatible
11) In general, are educational games or simulations too complex (challenging/
time consuming) for your students to learn the intended lesson?
Too complex
Somewhat complex
Not at all complex
12) In general, how easy do you think it would be to experiment with an educa-
tional game or simulation for one of your lessons?
Not at all easy
Somewhat easy
Mostly easy
13) How many co-workers have you seen using games or simulations in their
classroom, teaching practices, curriculum, or lesson plans?
Many
Some
Very few
None
14) Which grade level(s) do you think would benefit from the addition of educa-
tional games and simulations? Please check all that apply.
None
Elementary
Middle school
Justice and Ritzhaupt 115

High school
Postsecondary (i.e., college, university, technical)
Adult education (i.e., ABE/GED, ESL/ESOL, Adult High School)
Other (please explain)
15) What learner level(s) do you think would benefit from the addition of educa-
tional games and simulations? Please check all that apply.
None
Low-level learners
General (intermediate) learners
Gifted (high-level) learners
Mixed learners (two or more groups combined in one class)
Other (please explain)
16) Please rate each potential barrier according to your opinion of how
much the item may be an obstacle to your use of educational games
and simulations in your classroom, teaching practices, lesson plans, or curriculum.
In other words, how much does each of these potential barriers prevent you
from using games and simulations?
Scale: 0 (not a barrier), 1, 2 (somewhat a barrier), 3, 4 (definitely a barrier)
1) Lack of time (i.e., find a game or simulation, learn the game or simulation,
incorporate a game or simulation into the lesson)
2) Lack of games and simulations for disabled students (i.e., access, equipment,
game/simulation options)
3) Lack of games and simulations with a good balance between education and
entertainment (i.e., game/simulation is entertaining but with little learning, or it
has enough learning but has little entertainment)
4) Complexity (too difficult) of games and simulations for my students
5) Simplicity (too easy) of games and simulations for my students
6) Lack of customizability or adaptability in a game or simulation (i.e., inability
to modify game/simulation subjects, goals, or objectives)
7) Lack of the ability to track and assess student progress within a game/
simulation
8) Lack of knowledge about how to use games and simulations appropriately
9) The opinion that games and simulations cause problems with classroom
management and in-class student behavior
10) The perception that games may cause student behavioral problems (i.e.,
violence or aggression)
11) The perception that games may cause student obsession or addiction
12) The concern that students will not learn the intended lesson using the game/
simulation
13) The opinion that students learn more from a teacher than from a game or
simulation
14) The opinion that other learning strategies are more effective than using
games or simulations
116 Journal of Educational Technology Systems 44(1)

15) Lack of games/simulations that are aligned to state standards or standar-


dized testing
16) Lack of examples and available lesson plans using games and simulations
17) The perception of the term game (rather than the term educational simula-
tion, for example)
18) Lack of evidence to support the use of games and simulations in education
19) Lack of parental and community support for the use of games and simula-
tions in classrooms/lessons
20) Lack of your own motivation to use games and simulations in lessons
21) Lack of student motivation to use games and simulations in lessons (i.e.,
students do not seem interested in games/simulations)
22) Varying student abilities (i.e., technology skills, learning ability)
23) Lack of clear expectations, by administrators, for teacher usage
24) Cost/expense of games/simulations/equipment
25) Inability to try a game or simulation before purchase
26) Lack of access to games and simulations outside of school
27) Lack of technical support (for teachers and students)
28) Lack of technology reliability
29) Lack of my own technology abilities
30) Lack of administrative support
31) Length of class period
32) Class size
17) Are there any barriers missing from the list on the previous page?
In other words, is there something not previously listed that is preventing you
from using educational games and simulations in your lessons, curriculum,
teaching practices, or classroom?
18) Do you have any general comments or concerns about this survey and study?

Appendix B. Results From MANOVA


(Comprehensive characters’ relationships to the factors)

Demographic Factor Significant? Wilks’  F p

Gender Issues With Negative Potential Yes (.835) 6.577 .011


Student Outcomes
Gender Technology Issues Yes (.835) 8.050 .005
Gender Teacher Issues Yes (.835) 4.293 .040
Gender Issues Specific to Games and No (.835) 0.017 .896
Simulations
(continued)
Justice and Ritzhaupt 117

Continued

Demographic Factor Significant? Wilks’  F p

Gender Issues With Games and No (.835) 0.990 .321


Simulations in Education
Gender Incorporation Issues No (.835) 0.644 .423
Gender Student Issues No (.835) 0.007 .931
Age Issues With Negative Potential No (.835) 0.704 .647
Student Outcomes
Age Technology Issues No (.835) 0.414 .869
Age Teacher Issues No (.835) 0.291 .941
Age Issues Specific to Games and Approaching (.835) 2.106 .055
Simulations
Age Issues With Games and No (.835) 0.615 .718
Simulations in Education
Age Incorporation Issues No (.835) 0.694 .654
Age Student Issues No (.835) 1.314 .253
Ethnicity Issues With Negative Potential No (.835) 0.896 .410
Student Outcomes
Ethnicity Technology Issues No (.835) 1.099 .336
Ethnicity Teacher Issues No (.835) 0.289 .749
Ethnicity Issues Specific to Games and No (.835) 0.919 .401
Simulations
Ethnicity Issues With Games and Approaching (.835) 2.351 .098
Simulations in Education
Ethnicity Incorporation Issues No (.835) 1.346 .263
Ethnicity Student Issues No (.835) 1.743 .178
Highest degree Issues With Negative Potential No (.835) 1.461 .205
Student Outcomes
Highest degree Technology Issues No (.835) 0.255 .937
Highest degree Teacher Issues No (.835) 1.137 .343
Highest degree Issues Specific to Games and No (.835) 0.792 .557
Simulations
Highest degree Issues With Games and No (.835) 1.675 .143
Simulations in Education
Highest degree Incorporation Issues No (.835) 0.565 .727
Highest degree Student Issues No (.835) 0.347 .883
Grade taught Issues With Negative Potential No (.835) 1.214 .305
Student Outcomes
Grade taught Technology Issues Approaching (.835) 2.157 .061
(continued)
118 Journal of Educational Technology Systems 44(1)

Continued

Demographic Factor Significant? Wilks’  F p

Grade taught Teacher Issues Approaching (.835) 2.183 .058


Grade taught Issues Specific to Games and No (.835) 0.979 .432
Simulations
Grade taught Issues With Games and No (.835) 1.493 .194
Simulations in Education
Grade taught Incorporation Issues No (.835) 1.181 .320
Grade taught Student Issues No (.835) 1.420 .219
Gameplay Issues With Negative Potential No (.835) 1.002 .418
Student Outcomes
Gameplay Technology Issues Yes (.787) 3.163 .009
Gameplay Teacher Issues Approaching (.835) 2.159 .061
Gameplay Issues Specific to Games and No (.835) 0.549 .739
Simulations
Gameplay Issues With Games and No (.835) 0.583 .713
Simulations in Education
Gameplay Incorporation Issues No (.835) 1.317 .259
Gameplay Student Issues No (.835) 0.741 .594

Appendix C. Results From Pattern Matrix

Factor

Item 1 2 3 4 5 6 7

1 .199 .009 .326 .383 .072 .11 .185


2 .32 .415 .274 .146 .405 .162 .083
3 .102 .076 .693 .107 .07 .057 .147
4 .017 .092 .308 .106 .15 .581 .271
5 .165 .094 .846 .17 .062 .1 .026
6 .052 .019 .857 .031 .028 .035 .097
7 .17 .038 .617 .085 .124 .036 .079
8 .145 .056 .068 .778 .059 .013 .005
9 .832 .096 .039 .075 .064 .033 .058
(continued)
Justice and Ritzhaupt 119

Continued

Factor

Item 1 2 3 4 5 6 7

10 .916 .02 .059 .005 .065 .039 .047


11 .859 .012 .022 .073 .062 .143 .008
12 .651 .034 .155 .096 .157 .003 .018
13 .81 .017 .012 .01 .227 .239 .049
14 .756 .012 .026 .107 .185 .108 .009
15 .009 .21 .225 .459 .027 .104 .128
16 .049 .067 .282 .615 .109 .113 .162
17 .362 .006 .042 .185 .507 .011 .133
18 .16 .068 .05 .136 .662 .059 .046
19 .268 .007 .013 .035 .67 .099 .03
20 .026 .034 .242 .647 .072 .133 .541
21 .079 .005 .038 .116 .05 .029 .842
22 .006 .072 .216 .026 .168 .144 .673
23 .028 .131 .026 .114 .71 .058 .201
24 .048 .747 .15 .115 .204 .236 .021
25 .149 .589 .239 0 .246 .202 .131
26 .023 .75 .104 .122 .068 .124 .004
27 .055 .754 .128 .295 .001 .051 .132
28 .046 .614 .105 .045 .124 .149 .08
29 .076 .206 .258 .724 .321 .084 .229
30 .054 .281 .159 .359 .722 .175 .094
31 .035 .037 .161 .004 .257 .892 .015
32 .026 .032 .114 .025 .121 .898 .043

Declaration of Conflicting Interests


The authors declared no potential conflicts of interest with respect to the research,
authorship, and/or publication of this article.

Funding
The authors received no financial support for the research, authorship, and/or publication
of this article.
120 Journal of Educational Technology Systems 44(1)

References
Abbiss, J. (2008). Rethinking the “problem” of gender and IT schooling: Discourses in
literature. Gender and Education, 20, 153–165.
Aldrich, C. (2005). Learning by doing: A comprehensive guide to simulations, computer
games, and pedagogy in e-learning and other educational experiences. San Francisco,
CA: Pfeiffer.
American Educational Research Association, American Psychological Association, and
National Council on Measurement in Education. (1999). Standards for educational and
psychological testing. Washington, DC: American Psychological Association.
Annetta, L., Mangrum, J., Holmes, S., Collazo, K., & Meng-Tzu, C. (2009). Bridging
realty to virtual reality: Investigating gender effect and student engagement on learn-
ing through video game play in an elementary school classroom. International Journal
of Science Education, 31(8), 1091–1113.
Arrindell, W. A., & Van der Ende, J. (1985). Crosssample invariance of the structure of
self-reported distress and difficulty in assertiveness. Advances in Behavior Research and
Therapy, 7, 205–243.
Baek, Y. K. (2008). What hinders teachers in using computer and video games in the
classroom? Exploring factors inhibiting the uptakes of computer and video games.
CyberPsychology & Behavior, 11(6), 665–671.
Barko, T., & Sadler, T. (2013). Practicality in virtuality: Finding student meaning in video
game education. Journal of Science Education and Technology, 22, 124–132.
Beatty, P. (2004). The dynamics of cognitive interviewing. In S. Presser, J. Rothgeb &
M. Couper, et al. (Eds), Methods for testing and evaluating survey questionnaires.
New York, NY: Wiley.
Becker, K., & Jacobsen, D. (2005). Games for learning: Are schools ready for what’s to
come? Proceedings of DIGRA 2005 Conference, 16–20 June 2005. Vancouver, Canada.
Biernacki, P., & Waldorf, D. (1981). Snowball sampling: Problems and techniques of
chain referral sampling. Sociological Methods and Research, 10(2), 141–163.
Boyle, E., Connolly, T., Hainey, T., & Boyle, J. (2012). Engagement in digital
entertainment games: A systematic review. Computers in Human Behavior, 28(2012),
771–780.
Bourgonjon, J., Valcke, M., Soetaert, R., de Wever, B., & Schellens, T. (2011).
Parental acceptance of digital game-based learning. Computers & Education, 57(2011),
1434–1444.
Buckingham, D. (2003). Media education: Literacy, learning, and contemporary culture.
Cambridge, England/Malden, MA: Polity Press/Blackwell Publishing.
Charsky, D., & Mims, C. (2008). Integrating commercial off-the-shelf video games into
school curriculums. Tech Trends, 52(5), 38–44.
Chioncel, N. E., Van Der Veen, R. G. W., Wildemeersch, D., & Jarvis, P. (2003). The
validity and reliability of focus groups as a research method in adult education.
International Journal of Lifelong Education, 22(5), 495–517.
De Aguilera, M., & Mendiz, A. (2003). Video games and education. ACM Computers in
Entertainment, 1(1), 1–14.
Egenfeldt-Nielsen, S. (2004). Practical barriers in using educational computer games. On
the Horizon, 12(1), 18–21.
Justice and Ritzhaupt 121

Egenfeldt-Nielsen, S. (2010). The challenges to diffusion of education computer games.


Proceedings of ECGBL 2010, The 4th European Conference on Games Based
Learning, Copenhagen, Denmark.
Ferdig, R., & Boyer, J. (2007). Can game development impact academic achievement?
Retrieved from http://thejournal.com/articles/2007/10/25/can-game-development-
impact-academic-achievement.aspx?sc_lang¼en
Gee, J. (2003). What video games have to teach us about learning and literacy. New York,
NY: Palgrave Macmillan.
Gee, J., & Levine, M. (2008, September 17). Let’s get over the slump. Education Week,
202774232, 8(4), 28–32.
Germann, P., & Sasse, C. (1997). Variations in concerns and attitudes of science teachers
in an educational technology development program. The Journal of Computers in
Mathematics and Science, 16(2–3), 405–423.
Grant, J., & Davis, L. (1997). Focus on quantitative methods: Selection and use
of content experts for instrument development. Research in Nursing & Health, 20,
269–274.
Greenberg, B., Sherry, J., Lachlan, K., Lucas, K., & Holmstrom, A. (2010).
Orientations to video games among gender and age groups. Simulations & Gaming,
41(2), 238–259.
Guadagnoli, E., & Velicer, W. F. (1988). Relation of sample size to the stability of com-
ponent patterns. Psychological Bulletin, 103, 265–275.
Hainey, T., Connolly, T., Stansfield, M., & Boyle, E. (2011). The differences in motiv-
ations of online game players and offline game players: A combined analysis of three
studies at higher education level. Computers & Education, 57(2011), 2197–2211.
Halverson, R. (2005). What can K-12 school leaders learn from video games and
gaming? Innovate, 1(6). Retrieved from http://www.innovateonline.info/index.php?
view¼article&id¼81
Hamlen, K. (2010). Re-examining gender differences in video game play: Time spent and
feelings of success. Journal of Educational Computing Research, 43(3), 293–308.
Jensen, J., & De Castell, S. (2010). Gender, simulation, and gaming: Research review and
redirections. Simulation Gaming, 41(1), 51–71.
Johanson, G., & Brooks, G. (2010). Initial scale development: Sample size for pilot
studies. Educational and Psychological Measurement, 70(3), 394–400.
Joiner, R., Iacovides, J., Owen, M., Gavin, C., Clibbery, S., Darling, J., . . . Drew, B.
(2011). Digital games, gender and learning in engineering: Do females benefit as
much as males? Journal of Science Education and Technology, 20, 178–185.
Jones, K. (1998). What are we talking about?. Simulation & Gaming, 29(3), 314–320.
Jones, J., & Hunter, D. (1995). Consensus methods for medical and health services
research. BMJ, 311, 376–380.
Kaiser, H. F. (1974). An index of factorial simplicity. Pschometrika, 39, 31–36.
Ke, F. (2008). A case study of computer gaming for math: Engaged learning from game-
play? Computers & Education, 51(4), 1609–1620.
Kebritchi, M. (2010). Factors affecting teachers’ adoption of educational computer
games: A case study. British Journal of Educational Technology, 41(2), 256–270.
Kenny, R., & Gunter, G. (2011). Factors affecting adoption of video games in the class-
room. Journal of Interactive Learning Research, 22(2), 259–276.
122 Journal of Educational Technology Systems 44(1)

Kenny, R., & McDaniel, R. (2011). The role teachers’ expectations and value assessments
of video games play in their adopting and integrating them into their classrooms.
British Journal of Educational Technology, 42(2), 197–213.
Kerlinger, F. (1974). Foundations of behavioural research. New York, NY: Holt, Rinehart
and Winston.
Kiili, K. (2005). Educational game design: Experiential gaming model revised (Research
Report 4, pp. 1–12). Pori, Finland: Tampere University of Technology. Retrieved
from http://amc.pori.tut.fi/publications/EducationalGameDesign.pdf
King, K. (2002). Educational technology professional development as transformative
learning opportunities. Computers & Education, 39(3), 283–297.
Klabbers, J. G. (2009). Terminological ambiguity: Game and simulation. Simulation &
Gaming, 40(4), 446–463.
Koh, E., Kin, Y., Wadhwa, B., & Lim, J. (2011). Teacher perceptions of games in
Singapore schools. Retrieved from http://sag.sagepub.com/content/early/2011/04/28/
1046878111401839
Kotrlik, J., & Redmann, D. (2009). Analysis of teachers’ adoption of technology for use
in instruction in seven career and technical education programs. Career and Technical
Education Research, 34(1), 47–77.
Lean, J., Moizer, J., Towler, M., & Abbey, C. (2006). Simulations and games: Use and
barriers in higher education. Active Learning in Higher Education, 7(3), 227–242.
Lim, C. (2008). Global citizenship education, school curriculum, and games: Learning
mathematics, English and science as a global citizen. Computers & Education,
51(2008), 1073–1093.
Mayer, I., Warmelink, H., & Bekebrede, G. (2012). Learning in a game-based virtual
environment: A comparative evaluation in higher education. European Journal of
Engineering Education, 38(1), 85–106.
McCall, J. (2012). Navigating the problem space: The medium of simulation games in the
teaching of history. History Teacher, 46(1), 9–28.
Moizer, J., Lean, J., Towler, M., & Abbey, C. (2009). Simulations and games:
Overcoming the barriers to their use in higher education. Active Learning in Higher
Education, 10(3), 207–224.
Niederhauser, D. S., & Stoddart, T. (2001). Teachers’ instructional perspectives and use
of educational software. Teaching and Teacher Education, 17(1), 15–31.
Noy, C. (2008). Sampling knowledge: The hermeneutics of snowball sampling
in qualitative research. International Journal of Social Research Methodology, 11(4),
327–344.
Nunnaly, J. (1978). Psychometric theory. New York, NY: McGraw-Hill.
Oblinger, D. (2004). The next generation of educational engagement. Journal of
Interactive Media in Education, 2004(8), 1–18.
Padilla-Walker, L., Nelson, L., Carroll, J., & Jensen, A. (2010). More than just a game:
Video game and internet use during emerging adulthood. Journal of Youth and
Adolescence, 39, 103–110.
Penrod, J., Preston, D., Cain, R. E., & Starks, M. T. (2003). A discussion of chain referral
as a method of sampling hard-to-reach populations. Journal of Transcultural Nursing,
14(2), 100–107.
Prensky, M. (2001). Digital game based learning. New York, NY: McGraw-Hill.
Justice and Ritzhaupt 123

Rabiee, F. (2004). Focus-group interview and data analysis. Proceedings of the Nutrition
Society, 63, 655–660.
Reese, D. (2007). First steps and beyond: Serious games as preparation for future learn-
ing. Journal of Educational Multimedia and Hypermedia, 16(3), 283–300.
Rice, J. (2007). New media resistance: Barriers to implementation of computer video
games in the classroom. Journal of Educational Multimedia and Hypermedia, 16(3),
249–261.
Rieber, L. P. (1996). Seriously considering play: Designing interactive learning environ-
ments based on the blending of microworlds, simulations, and games. Educational
Technology Research & Development, 44(2), 43–45.
Rieber, L. P., & Noah, D. (2008). Games, simulations, and visual metaphors in educa-
tion: Antagonism between enjoyment and learning. Educational Media International,
45(2), 77–92.
Ritzhaupt, A., Gunter, G., & Jones, G. (2010). Survey of commercial off-the-shelf video
games: Benefits and barriers in formal educational settings. International Journal of
Instructional Technology and Distance Learning, 7, 45–55.
Roberts, D., & Foehr, U. (2008). Trends in media use. The Future of Children, 18(1),
11–37.
Robertson, J. (2012). Making games in the classroom: Benefits and gender concerns.
Computers & Education, 59(2012), 385–398.
Rogers, E. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.
Rosas, R., Nussbaum, M., Cumsille, P., Marianov, V., Correa, M., Flores, P., . . . Salinas,
M. (2003). Beyond Nintendo: Design and assessment of educational video games for
first and second grade students. Computers & Education, 40(2003), 71–94.
Royle, K., & Colfer, S. (2010). Computer games and learning – Where next? The breadth
and scope of the use of computer games in education. Walsall, England: CeDARE:
University of Wolverhampton.
Russell, C., & Shepherd, J. (2010). Online role-play environments for higher education.
British Journal of Educational Technology, 41(6), 992–1002.
Sanchez, J. (2009). Barriers to student learning in second life. Library Technology Reports,
45(2), 29–34.
Sanford, R. Ulicsak, M., Facer, K., & Rudd, T. (2006). Teaching with games:
Using commercial off-the-shelf computer games in formal education. Bristol, UK:
Futurelab.
Sauvé, L., Renaud, L., Kaufman, D., & Marquis, J. (2007). Distinguishing between
games and simulations: A systematic review. Journal of Educational Technology &
Society, 10(3), 247–256.
Schrum, L., Burbank, M., Engle, J., Chambers, J., & Glassett, K. (2005). Post-secondary
educators’ professional development: Investigation of an online approach to enhan-
cing teaching and learning. The Internet and Higher Education, 8(4), 279–289.
Shaffer, D. (2006). How computer games help children learn. New York: Palgrave
Macmillan.
Shaffer, D., Squire, K., Halverson, R., & Gee, J. (2005). Video games and the future of
learning. Phi Delta Kappan, 87(2), 104–111.
Simpson, E., & Stansberry, S. (2008). Video games and teacher development: Bridging the
gap in the classroom. In K. McFerrin, et al. (Eds), Proceedings of society for
124 Journal of Educational Technology Systems 44(1)

information technology & teacher education international conference 2008


(pp. 5326–5331). Chesapeake, VA: AACE.
Sliney, A., O’Mullane, J., & Murphy, D. (2009). Do the benefits of educational games and
virtual patient finally outweigh the drawbacks? Eurographics Ireland 2009. Retrieved
from http://gv2.cs.tcd.ie/egirl09/papers/05.pdf
Smarkola, C. (2007). Technology acceptance predictors among student teachers and experi-
enced classroom teachers. Journal of Educational Computing Research, 37(1), 65–82.
Squire, K. 2005. Changing the game: What happens when video games enter the class-
room? Innovative: Journal of Online Education. Retrieved from http://www.innova-
teonline.info/index.php?view¼article&id¼82
Squire, K. (2006). From content to context: Videogames as designed experience.
Educational Researcher, 35(8), 19–29.
Summers, G. (2004). Today’s business simulation industry. Simulation & Gaming, 35(2),
208–241.
Tapscott, D. (1998). Growing up digital: The rise of the net generation. New York, NY:
McGraw-Hill.
Taylor, T. (2008). Video game play and receptivity to using virtual worlds for language
learning (unpublished master’s thesis). George Mason University, Fairfax, VA.
Torrente, J., del Blanco, A., Marchiori, E., Moreno-Ger, P., & Fernandez-Manjon, B.
(2010, April). Introducing education games in the learning process. Paper presented at
IEEE Educon education engineering 2010, Madrid, Spain.
Torrente, J., Moreno-Ger, P., Martinez-Ortiz, I., & Fernandez-Manjon, B. (2009). Integration
and deployment of educational games in e-learning environments: The learning object
model meets educational gaming. Educational Technology & Society, 12(4), 359–371.
Van Someren, M., Barnard, Y., & Sandberg, J. (1994). The think aloud method: A prac-
tical guide to modelling cognitive processes. London, England: Academic Press.
Virvou, M., Katsionis, G., & Manos, K. (2005). Combining software games with education:
Evaluation of its educational effectiveness. Educational Technology & Society, 8(2), 54–65.
Vogt, D. S., King, D. W., & King, L. A. (2004). Focus groups in psychological assess-
ment: Enhancing content validity by consulting members of the target population.
Psychological Assessment, 16(3), 231–243.
Vos, L., & Brennan, R. (2010). Marketing simulation games: student and lecturer per-
spectives. Marketing Intelligence & Planning, 28(7), 882–897.
Warburton, S. (2009). Second Life in higher education: Assessing the potential for and
the barriers to deploying virtual worlds in learning and teaching. British Journal of
Educational Technology, 40(3), 414–426.
Wexler, S., Aldrich, C., Johannigman, J., Oehlert, M., Quinn, C., & van Barneveld, A.
(2007). Immersive learning simulations: The demand for, and demands of, simulations,
scenarios, and serious games. Santa Rosa, CA: The eLearning Guild.
Wilson, B. (2006). Gender differences in types of assignments preferred: Implications for
computer science instruction. Journal of Educational Computing Research, 34(3),
245–255.
Wolfe, J., & Crookall, D. (1998). Developing a scientific knowledge of simulation/
gaming. Simulation & Gaming, 29(1), 7–19.
Justice and Ritzhaupt 125

Author Biographies
Lenora Jean Justice is an assistant professor at Morehead State University
where she teaches graduate educational technology courses and chairs doctoral
student research committees. Additionally she is a consultant to several career
pathway and educational training organizations. Her research interests include
using games and simulations in education and training, mobile learning, flipped
instruction, and using the backchannel in education and training.

Albert D. Ritzhaupt is an associate professor of educational technology in the


School of Teaching and Learning at the University of Florida. His primary
research areas focus on the design and development of technology-enhanced
learning environments and technology integration in education. His publica-
tions have appeared in multiple venues.

You might also like