Beyond the Traditional: A Systematic Review of Digital Game-Based Assessment for Students’ Knowledge, Skills, and Affections
Abstract
:1. Introduction
1.1. Traditional SA Methods
1.2. Digital Game-Based Assessment
1.3. Previous Reviews and the Present Study
2. Method
2.1. Database and Search Terms
2.2. Inclusion and Exclusion Criteria
2.3. Study Selection
2.4. Data Analysis
3. Results
3.1. Overview of the Participants in the Included Studies
3.1.1. Participants’ Country Regions
3.1.2. Numbers of Participants
3.1.3. Educational Levels of the Participants
3.2. Characteristics of the Games Used in the Included Studies
3.2.1. Platforms of the Games
3.2.2. Genres of the Games
3.2.3. Commercial Access to the Games
3.3. Assessment Contents and Methods of the Included Studies
3.3.1. Content of Assessment
3.3.2. Method of assessment
3.4. Data Analysis Techniques and Results of the Included Studies
3.4.1. Data Analysis Techniques
3.4.2. Data Analysis Results
4. Discussion
4.1. Current States of DGBA Studies
4.1.1. The Overview of the Participants in the DGBA Studies
4.1.2. The Characteristics of the Games Used in the DGBA Studies
4.1.3. The Assessment Contents and Methods Used in the DGBA Studies
4.1.4. The Data Analysis Techniques and Results Reported in the DGBA Studies
4.2. Recommendations for Future Studies
4.3. Limitation
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- OECD. PISA 2018 Assessment and Analytical Framework; OECD: Paris, France, 2019. [Google Scholar]
- Puntambekar, S.; Hubscher, R. Tools for scaffolding students in a complex learning environment: What have we gained and what have we missed? Educ. Psychol. 2005, 40, 1–12. [Google Scholar] [CrossRef]
- Roschelle, J.; Dimitriadis, Y.; Hoppe, U. Classroom orchestration: Synthesis. Comput. Educ. 2013, 69, 523–526. [Google Scholar] [CrossRef]
- Min, W.; Frankosky, M.H.; Mott, B.W.; Rowe, J.P.; Smith, P.A.M.; Wiebe, E.; Boyer, K.E.; Lester, J.C. DeepStealth: Game-Based Learning Stealth Assessment with Deep Neural Networks. IEEE Trans. Learn. Technol. 2019, 13, 312–325. [Google Scholar] [CrossRef] [Green Version]
- Zhu, S.; Guo, Q.; Yang, H.H. Digital Game-Based Assessment on Student Evaluation: A Systematic Review. In Proceedings of the International Conference on Blended Learning, Nagoya, Japan, 10–13 August 2022; Springer Press: Cham, Germany, 2022; pp. 85–96. [Google Scholar]
- Shute, V.J. Stealth assessment in computer-based games to support learning. Comput. Games Instr. 2011, 55, 503–524. [Google Scholar]
- Gomez, M.J.; Ruipérez-Valiente, J.A.; Clemente, F.J.G. A systematic literature review of digital game-based assessment empirical studies: Current trends and open challenges. arXiv 2022, arXiv:2207.07369. [Google Scholar]
- Alonso-Fernández, C.; Calvo-Morata, A.; Freire, M.; Martínez-Ortiz, I.; Fernández-Manjón, B. Evidence-based evaluation of a serious game to increase bullying awareness. Interact. Learn. Environ. 2020, 1–11. [Google Scholar] [CrossRef]
- Cameron, L.; Wise, S.L.; Lottridge, S.M. The Development and Validation of the Information Literacy Test. Coll. Res. Libr. 2007, 68, 229–237. [Google Scholar] [CrossRef] [Green Version]
- Bodmann, S.M.; Robinson, D.H. Speed and Performance Differences among Computer-Based and Paper-Pencil Tests. J. Educ. Comput. Res. 2004, 31, 51–60. [Google Scholar] [CrossRef]
- Laumer, S.; Eckhardt, A.; Weitzel, T. Online gaming to find a new job–examining job seekers’ intention to use serious games as a self-assessment tool. Ger. J. Hum. Resour. Manag. 2012, 26, 218–240. [Google Scholar] [CrossRef]
- Eysenck, M.W.; Derakhshan, N.; Santos, R.; Calvo, M. Anxiety and cognitive performance: Attentional control theory. Emotion 2007, 7, 336–353. [Google Scholar] [CrossRef] [Green Version]
- Cassady, J.; Johnson, R.E. Cognitive Test Anxiety and Academic Performance. Contemp. Educ. Psychol. 2002, 27, 270–295. [Google Scholar] [CrossRef] [Green Version]
- Chen, F.; Cui, Y.; Chu, M.-W. Utilizing Game Analytics to Inform and Validate Digital Game-based Assessment with Evidence-centered Game Design: A Case Study. Int. J. Artif. Intell. Educ. 2020, 30, 481–503. [Google Scholar] [CrossRef]
- Lu, K.; Yang, H.H.; Shi, Y.; Wang, X. Examining the key influencing factors on college students’ higher-order thinking skills in the smart classroom environment. Int. J. Educ. Technol. High. Educ. 2021, 18, 1. [Google Scholar] [CrossRef]
- Lu, K.; Yang, H.; Xue, H. Investigating the four-level inquiry continuum on college students’ higher order thinking and peer interaction tendencies. Int. J. Innov. Learn. 2021, 30, 358–367. [Google Scholar] [CrossRef]
- Oakleaf, M. Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches. Libr. Acad. 2008, 8, 233–253. [Google Scholar] [CrossRef]
- Bertling, M.; Jackson, G.T.; Oranje, A.; Owen, V.E. Measuring argumentation skills with game-based assessments: Evidence for incremental validity and learning. In Proceedings of the International Conference on Artificial Intelligence in Education, Madrid, Spain, 22–26 June 2015; Springer Press: Cham, Germany, 2015; pp. 545–549. [Google Scholar]
- Kim, Y.J.; Almond, R.G.; Shute, V.J. Applying evidence-centered design for the development of game-based assessments in physics playground. Int. J. Test. 2016, 16, 142–163. [Google Scholar] [CrossRef]
- Margolis, A.; Gavrilova, E.; Kuravsky, L.; Shepeleva, E.; Voitov, V.; Ermakov, S.; Dumin, P. Measuring Higher-Order Cognitive Skills in Collective Interactions with Computer Game. Cult. Psychol. 2021, 17, 90–104. [Google Scholar] [CrossRef]
- Song, Y.; Sparks, J.R. Building a game-enhanced formative assessment to gather evidence about middle school students’ argumentation skills. Educ. Technol. Res. Dev. 2019, 67, 1175–1196. [Google Scholar] [CrossRef]
- Slimani, A.; Elouaai, F.; Elaachak, L.; Yedri, O.B.; Bouhorma, M.; Sbert, M. Learning Analytics Through Serious Games: Data Mining Algorithms for Performance Measurement and Improvement Purposes. Int. J. Emerg. Technol. Learn. iJET 2018, 13, 46–64. [Google Scholar] [CrossRef] [Green Version]
- Qian, M.; Clark, K.R. Game-based Learning and 21st century skills: A review of recent research. Comput. Hum. Behav. 2016, 63, 50–58. [Google Scholar] [CrossRef]
- Weiner, E.J.; Sanchez, D.R. Cognitive ability in virtual reality: Validity evidence for VR game-based assessments. Int. J. Sel. Assess. 2020, 28, 215–235. [Google Scholar] [CrossRef]
- Kiili, K.; Moeller, K.; Ninaus, M. Evaluating the effectiveness of a game-based rational number training-in-game metrics as learning indicators. Comput. Educ. 2018, 120, 13–28. [Google Scholar] [CrossRef]
- Hautala, J.; Heikkilä, R.; Nieminen, L.; Rantanen, V.; Latvala, J.M.; Richardson, U. Identification of reading difficulties by a digital game-based assessment technology. J. Educ. Comput. Res. 2020, 58, 1003–1028. [Google Scholar] [CrossRef]
- DeRosier, M.E.; Thomas, J.M. Establishing the criterion validity of Zoo U’s game-based social emotional skills assessment for school-based outcomes. J. Appl. Dev. Psychol. 2018, 55, 52–61. [Google Scholar] [CrossRef]
- Chuang, T.-Y.; Liu, E.Z.-F.; Shiu, W.-Y. Game-based creativity assessment system: The application of fuzzy theory. Multimedia Tools Appl. 2015, 74, 9141–9155. [Google Scholar] [CrossRef]
- Shute, V.J.; Wang, L.; Greiff, S.; Zhao, W.; Moore, G. Measuring problem solving skills via stealth assessment in an engaging video game. Comput. Hum. Behav. 2016, 63, 106–117. [Google Scholar] [CrossRef]
- Acquah, E.O.; Katz, H.T. Digital game-based L2 learning outcomes for primary through high-school students: A systematic literature review. Comput. Educ. 2019, 143, 103667. [Google Scholar] [CrossRef]
- Tokac, U.; Novak, E.; Thompson, C.G. Effects of game-based learning on students’ mathematics achievement: A meta-analysis. J. Comput. Assist. Learn. 2019, 35, 407–420. [Google Scholar] [CrossRef]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; PRISMA Group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [Green Version]
- Cheng, S.-C.; Hwang, G.-J.; Lai, C.-L. Critical research advancements of flipped learning: A review of the top 100 highly cited papers. Interact. Learn. Environ. 2020, 30, 1751–1767. [Google Scholar] [CrossRef]
- Buford, C.C.; O'Leary, B.J. Assessment of Fluid Intelligence Utilizing a Computer Simulated Game. Int. J. Gaming Comput. Simul. 2015, 7, 1–17. [Google Scholar] [CrossRef]
- Krebs, E.; Jaschek, C.; Von Thienen, J.; Borchart, K.P.; Meinel, C.; Kolodny, O. Designing a video game to measure creativity. In Proceedings of the 2020 IEEE Conference on Games (CoG), Osaka, Japan, 24–27 August 2020; IEEE Press: New York, NY, USA, 2020; pp. 407–414. [Google Scholar]
- Halverson, R.; Owen, V.E. Game-based assessment: An integrated model for capturing evidence of learning in play. Int. J. Learn. Technol. 2014, 9, 111. [Google Scholar] [CrossRef]
- Irava, V.; Pathak, A.; DeRosier, M.; Singh, N.C. Game-Based Socio-Emotional Skills Assessment: A Comparison Across Three Cultures. J. Educ. Technol. Syst. 2019, 48, 51–71. [Google Scholar] [CrossRef]
- Delgado-Gómez, D.; Sújar, A.; Ardoy-Cuadros, J.; Bejarano-Gómez, A.; Aguado, D.; Miguelez-Fernandez, C.; Blasco-Fontecilla, H.; Peñuelas-Calvo, I. Objective Assessment of Attention-Deficit Hyperactivity Disorder (ADHD) Using an Infinite Runner-Based Computer Game: A Pilot Study. Brain Sci. 2020, 10, 716. [Google Scholar] [CrossRef]
- Peters, H.; Kyngdon, A.; Stillwell, D. Construction and validation of a game-based intelligence assessment in minecraft. Comput. Hum. Behav. 2021, 119, 106701. [Google Scholar] [CrossRef]
- Ventura, M.; Shute, V. The validity of a game-based assessment of persistence. Comput. Hum. Behav. 2013, 29, 2568–2572. [Google Scholar] [CrossRef]
- Quiroga, M.; Escorial, S.; Román, F.J.; Morillo, D.; Jarabo, A.; Privado, J.; Hernández, M.; Gallego, B.; Colom, R. Can we reliably measure the general factor of intelligence (g) through commercial video games? Yes, we can! Intelligence 2015, 53, 1–7. [Google Scholar] [CrossRef]
- Wang, D.; Liu, H.; Hau, K.-T. Automated and interactive game-based assessment of critical thinking. Educ. Inf. Technol. 2022, 27, 4553–4575. [Google Scholar] [CrossRef]
- Delgado, M.T.; Uribe, P.A.; Alonso, A.A.; Díaz, R.R. TENI: A comprehensive battery for cognitive assessment based on games and technology. Child Neuropsychol. 2016, 22, 276–291. [Google Scholar] [CrossRef]
- Cutumisu, M.; Chin, D.B.; Schwartz, D.L. A digital game-based assessment of middle-school and college students’ choices to seek critical feedback and to revise. Br. J. Educ. Technol. 2019, 50, 2977–3003. [Google Scholar] [CrossRef]
- Craig, A.B.; DeRosier, M.E.; Watanabe, Y. Differences between Japanese and US children’s performance on “Zoo U”: A game-based social skills assessment. Games Health J. 2015, 4, 285–294. [Google Scholar] [CrossRef]
- Shute, V.J.; Rahimi, S. Stealth assessment of creativity in a physics video game. Comput. Hum. Behav. 2020, 116, 106647. [Google Scholar] [CrossRef]
- Mislevy, R.J.; Haertel, G.D. Implications of Evidence-Centered Design for Educational Testing. Educ. Meas. Issues Prac. 2006, 25, 6–20. [Google Scholar] [CrossRef]
- Van Laar, E.; van Deursen, A.J.; van Dijk, J.A.; de Haan, J. The relation between 21st-century skills and digital skills: A systematic literature review. Comput. Hum. Behav. 2017, 72, 577–588. [Google Scholar] [CrossRef]
- Westera, W.; Nadolski, R.; Hummel, H. Serious Gaming Analytics: What Students´ Log Files Tell Us about Gaming and Learning. Int. J. Serious Games 2014, 1, 35–50. [Google Scholar] [CrossRef] [Green Version]
- Auer, E.M.; Mersy, G.; Marin, S.; Blaik, J.; Landers, R.N. Using machine learning to model trace behavioral data from a game-based assessment. Int. J. Sel. Assess. 2022, 30, 82–102. [Google Scholar] [CrossRef]
- Kim, Y.J.; Knowles, M.A.; Scianna, J.; Lin, G.; Ruipérez-Valiente, J.A. Learning analytics application to examine validity and generalizability of game-based assessment for spatial reasoning. Br. J. Educ. Technol. 2022, 54, 355–372. [Google Scholar] [CrossRef]
- Kim, Y.J.; Ifenthaler, D. Game-based assessment: The past ten years and moving forward. In Game-Based Assessment Revisited; Springer Press: Cham, Germany, 2019; pp. 3–11. [Google Scholar]
- Kinzie, M.B.; Joseph, D.R.D. Gender differences in game activity preferences of middle school children: Implications for educational game design. Educ. Technol. Res. Dev. 2008, 56, 643–663. [Google Scholar] [CrossRef]
- Breuer, J.S.; Bente, G. Why so serious? On the relation of serious games and learning. Eludamos J. Comput. Game Cult. 2010, 4, 7–24. [Google Scholar] [CrossRef]
Inclusion Criteria | Exclusion Criteria |
---|---|
The research must be in the field of education or educational psychology. | All non-educational fields, such as medical, workplace, etc. |
The study must be written in English. | Written not in English. |
The use of digital games must be for the purpose of SA. | Other research purposes, such as game-based learning, game design, etc. |
The participants of the study must be students. | The participants are not students, such as adults in the workplace, patients in medical settings, teachers, etc. |
The study must have conducted an empirical inquiry using a digital game. | Other forms of research, such as framework proposal, qualitative study, case study, content analysis, etc. |
The data collected were derived from the player’s click interactions with the game. | Data obtained outside the game, such as questionnaires, physiological (such as eye movement) or neurological (such as electroencephalogram) data. |
Game Genre | Definition of Game Genre | Example Studies |
---|---|---|
Adventure | Explore the unknown and resolve riddles using narrative hints. | Buford & O’Leary, 2015 [34]; Min et al., 2019 [4] |
Simulation | Attempt to simulate as closely as possible a variety of real-world situations. | Slimani et al., 2018 [22]; Weiner & Sanchez, 2020 [24] |
Strategy | Establish a setting that encourages complicated problem-solving and analysis while giving players complete freedom over how they interact with, manage, and employ game characters and things. | Krebs et al., 2020 [35]; Halverson & Owen, 2014 [36] |
Role-playing | Provide players the opportunity to engage with the people in the game’s scenario while taking on the roles of individuals living in a fictional world. | Irava et al., 2019 [37]; DeRosier & Thomas, 2018 [27] |
Educational | The game was created for a specific discipline or subject, with clear traces of knowledge learning. | Kiili et al., 2018 [25]; Chen et al., 2020 [14] |
Puzzle | Stimulate logic, sensitivity, etc. by mobilizing players’ eyes, hands, and brains. | Delgado-Gómez et al., 2020 [38]; Chuang et al., 2015 [28] |
Content of Assessment | Definition of the Assessment Content | Example Studies |
---|---|---|
Discipline-specific knowledge | Specific knowledge in a particular subject in school | Hautala et al., 2020 [26]; Kiili et al., 2018 [25] |
Affective/psychological states | Involves attitude, awareness, perception, control, and emotion | Alonso-Fernández et al., 2020 [8]; Ventura & Shute, 2013 [40] |
Contemporary competencies | High order skills necessary for students in the 21st century | Shute et al., 2016 [29]; Song & Sparks, 2019 [21] |
Cognitive ability | The processing, storage and extraction of information by the human brain | Quiroga et al., 2015 [41]; Delgado-Gómez et al., 2020 [38] |
Method of Assessment | Definition of the Assessment Method | Example Studies |
---|---|---|
Summative assessment using final scores | Use the game’s final scores, including game coins and game score, as a gauge for the assessment content. | Song & Sparks, 2019 [21] Wang et al., 2022 [42] |
Summative assessment using process data | Utilize process information as indicators for direct assessment, such as playtime or the number of correct replies. | Kiili et al., 2018 [25]; Tenorio Delgado et al., 2016 [43] |
Formative assessment using process data | Calculate indicators based on the player’s process data through the game’s built-in formula. | Cutumisu et al., 2019 [44]; Craig et al., 2015 [45] |
Formative assessment modeling with process data | Mine feature variables using process data to build prediction models. | Chen et al., 2020 [14]; Shute & Rahimi, 2021 [46] |
Method of Assessment | Number of Studies Using the Supervised Model Technique | Number of Studies Using the Statistical Analysis Technique |
---|---|---|
Summative assessment using final scores | 0 | 18 |
Summative assessment using process data | 0 | 12 |
Formative assessment using process data | 0 | 12 |
Formative assessment modeling with process data | 18 | 11 |
Data Analysis Technique | Number of Studies Using the Technique | Percentage of the Total Studies |
---|---|---|
Supervised models | ||
Linear regression | 5 | 10% |
Elastic net regression | 1 | 2% |
Bayesian ridge regression | 1 | 2% |
Mixed linear model | 1 | 2% |
Logistic regression | 2 | 4% |
K-nearest neighbor | 1 | 2% |
Decision tree | 3 | 6% |
Random forest | 4 | 8% |
Gradient boosting decision tree | 2 | 4% |
Adaboost | 1 | 2% |
Support vector machine | 4 | 8% |
Conditional random field | 1 | 2% |
naïve bayes | 3 | 6% |
Bayesian network | 2 | 4% |
Dynamic Bayesian network | 1 | 2% |
Deep neural network | 3 | 6% |
Long short-term memory | 2 | 4% |
Statistical analysis | ||
Correlation analysis | 29 | 58% |
Variance analysis | 11 | 22% |
EM clustering | 2 | 4% |
K-means clustering | 1 | 2% |
Assessment Metric | Range of the Metric | Number of Studies |
---|---|---|
R2 | 0.260–0.431 | 4 |
MAE | 0.540–0.640 | 2 |
>1 | 1 | |
RMSE | 0.506–0.770 | 2 |
>1 | 1 | |
Accuracy | 0.637–0.715 | 2 |
0.900–0.980 | 2 | |
Recall | 0.980 | 1 |
Sensitivity | 0.620 | 1 |
0.925 | 1 | |
Specificity | 0.540 | 1 |
False positive rate | 0.310 | 1 |
Correlation with external tests | 0.203–0.410 | 4 |
0.530–0.670 | 3 | |
0.970 | 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, S.; Guo, Q.; Yang, H.H. Beyond the Traditional: A Systematic Review of Digital Game-Based Assessment for Students’ Knowledge, Skills, and Affections. Sustainability 2023, 15, 4693. https://doi.org/10.3390/su15054693
Zhu S, Guo Q, Yang HH. Beyond the Traditional: A Systematic Review of Digital Game-Based Assessment for Students’ Knowledge, Skills, and Affections. Sustainability. 2023; 15(5):4693. https://doi.org/10.3390/su15054693
Chicago/Turabian StyleZhu, Sha, Qing Guo, and Harrison Hao Yang. 2023. "Beyond the Traditional: A Systematic Review of Digital Game-Based Assessment for Students’ Knowledge, Skills, and Affections" Sustainability 15, no. 5: 4693. https://doi.org/10.3390/su15054693
APA StyleZhu, S., Guo, Q., & Yang, H. H. (2023). Beyond the Traditional: A Systematic Review of Digital Game-Based Assessment for Students’ Knowledge, Skills, and Affections. Sustainability, 15(5), 4693. https://doi.org/10.3390/su15054693