0% found this document useful (0 votes)
27 views22 pages

Integrating AI-based Adaptive Learning Into The Flipped Classroom Model To Enhance Engagement and Learning Outcomes

This study examines the integration of AI-based adaptive learning systems within the flipped classroom model to improve student engagement and learning outcomes in programming education. Conducted over 13 weeks with two undergraduate groups, the findings indicate that the group utilizing AI tools showed significant improvements in motivation and learning results due to real-time, personalized feedback. The research highlights the potential of AI to enhance collaborative learning while addressing individual student needs and challenges in mastering programming concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views22 pages

Integrating AI-based Adaptive Learning Into The Flipped Classroom Model To Enhance Engagement and Learning Outcomes

This study examines the integration of AI-based adaptive learning systems within the flipped classroom model to improve student engagement and learning outcomes in programming education. Conducted over 13 weeks with two undergraduate groups, the findings indicate that the group utilizing AI tools showed significant improvements in motivation and learning results due to real-time, personalized feedback. The research highlights the potential of AI to enhance collaborative learning while addressing individual student needs and challenges in mastering programming concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Computers and Education: Arti cial Intelligence 8 (2025) 100392

Contents lists available at ScienceDirect

Computers and Education: Artificial Intelligence


journal homepage: www.sciencedirect.com/journal/computers-and-education-artificial-intelligence

Integrating AI-based adaptive learning into the flipped classroom model to


enhance engagement and learning outcomes
Jozsef Katona a,b,c,d,* , Klara Ida Katonane Gyonyoru e
a
Department of Software Development and Application, Institute of Computer Engineering, University of Dunaujvaros, Dunaujvaros, Hungary
b
Institute of Electronics and Communication Systems, Kandó Kálmán Faculty of Electrical Engineering, Obuda University, Budapest, Hungary
c
GAMF Faculty of Engineering and Computer Science, John Von Neumann University, Kecskemet, Hungary
d
Department of Applied Quantitative Methods, Faculty of Finance and Accountancy, Budapest University of Economics and Business, Budapest, Hungary
e
Department of Organizational Development and Communication Science, Institute of Social Sciences, University of Dunaujvaros, Dunaujvaros, Hungary

A R T I C L E I N F O A B S T R A C T

Keywords: This study explores the integration of the AI-powered adaptive learning system within the flipped classroom
AI-Based adaptive learning model and discusses programming education. The increasing demand for the acquisition of programming skills in
Flipped classroom the digital era generates the development of innovative pedagogical approaches to help students overcome the
Programming education
difficulties in mastering programming concepts. The study investigates how AI-based adaptive feedback in­
Personalized learning
fluences student engagement, motivation, and learning outcomes. This study was conducted for 13 weeks, with
Mixed-methods research
two groups of undergraduate students. In one of the groups, tshe traditional flipped classroom model was
applied, while in the other group, AI-driven adaptive learning tools were introduced. In this research method­
ology, a mixed-methods approach was adopted, integrating quantitative analyses with qualitative data. These
showed that the experimental group demonstrated significant improvements in learning results and motivation,
thereby realiszing the value of real-time, personalized feedback provided through the AI system. The qualitative
findings revealed that with the AI system, there was an increase in student autonomy; thus, there is increased
responsibility among students to collaborate in in-class participation, which helps enhance a more dynamic and
cooperative learning environment. On the whole, these findings reveal important implications of the integration
of AI into flipped classrooms, while at the same time, they reveal how to make the modernization of pro­
gramming education achieve a better overall experience in student learning.

1. Introduction decades is the flipped classroom model. It revolutionizes traditional


teaching as students can acquire theoretical input outside the classroom
The role of the subjects of computer science and programming has through videos, online resources, or some interactive platforms, while
been constantly increasing in the field of education. Digital literacy, to full classroom time is spent on practical applications, problem-solving
which the competency of programming belongs, has become one of the collectively, and detailed topic discussions (Bishop & Verleger, 2013;
basic preconditions for succeeding in the labour market. The rapid Freeman et al., 2014; Hodges, 2020). The flipped classroom model is
evolution of information technology has a remarkable impact on econ­ based on an active learning approach, which facilitates student
omies as well as society and has made programming one of the funda­ engagement and deeper apprehension of the course material through
mental competencies enabling creative problem-solving, innovation, direct student participation in in-person classroom activities (Odum
and technological development. Apart from software development, et al., 2021; Prince, 2004). However, it is effective under certain orga­
programming skills have now become a significant requirement in nizational preconditions, such as the availability of technological re­
several fields, including engineering, science, finance, and many other sources, students with self-regulation skills regarding their learning
important factors of our everyday lives. habits, and the readiness of educators to ensure flexibility and adapt­
One of the most innovative pedagogical changes of the last few ability for curriculum design (Abeysekera & Dawson, 2015; Akçayır &

* Corresponding author. Department of Software Development and Application, Institute of Computer Engineering, University of Dunaujvaros, Dunaujvaros,
Hungary.
E-mail addresses: [email protected] (J. Katona), [email protected] (K.I.K. Gyonyoru).

https://doi.org/10.1016/j.caeai.2025.100392
Received 29 December 2024; Received in revised form 9 March 2025; Accepted 9 March 2025
Available online 10 March 2025
2666-920X/© 2025 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Akçayır, 2018). 1.3. Challenges and purpose of the study


The importance of computer science and programming in education
is expanding rapidly and having programming skills is regarded as a The benefits of this hybrid model have its challenges. Students have
crucial competency in the contemporary labour market. Due to the to take more responsibility for independent learning, especially in pro­
integration of automation, artificial intelligence and data science, crit­ cessing theoretical material without a teacher. Accessible technological
ical thinking and problem-solving skills are essential to be competitive resources include reliable internet and digital devices, while sufficient
in the workforce. However, programming education is often faced with self-regulation skills are needed to access them for learning purposes
student disengagement, lack of personalized support, and difficulties in (Abeysekera & Dawson, 2015; Akçayır & Akçayır, 2018). Educators
abstracting. Recent research on enhancing programming education has must face new demands as well, for example, deeper pedagogical
approached the effectiveness of AI-based adaptive learning systems in preparation, flexibility in curriculum design, and the ability to provide
meeting these challenges. According to the results, providing students differential support during in-class sessions (Brown, 2018; Lo, 2017).
with real-time feedback and scaffolding, the involvement of AI-driven The integration of AI-driven adaptive learning in the model of flip­
systems can lead to an increase in student engagement, motivation, ped classrooms also raises some theoretical challenges and opportunities
and improved learning outcomes (Heffernan et al., 2016; Holmes et al., that should be explored in the context of student engagement and
2019; Yilmaz & Yilmaz, 2023). motivation. The most challenging issue, however, concerns the balance
In programming education, the implementation of AI-based adaptive between the degree of student autonomy and the amount of support the
learning systems within the flipped classroom model can be effective AI system will provide. Even though flipped classrooms promote the
allowing more flexibility, where students engage with the theoretical independent work of students, an AI system could enhance or suppress
material and foundational concepts independently at their own learning this learning method depending on how it is used. While such systems
pace, while actual classroom sessions are devoted to addressing specific can give timely and individual feedback, they can also make students
programming challenges and providing immediate feedback and group dependent on automated help, rather than engaging with content in a
problem-solving (Bishop & Verleger, 2013; Bergmann, 2012; O’Flaherty critical manner (Lo et al., 2017).
& Phillips, 2015; Lopez, 2022). Programming education requires Besides, the integration of AI into the flipped classroom changes the
extensive practice to master complex logical structures and abstractions, pedagogical role of teachers from knowledge dispensers to learning fa­
and there must be a balance of individualized guidance with practical cilitators, which is a matter of changing pedagogical demands on edu­
application, which can be addressed in the flipped classroom model cators. AI-enhanced flipped classrooms also create a rationale in terms of
effectively (Hendrik & Hamzah, 2021; Kim et al., 2014; Lameras et al., the necessary robust technological infrastructure and strategic ap­
2012). proaches to meet equity issues resulting from students’ differing access
to technological tools, which can imped learning outcomes (Brown,
2018, pp. 11–21; Er-Rafyg et al., 2024, pp. 329–342).
1.1. Role of artificial intelligence in education Despite these challenges, the use of AI in a flipped classroom
promises to promote much more personalized and scalable learning
AI is not only a tool for learning, but it has become a significant environments, allowing students to progress in their learning process at
component of education through its integration with adaptive learning an individual pace while receiving targeted help. This theoretical
systems. These systems automatically manage content, pace, and feed­ framework addresses these challenges with attention to teacher adapt­
back according to the dynamic progress and needs of individual ability, student agency, and further exploring the potential of AI to
learners, providing customized learning experiences accompanied by establish conditions in which students are more motivated through an
real-time guidance (Heffernan et al., 2016; Holmes et al., 2019). environment of personalized experiences for learning.
Adaptive learning technologies can be applied well with subjects like The features within an adaptive learning system, such as in an AI-
programming, where students usually have very different levels of based system, ensure real-time feedback on progress through contin­
pre-knowledge and require individualized support in handling abstract uous adjustment in the level of difficulty or change of content, using
concepts and complex problem-solving tasks (Fernandes et al., 2023, pp. student progress while suggesting an individual learning path. The
1–6; Seo et al., 2021). AI-driven adaptive learning systems enhance purpose of the individualized approach mechanisms is to solve every
traditional models of education through real-time adaptability, scaf­ single student’s problem in developing better engagement and motiva­
folding, and feedback. They align with the readiness levels and diverse tion toward mastering complex concepts. The research focuses on how
motivations of the students by ensuring that each learner makes opti­ such systems can go along with the flipped classroom model to enhance
mum progress. In case of programming education, it involves providing the effectiveness of learning experiences in the domain of programming
immediate scaffolding challenges, for example, debugging syntax errors education. The adaptation of AI systems in content and feedback pro­
or designing algorithms and providing customized learning paths to vided to students in real time can reinforce active learning and enable
close understanding gaps (Gligorea et al., 2019; Wang et al., 2023). essential self-regulated learning behaviour to master programming
skills. This study will explore the integration of AI-based adaptive
learning systems into the flipped classroom model to achieve better
1.2. Integration of flipped classroom and AI results in programming education. This research will contribute to the
growing body of evidence on new pedagogies in programming education
The integration of AI-driven adaptive learning into the flipped by studying an experiment in which two student groups were examined
classroom model leads to a hybrid approach that creates an advantage for 13 weeks—one of the groups worked in the conventional flipped
from both perspectives. While the concept of the flipped classroom classroom model and the other group worked with an AI-powered
works to facilitate active learning with cooperation, the adaptive adaptive system enhancement.
learning systems introduce personalization and scalability to meet var­
ied learning needs (Almassri & Zaharudin, 2023). It is assumed by 2. Background
research that such integration can enhance engagement, improve
learning outcomes, and provide an inclusive educational experience Based on the increasing focus on the integration of AI in education
(Seo et al., 2021; Strielkowski et al., 2024). This hybrid model leverages through adaptive learning systems, several recent studies have shown
the structure of the flipped classroom in facilitating collaborative the transformative potential of these technologies in the domains of
problem-solving, while the adaptability of AI is utilized to address in­ science, technology, engineering, and mathematics (STEM). Xu and
dividual challenges and optimize learning trajectories. Ouyang (2022) discussed a range of AI applications that included

2
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

learning prediction, intelligent tutoring systems, behaviour analysis, engagement in the learning process, turning those into real-time updates
and educational robotics. These tools improve not only learning per­ within the learning content (Gligorea et al., 2019). These systems make
formance but also critical higher-order cognitive skills, such as compu­ use of algorithms that follow student progress in passing a curriculum,
tational thinking and problem-solving. The findings indicate the detecting deficiencies and areas of difficulty. With this information, the
growing relevance of AI in meeting the increasingly complex needs of system automatically adjusts task difficulty, provides differential feed­
learners within the fields of STEM in general and computer science in back, and shows learners additional learning material relevant to their
particular. specific needs. One of the significances of such systems is the ability to
In the study of Gligorea et al. (2023), it was examined that how much operate in near real time, thus allowing interventions to be timely and
an AI-driven adaptive learning system contributes toward an e-learning relevant (Wang et al., 2023).
platform. This paper detailed how an AI-driven learning system offers a These systems have adaptive natures and, hence offer personalized
tailored way of learning paths, and continuous, real-time feedback, learning pathways that alter with progress. This can be helpful in do­
significantly enhancing students’ academic involvement and achieve­ mains like programming education, which presents challenges at
ment of students. In another research, Kwak et al. (2023) examined the different levels, tending to involve different issues like debugging up to
generative AI-based adaptive learning system for programming educa­ difficult algorithmic mastering (Fernandes et al., 2023, pp. 1–6; Seo
tion. Their proposed system integrates learner data with domain et al., 2021).
knowledge to dynamically generate personalized learning materials,
provide feedback at an individual level, offer targeted tasks, and make it 2.1.2. Strengths of AI-based adaptive learning systems
suited to the specific demands of programming education. Some strong points concerning adaptive learning systems based on
Adding more detail, Er-Rafyg et al. (2024, pp. 329–342) examined AI make them particularly effective in educational settings. One of the
the potential and challenges of implementing AI-powered adaptive key advantages is their ability to provide real-time feedback and in­
learning systems. Their study highlighted the potential of such tech­ terventions. These systems analyse the progress made by students,
nologies to overcome some of the shortcomings of traditional teaching providing them immediate feedback on certain tasks and allowing
methodologies through immediate feedback, tailored content delivery, learners to understand where mistakes are committed by offering advice
and precise tracking of student progress. However, they also emphasized on how to correct errors. Personalized feedback can help learners ac­
a few critical challenges that should be considered when applying these quire knowledge deeper since students are provided with the exact
technologies, for instance, data privacy concerns, high development amount of support regarding the areas that need to be improved
costs, and the difficulty of integrating AI systems into existing in­ (Heffernan et al., 2016; Holmes et al., 2019).
frastructures. Despite these limitations, the study concluded that if the Another strength of adaptive learning systems is their potential to
application is properly designed to address individual learners’ needs, address individual learning needs. While the classic settings in a class­
adaptive learning can significantly enhance engagement and room involve all students going per the curriculum at the same pace, AI
performance. systems tailor the learning experience based on the different capacities
The integration of adaptive learning systems using AI integrated with of students (Gligorea et al., 2023). In this way, it will not leave behind
flipped classroom models ensures the synergistic use of both for mutual any students who need more time to understand the learning material or
strengths. This approach provides an ideal framework for active individually challenging areas. What is more, it can accelerate the
learning, ensuring classroom collaboration or problem-solving activities learning process for smarter and talented students. Such scalability in
while complementing the personalized support each student needs to these systems enables wide applications in different settings, from K-12
optimize their unique learning trajectory. This combination is promising to higher education, and thus the systems can be useful for both small
for programming education, as it aligns well with practice-oriented skill and large classrooms (Fernandes et al., 2023, pp. 1–6).
acquisition and iterative problem-solving. Additionally, AI-driven systems can promote more active and inde­
In alignment with these insights, the current study examines the pendent learning. While allowing students more freedom in their
effectiveness of an AI-enhanced adaptive learning system within a flip­ learning paths, these systems can foster self-directed learning and crit­
ped classroom context for teaching computer programming. It explores ical thinking. Indeed, the ability for independent problem-solving and
how the combination of these methods can improve learning outcomes, computational thinking has been seen as an important competence in
engagement, and motivation to meet the unique challenges in pro­ programming education for successful learning (Fernandes et al., 2023,
gramming education. Additionally, this research contributes to the pp. 1–6; Seo et al., 2021).
emerging body of evidence supporting innovative pedagogies in the
STEM disciplines. The approach aims to create an inclusive, personal­ 2.1.3. Application in education
ized, and effective learning environment for all students. AI-based adaptive learning systems find their niche within educa­
tional settings, particularly concerning the most complicated areas of
2.1. AI-based adaptive learning systems programming, where students often have difficulties with mis­
conceptions and completing complex tasks. An AI-driven adaptive
AI-based adaptive learning systems represent one of the fastest- learning system can fill in the gap between purely theoretical knowledge
growing areas in educational technology; they leverage AI algorithms and its practical application by guiding a student through real-time
to make the learning process more effective and personalized for each debugging, syntax errors, the design of algorithms, and other difficult
student. These systems automatically adjust the content, pace, and areas (Gligorea et al., 2019; Wang et al., 2023). For example, the
structure in real time based on performance, progress, and interaction adaptive learning system will assign customized exercises and practice
by learners. Adaptive learning systems use various techniques of AI, activities within the range of the current competency of the student in
which include machine learning, natural language processing (NLP), programming classes. As a consequence, students can continue to be
and data analysis to meet diverse learner needs, offering appropriate challenged without becoming overwhelmed (Xu & Ouyang, 2022).
challenges and support to individual students. (Heffernan et al., 2016; Apart from programming, such systems have been applied in math­
Holmes et al., 2019). ematics, science, and even language learning. The flexibility of these AI-
based systems allows application in different curricula, changing the
2.1.1. Definition and characteristics level of difficulty and the content to the progress of the learner. This is
AI-based adaptive learning systems can craft highly personalized, particularly effective in large classrooms where it may be challenging to
data-driven learning experiences. These monitor the learning behaviour offer personalized instruction.
of students during activities related to response time, accuracy, and AI, therefore, presents a promising model, especially in the case of

3
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

flipped classrooms, which are associated with improved engagement (Vihavainen et al., 2012, pp. 171–176). Evidence from various research
and learning outcomes. Students can use out-of-class time to learn new indicates that such active learning and hands-on experiences create a
material by themselves by engaging with the content at their own better opportunity to learn because students have direct solutions to
learning pace to facilitate in-class active learning. With the support of questions and problems right in the classroom (AlJarrah et al., 2018).
adaptive learning systems, flipped classrooms can provide individual Also, flipped learning environments tend to be more effective than
students with personalized, real-time assessments of their performance traditional lecture-based environments for the development of compu­
that might keep them on track in the course. The synergistic power of AI tational thinking and coding concepts (Zha et al., 2019). Flipped class­
and the flipped classroom concept addresses one of the pivotal chal­ rooms in conjunction with project-based learning methods can also
lenges in modern education: making learning more personal and scal­ significantly enhance student engagement and performance in computer
able to meet the diverse needs of students. In support, it has been shown science courses (Morais et al., 2021).
that this development addresses a significant challenge to meet the need The several issues associated with the flipped classroom model
for personalized and scalable learning experiences to fulfill variant somewhat offset the many advantages. One major issue pertains to
learner needs (Seo et al., 2021; Strielkowski et al., 2024). student procrastination in that most students tend to delay accessing the
preparatory materials until the last minute hence undermining the
2.1.4. Challenges and limitations effectiveness of the model (AlJarrah et al., 2018). It has, however, been
There are also some challenges facing AI-based adaptive learning postulated that with the help of interactive in-video quizzes and
systems. First, quality assurance about the data that shall be used to real-time feedback tools that make students active participants within
make learning experiences personalized should be considered. An AI specified time frames and keep students up to pace (Cummins et al.,
system makes adjustments in real time by using precise and compre­ 2015). Learning Management Systems (LMSs) can provide instructors
hensive data as input; inaccuracies or gaps in that data void the effec­ with some assistance in managing student engagement and at-risk
tiveness of the system. Besides, the development and integration of AI- learner identification early at the beginning of the class (Lacher &
based systems are very expensive and require much time to develop Lewis, 2015). For instance, Clark et al. (2016) reported that in the case
and invest in technology and teacher training (Er-Rafyg et al., 2024, pp. of integrating these tools into engineering flipped courses, there was an
329–342). increase in student preparation and class engagement.
Though these systems provide an individual learning path, some­ It could be added that this model of the flipped classroom, supported
times they do not consider the social and emotional sides of learning. by various interactive tools and technologies, would contribute greatly
The engagement of a student does not depend on cognitive aspects only; to programming education. It enables students to shift away from pas­
peer interaction, motivation, and self-confidence also affect it. So, the sive absorption and actively engage in the process of creating a solution
design of AI systems has to incorporate not only cognitive data but also to programming assignments, thus supporting a deeper understanding
emotional and social ones in order to make learning holistic (Wang et al., and mastering of complex coding concepts. However, this approach
2023). presents many challenges, for example, procrastination and the need for
self-discipline when students are learning independently and going
2.2. The role of the flipped classroom in enhancing programming through a course by themselves. These issues can be optimized with the
education help of adaptive technologies, which can provide real-time feedback on
the current state and knowledge of students.
The flipped classroom has become one of the most visible pedagog­
ical methods as an innovative approach. In particular, it should be used 2.3. Theoretical framework for integrating AI with flipped classrooms
in fields like programming education that require active problem-
solving and extensive hands-on practice. It is different from traditional The integration of AI-based adaptive learning systems with flipped
lecture-based teaching, which flips the use of in-class and out-of-class classrooms can, therefore, be considered one of the most promising
time. Students go through instructional materials including videos and approaches to increasing student engagement and motivation, as it can
readings before coming to class, while time in the classroom is used for ensure better learning performance. To understand how these two
practical applications, group activities, and problem-solving exercises pedagogical approaches complement each other, developing a theoret­
(Mok, 2014). Such readjustment of classroom dynamics bodes particu­ ical rationale that justifies the benefits and impact of their integration is
larly well for programming education, in which mastery of complex necessary. Personalized learning, self-regulated learning, and active
concepts often requires immediate feedback, peer interaction, and sus­ learning provide a high level of theoretical framework grounding for
tained practice (Amresh et al., 2013). synergy between AI and flipped classrooms in educational settings.
Several research works have reiterated the benefits that flipped Personalized learning, tailored to individual students’ needs, has
classrooms bring about in programming education. Programming en­ recently gained remarkable attention. AI-based adaptive learning sys­
compasses activities that are intrinsically hard, abstract, and time- tems effectively support customized learning by providing timely feed­
consuming (Hayashi et al., 2015, pp. 487–489). The flipped classroom back, making adjustments in learning paths, and responding to the
lets students have the option to go through theoretical materials at their unique challenges of individual student performance (Xu & Ouyang,
own pace multiple times for difficult concepts (Kay et al., 2019). This 2022). Personalized learning is grounded in, for instance, Vygotsky’s
self-paced learning approach is especially beneficial for students who (1978) concept of Zone of Proximal Development, which emphasizes
need more time to process new information, allowing them to be better that every learner should be offered personal scaffolding. The notion is
prepared for in-class applications and discussions (Hsu & Lin, 2016; aligned with the development of modern AI systems that will extend this
Maher et al., 2015). For instance, in the study by Edgcomb et al. (2017), intention. The current and the next sections examine other efficient al­
the use of web-based interactive materials in introductory programming ternatives of computer-assisted learning to occur dynamically, ensuring
courses enhanced students’ engagement and preparation, increasing the that the modifications are considered individually to suit each student’s
effectiveness of in-class activities and hands-on learning. needs (Heffernan et al., 2016).
In addition, the flipped classroom changes students from passive The flipped classroom concept supports active learning by allowing
recipients of knowledge to active participants in their learning process. for the consumption of theoretical material from outside the class,
Traditional lectures often limit interaction and leave students uncertain leaving time in class for dynamic implementation. This self-regulated
about how to apply theoretical concepts (Amresh et al., 2013). In flipped learning (SRL), setting learning aims, monitoring progress, and adjust­
classrooms, class time is used for practice, collaborative ing strategies as necessary are also reinforced through AI systems. AI
problem-solving, and real-time feedback from instructors and peers provides constant adaptive feedback that will sustain students in

4
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

regulating their learning behaviour or motivation (Schunk & Zimmer­ mastery of complex programming concepts compared to traditional
man, 2008). In a flipped classroom, that is, when students often enter the flipped classroom models?
class in an independent mode of work, AI tools grant support for SRL RQ2 How do student engagement and motivation in programming
upon personalized prompts, monitoring the process of learning, and education change when exposed to a flipped classroom model
then proffer tailored suggestions for further improvement of the learning enhanced by AI-based adaptive learning systems compared to a
process (Jin et al., 2023). traditional flipped classroom model?
Regarding the pedagogical approach, active learning with the
emphasis on being engaged in activities, is central to the flipped class­ 3. Materials and methods
room model. In this case, AI implemented in this specific context en­
hances the degree of activity amongst students through the provision of The next section elaborates on the experimental design, participants,
immediate and personalized feedback with the objective of constructing learning materials, methods for data collection, storing and strategies to
an interactive session of learning (Holmes et al., 2019). The theory of manage data, ethical concerns, and analysis techniques applied to
active learning, as discussed by Bonwell and Eison (1991), posits that examine the integration of AI-based adaptive learning systems in flipped
learning is most effective when students actively construct knowledge classroom programming education.
through engagement in problem-solving and critical-thinking tasks.
AI-enhanced flipped classrooms are supportive, promoting real-time 3.1. Participants
feedback and offering students active problem-solving during class time.
While both AI-driven learning systems and flipped classrooms are In the baseline experiment, the test subjects are undergraduate stu­
well-established approaches, their combination represents a novel dents enrolled in an introductory programming course. The sample size
contribution. Most of the previous studies consider either one of these is 120 students. The Experimental Group (EG; N = 60) implemented an
two approaches separately. For example, AI-based systems have been enhanced flipped classroom model developed through the use of an
found to increase learning performance through personalized instruc­ adaptive learning system using AI, while the Control Group (CG; N = 60)
tion (Gligorea et al., 2023), while flipped classrooms have been found to received the traditional flipped classroom teaching approach without
enhance engagement and retention in programming education (Mithun the integration of AI.
& Evans, 2018). Few research, however, has been done about their The experiment presented in this paper was taken in Hungary, where
combined effect. This paper fills this lacuna by presenting evidence that for any student attending higher education, compulsory lectures and
AI-enhanced flipped classrooms can enhance engagement and motiva­ laboratories are generally provided. Students attended weekly lectures
tion while fostering deeper learning and mastery of complex program­ and participated as part of the introductory course in programming and
ming concepts at the same time. In general, integrating these two practical activities inlab session activities reinforcing theoretical lec­
approaches presents a synergistic effect that can handle different stu­ tures. The course is credit-based and requires compulsory attendance for
dents’ learning needs on technical subjects for personalized and adap­ a specified minimum number of sessions each term to meet graduation
tive learning. requirements.
However, some challenges in AI-enhanced flipped classrooms have
not been met. These approaches are equally affected by a host of issues 3.1.1. Demographic information
that affect the access to technology gap, teacher training, and quality, as The participants’ age range was between 18 and 25 years (M = 19.5,
well as being prepared for self-regulative learning (Er-Rafyg et al., 2024, SD = 1.8). They were part of a small gender-unbalanced cohort with 70
pp. 329–342). Also, while AI systems can provide personalized feedback, % male and 30 % female test subjects. All participants were studying
their design must ensure that the feedback is actionable helping students Computer Science or related fields; therefore, the content of the pro­
progress toward developing complex skills (Fernandes et al., 2023, pp. gramming course was relevant to their background. This demographic
1–6). Some of the challenges that require thoughtful implementation profile aligns with the target audience that usually attends introductory
and further research to position AI within educational contexts are programming courses.
discussed in this study.
3.1.2. Randomize process
2.4. Research questions A randomization process was followed to reduce the selection bias
and compare the two groups. The random number generator assigned
The flipped classroom model has gained increasing prominence in the participants in either the EG or CG. This random assignment made
programming education and the method of how this relatively new sure to have an equal distribution of participants according to gender,
approach can be further refined to meet the diverse needs of students is age, and prior programming experience. This enabled an increase in the
examined. Programming is inherently dynamic as students need to internal validity of the study. Another important feature of randomiza­
engage deeply with complex problem-solving, logic, and abstract tion in this research is the control of possible biases due to self-selection
thinking; hence, it is an ideal field for pedagogical models that empha­ or non-random assignment.
size active learning and personalized modes of instruction. One of the
most promising innovations in the area is fully integrated AI-based 3.1.3. Verification of prior experience and skills
adaptive learning systems within an overall flipped classroom model. All participants completed a pre-study questionnaire before the
These have the potential to offer real-time feedback, adjust learning experiment on prior programming experience, experience with AI IDE
materials based on individual student progress, and provide a person­ integrated tools, and general digital literacy. The items included
alized learning experience that may significantly improve student multiple-choice and Likert-scale items, such as.
performance.
These research questions are proposed to explore the effectiveness of • "How familiar are you with programming languages (e.g., Python,
the integration of AI-driven adaptive learning systems in the flipped Java)?" (1 = Not familiar, 5 = Very familiar)
classroom setting concerning learning outcomes, engagement, and the • "Rate your proficiency in using learning management systems (e.g.,
potential to address the needs of each student within programming Moodle, Blackboard)." (1 = Low, 5 = High)
education.
This baseline survey also helped confirm that the two groups were
RQ1 How does the integration of AI-based adaptive learning systems comparable concerning base-level skills and experience so that any
into the flipped classroom model impact learning outcomes and the measured variation in the results could be said to have been the result of

5
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

the intervention. To this end, and to control any prior use of AI tools or In addition, for the EG, the logs of the AI system recording the
online learning systems as a confounding variable, specific questions completion of videos, the number of interactions, and the time spent on
about participants’ familiarity and experience with such technologies exercises provided rich detail. This approach allowed the research team
were added. This was checked for completeness against a statistical to monitor participants regarding adherence to the study protocol and
analysis, which verified that no significant differences were present consistency across both groups. These logs were monitored in real time
between the experimental and CGs either in prior programming expe­ for anomalies or patterns that may have an impact on the outcomes of
rience or in familiarity with the AI-based learning platforms. this study. Controlling these factors created equal conditions both for
the experimental and CGs, reducing the effects that could have been
3.1.4. Exclusion criteria caused by any confounding variable. Such an effort allowed for the
Participants were to be excluded in conditions of not meeting the proper testing of the effectiveness of the AI-enhanced rigorously applied
inclusion criteria, which included clients attending less than 80 percent flipped classroom model, confirming that any observed differences
of sessions, and/or if they did not complete both post- and pre-tests. across groups were due to the intervention itself, rather than external or
These were followed to the letter to ascertain the reliability of the unconsidered variables.
data. Also, participants who for any reason did not act in compliance
with the interaction envisaged by the AI system, did not perform the 3.2. Experimental design
compulsory activities, or did not interact with adaptive learning tools,
were filtered out to secure the comparability of the conditions. A quasi-experimental design was adopted to investigate the effective
differences between the AI-enhanced flipped class model and the
3.1.5. Control of external factors traditional flipped class model. The experiment lasted for 13 weeks, that
Control of extraneous variables was a key concern in this regard to is, a whole semester. There are similar course content, assignments, and
ensure the validity of the study results. For example, both groups were assessment methods that were implemented in both groups to minimize
being taught the same concepts regarding programming, such as loops, variability and focus on the intervention itself. Fig. 1 shows the basic
conditionals, and object-oriented programming, with identical materials phases of the actual experimental design.
to maintain consistency in instructional content. Also, all sessions were The AI-based adaptive learning system used in this study was
conducted by the same instructor to avoid any potential instructor bias. designed to facilitate both outside-classroom and in-class learning ac­
The experiment was conducted in a controlled classroom environment tivities. While students worked through self-paced exercises and
that was in a way designed to eliminate any form of distraction, while at received personalized feedback from the AI system outside of the class,
the same time guaranteeing equal access to core resources, including the insights gained from their interactions were also used to enhance in-
computers and internet connectivity. class activities. In-class time was structured to address students’ specific
To make sure that external factors, such as the difference in famil­ weaknesses identified by the system, facilitating collaborative problem-
iarity with certain AI tools or other external influences, would not solving and real-time feedback during peer-to-peer and student-
impact the results of our study, some controls were implemented. For instructor interactions. For instance, students who struggled with spe­
example, before the experiment, participants were asked whether they cific programming concepts were given more targeted practice problems
had used AI-based learning tools before, and this fact was considered during in-class sessions, and instructors used the AI-generated data to
while analysing the results. Furthermore, the facts that participants guide focused discussions that addressed common issues encountered
adhered to the study protocols and did not use external help during the outside of the class.
study period, were verified to ensure that the results were attributable to The system logs students’ progress through the self-paced learning
the experimental conditions. modules each time, changing the degree of difficulty based on how they

Fig. 1. Phases of experimental design.

6
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

are performing in real time. It gives every student individualized hints, task, and patterns of success or struggle, in order to provide immediate
exercises, and feedback used later in tailoring in-class discussions and feedback. Knewton’s adaptive feedback includes offering hints when
problem-solving exercises. The system supports students outside the students struggle with problems, adjusting the complexity of tasks, and
classroom in mastering complex programming concepts while simulta­ suggesting new learning activities based on students’ needs.
neously preparing them for active participation in collaborative (Kasinathan, 2017)
problem-solving activities. Another adaptive learning system used for this study is ALEKS. This
Fig. 2 presents a flowchart depicting the sequence of learning ac­ system assigns personalized learning pathways to each student by
tivities for the EG. It provides a snapshot of students’ interaction with identifying students’ strengths and weaknesses and proposing exercises
the AI system both outside and inside the classroom, ranging from to fill their gaps in knowledge. The continuous revision throughout
individualized practice assignments at home to interactive instructor-led students’ progress of the ALEKS learning path provides immediate
problem-solving in class. This flowchart clearly illustrates that infor­ feedback on the responses to the exercises that help students to under­
mation from activities completed outside of the classroom drives the stand their mistakes and underline those concepts that they perhaps had
planning and development for the in-class session as part of a seamless not learned well (Kumor et al., 2024; Lang, 2017; VanLehn, 2011).
process in moving from the outside-class learning phase to inside-class These tools enabled an Active Learning Classroom where in­
learning. terventions could be provided for students when necessary. It helped
students improve their understanding of programming concepts and
stay more engaged. Instructors utilized real-time data from these sys­
3.3. AI tools and adaptive feedback tems as insights into how students were performing, hence determining
points of frustration and altering pedagogical methods.
In this experiment, EG used the AI-powered adaptive learning plat­ These adaptive learning tools will ensure that the EG receives the
forms Knewton and Assessment and Learning in Knowledge Spaces most personalized and responsive learning for better insight into how
(ALEKS) to track student behaviour and create personalized, real-time AI-enhanced flipped classrooms work in programming education.
feedback. These were chosen due to their previous effectiveness in This adaptive feedback approach befits the framework of findings
providing adaptive learning experiences, especially for domains that and calls to action by several scholars, such as Schlimbach et al. (2022)
require dynamic feedback and personalized learning paths. (Kasinathan, and Janson et al. (2020), for increased student inclusion through feed­
2017; Kumor, 2024) back tailored to changing individual conditions. With such continuous
Knewton is a well-known adaptive learning platform that uses data updates during their interaction, these AI facilities scaffold the process
to adjust the difficulty and sequence of exercises based on students’ of learning efficiently to move ahead with reaching difficult concepts in
individual progress. It collects data from student interactions with the programming at individual learning pace.
platform, such as answers to practice problems, the time spent on each

Fig. 2. Learning functions within the integration of AI-based adaptive learning systems and the flipped classroom model, both in the classroom and home learning
environments.

7
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

This addition would help maintain, transparency regarding the tasks thresholds have been encountered in educational practice and research
handled by the AI tools with adaptative feedback and provide a better because they provide categories that make the measurement of different
understanding of their role in student learning motivation within the levels of engagement clearer to interpret. These categories therefore
flipped classroom. make it easier to assess the quality of student involvement and areas in
which improvement is needed. These thresholds are not novel in moti­
3.4. Learning materials vation and engagement measurement. Many of the studies have used the
"> 8/10″ score for identifying high motivation and engagement and "<
The core teaching materials were a set of video lessons that intro­ 5/10″ score for defining low motivation. For example, Ryan and Deci
duced the major concepts of programming: loops, conditionals, and (2000) in their Self-Determination Theory elaborated that students with
object-oriented principles. The EG received video quizzes with questions low motivation often display low levels of interest and apathy toward
adjusted by the AI system based on student performance. Supportive school activities. Similarly, in educational research such as Schunk and
resources include coding manuals, textbooks, and online articles related Zimmerman (2008), similar scales were used to categorize student
to the class material provided in the experiment. Recommendations for motivation levels into distinct categories. These thresholds are consis­
further readings from the AI system were based on students’ progress tent with good methodological practice in educational research,
and areas of difficulty. The treatment group received an LMS integrated particularly helping to ensure that the differing levels of motivation can
with a variety of AI-powered tools: personalized feedback, progress be clearly and reliably distinguished. Such a scale allows for the effec­
tracking, and adaptive support. tiveness of education to be measured, particularly by comparing
different sets between using an AI-enhanced adaptive system versus
3.5. Data collection methods and research instruments pure teaching methods.
These questionnaires were adapted from previously validated scales
A multi-method data collection approach was implemented to of engagement and motivation to make sure they were valid and reli­
comprehensively obtain data on the effects of the AI-based flipped able. The instruments’ internal consistency was measured. A pilot study
classroom model for a robust understanding of both quantitative and was conducted to refine the survey items for clarity and relevance to the
qualitative outcomes. This will determine the level of students’ knowl­ flipped classroom model. Reliability analysis using Cronbach’s alpha for
edge in programming, problem-solving ability, engagement, and moti­ the questionnaires on engagement and motivation yielded 0.87 and 0.85
vation in programming education. respectively.
The programming knowledge and problem-solving skills were The instructors conducted classroom observations, especially for the
assessed based on a mix of coding tasks and multiple-choice questions students’ interaction and engagement level regarding how far the
(MCQs). The programming test had a total of 30 MCQs that tested stu­ collaborative problem-solving activities were effective. The notes
dents on their theoretical knowledge regarding the basics of program­ observed had been systematically recorded and coded for key themes:
ming: loops, conditionals, and object-oriented principles. For example, "peer interaction," "active participation," and "instructor feedback." This
one MCQ could be: " What is the output of the following Python code independently was examined by two researchers in order to ensure
snippet?" followed by a small code snippet. The rest of the test was a qualitative data analysis was both robust and reliable (Cohen’s kappa =
practical programming session where students were required to put into 0.87).
practice all the theoretical knowledge in solving some problems, such as Further insights into students’ progress were obtained from the data
writing a function that checks whether a given number is prime or logs of the AI system, which detailed the usage of AI tools through
debugging a given piece of code. The test for programming knowledge completion rates on videos, the frequency of interaction, and types of
had questions totalling 100 points, where the MCQs carried 30 points, adaptive feedback. This report was used to assess how AI supported both
and the rest were divided between the practical programming tasks. It out-of-class and in-class learning, and enabled instructors to make real
was a pre-test to establish the baselines of prior knowledge amongst adjustments in their teaching in response to real-time performance.
participants, and a post-test measured the learning outcomes after the Quantitative data was collected through pre- and post-tests, while
intervention concerning the given skill acquired. qualitative data were based on surveys, classroom observational notes,
The questionnaires designed for the study were adapted from the and AI system logs. This integration shows a holistic understanding of
existing scales, reflecting the context of the flipped classroom and AI students’ advances in programming knowledge, engagement, and
integration. Examples of the Likert-scale items in this respect were: " motivation due to AI-enhanced models of flipped classroom instruction.
Rate your level of motivation during the course", ranging from 1 to 10, The integration of various multi-sourced data will eventually ensure
aimed at assessing cognitive, emotional, and behavioural components of that the findings are appropriately anchored both on measurable out­
engagement and intrinsic and extrinsic motivation. These were then comes of the treatment and rich qualitative experiences that enhance the
averaged over 50 questions to create an averaged and fair look at those robustness of the analysis. Therefore, it applied a convergent parallel
aspects. These are averaged scores ensure a more representative mea­ mixed-method approach to fit the detailed academic and behavioural
sure of both student engagement and motivation. This approach pro­ evidence that had wider ramifications of queries on the impact brought
vided a consistent assessment without bias towards unusually high or on by the integration of flipped classrooms and AI-based adaptive
very low scores, thus representing all the elements of students’ experi­ learning in traditional programming education. The data collection
ence in this engagement. The highest possible score for engagement and process was designed to provide insights into the efficiency assessment
motivation was 10 points, reflecting the most accurate overall view of within the framework of the pedagogical model discussed in the
their degree of involvement in the learning process. The use of "> 8/10″ literature.
and "< 5/10″ thresholds to categorize the level of motivation and
engagement is common in both secondary and higher education and 3.6. Data storage and management
thus enables clear categorization of students’ engagement. While a score
of >8/10 determines high motivation, where students have continuous Data collected in the experiment (test scores, survey responses, sys­
interest in and make great contributions to learning, recent studies, and tem logs) were kept in a database encrypted with data protection mea­
teaching experiences indicate engagement with the subject. A score sures in mind. Participants were provided with unique identifiers so
indicated by "< 5/10″ will reveal low motivation, denoting low interest, their data could be anonymized. The data was only accessible by the
zero, or poor participation, while interaction with learning materials is research team to ensure this information remained confidential
low. Sometimes the motivation and engagement thresholds describe throughout the study. Backup copies of the data were created weekly to
learning experiences and participation patterns among students. These prevent loss and maintain the integrity of the research.

8
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

3.7. Ethical considerations measures of student interaction with the tools, such as the number of
times a tool is reused, how much time the students spend on particular
An informed consent form indicating the purpose of the study and exercises, and also how well a student responded to the adaptive feed­
the kind of data to be collected, and how their information would be back. These logs that gave an objective measure of how different stu­
used, was obtained from participants prior to data collection. Written dents engage the AI system and thus allowed analysis of how different
informed consent was obtained from all delegates, and the students were adaptive interventions influence learning behaviors. Taken together, the
assured that the participation was voluntary and could be withdrawn at sources of information enriched the analysis by providing a mix of
any time. Collected data were anonymized to protect the identity of the subjective and objective measures, thereby allowing the description of
students. This current study collected both quantitative and qualitative the experiences with learning of students considering an effective
data to investigate not only the measurable outcomes of AI-based context such an AI-based adaptive learning system in improving pro­
adaptive programming education but also the nuanced experiences gramming education.
and perspectives of socially disadvantaged students. All participants
gave their consent to participate in the study and their only request was 4. Results
not to mention the name of the university where they were currently
studying, therefore, the names of universities are not mentioned in the The current chapter is dedicated to an in-depth analysis of the
study. effectiveness of the AI-enhanced flipped classroom model in comparison
with the traditional flipped classroom model within a 13-week pro­
3.8. Data analysis gramming education. Quantitative and qualitative data about learning
outcomes, engagement level, motivation, and overall effectiveness will
In the present study, Analysis of Covariance (ANCOVA) was be assessed. The statistical analysis uses ANCOVA, controlling for
employed to examine differences in learning outcomes, "learning gains," baseline differences in pretest scores, which makes the comparison of
engagement, and motivation levels between the experimental and CGs. the two groups strict, hence giving a more accurate estimate of the effect
ANCOVA was selected for its ability to control for baseline differences in of the intervention.
pre-test scores, thereby facilitating a more accurate evaluation of the
intervention’s effect on post-test results. By incorporating pre-test scores 4.1. Learning outcomes
as a covariate, ANCOVA adjusts post-test scores to account for initial
disparities, enabling a clearer assessment of the intervention’s true The learning outcomes have been measured through the comparison
impact while minimizing the influence of extraneous variables. of pre- and post-test scores for both the CG and the EG. The main
Well, ’Learning gain’, generally regarded as the improvement in objective is to find out whether learning outcomes from the AI-enhanced
learner’s learning outcomes, for better wording, is quantified via their flipped classroom model outperform those from the traditional flipped
change in scores from the pre-test to the post-tests. Learning gains classroom model significantly.
provide insight into the effectiveness of interventions taking place over
time, giving views with respect to the difference that came upon from 4.1.1. Pre-test and post-test scores
the beginning of the trial. While pre-test and post-test scores are indi­ The pre-test and post-test results are summarized in Table 1 below.
vidual points in time, learning gains represent a relative change or The line chart with error bars (Fig. 3a) illustrates the mean scores of
growth of the scores from one-time interval to the other. the CG and EG in the pre-test and post-test phases. For the CG, the mean
First, the data was analyzed for normality by applying the Shapiro- score rose from 56 in the pre-test to 70.1 in the post-test. The standard
Wilk test; this showed that distributions of dependent variables deviations for the CG were 8.4 in the pre-test and 9.5 in the post-test,
approximated the normal distribution. It was, therefore, appropriate to indicating moderate variability, which was slightly higher in the post-
carry out a parametric analysis. Besides, there was the Levene test for test. In the EG, the mean score showed a significant increase, moving
equality of variances in order to check whether the assumption of ho­ from 55.2 in the pre-test to 82.3 in the post-test. The standard deviation
mogeneity of variances had been met for the EGs and CGs. Thus, these for the EG was low in the pre-test at 1.5, suggesting little variability
tests provided supplementary support to employing parametric statis­ among scores, but it rose to 8.2 in the post-test, indicating increased
tical procedures in terms of ANCOVA for testing the post-test differences variability as scores improved. Overall, this chart reveals that while both
while controlling pre-tests as well as enhancing the robustness of the groups experienced improvement, the EG demonstrated a larger gain in
analysis. mean score compared to the CG. The error bars suggest that scores in the
The effect sizes were estimated by partial eta-squared (η2) to quantify EG became more variable in the post-test, whereas the CG’s variability
the magnitude of differences observed between groups. Effect size is a remained relatively stable across both tests.
measure of practical importance in that it refers to the proportion of The boxplot (Fig. 3b) provides a visualization of score distributions
variance in a dependent variable that can be attributed to experimental for both the CG and EG in the pre-test and post-test. In the CG, pre-test
manipulation. Effect size is important in understanding the real-life scores ranged from 40 to 70, with a median around 56, and showed a
importance of the effects, which may not directly follow from the sta­ narrow interquartile range, indicating less variability. The post-test
tistical significance itself as an indication of the strength or importance scores for the CG ranged from 52 to 88, with a median near 70.1, and
of the effects. a slightly wider distribution, suggesting a moderate increase in vari­
Descriptive statistics, including means, standard deviations, and ability. In contrast, the EG’s pre-test scores ranged from 42 to 72, with a
confidence intervals, were depicted using boxplots and line charts with median near 55.2 and low variability, as indicated by the narrow
error bars, fully presenting the adjusted means and variability of both interquartile range. After the test, the EG’s scores spanned from 64 to 95,
groups. This display provided full clarity on central tendency and dis­ with a median around 82.3, showing a much broader interquartile range
tribution in visualizing the data, hence interpreting the effect of the and greater spread in scores. This boxplot indicates that while both
intervention across different levels of engagement, motivation, and groups improved from the pre-test to the post-test, the EG achieved a
learning outcomes. Apart from the analysis of covariance, integrated higher median score with more variation in the post-test results,
data from survey questionnaires and AI system logs were used to offer a reflecting both greater improvement and increased variability in
multidimensional understanding of the intervention’s impact. The sur­ performance.
vey data were then used to assess the baseline level of engagement and The normality of the data was confirmed using the Shapiro-Wilk test,
motivational levels through descriptive statistics that showed students’ with p-values greater than 0.05 for all groups in both the pre-test and
self-reported experiences. These log data are a rich source of objective post-test (Table 2). These results indicate that the data do not

9
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Table 1
Summary of the pre-test and post-test results.
Group Pre-test Post-test

Mean SD Min Max Mean SD Min Max

CGa 56.00 8.40 40.00 70.00 70.10 9.50 52.00 88.00


EGa 55.20 1.50 42.00 72.00 82.30 8.20 64.00 95.00
a
N = 60.

Fig. 3. a) The mean scores of the CG and EG in the pre-test and post-test phases. b) A visualization of score distributions for both the CG and EG in the pre-test and
post-test.

Table 2
Shapiro-Wilk and Levene’s Test results for normality and homogeneity of variances.
Pre-test Post-test

Shapiro-Wilk test Levene’s test Shapiro-Wilk test Levene’s test

Group W-value p-value F-value p-value W-value p-value F-value p-value

CGa 0.972 0.341b 0.760 0.387b 0.969 0.283b 0.601 0.436b


EGa 0.975 0.408b 0.970 0.311b
a
N = 60.
b
2-tailed.

significantly deviate from normality. Additionally, Levene’s test for main effect: the EG outperformed CG with a significant difference. This
equality of variances was conducted to ensure the homogeneity of var­ is further shown to be strong by both a very large F value (25.67) and
iances between the groups. The results demonstrated that the variances small p-value (0.001), so partial η2 accounted for about 0.301 propor­
in both the pre-test and post-test scores were not significantly different tion of variance in the post-test score by group factor.
across the CG and EG. These findings validate the use of parametric tests, Also, pre-test results, taken as a covariate, are significantly predictive
such as ANCOVA, for comparing post-test scores between the experi­ of post-test performance, with F = 10.87, p = 0.002. Partial η2 of 0.156
mental and CGs while controlling for pre-test scores as a covariate. suggests that pre-test scores explain a moderate amount of variance in
A covariance analysis was conducted, therefore, on the post-test re­ the post-test results. These results signal both the efficiency of the AI-
sults across EG versus CG, controlling pre-test scores. The result of enhanced flipped classroom model and the need to consider baseline
ANCOVA has been summarized in Table 3. The main effect of inter­ knowledge when accurately estimating the effect of the intervention.
vention has turned out to be significant in general, as confirmed by the
adjusted means showing the EG outperforming CG. 4.1.2. Learning gains analysis
Results of ANCOVA showed that the treatment effects, that is group Moreover, the learning gains were calculated as the difference be­
differences at post-test have an apparent and statistically significant tween pre-test and post-test scores for each group. According to these
learning gains, a One-Way ANOVA was done to establish whether the

Table 3
ANCOVA results for post-test scores. Table 4
Results of ANOVA for learning gain comparison between CG and EG.
SS df MS F- p- Partial
value value η2 CG CG EG EG ANOVA p-
a
(Mean) (SD) (Mean) (SD) (F-value) value
Group (CG vs EG) 1284.20 1 1284.20 25.67 0.001 0.301
Pre-test 542.80 1 542.80 10.87 0.002a 0.156 Learning gain 14.1 9.0 27.1 7.6 12.35 0.01b
(Covariate) comparisona
Error 4760.30 117 40.68 ​ ​ ​ a
N = 60.
a b
2-tailed. 2-tailed.

10
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

gains differed significantly according to the groups. In addition, Table 4 of engagements are summarized in Table 6.
presents the outcomes of the ANOVA (p = 0.01) delineates that learning The line graph with error bars in (Fig. 5a) depicts the means of
gains are significantly different across the three groups; hence, it con­ engagement scores across pre-test and post-test phases for CG and EG. In
firms the relative effectiveness for an AI-enhanced flipped classroom the CG, the mean score has slightly increased from 6.1 in the pre-test to
model. 6.4 in the post-test. The CG had a standard deviation of 1.1 in the pre-test
To further show the variation in the gain levels, the learning gains and 1.2 in the post-test; thus, the variability was moderate in both tests.
were categorized into percentage improvement brackets and the fre­ In the EG, the mean score has shown a more considerable growth: from
quency of students falling under each category was counted for both 6.2 points in the pre-test to 8.5 points in the post-test. The standard
groups. Table 5 and Fig. 4 below show that EG gained significantly in deviation for the EG remained constant at 1.0 during the pre-test and
learning with more number of students exhibiting more than 50 % slightly reduced to 0.9 for the post-test, reflecting the reduced variability
improvement compared to CG. with the improvement. Overall, this chart showed that while there was
Fig. 4 below presents a bar chart displaying the distribution of stu­ an improvement in both groups, the EG has seen a much greater increase
dents by percentage improvement for each group. This chart, using in the level of engagement scores than the CG. Error bars show that
shades of gray to distinguish groups, reveals that the AI-supported engagement in the EG converged to the mean, indicating consistent
approach led to more substantial improvements, with over a third of improvement among participants.
the EG achieving gains above 50 %. This pattern contrasts with the CG, The boxplot (Fig. 5b) visualizes the distribution of the engagement
where only a small fraction of students reached comparable levels of score for both groups in the pre-test and post-test phases. For the CG, the
improvement. pre-test scores ranged from 4 to 8 with a median about 6.1, and its
Hence, ANOVA and a percentage distribution test showed that there interquartile range was relatively narrow, showing low variability. The
was indeed considerably higher learning gain in EG with the AI- CG post-test scores ranged from 4 to 9, with the median near 6.4 and had
enhanced flipped classroom model. This consolidates the advantage a similar interquartile range, which means that the variability in the
provided by personalized support as offered through an AI-enhanced distribution is consistent across both phases for this group. The pre-test
model due to facilitating higher percentages of students at remarkable scores of the EG range from 4 to 8, its median around 6.2, with a narrow
improvement levels in their programming skills. interquartile range, reflecting low variability. The scores of the EG after
the intervention vary between 6 and 10, with a median of approximately
4.2. Engagement levels 8.5, reflecting a higher median engagement score and increased inter­
quartile range. This wider spread in post-test scores for the EG indicates
Engagement levels are critical in evaluating the effectiveness of the that there was a significant increase in engagement levels, with partic­
intervention, especially in capturing students’ active participation in, ipants reaching higher levels of engagement compared to the CG.
and interaction with, the learning materials and activities conducted The Shapiro-Wilk test further confirmed that the data had a normal
inside the classroom. In the present study, engagement levels were distribution as all the groups in the pre-test and post-test phases had a p-
measured through multiple methods: standardized surveys, classroom value of >0.05, as represented in Table 7 below. The above implies that
observations, and AI system logs for the EG. A pre-test engagement data did not seriously deviate from a normal distribution. Additionally,
survey was conducted at the start to ensure comparability at baseline. variances in the scores obtained during the pre-test and post-test be­
This provided initial engagement data, and post-test measured the effect tween CG and EG had been verified by Levene’s test for equality of
of the intervention. variances. These findings also confirm the hypothesis of homogeneity of
Accordingly, survey questionnaires were used to let the students rate variances, hence permitting the conduct of the parametric test through
their level of engagement on a 1-to-10-point system to determine ANCOVA analyses for post-test scores based on pre-test scores being
engagement scores. Scores from 1 to 4 have been regarded as reflecting considered as a covariate.
low engagement, that is, less or passive engagement with the activities. Engagement levels were assessed through pre-test and post-test
Scores ranged between 5 and 7 as moderate engagement showing active surveys, with the results summarized in Table 8. The ANCOVA anal­
but less sustained interest in participation. Engagement with a score of ysis controlled for pre-test scores as a covariate to determine the impact
8–10 was considered as showing high engagement involving maintained of the intervention on post-test motivation levels.
and proactive participation within the learning activities. The ANCOVA of post-test engagement scores is shown in Table 8.
This classification is based on studies of student-level engagement This gives a finer-grain statistical analysis, including adjustments to
and was applied throughout the report to describe students’ distribution control for pre-test scores as a covariate. The main effect of the group
into categories. The threshold values align with those commonly found factor (CG versus EG) is statistically significant as shown by F = 121.55
in broader engagement research allowing for the data to be examined in and p-value <0.001. That is, there was a statistically significant differ­
a transparent and comparable manner. ence in the engagement scores between the AI-enhanced Flipped
Classroom Model and the traditional flipped classroom model. From the
4.2.1. Survey-based engagement scores partial eta-squared (η2) value of the group effect of 0.510, corresponding
The pre-test scores confirmed the participants’ engagement levels to a very large effect size. This suggests that a great amount of the
before the intervention, thus providing a baseline for comparison. Post- variation among the engagement scores is accounted for by the varia­
test scores captured the changes in engagement levels achieved through tions between the two instructional models.
respective teaching models. The pre-test and post-test results in respect The F-value of the pretest score covariate is 0.86, with a p-value of
0.355; hence, pretest scores do not significantly influence post-test
Table 5 outcomes on engagement. The partial eta-squared for this covariate is
Learning gains categorized by percentage improvement for CG and EG. 0.007, reflecting an extremely small effect size. It means that the
Learning gain category CGa EGa
engagement level of the students before the actual test, does not affect
performance later and makes intervention even more important. The
0–10 % 10 2
error term of the variance that is unexplained within this model has a
11–20 % 18 5
21–30 % 12 8 sum of squares value of 112.91, on 117 degrees of freedom. Accordingly,
31–40 % 8 10 the mean square error (MS) is equal to 0.97, reflecting an overall lower
41–50 % 6 15 unexplained variance in engagement scores over and above both the
>50 % 6 20 group effect and covariate.
a
N = 60. All in all, the results indicated the effectiveness of the AI-enhanced

11
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Fig. 4. Learning gains by percentage improvement for CG and EG.

flipped classroom model for enhancing engagement levels. The great


Table 6
effect size associated with the group factor reflected that the instruc­
Summary of the pre-test and post-test results.
tional model played a major role in shaping the post-test engagement
Group Pre-test Post-test outcome, irrespective of the initial engagement levels. This underlined
Mean SD Min Max Mean SD Min Max the potential of AI-based adaptive learning systems in fostering higher
CGa 6.1 1.1 3 9 6.4 1.2 4 8 engagement in educational settings.
EGa 6.2 1.0 4 8 8.5 0.9 7 10 Table 9 provides a detailed breakdown of student engagement levels
a across the CG and EG, categorized into high engagement (scores >8/10)
N = 60.
and low engagement (scores <5/10) during the pre-test and post-test
phases. This classification allows for a clearer understanding of the
shifts in engagement brought about by the intervention. The table below
shows the distribution of pre- and post-test measures of the AI-enhanced

Fig. 5. a) The mean scores of the CG and EG in the pre-test and post-test phases. b) A visualization of score distributions for both the CG and EG in the pre-test and
post-test.

Table 7
Shapiro-Wilk and Levene’s Test results for normality and homogeneity of variances.
Pre-test Post-test

Shapiro-Wilk test Levene’s test Shapiro-Wilk test Levene’s test

Group W-value p-value F-value p-value W-value p-value F-value p-value


a b b b
CG 0.986 0.738 0.601 0.440 0.973 0.356 0.135 0.714b
EGa 0.981 0.611b 0.978 0.417b
a
N = 60.
b
2-tailed.

12
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Table 8 85 % of the students in this group finished all the tasks, using 2.8 h every
ANCOVA results for post-test scores. week to do exercises, and showing an average of 15 hits to the AI system
SS df MS F- p-value Partial per week. These metrics point, in fact, to a tremendous amount of sus­
value η2 tained motivation and, therefore, engagement on tasks, which can be
Group (CG vs EG) 117.30 1 117.30 121.55 <0.001 a
0.510 due to the personalized support with adaptive feedback provided by AI.
Pre-test 0.83 1 0.83 0.86 0.355a 0.007
(Covariate) 4.2.3. Classroom observation data
Error 112.91 117 0.97 ​ ​ ​ Observing classes also yielded qualitative and quantitative data on
a
2-tailed. students’ in-class behaviors and interactions such as frequency of
participation, peer interaction, and student-instructor interaction during
classes. The engagement metrics were derived by structured observation
Table 9 protocols using a standardized rubric to classify behaviors.
Engagement level distribution in pre-test and posts-test phases for CG and EG. In the CG, the engagement was found to be moderate, typified by a
Pre-test Post-test spurt in participation at instances when students would contribute or ask
Group Students Students Students Students
for help. While in the EG, engagement was high, with consistent
reporting >8/ reporting <5/ reporting >8/ reporting <5/ involvement in collaborative and independent learning activities.
10 10 10 10 Interaction rates, including both peer interactions and student and
CGa 5 35 18 42 instructor exchanges, were defined as the number of meaningful ex­
EGa 8 30 42 2 changes per hour of instruction. These exchanges included problem-
a
N = 60.
solving discussions, collaborative activities, and substantive feedback-
seeking behaviors that were used to capture the depth and quality of
interactions.
flipped classroom model on students’ engagement and disengagement.
This approach made certain that engagement levels were interpreted
These are important for assessing the effectiveness of the intervention on
consistently and then validated the comparisons between CG vs. EG
students’ participation and interaction with the learning materials.
summarized in Table 10.
Fig. 6 shows the distribution of students who reported scores higher
In the CG, participation frequency was observed as moderate, though
than 8/10 and lower than 5/10 in both CG and EG in pre-test and post-
students participated in lesson activities, they rarely discussed or shared
test phases. Each bar is divided into two segments, wherein the black
something. Participation frequency in the EG was rated high, as students
portion represents students with scores above 8/10, while the white one
took part in exercises, discussions, and problem-solving activities
represents those students whose scores are below 5/10. This chart re­
continuously throughout the class sessions. Probably, personalized
flects that, in the pre-test phase, the majority of students in both groups
support and real-time feedback from an AI system were motivation
reported scores below 5/10, while only a few scored above 8/10. In
enough for active participation by the students.
contrast, in the post-test phase, there is a big shift in the EG, as many
The CG demonstrated a peer interaction rate of 3.1/hour, indicative
students scored above 8/10 (42), though only a few scored below 5/10
of some effective collaboration; however, this was seriously limited to
(2). In contrast, the CG only showed a more modest improvement in the
low-level activities. On the other hand, the use of AI-enhanced learning
post-test phase, with 18 students scoring above 8/10 and 42 below 5/10.
This chart shows the dramatic improvement that took place in the EG
due to the intervention, in comparison with the limited progress Table 10
observed in the CG. Classroom observation data on engagement metrics for CG and EG.
Engagement metric CG EG
4.2.2. Engagement during at home learning
The EG’s self-study motivation was further captured from the logs of Survey engagement score Moderate High
Peer interaction rate 3.1/hour 5.4/hour
the used AI system, recording metrics on task completion rates, exer­ Student-instructor interaction rate 2.3/hour 4.8/hour
cising time, and interaction frequencies. Results showed that on average,

Fig. 6. Distribution of engagement levels (>8/10 and < 5/10) for CG and EG.

13
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

systems by the experimental condition demonstrated a dramatically In the motivation survey, students were asked to give a grade from 1
higher peer interaction rate at 5.4/hour, insinuating that with the AI- to 10 about their motivational level. Thus, scores ranging between 1 and
enhanced learning systems, collaborative learning and problem- 4 were referred to as low motivation and it signalled that little effort or
solving among students could be fostered toward a most interactive interest had been invested, whereas scores between 5 and 7 meant
classroom environment. moderate motivation, pointing to an occasional rather than consistent
This means that whereas some students actually sought guidance, the effort, while a score falling within a range from 8 to 10 signalized high
overall engagement of instructors was limited for the CG. In the case of motivation, referring to a situation when full of consistent enthusiasm
the EG, the interaction rate could reach 4.8/hour between students and was shown and much effort was made. Such classification can be traced
instructors, which shows that students were quite proactive in seeking to some earlier studies concerning motivational measurement, hence
assistance and feedback from their instructors. AI personalized recom­ making a contribution to carrying out a more exhaustive analysis of
mendations tend to increase the likelihood of students reaching out motivations within both CG and EG.
more frequently to instructors in an attempt to improve their learning.
Fig. 7 below represents the interaction rate with peers and in­ 4.3.1. Survey-based motivation scores
structors in both groups, showing a marked increase in EG. Improve­ Motivation levels of the involved students were measured in both
ment in the quantity of interaction testifies to the effective promotion of pre-test and post-test questionnaires used to indicate the effect of the
a collaborative, friendly atmosphere facilitated by the AI-enhanced intervention. The scores from pre-tests gave base data to both CG and EG
flipped classroom model. while the scores from post-tests showed variation in motivation due to
The higher rates of peer and student-instructor interactions in the EG intervention. Descriptive statistics, that is, mean and standard deviation
reflect a more engaging and collaborative learning environment fostered for minimum and maximum values for motivation scores are shown
by the AI-driven flipped classroom model. This system would, therefore, below in Table 11.
be able to give better feedback on an immediate basis and tailor support Fig. 8a shows the mean scores of motivational levels achieved by CG
according to the needs of the students. These findings suggest that stu­ and EG in the pre-test and post-test phases (see Table 12). From this, it is
dents are more engaged, as experienced by the higher frequency of deduced that CG improved slightly in terms of motivation in the post-
interaction with peers and instructors compared to students in the CG. test phase since the average score increased from 6.2 during the pre­
Results showed that students in the EG were much more engaged in test to 6.8 in the post-test phase. The standard deviations for the CG were
the in-class activities, both in peer-to-peer or student-instructor in­ 1.3 in the pre-test and 1.5 in the post-test, indicating consistent vari­
teractions. This is because, for outside-class activities, the real-time ability across the two phases. In contrast, the EG showed a more sig­
adaptations by the AI system identified learning gaps of each student nificant improvement in the mean score from 6.3 in the pre-test to 8.9 in
and provided specific feedback. The adaptive system allowed students to the post-test. The standard deviations of the EG decreased slightly from
focus the in-class efforts on solving problems that were directly relevant 1.2 in the pre-test to 1.0 in the post-test, showing less dispersion of
to their individual learning needs, thus improving classroom scores while students attained higher motivation levels. There is a large
participation. gap between the results of the two groups that underlines the effec­
tiveness of an AI-enhanced flipped classroom model in driving higher
motivation.
4.3. Motivation levels

Table 11
Motivation is a critical determinant in education, and its mainte­
Summary of the pre-test and post-test results.
nance in programming courses is a crucial issue, as students must engage
with abstract concepts and complex problem-solving tasks that require Group Pre-test Post-test

continuous effort and persistence. The measurement of motivation Mean SD Min Max Mean SD Min Max
levels was examined by surveying, classroom observation, and AI system CGa 6.2 1.3 4 8 6.8 1.1 5 8
logs for the EG. A pre-test motivation survey at the beginning of the EGa 6.3 1.2 4 9 8.9 1.0 7 10
study provided baseline data, enabling comparison with post-test results a
N = 60.
following the intervention.

Fig. 7. Peer and student-instructor interaction rates for CG and EG.

14
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Table 12
Shapiro-Wilk and Levene’s Test results for normality and homogeneity of variances.
Pre-test Post-test

Shapiro-Wilk test Levene’s test Shapiro-Wilk test Levene’s test

Group W-value p-value F-value p-value W-value p-value F-value p-value

CGa 0.980 0.441b 0.474 0.492b 0.978 0.353b 0.126 0.724b


EGa 0.981 0.488b 0.987 0.766b
a
N = 60.
b
2-tailed.

Fig. 8. a) The mean scores of the CG and EG in the pre-test and post-test phases. b) A visualization of score distributions for both the CG and EG in the pre-test and
post-test.

Fig. 8b shows the distribution of scores in the two groups at pre-and Motivation was measured in questionnaires both before and after the
post-test by a boxplot. The scores in CG during the pre-test ranged be­ test. The results are summarized in Table 13. In the ANCOVA analysis,
tween 4 and 8 and had a median value of approximately 6.2 and a small pre-test scores were used as a covariate to adjust for pre-existing dif­
interquartile range, reflecting moderate variability. At post-test, for CG ferences between groups in order to evaluate the effect of the inter­
the results ranged from 5 to 8 with a median of about 6.8 indicating that vention on post-test motivation levels.
their distribution also was right-skewed but with similar dispersion. As The ANCOVA results are presented in the following table as a func­
for the EG, during the pre-tests, their results fell between 4 and 9, with tion of the detailed statistical analysis of post-test motivation scores,
an approximate median of 6.3 and a very small interquartile range, having controlled for pretest motivation levels to act as a covariate. As
which indicates coherence in performance. The scores of the EG ranged mentioned earlier in the exercise, the analysis had found a significant F;
from 7 to 10, with a median of around 8.9. One can also observe that the from there, it had become a known fact that due to the group factor, that
interquartile range is a bit bigger due to higher median points together is, control versus experimental, the main effect statistical difference did
with higher variation among performances. The dramatic gain in the exist between the two groups: F = 121.98 and p < 0.001. It meant that in
motivation of the EG is shown in the boxplot, which shows a decidedly the post-test, the motivational score favoured EG over CG at signifi­
rightward shift in the entire distribution toward the top side of the scale cance. A partial η2 of 0.509 would be considered a large effect size,
relative to the CG. which would suggest that a great amount of variation within post-test
The Shapiro-Wilk test confirmed that the data of motivation scores motivation scores is accounted for by the kind of instructional model
did not deviate from normality since all p-values in both the pre-test and used.
post-test phases were above 0.05 for all groups. Additionally, Levene’s The covariate, the pre-test motivation scores, are supported by an F-
test for equality of variances was conducted to verify the homogeneity of value of 0.70 and a p-value of 0.404, showing that the pretest scores did
variances between the groups. The results demonstrated no significant not contribute much to the post-test results. The partial eta-squared
differences in variances for either the pre-test or post-test scores across value was 0.003 for the covariate, which is considered negligible,
the CG and EG. These findings validate the assumptions required for the hence suggesting a negligible baseline motivation level on differences in
application of parametric tests, such as ANCOVA, to compare post-test post-test motivation score changes.
motivation scores while accounting for pre-test scores as a covariate. The error term, with a sum of squares (SS) value of 127.10 distrib­
uted across 117 degrees of freedom (df), results in a mean square error
(MS) of 1.09. This relatively low error variance indicates that the model
Table 13
provides a robust explanation of the differences in motivation scores
ANCOVA results for post-test scores.
between the groups.
SS df MS F- p-value Partial Results of ANCOVA show that the AI-enhanced flipped classroom
value η2
model effectively enhanced students’ motivation to a great extent. The
Group (CG vs EG) 132.52 1 132.52 121.98 <0.001 a
0.509 great magnitude of the effect size of the factor of the group testifies to
Pre-test 0.76 1 0.76 0.70 0.404a 0.003 the impact of the instructional model beyond initial motivation,
(Covariate)
Error 127.10 117 1.09 ​ ​ ​
underlining thereby the possibility to ensure higher and more homo­
geneous motivation of students. Table 14 presents the distribution of
a
2-tailed.

15
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Table 14 behaviour, completion rates of in-class activities, and frequency of self-


Motivation distribution in pre-test and post-test phases for CG and EG. initiated questions. Table 15 summarizes the differences between the CG
Pre-test Post-test and EG.
In the CG, proactive behaviour, that is, the client volunteering for
Group Students Students Students Students
reporting >8/ reporting <5/ reporting >8/ reporting <5/ tasks or starting a discussion, was rated as moderate. Some students
10 10 10 10 were involved, but overall, the dynamic that served in the class was not
CGa 8 30 18 35
high, which was observed in the EG. For proactive behaviour, in the EG,
EGa 10 28 42 3 it was rated as high: the students frequently volunteered for the coding
a challenges, showed involvement in group activities, and were enthusi­
N = 60.
astic, in general, in class.
During in-class activities, it was noted that the CG students
motivation on both CG and EG at both the pre-test and post-test phases. completed 70 % of the exercises. Although this indicates that the ma­
Categories in Table 14 classify students into the two broad categories of jority participated, it is evident that a quarter to a third of the class did
higher than 8 out of 10 (high motivation) and less than 5 out of 10 (low not participate in all exercises. The comparison in the EG was 90 %,
motivation) based on the score they mark for motivation. This catego­ showing that this motivation was higher and resulted in greater
rization can also allow for a clear comparison of shifts in motivation commitment toward the accomplishment of set tasks. This could
levels that are a result of the intervention. By comparing the distribution contribute partially to the adaptive support provided by the AI system,
of the pre-test and post-test results for each group, the table will be able in which the students were given just the right level of support, which
to indicate the impact an AI-enhanced flipped classroom model has on allowed students to become successful at those activities.
improving motivation and lessening disengagement among students. On average, control participants asked 2.5 self-initiated questions
Fig. 9 shows a comparison of the levels of students’ engagement per hour, a moderately curious and motivated engagement with the
between CG and EG for both the pre- and post-test phases. It used colours subject matter. By comparison, the EG averaged 5.2 questions per hour,
where black meant the greatest number of students for >8/10 in the reflecting greater motivational levels toward becoming engaged. Stu­
score in engagement, while white indicates <5/10 of the scores. dents in this latter group were more active in asking for clarification and
During the pre-test, the majority in both groups reported low levels support due to the personalized learning approach and real-time
of engagement, with the CG having 30 participants scoring below 5 out feedback.
of 10 and similarly, 28 in the EG. Few scored above 8/10, with 8 stu­
dents showing this score in the CG and 10 in the EG. That is to say, both
groups appear to be similar in basic levels of engagement before entering 4.4. System data analysis
the intervention.
During the post-test, EG had significantly improved in engagement, The logs generated by the AI of the EG are analyzed below to make
with 42 students scoring above 8/10 and only 3 students scoring below inferences regarding student behavioural patterns, their frequency, and
5/10. The changes for CG were less accentuated: 18 students reported nature within the adaptive intervention and across successive modifi­
scores above 8/10, while 35 students remained in the low category of cations within personalized pathways from this study. These provide an
engagement. Such results reflect the efficiency of the intervention in objectively rich source of data to illustrate how well students generally
increasing higher levels of engagement for EG in comparison with the use the adaptive learning modes by the AI and adaptive modifications
discrete progress for CG.
The chart highlights significant contrasts between the two groups, Table 15
signalling that the intervention has a remarkable effect on student Classroom observation data on motivation metrics for CG and EG.
engagement in the EG. Motivation metric CG EG

Proactive student behaviour Moderate High


4.3.2. Classroom observation data on motivation Completion rate of in-class activities 70 % 90 %
Motivational behaviours were also observed during classroom ses­ Frequency of self-initiated questions 2.5/hour 5.2/hour
sions. Observations focused on metrics such as proactive student

Fig. 9. Distribution of motivation levels (>8/10 and < 5/10) for CG and EG.

16
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

occurring within each case. The analysis focused on key metrics repre­ Table 17
senting the extent and types of interactions with the AI, as well as the Excerpts from qualitative data supporting key themes in AI-enhanced flipped
frequency of personalized interventions. classroom.
Table 16 summarizes some of the key metrics from the AI system logs Theme Example Excerpts Frequency/Context
on the frequency and type of interactions students had with these tools AI-driven Support “The AI felt like having a tutor with Frequently
over the course. These metrics provide insight into how the systems me all the time, giving me mentioned by EG
adapted the learning experience for individual students, making sure personalized help whenever I students
that each student received the right amount of support to optimize their needed it. It kept me engaged and
motivated.”
learning outcomes (see Table 17).
​ “Without personalized feedback, I Mentioned by CG
The data shows that the students of the EG worked with these sys­ often didn’t know how to proceed students
tems regularly; hence, adaptive interventions and customized exercises when I got stuck.”
are at a high frequency. That means the two, Knewton and ALEKS, were Student “I felt free to learn at my own pace, Common in EG
very much in function regarding personalized feedback given to the Engagement and which helped me grasp the responses
Autonomy programming concepts better. I
students, letting them make their progress at their desired pace but could revisit difficult topics
immediately addressing their learning needs. whenever I needed to.”
These are complemented by data on the frequency and nature of ​ “I didn’t know if I was doing things Common in CG
interactions, reflecting active use by students of the AI system and correctly because there was no responses
immediate feedback. This made me
customized adaptive support in the process of study. In this respect,
hesitant to tackle harder problems.”
weekly interactions with the system ranged between 10 and 25 ac­ Classroom “In the EG, students were much Observed in
cording to student engagement and the nature of the interventions. On Dynamics more likely to ask questions and classroom
average, the system provided 7 adaptive interventions per student, engage in higher-order tasks. The interactions in the EG
which underlines responsiveness to individual learning needs. AI’s immediate feedback seemed to
empower them to participate more
Besides, this AI system elaborated an average of 8 personalized ex­
actively.”
ercises per student. The range shows that the system was working to
provide elaborated learning materials that take into consideration each
advance or reinforcement that any student could have required. These "real-time feedback" were identified as ways in which students described
interventions were set to optimize the process of learning through im­ their responses to the interactions of the AI system, to their peers in turn,
mediate feedback, precisely targeted exercises, and changes in the task and their reactions to adaptive learning. After this sort of initial coding
difficulty considering the pace of learning of students. phase, researchers the codes were taken and organized into larger
The data show that the AI platform indeed supported the learning themes. The most pervasive themes observed were AI-powered support,
process of students through immediate feedback and resources. Thus, student engagement, and autonomy, along with classroom dynamics.
the adaptive system facilitated real-time interventions for students, These themes represent the critical elements of the learning process and
contributed to greater engagement, and promoted individualized were also corroborated through the process of iteration with constant
learning. The data serves as evidence by tracking interactions and in­ checking against data.
terventions, showing the effectiveness of AI-based adaptive learning The frequency of key statements and contexts in which they occurred
within a flipped classroom model. The fact that AI supports traditional were also recorded for further clarity. These themes are ranked in order
teaching in enhancing student learning outcomes is also demonstrated. of their salience. The most salient was the theme of AI-driven support;
according to the students in the EG, real-time personalized feedback
given by the AI system kept them on track and allowed them to be
4.5. Qualitative analysis
confident that their learning was on target. One of the students stated,
“The AI felt like having a tutor with me all the time, giving me personalized
In this study, thematic analysis was applied as a qualitative analysis
help whenever I needed it. It kept me engaged and motivated.” This state­
approach to explore the experiences of students and instructors in both
ment was typical of many in the EG, who consistently emphasized the
CG and EG. It values highly the exploration of how the AI-enhanced
value of immediate support. In contrast, a student in the CG commented,
flipped classroom model influences changes in the learning behaviour,
“Without personalized feedback, I often didn’t know how to proceed when I
motivation, and classroom dynamics of students. Data collection relied
got stuck.”
on classroom observations, open-ended survey responses, and AI system
The second most discussed theme concerned student engagement
logs. The analyses were conducted with due process to ensure that the
and autonomy. Many students in the EG mentioned the feeling of
identification of key themes and patterns is appropriate, reliable, and
increased autonomy. For instance, one student mentioned, "I felt free to
consistent.
learn at my own pace, which helped me grasp the programming concepts
Data familiarization was the first step of the analysis, in which the
better. I could revisit difficult topics whenever I needed to." This feeling of
data collected from various sources were repeatedly read to comprehend
being in control was also very much part of several students’ reports,
students’ experiences. The data was then coded in detail, and some of
and indeed for them, this became a key driver. This is in contrast with
the identified themes included "engagement," "real-time feedback," "peer
the frustration reported by several students in the CG for lack of au­
collaboration," and "autonomy." These themes are considered important
tonomy and immediate feedback. A student noted "I didn’t know if I was
to understanding the effect of the AI-enhanced flipped classroom model.
doing things correctly because there was no immediate feedback. This made
Inductive coding was done to ensure consistency and transparency.
me hesitant to tackle harder problems."
First, terms coded from the data described how students responded to
The last theme that emerged through the observations in the class­
the interactions of the AI system: "autonomy," "peer collaboration," and
room was class dynamics. This is because the EG had greater partici­
pation and cooperation in the activities performed within the class.
Table 16
These students were also seen to participate in more group discussions,
AI system data on student interaction metrics in the EG.
asking for clarifications, and working with colleagues during problem-
Motivation metric Mean Min Max solving activities. As a result, students in the EG were found to have
Weekly interactions 15 10 25 more confidence in undertaking challenging tasks. Indeed, they often
Adaptive interventions 7 per student 3 per student 12 per student referred to the support they received from the AI system. One observer
Customized exercises provided 8 per student 5 per student 10 per student
commented, "In the EG, students were much more likely to ask questions and

17
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

engage in higher-order tasks. The AI’s immediate feedback seemed to outcomes due to real-time feedback and adapting learning paths.
empower them to participate more actively." Whereas students in the CG Whereas in traditional flipped classrooms, students engage with the
were less keen to engage in challenging activities or ask for help from the material in an independent manner, through the use of an AI system,
teacher during the lesson because they did not have individual support tailoring the content to learners’ needs is ensured and a better and more
when studying alone. focused way of preparation is guaranteed. Individualized support helps
Qualitative findings indicate that in a flipped classroom model, to address knowledge gaps that are getting in the way of students
integrating AI enhances student autonomy and increases engagement by making progress and makes the activities in class more meaningful and
creating more interactivity, hence increasing collaboration among productive. The ability to provide interventions immediately also
learners. Real-time feedback and personalized support by the AI system further facilitates student motivation development, as it was reflected in
may be used to form a dynamic learning atmosphere in which students the substantially higher post-test motivation scores seen in the EG.
would be able to and motivated to complete more complex tasks. The integration of AI-based adaptive learning significantly enhanced
The dynamics of the classroom also proved it in the EG, with the the motivation and engagement of the students in the EG. The results of
students working collaboratively and showing more engagement with the post-test survey revealed higher motivation scores for a larger per­
the material. Qualitative results support quantitative results and provide centage of students who were highly engaged (>8/10) in the EG
further evidence of how the integration of AI with the flipped classroom compared to the CG. These findings were further supported by obser­
model was effective. Real-time adaptive feedback provided by the AI vational data, which evidenced more peer-to-peer and student-
system facilitated a more personalized learning experience that greatly instructor interactions in the EG. These findings agree with the work
influenced the motivation, engagement, and overall learning outcomes of other researchers, for example, Gligorea et al. (2023) reported the use
of students. These insights highlight the potential of AI-enhanced flipped of AI-driven adaptive systems because engagement is enhanced by
classrooms in transforming programming education, offering more receiving immediate feedback and a personal touch in learning.
tailored, effective learning environments. The real-time guidance given through the AI system played a sig­
nificant role in promoting an environment of more interactive and
5. Discussion participatory learning. The students, according to the collected feed­
back, felt more confident and interested in learning with immediate
These findings show the transformation that might result from the support from the system, as also discussed by Holmes et al. (2019). The
integration of AI-based adaptive learning systems into flipped class­ results in this study support the contribution of Zha et al. (2019), to the
rooms in programming education proving positive development about effect that active learning methodologies, complemented with adaptive
student engagement, motivation, and learning outcomes and providing support, lead to a significant increment in engagement and under­
a promising way to solve persistent challenges within programming standing within students attending courses in programming.
instructions. Adaptive learning systems enable students to work at a This study reinforces the dual benefits of joining AI-based adaptive
pace that suits them by offering individualized feedback and customized learning to a flipped classroom model. The flipped classroom structure
learning paths, thus making the process all-inclusive and effective. The enables students to engage with theoretical content outside the class,
discussion sets the findings in the context of the relevant prior research while the AI system supports self-study by providing adaptive feedback
and reflects critically on the outcomes regarding the research questions. and personalized recommendations. This synergy solves one of the most
important problems in programming education, regarding the question
RQ1 How does the integration of AI-based adaptive learning systems of how to bridge the gap between theoretical understanding and prac­
into the flipped classroom model impact learning outcomes and the tical application. Kay et al. (2019) further reported that flipped class­
mastery of complex programming concepts compared to traditional rooms combined with technology facilitate deeper learning and student
flipped classroom models? autonomy.
These findings have implications for practice across diverse educa­
The integration of AI-based adaptive learning effectively enhanced tional platforms. AI-enhanced flipped classrooms make it easier for in­
learning outcomes, as revealed through the ANCOVA-backed pre- and structors of large classes to shift away from trying to be lecturers into
post-test analyses. Students in EG demonstrated a larger increase in serving as learning facilitators. This is consistent with research by Maher
concept mastery than those in the CG, since their post-test scores et al. (2015) on technology-enhanced learning in the pursuit of effi­
significantly outperformed those of the CG. These findings are supported ciency and equity. The challenges of implementation such systems face,
by literature showing how adaptive learning technologies can satisfy the according to Er-Rafyg et al. (2024, pp. 329–342), concern equal access
specific learning needs of students more effectively than traditional to technology and teacher training.
methods alone (Heffernan et al., 2016; Holmes et al., 2019). For Moreover, this flipped classroom model integrated with the AI sys­
instance, Xu and Ouyang (2022) in their study proved that AI platforms tem not only accommodated personalized learning paths out of class but
enhance academic achievement by offering superior learning paths, also played a key role in reinforcing engagement within the classroom
confirming this paper’s results. environment.
The adaptive interventions that the AI system made possible in Equipping instructors with the means for timely data on student
facilitating individualized learning experiences are significant. Through performance and difficulties, AI pinpointed aspects that needed focused
personalized feedback and real-time adjustments in learning materials, work during class, thus resulting in sharper and more effective peer
students were able to work out their weaknesses and build confidence in interaction. This increases the quality of peer-to-peer collaboration
solving the programming challenges. This is in agreement with the re­ whereby the students get engaged working out problems that relate to
sults by Fernandes et al. (2023, pp. 1–6), who established that AI systems them directly.
dynamically scaffold learning to meet the needs of each student.
5.1. Addressing the novelty and contribution of the study
RQ2 How do students’ engagement and motivation in programming
education change when exposed to a flipped classroom model While existing studies acknowledge the fact that AI-based adaptive
enhanced by AI-based adaptive learning systems compared to a learning systems and the flipped classroom model both boast certain
traditional flipped classroom model? advantages, this paper contributes to the field by providing new insights
into the synergistic effects of integrating these two approaches. The AI-
The integration of AI-based adaptive systems into the concept of driven learning systems and the flipped classroom model have been
flipped classrooms increases the activity of students and learning studied independently; however, limited research has examined their

18
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

integration, particularly in programming education. Their integration 5.3. Addressing potential concerns and new challenges
should be considered to improve learning results, as it not only enhances
personalized learning through AI but also facilitates a self-regulated, A deeper integration of the AI system into flipped classrooms can
student-centred approach promoted by the flipped classroom model. further raise some concerns. Among them, one can mention that over-
Thus, integrating AI within the flipped classroom model generates reliance on AI for immediate feedback may involuntarily decrease the
effects beyond the individual contributions of each approach to amount of student-teacher interaction. Again, this can be mitigated by
engagement and motivation through dynamic, real-time feedback that the role an AI system plays in supporting students’ autonomous learning
may affect learning outcomes. Building on the strengths of both meth­ and thus can enable instructors to be more involved in organizing
odologies, this study provides new evidence on how AI-supported flip­ collaborative and problem-solving activities during class time.
ped classrooms could tailor learning experiences to meet individual Moreover, while AI systems can enhance student engagement and
needs by providing scaffolding for diversified learners. As a result, their learning results, there could be some problems related to the accessi­
integration can contribute to the development of problem-solving skills, bility of such systems in educational contexts with poor technological
fostering persistence in completing complex tasks, and deepening un­ resources. In addition, teachers should be professionally trained in how
derstanding of programming concepts. to use AI tools in an effective way in the classroom. These challenges
The integration of AI in the pre-lesson phase, particularly in self- highlight the need for targeted policy interventions and ongoing teacher
paced learning, significantly improves student engagement and moti­ professional development to be placed if the successful integration of AI-
vation during the in-lesson phase of learning. While conventional flip­ enhanced flipped classrooms is to occur.
ped classroom models require students to engage with learning
materials before lessons, AI integration with the model guarantees 5.4. The role of AI-based feedback in fostering collaboration and critical
personalization, where each student can identify and address individual thinking
knowledge gaps before encountering complex concepts in the classroom,
which results in enhancing comprehension and learning effectiveness. Good scaffolding for the collaboration of students was provided by
This personalized pre-class preparation ensures that students approach AI-based feedback in the EG. These systems, such as Knewton and
classroom activities with a higher level of confidence and a better un­ ALEKS, are adaptive learning systems that can provide real-time,
derstanding of the material, leading to more productive in-class customized feedback; hence, students can collaborate in trying to
engagement and participation. Previous studies (Xu & Ouyang, 2022; resolve an issue. After some tasks within the programming assignments
Zha et al., 2019) support this notion, demonstrating that AI’s tailored turned out to be difficult to solve for students, AI systems hinted at or
interventions help scaffold learning experiences, resulting in increased reduced the difficulty level of the task, which again triggered an argu­
engagement and participation in the classroom. mentation process among students about what would the most appro­
The study not only confirms the theoretical foundations of the flip­ priate way to solve a particular problem be.
ped classroom model and AI-driven adaptive learning systems but also Instead of individual work, the feedback provided usually called for
demonstrates how these apparently contradictory approaches can be multiple discussions among the students where they compared solutions
effectively integrated to enhance personalized learning experiences. or probed further into any alternative approaches. For example, when
Correspondingly, within the framework of Self-Determination Theory students encountered problems debugging a piece of code, AI feedback
(Ryan & Deci, 2000) and Constructivism (Vygotsky, 1978), motivation usually came up with a hint, which could raise discussions between
and involvement arise through the interplay of autonomy, competence peers on how to clarify what went wrong, explore the logic behind it,
and personalized learning. The extended theoretical frameworks have and develop the best possible solution. In this way, the AI feedback
shown, through integrating AI into the flipped classroom model, that served not only as an individual learning tool but also as a base for
adaptive feedback and individualized learning paths significantly discussions and group problem-solving.
improve the intrinsic motivation of students to engage and learn com­ Critical thinking was further enhanced as the AI feedback encour­
plex subjects such as programming. aged students to discuss the reasons behind their mistakes and the rea­
sons for the provided feedback. This approach enhances deep thinking
about concepts, rather than focusing solely on finding solutions, thereby
5.2. Expanding on the role of AI in flipped classrooms it promotes a deeper understanding of the material.
Upon receiving feedback and discussing it among peers, students
In a flipped classroom setting, as highlighted, AI provides more than consolidated their understanding of the programming concepts and
simply adaptive feedback; it offers personalized learning pathways strategies for solving problems as a group. In fact, discussions among
tailored to the specific needs of individual learners, facilitating effective students were an integral part of the learning process, contributing to
skill development. Furthermore, this study addresses a critical gap in both cognitive and social engagement that fostered collaborative
research by measuring both motivation and engagement, along with the learning.
effects on long-term skill acquisition in programming. While existing The adaptive feedback of AI adjusted the difficulty of the tasks in real
studies examine separated aspects of either AI or flipped classrooms, time and enabled students to work progressively on complex problems
there is a limited number of research examining how their combined as a team. In solving these tasks together, they could combine their
strengths can lead to more robust learning outcomes, particularly in strengths, share strategies and develop a deeper understanding of the
technical domains that require deep understanding and a strategic programming concepts being taught. This active learning approach,
problem-solving approach. supported through personalized feedback, fostered a more interactive
This study further addresses the challenge of ensuring equity in engagement with the learning material and resulted in collaborative
learning outcomes, especially for students who may struggle with learning.
foundational concepts. The role of AI is to ensure that interventions are The basic setup of this study aligns with the findings of Gligorea et al.
given in real time, as AI assures students get help precisely when they (2023), emphasizing that an adaptive learning system can increase
need it. This is particularly helpful in the field of programming educa­ collaboration and critical thinking by providing feedback that includes
tion, where concepts are built sequentially, and students may experience reflection on students’ learning processes and discussions, which results
varying levels of understanding. An AI system tracks how students in fostering deeper engagement and more effective learning outcomes.
interact with the content, modifying this content to provide an envi­
ronment that enhances student engagement and persistence through
challenging portions.

19
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

5.5. Implications for educational practice a non-linear process (Ali et al., 2019). As such, overreliance on AI
feedback is likely to turn students into passive recipients of knowledge
From a practical point of view, this study provides useful leads that rather than active participants in their learning process.
educators can use in integrating technology into their teaching. For
instance, it is suggested that AI-enhanced flipped classrooms could be 5.6.2. Potential for inequality in access
particularly helpful with large classes or for diverse learning needs As revolutionary as AI-based adaptive learning systems can be, their
among students. This model is scalable because personalized learning potential is contingent on having accessible, stable internet connectiv­
pathways can be created without overloading the instructor with one- ity, digital devices, and a supporting technological infrastructure in
on-one interventions. Moreover, the current study also points out that place. In low-income places, in particular, students would not always be
integrating AI-driven adaptive learning with the flipped classroom able to use these tools. (Farahani & Ghasemi; 2024). Warschauer (2011)
model can lead to greater improvement in learning outcomes compared established in his work that digital gaps in accessing technologies can
to applying either approach independently. exacerbate existing gaps in education. Unless gaps in technology access
The study is therefore helpful guidance for policymakers and edu­ are addressed, the potential benefits of AI-based learning will remain
cators interested in improving educational equity and student engage­ inaccessible to a significant percentage of students. (Warschauer, 2011).
ment by examining the synergy between AI and flipped classrooms. The
findings indicate that the AI-enhanced flipped classroom can offer a 5.6.3. Teacher training and readiness
flexible and effective approach toward personalized learning, which The application of AI in classrooms also raises concerns regarding the
may be especially impactful in large, diverse educational settings. readiness of educators to effectively integrate these technologies into
However, the successful implementation of this model requires their teaching practices. The efficacy of AI-assisted flipped classrooms
addressing technology access, equal accessibility for students, data also depends on educators’ readiness to use such tools in their instruc­
protection and teacher training to achieve its full potential. These issues tional methods alongside the technologies being used. Teacher readiness
need careful handling to avoid possible biases in the learning paths; and preparation in using AI tools get little attention in discussions of AI
meanwhile, instructors have to be well-trained so that they can effec­ in classrooms, yet their readiness is a critical factor in the successful
tively use such systems. Furthermore, the AI system should be contin­ implementation and impact of these technologies on student learning
uously monitored and adjusted to keep pace with students’ needs and outcomes. Educators’ beliefs towards technologies, their comfort in
the curriculum. using digital tools, and their instructional approach significantly influ­
The research provides compelling evidence for the effectiveness of ence how AI is utilized in classrooms (Dimitriadou & Lanitis, 2023;
integrating AI-based adaptive learning systems into a flipped classroom, Ertmer & Ottenbreit-Leftwich, 2010). Educators’ unreadiness in using AI
especially in programming education. The findings highlight the dual tools leads to their misutilization or underutilization, reducing the po­
benefits of this integration: enhanced engagement and motivation, as tential of AI-assisted learning environments.
well as significant improvements in learning outcomes. These findings
confirm not only the efficacy of AI and flipped classrooms independently 5.6.4. Ethical considerations and data privacy
but also how their combination provides more personalized, inclusive, AI systems collect large amounts of data on student learning,
and effective learning. New insights from this study provide a useful behaviour, and engagement. Such data, though useful in individualizing
foundation for further research into the integration of AI in educational instruction to accommodate individual needs and in monitoring student
settings. progress, also raise questions regarding the security of the data and their
protection. The collection of student data needs to be done in strict
5.6. Critical perspective on AI-enhanced flipped classrooms in adherence to principles of privacy to shield students from the exploita­
programming education tion of their personal details (Lacroix, 2019). There is also a concern that
AI systems would perpetuate existing gender, race, or socioeconomic
While the use of AI-based learning systems in flipped classrooms has biases in the data. Unless such biases are handled adequately, AI systems
shown promise in enhancing motivation, engagement, and learning perpetuate existing educational inequalities (Verma, 2019). Adequate
outcomes of learners, it is also important to critically review the chal­ AI design and protection of data is crucial to sustaining trust in such
lenges and limitations of this approach. AI in learning is a new devel­ systems and ensuring equitable learning outcomes.
oping branch of learning, and even though it has a high potential to
revolutionize learning, there are some crucial issues that need to be 6. Limitations and future research
addressed when applying it in different learning contexts.
Although the results are promising, there are also some limitations of
5.6.1. Over-reliance on AI feedback the study that should be considered. Whereas the sample size is suffi­
One of the shortcomings of AI-based flipped classrooms is excessive cient for conducting a controlled experiment, it may not fully reach the
reliance on machine feedback. Real-time dynamic feedback is of great total amount of diversity that is actually present within programming
utility in providing instant support and guiding learners through chal­ students, especially regarding prior experience and digital literacy.
lenging assignments, yet it also has the potential to limit learners’ Further research may use larger and more diverse samples to confirm
engagement with human instructors. With more and more instructional and generalize the findings.
work carried out by AI systems, there is an increasing concern that While this study verified the effectiveness of AI-enhanced flipped
learners would miss enriching face-to-face interaction with instructors, classrooms, its results on quantitative aspects such as scores and levels of
which are crucial in developing deeper understanding and logical engagement dominate. Longitudinal intervention effects might also be
reasoning (Singh et al., 2022; Stevens et al., 2021). The role of the considered in further studies regarding whether the benefit from an
human instructor remains essential not only in transforming knowledge adaptive learning system is kept longer than during the class duration
but also in providing emotional support, personalized mentorship, and itself. Moreover, it could be particularly informative for educators and
adaptive guidance between educators and students. policy framers to see how such systems affect students’ attitudes toward
Moreover, although AI feedback is personalized, it remains inher­ programming and their career preparedness.
ently limited by the data it is trained on. Adaptive systems review and This is a mixed-methods research, adopted for concurrently
respond to student performance by predefined algorithms and existing capturing quantitative and qualitative data. Qualitatively in-depth, this
datasets. The used algorithms in such systems, however, do not capture research could be extended further in the context of this paper and may
everything in individual learning processes, especially when learning is include in-depth case studies that probe how students experience and

20
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

adapt to AI-enhanced flipped classrooms over time. While the findings from this study are promising, they underscore the
This would help in the identification of the best implementation of need for further research and development. The standardization of
these technologies, and further allow for the development of more assessment tools and research across different disciplines and learning
robust and scalable models through which AI might be integrated into environments would enhance the generalizability of these findings.
programming education. Long-term research into the use of AI-enhanced flipped classrooms
regarding retention and real-life application of programming skills could
7. Conclusion provide further insights into their efficacy. In this way, the integration of
AI-based adaptive learning systems into flipped classroom models rep­
Integration of AI-based adaptive learning systems with flipped resents a significant advancement in educational innovation, especially
classroom models forms a new pedagogic approach to teaching pro­ in areas like programming, which require a high degree of individuali­
gramming. Indeed, the research has found that integrating the two zation and practice. This approach not only enhances academic perfor­
pedagogies results in increased student learning engagement, motiva­ mance but also fosters a more engaging, inclusive, and supportive
tion, and performance. These findings reinforce existing literature on the learning environment.
advantages of technology-driven adaptive learning systems in address­ By fully addressing challenges in implementation, the expansion of
ing the diverse needs of individual learners, while helping bridge the gap research will ensure that educators and institutions can fully apply the
between theory and application. possibilities of AI-based flipped classrooms to facilitate student success.
Experimental results demonstrated that students in the AI-enhanced
flipped classroom consistently outperformed their peers in the tradi­ CRediT authorship contribution statement
tional flipped classroom model. The AI system, by offering personalized
feedback and tailored learning pathways, allowed students to progress Jozsef Katona: Writing – original draft, Visualization, Data curation,
at their own comfort levels through complex programming concepts and Conceptualization. Klara Ida Katonane Gyonyoru: Writing – review &
thus made learning inclusive and efficient. This finding underlines the editing, Validation, Methodology.
potential of adaptive learning to optimize academic performance and
improve the acquisition of programming skills, as reflected in the sub­
Declaration of competing interest
stantial gain in post-test scores among the EG.
Besides, this study documented the significant role of motivation and
The authors declare that they have no known competing financial
engagement in educational success. In an interactive and participative
interests or personal relationships that could have appeared to influence
learning environment, real-time feedback mechanisms and personalized
the work reported in this paper.
guidance through the AI system significantly improved learning. This is
manifested in the higher degrees of peer-to-peer interaction, interaction
with the instructor, and significantly improved motivational scores References
manifested by the EG. These findings suggest that a flipped classroom
Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped
combined with AI-enhanced learning has an effect not just on over­ classroom: Definition, rationale, and a call for research. Higher Education Research
coming cognitive challenges in learning but also on affective factors and Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336
such as confidence and enthusiasm, which are believed to be important Akçayır, G., & Akçayır, M. (2018). The flipped classroom: A review of its advantages and
challenges. Computers & Education, 126, 334–345. https://doi.org/10.1016/j.
for mastering programming. compedu.2018.07.021
From a practical perspective, these findings provide information for AlJarrah, A., Thomas, M. K., & Shehab, M. (2018). Investigating temporal access in a
educators and institutions seeking to renew their programming flipped classroom: Procrastination persists. International Journal of Educational
Technology in Higher Education, 15(1), 1–18. https://doi.org/10.1186/s41239-017-
curricula. 0083-9
The AI-enhanced flipped classroom model offers a significant Almassri, M. A., & Zaharudin, R. (2023). Effectiveness of flipped classroom pedagogy in
advantage particularly when operating in large or diverse classrooms programming education: A meta-analysis. International Journal of Instruction, 16(2),
267–290. https://doi.org/10.29333/iji.2023.16216a
where individualized instruction can be difficult. By automated Amresh, A., Carberry, A. R., & Femiani, J. (2013). Evaluating the effectiveness of flipped
customized support, AI systems in this model allow educators to focus on classrooms for teaching CS1. In 2013 IEEE frontiers in education conference (FIE) (pp.
facilitating activities that promote collaboration and higher-order 733–735). IEEE.
Bergmann, J. (2012). Flip your classroom: Reach every student in every class every day.
learning. The adaptability of the model also caters to a wide variety of
International Society for Technology in Education. https://doi.org/10.1109/
learner profiles, from beginners to more advanced students, and en­ TLT.2015.2444374
courages scalability making it applicable in different educational Bishop, J., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. In
contexts. 2013 ASEE annual conference & exposition (pp. 23–1200). https://doi.org/10.18260/
1-2–22585
However, such implementation faces several hurdles for the models. Bonwell, C., & Eison, J. (1991). Active learning: Creating excitement in the classroom. In
Firstly, this study highlights some equity issues in relation to the ASHE-ERIC higher education report No. 1. Washington, D.C.: The George Washington
accessibility of the technologies on which AI-driven learning systems University.
Brown, A. F. (2018). Implementing the flipped classroom: Challenges and strategies.
depend. It is essential to ensure that all learners, especially those from Innovations in flipping the language classroom: Theories and practices. https://doi.org/
less developed areas, have access to the necessary digital equipment and 10.1007/978-981-10-6968-0_2
reliable internet connection to be able to use effectively this platform- Clark, R. M., Besterfield-Sacre, M., Budny, D., Clark, W. W., Norman, B. A., Parker, R. S.,
… Slaughter, W. S. (2016). Flipping engineering courses: A school wide initiative.
based approach. Institutional support is crucial in promoting equity Advances in engineering education, 5(3), 1–39.
within educational systems, through policies that ensure the imple­ Cummins, S., Beresford, A. R., & Rice, A. (2015). Investigating engagement with in-video
mentation of appropriate infrastructure and tools for all students. quiz questions in a programming course. IEEE Transactions on Learning Technologies,
9(1), 57–66.
Another important aspect of this new learning paradigm concerns Dimitriadou, E., & Lanitis, A. (2023). A critical evaluation, challenges, and future
the role of the educators. The transition from traditional lecturing to perspectives of using artificial intelligence and emerging technologies in smart
facilitation requires significant professional development. Educators classrooms. Smart Learning Environments, 10(1), 12. https://doi.org/10.1186/
s40561-023-00231-3
have to be trained in understanding and using data coming from the AI
Edgcomb, A., Vahid, F., Lysecky, R., & Lysecky, S. (2017). Getting students to earnestly
systems to effectively address the emerging needs of students, managing do reading, studying, and homework in an introductory programming class. In
student-centred learning environments to foster a collaborative class­ Proceedings of the 2017 ACM SIGCSE technical symposium on computer science
room. This involves elaborate training programs and resources that education (pp. 171–176). https://doi.org/10.1145/3017680.3017732
Er-Rafyg, A., Zankadi, H., & Idrissi, A. (2024). AI in adaptive learning: Challenges and
advance educators toward the demands of technology-driven opportunities. Modern artificial intelligence and data science 2024: Tools, techniques and
classrooms. systems. https://doi.org/10.1007/978-3-031-65038-3_26

21
J. Katona and K.I.K. Gyonyoru Computers and Education: Arti cial Intelligence 8 (2025) 100392

Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change: How Educational Research Review, 22, 50–73. https://doi.org/10.1016/j.
knowledge, confidence, beliefs, and culture intersect. Journal of Research on edurev.2017.08.002
Technology in Education, 42(3), 255–284. https://doi.org/10.1080/ Lopez, S. (2022). Book review–flip your classroom: Reach every student in every class
15391523.2010.10782551 every day by jonathan Bergmann _ampersandsign aaron sams. Electronic Journal of
Farahani, M., & Ghasemi, G. (2024). Artificial intelligence and inequality: Challenges Social and Strategic Studies, 3, 258–264. https://doi.org/10.47362/EJSSS.2022.3208
and opportunities. Int. J. Innov. Educ, 9, 78–99. https://doi.org/10.32388/7HWUZ2 Maher, M. L., Latulipe, C., Lipford, H., & Rorrer, A. (2015). Flipped classroom strategies
Fernandes, C. W., Rafatirad, S., & Sayadi, H. (2023). Advancing personalized and for CS education. In In proceedings of the 46th ACM technical symposium on computer
adaptive learning experience in education with artificial intelligence. 32nd annual science education (pp. 218–223). https://doi.org/10.1145/2676723.2677252
conference of the European association for education in electrical and information Mithun, S., & Evans, N. (2018). Impact of the flipped classroom on students’ learning and
engineering. IEEE. https://doi.org/10.23919/EAEEIE55804.2023.10181336 retention in teaching programming. In 2018 ASEE annual conference & exposition.
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & https://doi.org/10.18260/1-2–30608
Wenderoth, M. P. (2014). Active learning increases student performance in science, Mok, H. N. (2014). Teaching tip: The flipped classroom. Journal of Information Systems
engineering, and mathematics. Proceedings of the National Academy of Sciences, 111 Education, 25(1), 7–11.
(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 Morais, P., Ferreira, M. J., & Veloso, B. (2021). Improving student engagement with
Gligorea, I., Cioca, M., Oancea, R., Gorski, A. T., Gorski, H., & Tudorache, P. (2023). project-based learning: A case study in software engineering. IEEE Revista
Adaptive learning using artificial intelligence in e-learning: A literature review. Iberoamericana de Tecnologias del Aprendizaje, 16(1), 21–28. https://doi.org/
Education Sciences, 13(12), 1216. https://doi.org/10.3390/educsci13121216 10.1109/RITA.2021.3052677
Hayashi, Y., Fukamachi, K., & Komatsugawa, H. (2015). Collaborative learning in O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A
computer programming courses that adopted the flipped classroom. Proceedings of scoping review. The internet and higher education, 25, 85–95. https://doi.org/
the IEEE international conference on advanced learning technologies (ICALT). 10.1016/j.iheduc.2015.02.002
Heffernan, N. T., Williams, J. J., & Ostrow, K. S. (2016). The future of adaptive learning: Odum, M., Meaney, K., & Knudson, D. V. (2021). Active learning classroom design and
Does the crowd hold the key? International Journal of Artificial Intelligence in student engagement: An exploratory study. Journal of Learning Spaces, 10(1), 27–42.
Education, 26, 615–644. https://doi.org/10.1007/s40593-016-0094-z Prince, M. (2004). Does active learning work? A review of the research. Journal of
Hendrik, H., & Hamzah, A. (2021). Flipped classroom in programming course: A engineering education, 93(3), 223–231. https://doi.org/10.1002/j.2168-9830.2004.
systematic literature review. International Journal of Emerging Technologies in Learning tb00809.x
(iJET), 16(2), 220–236. https://doi.org/10.3991/ijet.v16i02.15229 Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of
Hodges, L. C. (2020). Student engagement in active learning classes. Active learning in intrinsic motivation, social development, and well-being. American Psychologist, 55
college science: The case for evidence-based practice, 27–41. https://doi.org/10.1007/ (1), 68–78. https://doi.org/10.1037/0003-066X.55.1.68
978-3-030-33600-4_3 Schlimbach, R., Rinn, H., Markgraf, D., & Robra-Bissantz, S. (2022). A literature review
Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education: Promises on pedagogical conversational agent adaptation. In Pacific asia conference on
and implications for teaching and learning. Center for Curriculum Redesign. information systems (Vol. 1).
Hsu, W. C., & Lin, H. C. K. (2016). Impact of applying WebGL technology to develop a Schunk, D. H., & Zimmerman, B. J. (Eds.). (2008). Motivation and self-regulated learning:
web digital game-based learning system for computer programming course in flipped Theory, research, and applications. Routledge.
classroom. In 2016 international conference on educational innovation through Seo, K., Tang, J., Roll, I., Fels, S., & Yoon, D. (2021). The impact of artificial intelligence
technology (EITT) (pp. 64–69). IEEE. https://doi.org/10.1109/EITT.2016.20. on learner–instructor interaction in online learning. International Journal of
Janson, A., Sӧllner, M., & Leimeister, J. M. (2020). Ladders for learning: Is scaffolding the Educational Technology in Higher Education, 18(1), 1–23. https://doi.org/10.1186/
key to teaching problem-solving in technology-mediated learning contexts? The s41239-021-00292-9
Academy of Management Learning and Education, 19(4), 439–468. https://doi.org/ Singh, J., Evans, E., Reed, A., Karch, L., Qualey, K., Singh, L., & Wiersma, H. (2022).
10.5465/amle.2018.0078 Online, hybrid, and face-to-face learning through the eyes of faculty, students,
Jin, S. H., Im, K., Yoo, M., Roll, I., & Seo, K. (2023). Supporting students’ self-regulated administrators, and instructional designers: Lessons learned and directions for the
learning in online learning using artificial intelligence applications. International post-vaccine and post-pandemic/COVID-19 world. Journal of Educational Technology
Journal of Educational Technology in Higher Education, 20(1), 37. https://doi.org/ Systems, 50(3), 301–326. https://doi.org/10.1177/00472395211063754
10.1186/s41239-023-00406-5 Stevens, G. J., Bienz, T., Wali, N., Condie, J., & Schismenos, S. (2021). Online university
Kasinathan, V., Mustapha, A., & Medi, I. (2017). Adaptive learning system for higher education is the new normal: But is face-to-face better? Interactive Technology and
learning. In 2017 8th international conference on information technology (ICIT) (pp. Smart Education, 18(3), 278–297. https://doi.org/10.1108/ITSE-08-2020-0181
960–970). IEEE. https://doi.org/10.1109/ICITECH.2017.8079975. Strielkowski, W., Grebennikova, V., Lisovskiy, A., Rakhimova, G., & Vasileva, T. (2024).
Kay, R. H., Macdonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, AI-driven adaptive learning for sustainable educational transformation. Sustainable
active, and flipped classroom teaching approaches in higher education. Journal of Development. https://doi.org/10.1002/sd.3221
Computing in Higher Education, 31(3), 449–471. https://doi.org/10.1007/s12528- VanLehn, K. (2011). The relative effectiveness of human tutoring, intelligent tutoring
018-9197-x systems, and other tutoring systems. Educational Psychologist, 46(4), 197–221.
Kim, M. K., Kim, S. M., Khera, O., & Getman, J. (2014). The experience of three flipped https://doi.org/10.1080/00461520.2011.611369
classrooms in an urban university: An exploration of design principles. The Internet Verma, S. (2019). Weapons of math destruction: How big data increases inequality and
and higher education, 22, 37–50. https://doi.org/10.1016/j.iheduc.2014.04.003 threatens democracy. Vikalpa, 44(2), 97–98. https://doi.org/10.1177/
Kumor, T., Uribe-Flórez, L., Trespalacios, J., & Yang, D. (2024). ALEKS in high school 0256090919853933
mathematics classrooms: Exploring teachers’ perceptions and use of this tool. Vihavainen, A., Luukkainen, M., & Kurhila, J. (2012). Multi-faceted support for MOOC in
TechTrends, 1–14. https://doi.org/10.1016/j.heliyon.2021.e07345 programming. Proceedings of the 13th annual conference on innovation and technology
Kwak, M., Jenkins, J., & Kim, J. (2023). Adaptive programming language learning in computer science education. https://doi.org/10.1145/2380552.2380603
system based on generative AI. Issues in Information Systems, 24(3). https://doi.org/ Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes
10.48009/3_iis_2023_119 (Vol. 86). Harvard university press.
Lacher, L. L., & Lewis, M. C. (2015). The effectiveness of video quizzes in a flipped class. Wang, S., Christensen, C., Cui, W., Tong, R., Yarnall, L., Shear, L., & Feng, M. (2023).
In Proceedings of the 46th ACM technical symposium on computer science education (pp. When adaptive learning is effective learning: Comparison of an adaptive learning
224–228). https://doi.org/10.1145/2676723.2677302 system to teacher-led instruction. Interactive Learning Environments, 31(2), 793–803.
Lacroix, P. (2019). Big data privacy and ethical challenges. In M. Househ, A. Kushniruk, https://doi.org/10.1080/10494820.2020.1808794
& E. Borycki (Eds.), Big data, big challenges: A healthcare perspective. Lecture notes in Warschauer, M. (2011). Learning in the cloud. Teachers College Press.
bioengineering. Cham: Springer. https://doi.org/10.1007/978-3-030-06109-8_9. Xu, W., & Ouyang, F. (2022). The application of AI technologies in STEM education: A
Lameras, P., Levy, P., Paraskakis, I., & Webber, S. (2012). Blended university teaching systematic review from 2011 to 2021. International Journal of STEM Education, 9(1),
using virtual learning environments: Conceptions and approaches. Instructional 59. https://doi.org/10.1186/s40594-022-00377-5
Science, 40, 141–157. https://doi.org/10.1007/s11251-011-9170-9 Yilmaz, R., & Yilmaz, F. G. K. (2023). The effect of generative artificial intelligence (AI)-
Lang, C., Siemens, G., Wise, A., & Gasevic, D. (Eds.). (2017). Handbook of learning based tool use on students’ computational thinking skills, programming self-efficacy
analytics (p. 23). New York: SOLAR, Society for Learning Analytics and Research. and motivation. Computers and Education: Artificial Intelligence, 4, Article 100147.
https://doi.org/10.18608/hla17. https://doi.org/10.1016/j.caeai.2023.100147
Lo, C. K., Hew, K. F., & Chen, G. (2017). Toward a set of design principles for Zha, S., Jin, Y., Moore, P., & Gaston, J. (2019). Hopscotch into coding: Introducing pre-
mathematics flipped classrooms: A synthesis of research in mathematics education. service teachers computational thinking. TechTrends, 64, 17–28. https://doi.org/
10.1007/s11528-019-00423-0

22

You might also like