Generative AI and Higher Education
Generative AI and Higher Education
www.ipp-jcs.org
© Institute of Public Policy Lisbon | September 2024 – All rights reserved
Reports
Reports by Institute of Public Policy aim to support the public debate with comprehensive research and detailed empirical analysis.
The authors
Steffen Hoernig is a Professor of Economics at Nova School of Business and Economics. André Ilharco is a researcher at the Institute
of Public Policy. Paulo Trigo Pereira is a Professor at ISEG, University of Lisbon and President of the Institute for Public Policy. Regina
Pereira is a researcher at the Institute of Public Policy.
2
Contents
1. Introduction........................................................................................................................... 5
2. Educators: Teaching and Evaluation Methods ...................................................................... 8
2.1. Findings from the Literature ......................................................................................... 8
2.2. Use Cases..................................................................................................................... 10
2.3. Main Opportunities and Threats ................................................................................. 13
3. Students: Learning and Autonomy...................................................................................... 19
3.1. Findings from the Literature ....................................................................................... 20
3.2. Use Cases..................................................................................................................... 21
3.3. Main Opportunities and Threats ................................................................................. 24
4. Higher Education Institutions: Organizing and Planning ..................................................... 29
4.1. Findings from the Literature ....................................................................................... 30
4.2. Use Cases..................................................................................................................... 31
4.3. Main Opportunities and Threats ................................................................................. 35
5. How to Minimise Risks While Tapping the Potential of GenAI ........................................... 40
5.1. Educators..................................................................................................................... 40
5.2. Students ...................................................................................................................... 44
5.3. Higher Education Institutions ...................................................................................... 46
6. Conclusions and Recommendations ................................................................................... 49
References................................................................................................................................... 53
3
4
1. Introduction1
Generative AI (GenAI) became a worldwide sensation in late 2022, with the public
release of ChatGPT. Soon students, educators, teaching assistants, and university
administrators grasped the potential of instruments such as chatbots in higher education
(HE). While some foresaw the end of classical teaching via substitution by AI assistants,
others understood the transformative potential of GenAI in shaping and enhancing
education to everybody’s benefit.
GenAI can provide students with quick summaries of educational materials, help
teachers prepare their classes and grade their students' exams, as well as reorganise the
entire academic curriculum and teaching approach for future students based on an
analysis of previous and incoming students’ data. However, GenAI brings other
possibilities to the table. It does not simply curate information or summarise many sources
into one usable and easily accessible document for humans, it also generates new content.
This content may assume the form of human-like text, image, video, sound, software, or
anything the GenAI technology can collect data about. It is indeed the GenAI technology
of Large Language Models (LLMs) that is behind every AI chatbot such as OpenAI’s
ChatGPT, Google’s Gemini, or Microsoft’s Copilot. Other types of AI (“big data”) will also
play their part in developing higher education, namely due to their capabilities to analyse
large data sets and find patterns. For instance, an AI-powered analysis of student data,
such as their age, gender, or previous school grades, among other performance-related
data, can help universities better grasp the roots of old educational problems such as
dropouts, unequal opportunities arising from external factors, or outdated curricula.
If left to chance, these new possibilities may also lead to abuse. There is a double need
to prevent the misuse of GenAI and to use it to enhance students’, teachers’ and every HE
actor’s performance and integration. The European Union has already produced various
policy documents. The Living guidelines on the responsible use of generative AI in
research (European Commission, 2024)2 suggests guidelines for research, which can be
extended to higher education more generally. Examples are the obligation for researchers
to mention the use of AI in their research (similar to in-class activities/evaluations carried
out by teachers using AI); avoiding the predominant use of AI in tasks that impact other
1 This report is a work in progress and comments are welcome. We would like to thank Google for supporting this research, and
ISEG (University of Lisbon) for supporting Institute of Public Policy. The cover image was produced by ChatGPT.
2 https://research-and-innovation.ec.europa.eu/document/2b6cf7e5-36ac-41cb-aab5-0d32050143dc_en
5
researchers (e.g. peer reviews) may be justified on the same basis as, for example, the
obligation for teachers not to subject their students to 100% AI-managed evaluations.
The Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and
learning for Educators (European Commission, 2022)3 refers to four main domains where
AI and Data can be used in Education (student teaching; student supporting; teacher
supporting; and system supporting) and provides thirty-six Guidance Questions for
Educators regarding human agency and oversight; transparency; diversity, non-
discrimination and fairness; societal and environmental well-being; privacy and data
governance; technical robustness and safety; accountability.
Other institutions have produced policy documents (OECD 2023, UNESCO 2023,
2023a) addressing the issue of the impact of AI in education and research. As GenAI
develops, the number of papers published on the topic increases exponentially, and so
does the number of higher education institutions that produce either regulations on AI
use or guidelines for educators and students.
This report aims to give a brief overview of the impact of GenAI on higher education
namely in three main dimensions: educators (Section 2), students (Section 3) and higher
education institutions (Section 4). Our focus is on the main opportunities and threats
associated with GenAI to minimise the risks and fully take advantage of the opportunities.
We faced two difficulties in composing this report. One is that the GenAI is a moving
target. We know what it is today and which are its available applications (LLMs, image
/sound / video generators, editing tools such as Grammarly, bibliography management
tools such as SciSpace), but do not yet know its future incarnations and uses. The other
difficulty is that we would like to refer to a shared understanding about the mission and
core values of Higher Education Institutions (HEIs). For example, plagiarism is certainly
a threat because academic integrity is a shared core value. The identification of and trade-
offs between threats and opportunities depends on our focus on certain core values.
The first difficulty is unsurmountable. The only certainty we have is that the questions
we address in this paper will have different answers in a few years as GenAI develops and
new applications emerge. As for the second, we will mention a set of core values
throughout the paper, raise questions, and present our answers and recommendations in
Sections 5 and 6.
3 https://op.europa.eu/en/publication-detail/-/publication/d81a0d54-5348-11ed-92ed-01aa75ed71a1/language-en
6
The following questions are among the most relevant raised by the development of
GenAI:
It is beyond the scope of this report to answer the first question, which is a fundamental
existential question of higher education institutions not much addressed in the literature.
However, we will address all the other questions in this report.
7
2. Educators: Teaching and Evaluation Methods
“One recent study found that in the 20 occupations most exposed to AI language
modelling, there are 14 teaching subcategories including English language and literature,
foreign language and literature, and history teachers” (Felten et al., 2023).
At the level of opportunities for educators, one can identify three main domains: the
preparation of class materials, teaching attractiveness, and evaluation of students’ work,
where the latter faces new challenges due to students’ (mis)use of AI. Concerning the
risks, the quality of AI system outputs is not guaranteed, may not be in line with
educational goals, or may contain biases. Educators will be responsible for overseeing and
avoiding these issues.
The rapid integration of GenAI into educational settings is reshaping curriculum design
and pedagogical approaches. In a recent study, Lee et al. (2024) formed a Community of
Practice (CoP) among the University of Adelaide’s staff and students to collectively discuss
issues surrounding AI. Half of the participants indicated they had modified their course
designs due to GenAI. A prevalent change involved assessments, with many participants
mentioning the use of varied assessment methods, the introduction of new assessment
questions, and tasks focused on the application of knowledge. Additional adjustments
8
included altering essay topics to emphasise fieldwork and using AI to create low-quality
drafts that students were then asked to improve.
Although most AI-enabled adaptive learning systems and frameworks are still in an
experimental phase, they have seen growing interest and use since the pandemic (Kabudi
et al., 2021), particularly to improve pedagogy. These systems can enhance teaching and
offer individualised assistance by accurately matching the specific learning goals and
criteria for the success of the course or instructional unit, allowing educators to “tailor
educational content to the distinct needs, interests, and learning preferences of each
student, offering personalised learning materials and activities” (Labadze et al., 2023).
Concerning exam preparation and correction, GenAI has been found to transform
educators’ work loads and capabilities to do either. Yang et al. (2021) reported that it can
generate questions and create multiple-choice tests. Additionally, Lu et al. (2021) studied
the use of natural language processing to create a system that automatically created tests
and found that AI technologies can generate highly reliable short-answer questions.
Although not pure GenAI per se, AI-powered automatic assessment reduces grading time
(Crompton & Burke, 2023; Rutner & Scott, 2022). Nonetheless, GenAI's capabilities are
essential to provide detailed and personalised feedback on those assessments.
After the launch of ChatGPT in November 2022, both students and educators were
taken in by the quality of the chatbot’s text outputs. Many educators were also
immediately concerned with the need to weed out GenAI-driven plagiarism. The literature
on this issue stresses the fragility of the existing mechanisms to detect AI-generated text,
arising from the difficulty of differencing the latter from human-generated text (Elkhatat
et al., 2023). Not only do AI-generated texts (such as those produced by ChatGPT) go
unnoticed by AI detection tools (Weber-Wulff et al., 2023), but studies have found that
9
these detection tools have too many false positives and false negatives (Dalalah & Dalalah,
2023), which may undermine HE core values such as trust and integrity.
From the educators’ perspective, GenAI can play a pivotal role in preparing courses
and establishing learning goals. GenAI tools, such as AI chatbots and ITS, offer numerous
advantages that can significantly enhance the educational process. For instance, ITS can
assist educators by measuring students' comprehension, proposing suitable teaching
approaches and techniques, and offering assistance and direction to both students and
educators (Zawacki-Richter et al., 2019).
Well-designed GenAI systems can further aid various aspects of course preparation by
offering a wealth of resources, making learning materials easily accessible, and offering
expert guidance on challenging subjects (Ilieva et al., 2023; U.S. Department of
Education, 2023; UNESCO, 2023a). Additionally, they can assist with routine
administrative tasks associated with university courses (OECD, 2023). For example, if
correctly trained, AI chatbots could efficiently manage common student inquiries,
schedule office hours, and distribute course materials (Ilieva et al., 2023).
By managing these tasks, GenAI tools could allow educators to focus more on critical
aspects of teaching. In fact, according to Ilieva et al. (2023), the most important decisions
related to instructional design, assessment methods, and overall course management
must be retained by the educator. Human supervision guarantees that the course content
remains permanent and accurate. Educators can use insights from AI tools to make
informed decisions while maintaining their essential role in the educational process
(Felix, 2020).
In the context of class preparation and study material, GenAI can be an asset. For
example, educators can use it to record classroom sessions and obtain metrics related to
student engagement (U.S. Department of Education, 2023). If students spend a significant
amount of class time talking to each other, the educator can analyse this behaviour to
understand its underlying causes: Is the content too difficult? Is it too boring? By
10
identifying the reasons, educators can develop targeted approaches to address these
specific issues, enhancing the overall effectiveness of their teaching.
In addition, GenAI tools can be used throughout the class itself. According to OECD
(2023), AI tools can be used to augment cognitive engagement, by supporting educators
in (i) generating multiple representations, (ii) adapting to more meaningful contexts and
(iii) providing numerous opportunities to practice.
Specifically, in (i), the educator can recur to text, graphics, or models to help students
make deeper connections and achieve a thorough understanding of the content. GenAI
can support this by providing diverse explanations, allowing students to evaluate and
contrast different viewpoints, which can enhance their comprehension. In the context of
(ii), educators can create a more adaptable bank of instructional resources, generating
them spontaneously to address the interests of distinct groups of students. This would
ensure that the learning materials are relevant and engaging to all students. Finally, in
(iii), through repetitive practice, students develop fluency and the ability to perform tasks
quickly and effectively by repeatedly engaging in the same computations and processes.
Educators can use GenAI to produce unlimited, customised practice exercises, offering
targeted practice opportunities that address the specific difficulties students might face in
each topic.
11
optimally reintroduces old content to refresh memory (Zawacki-Richter et al., 2019;
OECD, 2023). Beyond supporting specific topics, it can guide students' learning
trajectories, indicating when they are ready to advance.
An example of a tool that was designed to produce a feedback loop between teaching
statistics and assessing student’s progress is Stat-Knowlab, as described by de Chiusole et
al. (2020). It adapts each learning experience to individual students' competencies and
creates optimal learning paths. Applications such as this will help educators ensure that
their students engage with educational activities suited to their current level of readiness.
Kurtz et al. (2024) emphasise the transformative potential of GenAI in the domain of
student assessment. According to the authors, educational institutions still rely on
traditional testing methods focusing on memorization and recall, neglecting the practical
application of knowledge and skills. Moreover, while the call for transformative
approaches in teaching and assessment is not solely a consequence of GenAI's emergence,
the latter underscores the urgent need for these educational reforms.
One approach is through "AI-proof" assignments that use best practice teaching
principles, such as breaking them into smaller tasks, designing authentic assignments with
real-world value, incorporating space for student reflection and metacognition, and
creating connections in content to experiences that AI lacks (e.g., recent events, classroom
discussions). For instance, “AI-proof” assignments that were put into practice include
generating sample texts or code and asking students to fact-check, critique, and improve.
Integrating AI into evaluation methods themselves can take multiple forms. According
to Ouyang et al. (2022), various studies predating the launch of LLMs had already shown
the benefits of automated assessment in online higher education. For example, Hooshyar
et al. (2016) created an ITS Tic-tac-toe Quiz, designed to provide formative assessments
for students' programming and problem-solving skills. Their research found that this
system improved students' interest in learning, fostered positive attitudes, increased
technology acceptance, and enhanced problem-solving capabilities. Similarly, Aluthman
(2016) introduced an automated essay evaluation system that offers immediate feedback,
assessment, and scoring for students in an online English learning environment. The study
concluded that it significantly enhanced the writing performance of undergraduate
students.
12
GenAI tools can further revolutionise formative assessments by offering advanced
analysis and feedback capabilities (Celik et al., 2022). GenAI can evaluate student-created
graphs or models, provide immediate feedback on complex skills, and manage simpler
grading tasks, allowing educators to focus on more complex evaluations (U.S. Department
of Education, 2023).
2.3.1. Opportunities
Large class sizes can be challenging when it comes to assessing students’ level of
understanding in real time. It is not a question of evaluation, but of keeping and fostering
the engagement of students with the content being taught. As mentioned in Section 2.2.3.,
GenAI may help educators by providing quizzes and diagnostic exercises, which can, in
turn, provide more granular and precise reporting on what each of these students learned
or is learning during a class. The level of monitoring reach and precision of GenAI
outperforms the educator’s (OECD, 2023). Following the above, in the future, GenAI can
be an asset to measure students’ level of understanding at two levels: the individual level
and the class level.
At the individual level, educators can identify students who may be struggling with
specific topics and intervene right away (Celik et al., 2022; Lim et al., 2023). For example,
suppose a student persistently answers questions incorrectly. In that case, the AI tool can
alert the educator, who can then provide additional support or adjust explanations to
address the student's difficulties.
At the class level, GenAI can aggregate data to provide insights into overall class
performance (Celik et al., 2022; U.S. Department of Education, 2023). Educators can use
such insights to adjust their strategies, allocate more time to topics that students find
challenging, and identify trends in student understanding. This could present a great
opportunity to develop group activities and assessments, by pairing students who have
mastered specific topics with others who have not yet reached the same level of
understanding.
13
2.3.1.2. Teaching materials better suited to students' abilities and educational levels
Students differ not only in their knowledge level, areas of preference or grades: they
also differ in the educational approach under which they learn best. While some require
more direct attention, others work best as self-learners. While some prefer to follow the
standard order of the contents, others find it better to have them reorganised. This, of
course, is not the cause of all academic failure, but it would be naïve to think that these
issues do not exist, that a one-fits-all approach to teaching is best where all students
regardless of their capabilities and motivation must learn each content at the same time,
at the same pace, with the same activities.
In addition, GenAI may also support and assist educators’ judgement of each student’s
development by providing learning analytics. The data gathered in real time of each
student's performance can inform their educator on what to study next and what set of
exercises will help the student achieve the expected learning goals. Each time this analysis
is performed, the educator will progressively have more insights and proposals on how to
plan next year’s curriculum.
GenAI can train and test students in unscripted and highly realistic situations. Take
the case of finance or economics. With GenAI, an educator could simulate a life-like real-
time market situation where students could interact with each other and the GenAI
mechanism in more ways than a classical game situation where all participants have a
limited set of options. In medical studies, GenAI technology could reproduce a real-time
simulation of a human liver (possibly complemented by virtual reality) that reacts with a
14
high level of accuracy to the medical student’s actions. In both cases, GenAI would provide
an unprecedented level of interactivity.
This capability allows educators to make their teaching more effective and responsive
to the individual needs of students. For instance, GenAI can convert complex texts into
simpler versions for students with different comprehension levels, or create multilingual
glossaries to support language learners. Moreover, in the future, it is also expected that
more specific adaptations of educational materials for various disabilities will be created,
covering a broader number of physical, sensory, cognitive, or learning disabilities (Pierrès
et al., 2024).
15
2.3.1.5. Less time spent on administrative tasks
“Schools and teachers can use software to perform many repetitive and time-
consuming tasks such as timetabling, attendance control, and enrolment. Automating
such tasks can allow teachers to spend less time on routine tasks and more time with their
students [or preparing class materials or other educational approaches]” (European
Commission, 2022).
2.3.2. Threats
The development and implementation of GenAI tools must ensure that they enhance
learning while not inadvertently harming students (Felix, 2020). If these tools are not
used correctly or exacerbate existing issues in education, their use could significantly
undermine trust in educators, both by students and their parents. Over-reliance on AI may
even lead students to prefer AI tools instead of their educators or peers, specifically if the
application seems to be more competent and to provide more immediate feedback than
the educator or colleagues (Felix, 2020).
16
This potential shift in trust could have several implications. Strong relationships
between educators and students are crucial for effective learning. If students start to rely
more on AI tools, the personal connection that educators build with their students will
weaken, leading to a less engaging and supportive learning environment. Additionally,
over-reliance on AI reduces opportunities for students to develop critical thinking and
interpersonal skills (Abbas et al., 2024). Educators not only impart knowledge but also
foster discussions, debates, and interactions that are essential for holistic development.
AI tools, while efficient, may not be able to replicate the nuanced guidance and
encouragement that a human educator provides.
Karl Popper observed long ago that “every solution of a problem raises new unsolved
problems” (Popper, 1963). If it is true that AI could solve many current problems in higher
education, it created several new ones, and hard-to-detect plagiarism is certainly one of
them. If AI serves to both detect and avoid detection of AI authorship at the same time
and with equal effectiveness, it will be a threat to higher education. It may even not just
be a threat, but a transformative condition. As stated in the literature, AI tools and
mechanisms to detect AI-generated text are far from fulfilling their mission accordingly,
as there are many ways to avoid detection. Ibrahim et al. (2023) report that it does not
need much to mislead ChatGPT-based tools to detect plagiarism, a few typos or full-stops
can do the trick. The studies mentioned above (Dalalah & Dalalah, 2023; Elkhatat et al.,
2023; Weber-Wulff et al., 2023) strengthen the idea that AI will not be enough to deter
its use for plagiarism. Certainly, HE educators will have to adapt to this new condition.
Until this happens, AI must be considered a threat to academic integrity and the
achievement of the educational goals of HE in learning and research.
The ethical dimension of GenAI use in the evaluation and assessment of students’
learning and achievement is especially important. AI systems’ tendency for hallucinations
and biases is well-known and justifies the requirement of “in-the-loop” presence of the
educator in any AI tool participating in student evaluations. Additionally, GenAI systems
may overlook and disregard other human factors that matter in the assessment of a given
student, namely the learning process, the student’s effort, and their participation in course
work and classes. In other words, it seems clear that AI grading overlooks one educational
17
output that has been crucial to HE for a long time, which is the student's ability to “learn
to learn.” This does not mean that educators, therefore, must be forbidden to make
reasonable use of any form of automatic evaluation. Rather, they will have to have the
final word regarding the overall evaluation. We return to this issue in chapter 5.
As mentioned in the Introduction, recent research by Felten et al. (2023) explores how
LLMs such as ChatGPT will impact various occupations, identifying post-secondary
educators, such as those in English and foreign languages and literature, and history,
among the most affected. These teaching roles are among the top 20 occupations exposed
to AI-enabled advances in language modelling capabilities.
However, attention should be given to the fact that GenAI has the potential to
automate certain teaching positions, particularly those involving repetitive or less
specialised tasks. Maintaining the role of human educators is crucial due to concerns
about the quality of education. Human educators should bring empathy, creativity, and a
deep understanding of student needs—qualities that are difficult for AI to replicate (Chan
& Tsi, 2023). Reducing contact with human educators could result in a more transactional
form of education, diminishing the relational aspects that foster student engagement,
critical thinking, and interpersonal skills.
One group that may be at particular risk is teaching assistants (TAs), who play a crucial
role by aiding educators in teaching, grading, and administrative tasks in close contact
with students. TA positions also serve as a significant source of financial aid, as well as
valuable teaching and research experience, for graduate students pursuing advanced
degrees. With GenAI systems' increasing capabilities to manage tasks typically carried out
by TAs, there is a concern that these positions may become scarce. The replacement of
TAs with GenAI has the potential to reduce the number of graduate students who can
support their education through TA roles (Pence, 2019). This shift could have broader
effects on the academic community, potentially shrinking the group of prospective
educators and researchers.
According to UNESCO (2023), “Artificial Neural Networks are usually black boxes;
that is, […] their inner workings are not open to inspection. As a result, ANNs are not
transparent or explainable, and it is not possible to ascertain how their outputs were
18
determined.” This lack of transparency makes it challenging to detect biases in GenAI
models. Furthermore, UNESCO (2023) emphasises that “If users do not understand how
a GenAI system arrived at a specific output, they are less likely to be willing to adopt it or
use it ….”
Two concerns reflected in policy documents regarding GenAI for higher education are
transparency and explainability. These concerns are especially important if one considers
the educators’ suggested role of oversight of AI systems used for classes. How can one
oversee a “black box”? And what do educators need to know regarding GenAI to act upon
these systems? For example, one may think of cases where biases are detected in the
middle of a course with GenAI incorporated in its educational approach. How will
educators be able to interact with or correct the GenAI system? AI systems skills will enter
the European Framework for the Digital Competence of Educators4 of 2017, which is
described as a “general reference framework to support the development of educator-
specific digital competences in Europe”. As did the other threats mentioned above, the
role of educators will have to be rethought both in terms of desired educational goals and
educational capabilities.
Since the release of ChatGPT, students have leveraged the capacity of GenAI to
produce well-written text, often to bypass the effort of writing essays and assignments by
themselves (Beck et al., 2023). While this might save time, it defeats the purpose of the
evaluation exercise, which is to make students practice writing skills and assess whether
they have developed their skills concerning the relevant subject matter. Too often, grading
has focused on the quality of the text rather than the depth of understanding.
4 https://joint-research-centre.ec.europa.eu/digcompedu_en
19
Nevertheless, students’ use of GenAI capabilities should not be viewed only in negative
terms. Using GenAI to produce ideas, structure and summarise arguments, and produce
better and more imaginative text can be a vital skill set. Thus the challenge lies in finding
the right trade-off and persuading students that relying solely on AI to complete their
work short-changes their learning experience. Students must understand that while AI
can assist in the learning process, the ultimate goal is to develop their analytical and
critical thinking skills.
The way students perceive their learning environment, their self-assessed abilities, and
the instructional methods employed are important in shaping their learning strategies
(Biggs, 1999). These perceptions can subsequently affect their educational outcomes
(Chan & Hu, 2023): When students view their learning environment positively – covering
aspects such as curriculum content, instructional methods, assessment techniques,
learning resources, and support services – and feel confident in their abilities, they are
more likely to engage in a deep learning approach. This approach involves striving for
understanding and linking concepts. Conversely, students who perceive their learning
environment negatively or lack confidence in their abilities tend to adopt a surface
learning approach, which is characterised by memorization and meeting basic
requirements (Biggs, 2011).
20
tools benefit the learning process. If students perceive GenAI as a valuable resource and
feel confident in their ability to use it, they are likely to engage deeply with the
technology, enhancing their learning experience (Kurtz et al., 2024). However, if they
have concerns about access or unclear usage policies, they may adopt a more superficial
or improper approach to using GenAI. This can limit its potential benefits in their
education and lead to other consequences, such as reduced skill development due to
overreliance (Abbas et al., 2024), academic integrity issues (Sullivan et al., 2023; Cotton
& Cotton, 2023; Tlili et al., 2023; Memarian & Doleck, 2023a), inequality in access
(UNESCO, 2023a), and overdependence on technology.
Despite these concerns, the literature shows that GenAI tools can significantly enhance
educational experiences across various domains. For instance, these applications can
potentially deepen students' knowledge and comprehension by providing tailored
educational experiences and helping them develop their writing skills (Beck et al., 2023;
Kaharuddin, 2021).
Moreover, as discussed in the previous Section, these tools adapt to individual learning
styles, making educational progress more personalised and effective. They provide
continuous, real-time feedback, which is crucial for performance improvement.
Furthermore, GenAI promotes active participation in the learning process, improves the
capacity to assess information critically, and encourages inquisitive learning,
strengthening critical thinking skills (OECD, 2023).
21
2018), which promotes autonomous learning and enhances the learning process and
outcomes for students.
Kim and Bennekin (2016), as reported in Crompton & Burke (2023), conducted a
study on Alex, an AI assistant used in a college mathematics course. Alex interacted with
students by asking diagnostic questions and providing support tailored to student needs.
This support was organised into four stages: goal initiation (“Want it”), goal formation
(“Plan for it”), action control (“Do it”), and emotion control (“Finish it”). Alex offered
help based on the specific needs of students in different subject areas, promoting
perseverance in their academic pursuits and enhancing their academic results. This study
highlighted the role of AI in ensuring timely support and adapting to students’ academic
abilities, preferences, and optimal support strategies.
GenAI tools for course assistance can also revolutionise the way students organise
course materials and prepare for their classes. They can organize lecture notes, schedule
study sessions, and provide reminders for assignments and exams (Sajja et al., 2023).
Given that these tools analyse students' learning patterns, they can also offer tailored
recommendations to optimise study habits. For example, platforms like Mindgrasp
provide comprehensive GenAI-driven solutions that curate relevant study materials,
highlight key concepts, and answer challenging questions in real time.
22
AI course assistants also facilitate communication and information retrieval, allowing
students to ask questions related to the syllabus, such as exam dates, upcoming class
materials, homework assignments, attendance, and grade and course expectations (Sajja
et al., 2023). For instance, tools like Quizlet’s Q-Chat and Course Hero’s AI assistant offer
interactive features that enable students to receive instant, accurate responses to their
queries, enhancing their preparedness and engagement with the course content.
GenAI-driven course assistance tools are also valuable for homework and exam
preparation (Labadze et al., 2023). These tools can generate summaries of lecture notes,
making it easier for students to quickly review key concepts. This, allied to the fact that
these tools can highlight parts of the lesson that students did not understand well and
offer extra materials and explanations, can also help students grasp difficult topics better.
Flashcard creation is another significant feature of GenAI tools. By converting lecture
notes and textbooks into flashcards, these tools facilitate active recall, which is a proven
technique for enhancing memory and learning efficiency (Roediger & Butler, 2011).
Moreover, the interactive and personalised feedback offered by these tools can also be
very useful for homework preparation, helping students improve their practice problems
and/or assignments.
Regarding exam preparation, students can use GenAI tools to quiz themselves,
reinforcing their knowledge (Choi et al., 2023). In addition, GenAI tools can predict
potential exam questions based on the course content. This feature could help students
focus their study efforts on the most relevant material. For instance, AI algorithms can
analyse past exams and current course materials to generate exam questions, providing
students with a targeted and efficient study guide.
GenAI tools can aid in developing various skills essential for academic and professional
success (Labadze et al, 2023). According to the authors, these are (i) enhanced writing
skills, (ii) problem-solving abilities, (iii) group collaboration, and (iv) critical thinking and
analytical skills.
In terms of (i), GenAI tools offer syntactic and grammatical corrections, stylistic
suggestions, and vocabulary enhancements. This helps students produce clearer writing
(Kaharuddin, 2021). For instance, tools like Grammarly provide real-time feedback on
23
grammar and style, allowing students to learn and improve their writing proficiency over
time.
GenAI tools can also contribute to developing critical thinking and analytical skills
(Kasneci et al., 2023). By analysing large datasets and generating insights, AI can assist
students in conducting thorough research and developing well-supported arguments. This
exposure to advanced data analysis techniques prepares students for future academic and
professional challenges, enhancing their overall intellectual capability.
3.3.1. Opportunities
3.3.1.1. Advanced study support: tutoring sessions, autonomous exercise creation, and mock oral
examinations
As GenAI continues to evolve, its ability to provide precise and insightful feedback will
improve. By analysing larger and more complex datasets, AI can detect subtle patterns in
student performance, offering highly targeted recommendations for improvement. Areas
that can significantly benefit from these advancements are (i) personalised tutoring
sessions, (ii) autonomous creation of exercises, and (iii) mock oral examinations.
24
Concerning (ii), it can be expected that, in the future, GenAI tools autonomously create
even more personalised and varied exercises, integrating multimedia elements to meet
different learning preferences. Advanced data analytics might also allow AI to predict
learning trajectories and proactively adjust study plans to optimise long-term retention
and understanding.
In terms of (iii), GenAI can simulate real exam scenarios, ask questions, evaluate
responses, and offer feedback. Enhanced feedback mechanisms, including detailed
analysis of verbal and non-verbal communication skills, could help students refine their
presentation and public speaking abilities. Furthermore, augmented or virtual reality
tools can allow to creation of immersive, lifelike exam environments, offering an even
more comprehensive preparation experience. These technologies could be especially
useful for preparing presentations and thesis defences.
Specifically, the advancement of GenAI settings will allow the provision of precise and
insightful feedback, helping students to identify their strengths and weaknesses with
higher accuracy. This will empower them to take active steps in their learning path,
adjusting their strategies and efforts to achieve better outcomes. Moreover, students will
be able to identify their knowledge and readiness more accurately for exams or
assignments, allowing them to be better prepared and to develop deeper knowledge and
competencies.
25
3.3.1.3. Enhancing research skills
In this sense, and as students become more familiar with these tools, expressing
positive perceptions about their potential to revolutionise academic research and generate
new insights (Al-Zahrani, 2023), they will be used for more purposes than those exposed
earlier in this Section. For literature reviews a tool such as SciSpace, which applies GenAI
to an enormous set of papers published in peer-reviewed journals, is of enormous
advantage. This will allow both professional and young researchers, such as master’s and
PhD students, to efficiently select relevant literature and process large datasets, identify
patterns, and draw comparisons that would be too time-consuming to perform manually.
Consequently, students can focus more on interpreting results and deriving meaningful
conclusions, enhancing their studies' depth and precision.
Furthermore, GenAI can automate the collection and synthesis of information from
numerous sources for compilatory studies, enabling students to quickly compile
comprehensive literature reviews and meta-analyses, identifying key findings and gaps in
existing research, saving time, and improving the quality of their reviews (UNESCO,
2023). GenAI’s predictive capabilities can guide students toward emerging research
trends, helping them align their studies with future developments in their field (Al-
Zahrani, 2023).
26
3.3.1.4. Boost to asynchronous education and student self-motivation
The above-mentioned features of GenAI allow for asynchronous learning, avoiding the
one-size-fits-all trap and making learning more engaging and effective. These
technologies present a significant opportunity in education: increasing students’ self-
motivation (Memarian & Doleck, 2023).
In this context, Memarian and Doleck (2023) stated that “such effects [in student self-
motivation] need to be studied in the long term and efforts need to be made to study
student motivation changes based on their learning and demographic backgrounds”.
From a distinct perspective, another opportunity that arises from the deployment of GenAI
is its potential to bridge gaps for students worldwide, allowing the democratization of
knowledge where every student has access, independently of their location or economic
status.
3.3.2. Threats
In fact, if students become too dependent on GenAI for answers, they will struggle to
develop their own problem-solving and idea creating skills. The convenience of having an
5 We conducted a test with ChatGPT. Initially, we asked in English for a syllabus for a public economics course, and it produced a
well-designed syllabus based on existing and reputable bibliography. Subsequently, we asked the same question in Portuguese,
specifying that we needed Portuguese bibliography. ChatGPT produced a similar syllabus in Portuguese, but all five references were
invented. One of the authors of this report discovered that he had a "new" book attributed to him that he had never written.
27
AI system readily provide information and solve problems could lead to a passive learning
attitude, where students might accept AI-generated solutions without questioning,
especially under time pressure. This dependency can result in a decline in original
thinking and the ability to generate unique ideas independently (Kurtz et al., 2024; Abbas
et al., 2024).
We described above how ITSs can provide feedback on learning progress and nudge
students towards better learning outcomes. If these are not implemented with
moderation, students could become quickly overwhelmed by the amount of feedback they
receive in multiple courses simultaneously, possibly even from different systems and with
multiple deadlines to respond to. This can lead to negation, abandonment, and in the
worst case permanently harm the self-confidence of students in their own abilities.
Students may also become dependent on the constant stimuli provided in gamified
learning environments, neglecting other important activities not covered by the system6
and becoming incapable of being autonomous when left on their own. Students need to
learn how to set their priorities and allocate their time accordingly, without tutoring
systems telling them what to do and when to do it.
While GenAI creates opportunities for group discussions and debates, by offering
structured discussion frameworks and real-time feedback (Labadze et al., 2023), it can
also lead to social isolation. As GenAI systems provide tailored content and instant
feedback, students might find themselves spending more time interacting with machines
rather than engaging with peers and educators (Felix, 2020).
This shift could result in reduced opportunities for collaborative learning, which is the
whole point of visiting an HEI and which is vital for developing empathy and
communication and teamwork skills. The lack of social interaction can hinder students'
ability to work effectively in group settings and diminish their overall social competence
(Marrone et al., 2022).
28
3.3.2.4. Hallucinations and reinforcement of biases
There are other potential dangers associated with GenAI in this area: GenAI systems
can make errors (“hallucinations”), which are due to their nature as statistical prediction
machines – they are not databases of “facts”.
Both hallucinations and biases are particularly problematic in the educational context,
where students should rely on accurate information to learn and develop their
understanding. This issue creates a critical need for students to be taught how to double-
check and verify the information provided by GenAI systems. In Chapter 5, mechanisms
and recommendations to address this issue will be provided.
One driving force behind this trend is the goal of democratising higher education and
accommodating the international student market, creating pressures due to the large
number of enrolled students (Novoselova, 2023; Popenici & Kerr, 2017). These drive the
need for efficient tools that help manage large student populations and improve
educational outcomes. Moreover, as the usage of GenAI technologies spreads across the
29
various levels of education – teaching, learning and even research –, HEIs are obliged to
rethink their role, governance, and management processes.
From the student administration perspective, these technologies offer substantial value
by enhancing efficiency and providing comprehensive support throughout students'
academic path, from critical phases such as admissions and financial aid to continuous
assistance in student services and proactive measures aimed at reducing dropout rates
(UNESCO, 2023a). For admissions and financial aid, chatbots can be used to clarify
students’ questions and help them get the information they need in a much faster and
personalised way. Furthermore, AI assistants are also being used to offer ongoing student
support to various kinds of modalities: tutoring, organizational help, social and campus
30
life, and career guidance. Finally, sophisticated data analysis of academic records and
performance generates early warning reports, identifying at-risk students and enabling
timely interventions to prevent dropouts (Ge & Hu, 2020; Crompton & Burke, 2023; Nagy
& Molontay, 2023).
Apart from these applications, GenAI tools in student administration can leverage
additional data to enhance the student experience further. According to Ge and Hu
(2020), universities can gather online consumption data, credit payment information, and
campus card usage to improve administrative decisions. This data can be used to optimise
dormitory assignments and roommate matching, enhancing communication and creating
a positive learning and living environment. Evidently, care should be taken due to the
highly personal natura of this data.
Over the past few decades, HEIs have increasingly implemented performance
management practices, with more data being collected and more sophisticated indicators
being computed, recurring to big data techniques. Moreover, the amount and type of data
collected has been increasing, and the purposes for which the data is used is constantly
changing, reflecting the shift of priorities over the years (Beerkens, 2021). Specifically,
performance indicators are used at three levels: to reflect on the performance of the
system, on that of individual universities, and on that of sub-units within organisations.
In this sense, GenAI can play a significant role by enhancing the capacity to collect,
analyse, and use data for performance management. GenAI tools can automate the data
collection process, identify patterns and trends, and present the output in a form that is
adequate for its users and their needs. For instance, GenAI can assist in planning and
evaluating degree programs by predicting future trends in student enrolment and industry
needs. Additionally, it can evaluate the effectiveness of current programs by analysing
student performance data, graduation rates, and employment outcomes, so that
institutions can adapt their programs accordingly.
Finally, GenAI can assist in drafting degree programs and designing courses, including
setting learning goals and conducting consistency checks. It can also enhance the
31
complementarity between courses by analysing all existing syllabi and constructing new
ones based on this information.
The process of student recruitment includes many steps, two of the most important
being (i) thoroughly answering students’ queries regarding the degree programs and
university, and the other (ii) advertisement, either through social media platforms or
direct and personalised communication to prospective students.
In the case of (i), the most common GenAI tool to answer students’ questions are
chatbots (UNESCO, 2023a; Labadze et al., 2023), as they are designed to mimic human
conversation using text to provide information in a conversational manner. In fact,
according to Ilieva et al. (2023), “intelligent chatbots can execute a wide range of business
functions, including sales and marketing, personal assistance, and information retrieval.”
This makes chatbots particularly attractive for universities aiming to enhance their
recruitment processes, as they can provide instant responses to frequent questions, guide
prospective students through application procedures, and even offer personalised
program recommendations based on student interests.
For (ii), advertisement campaigns can be designed and optimised for specific student
groups by using GenAI tools to create social media content and commercial scripts with
higher click-through rates, as well as develop personalised communication (emails and
messages). The data used for those communication purposes can be of various forms:
demographic and geographic data, behavioural data (e.g., website visits, interaction with
previous market content, as well as in social media platforms), previous communications
(e.g., email inquiries and chat conversations), survey responses, and interactions from
social media platforms.
This approach not only reduces marketing costs but also increases engagement. For
example, Element451 uses AI to enhance student engagement through personalised
communication, analysing data to identify prospective students and tailor recruitment
messages. Additionally, other tools facilitate peer-to-peer engagement, allowing potential
applicants to ask questions and gain insights into the student experience.
An innovative tool that potentially enhances recruitment processes is the E360 Tailor’s
University Virtual Campus Tour. By integrating GenAI, it analyses campus data, prioritises
elements, selects relevant locations, and crafts tour content accordingly. This creates a
32
multi-stop campus tour tailored to the candidate’s interests and needs, providing an
engaging virtual experience for each prospective student.
With time, similar innovations are expected to transform student recruitment even
further, offering increasingly personalised and immersive experiences tailored to the
specific interests and needs of prospective students, and making the process more
engaging and effective.
Even before the breakthrough in GenAI, universities were already evolving their
processes to provide more instant and intuitive assistance to students. In 2016, IBM
developed Jill Watson, an AI-powered virtual teaching assistant designed to support
individual courses by answering student questions like a human teaching assistant,
incorporating ChatGPT in 2022. Jill Watson answers basic student course questions and
helps students with course content, such as explaining topics and assisting with
assignments.
For instance, the University of Michigan developed two AI tools: a general AI assistant
(U-M GPT) and a personalised AI tool (U-M Maizey). The first consists of a versatile
assistant available to all campus members, capable of answering any academic or
university-specific questions, summarizing general information, and producing written
work. Users can choose between AI models like GPT-3.5, GPT-4, and Llama 2, and
students can access it at no cost, generating up to 25 prompts per hour without using
user-specific data for training to protect privacy. U-M Maizey, on the other hand, connects
to personal accounts on Google and Canvas to provide customised answers and insights.
It allows students to upload unstructured texts, adjust the AI tool's "temperature" for
output sensitivity, and use or share their customised tools. While U-M Maizey was free
through 2023, it required a monthly fee afterward.
It is expected that AI student assistant tools will move from chatbots to omnipresent
tools that provide multimodal support, available anytime and on any kind of device. These
33
advanced tools may even incorporate voice functions to make student interaction easier
and more intuitive.
Various technological solutions have already been developed to assist educators. For
instance, researchers from the University of Surrey developed KEATH.AI, an AI system
providing rapid and accurate essay assessments with an 80% baseline accuracy. It allows
educators to modify scores, customise grading rubrics, and generate detailed analysis
reports on student performance. Similarly, GRAIDE, created by PhD students at the
University of Birmingham, enhances grading by learning an assessor's style. It accepts
written and digital submissions, reducing grading times by 87% and offering seven times
more feedback.
4.2.5. AI tools for the back office: profiling, prediction, and dropout reduction
According to Pierrès et al. (2024), the initial stage involves predicting student
performance before admission based on their CVs, while the subsequent stage focuses on
identifying students at risk of dropping out from courses, study programs, or the
institution, as well as determining their learning profiles. These types of analysis involve
gathering information, such as knowledge levels (or knowledge gaps), skills and strengths
34
(Nagy & Molontay, 2023), learning styles or preferences, interests (Ouyang et al., 2022),
grades, and class attendance.
This information can then be used to create metrics and real-time dashboards, as well
as to provide feedback and guidance on content-related issues throughout the learning
process, enhancing student retention and success (Zawacki-Richter et al.,2019).
4.3.1. Opportunities
Currently, HEIs can already benefit from AI platforms that analyse data sets (including
educator availability, course requirements, student enrolments, and classroom resources)
to minimise conflicts and optimise resource use. They dynamically adjust to real-time
changes, providing flexibility and responsiveness that traditional methods lack.
Still, GenAI can further enhance these benefits, as future advancements in such tools
could focus on developing more intuitive and integrated platforms that seamlessly
combine various administrative functions. Additional enhancements could include
advanced predictive analytics to anticipate scheduling conflicts before they arise and
machine learning algorithms that continually improve scheduling efficiency based on
historical data. Moreover, incorporating user-friendly interfaces and real-time
collaboration tools can facilitate better communication and coordination among
35
departments, reducing the manual workload and enhancing overall institutional
productivity.
Another significant opportunity arising from the deployment of GenAI is the potential
to break down remaining cultural, ethnic and disability barriers, which would foster a
more inclusive and accessible learning and living environment. According to Pence
(2019), AI agents will soon start to participate in collaboration and team building,
resulting in more ethnically and culturally diverse teams.
On the disabilities front, Pierrès et al. (2024) mention that HE should be flexible and
tailored to the diverse needs of students. AI may help provide the necessary
personalization and adaptability, enabling individuals with disabilities to pursue higher
education. If students with disabilities are considered not just in the development and
adoption of AI educational technologies, but also included in the discussions for the
conception and implementation of these tools, they may benefit from an adapted pace of
learning and flexible course schedules (García-González et al., 2020). Furthermore, AI-
driven accessibility tools can provide support to students with disabilities by guaranteeing
equitable access to educational materials (OECD, 2023).
4.3.2. Threats
36
“Members of the Harvard College community commit themselves to producing
academic work of integrity – that is, work that adheres to the scholarly and intellectual
standards of accurate attribution of sources, appropriate collection and use of data, and
transparent acknowledgement of the contribution of others to their ideas, discoveries,
interpretations, and conclusions. Cheating on exams or problem sets, plagiarizing or
misrepresenting the ideas or language of someone else as one’s own, falsifying data, or
any other instance of academic dishonesty violates the standards of our community, as
well as the standards of the wider world of learning and affairs.”7
This Code of Honor fits well with pre-GenAI academia, where intellectual property
rights were clearly defined, but seems out of step with the post-GenAI environment.
Sources cannot be attributed accurately if these property rights are not properly defined.
However, there are no property rights over the output of large language models or image
generators – neither the owners of the training data nor the GenAI user. The term
“plagiarism” is no longer adequate to capture the use of the outputs of these GenAI
applications.8 But it is still relevant that, for the sake of academic integrity, “there should
be an acknowledgement of the contribution of others” even though the “others” may be,
in certain cases, GenAI applications.
The increasing use of LLMs and other GenAI tools in subjects that rely heavily on
written outputs, such as essays, presents unique challenges. HEIs are currently in an
experimental phase, exploring various strategies to effectively manage and integrate these
tools into their academic processes. According to UNESCO (2023a), some of the strategies
so far implemented consist of (i) banning ChatGPT in assessments or entirely; (ii) using
software to detect AI-generated text; (iii) shifting to oral, handwritten, or invigilated
exams; (iv) employing assessments that AI struggles to produce, such as podcasts, lab
activities, and group work; (v) establishing policies and guidelines for the ethical and
transparent use of AI in teaching, learning, and research (for example, allowing a
considered use of ChatGPT); and (vi) creating new assessment forms using ChatGPT
explicitly.
However, these measures do not seem to be sufficient, nor widely implemented. For
instance, regarding (v), some universities have set explicit guidelines for AI use in
classrooms (e.g., University of Southern California, Nova School of Business and
37
Economics) and others have defined expectations about AI usage at both syllabus and
individual levels (e.g., Ohio State University), but in a general manner students and
educators still ask for more clarity on the rules. Malmström et al. (2023) found that
although Swedish students are familiar with and have positive perceptions of using AI
tools like ChatGPT and other AI language tools in education, most were unaware of any
institutional guidelines for responsible AI use.
38
Moreover, inequalities may also be present at the learning level. The content created
by GenAI tools may not follow established pedagogical methods. Take, for instance, the
case of students with impairments: European Commission (2022) questions whether
GenAI systems provide appropriate interaction modes for learners with disabilities or
special education needs. This includes addressing whether the AI system is duly designed
to treat learners respectfully and adapts to their individual needs.
Both educators and HEIs have a role in this matter, by assuring human agency and
oversight, when implementing GenAI. Particularly, HEIs must implement monitoring
systems to follow up on and eliminate these types of occurrences.
Analysing the implementation of GenAI in HEIs raises sustainability concerns from two
angles: the environmental and the economic standpoints. Regarding the environmental
perspective, it is expected that academic and research use of AI will drive energy
consumption levels. In fact, according to Ray (2023), “the large size and complexity of
ChatGPT models require significant computing resources, which can have negative
environmental impacts. Improving the energy efficiency of ChatGPT models is an
important challenge that needs to be addressed.” In this sense, as AI and machine learning
models become more sophisticated, the demand for computational power and energy
increases, leading to higher carbon footprints.
39
research and infrastructure. However, developing, implementing, and maintaining
advanced AI systems is expensive. As stated above, significant financial resources are
required for computational power, data storage, skilled personnel, and ongoing research
and development. The direct consequence is that such high costs could strain the budgets
of HEIs, especially those with limited financial resources (UNESCO, 2023a). Additionally,
given the high costs associated with AI technologies, only major, well-funded
organizations will have the financial means to participate in the AI competition to start
with. This could lead to the concentration of AI knowledge and resources within a small
number of institutions, widening the gap between these and smaller or less prosperous
universities. Consequently, this may lead to an even more tilted playing field in higher
education, with leading institutions drawing an ever-higher number of students, research
opportunities, and financial backing.
This chapter discusses how the risks of GenAI in higher education can be managed
while harnessing its potential. The preceding chapters highlighted the risks associated
with implementing GenAI for educators, students, and higher education institutions.
Here, we propose measures and programs to address and mitigate each of these identified
risks, ensuring a safer and more effective integration of GenAI in educational
environments.
5.1. Educators
As GenAI technologies become part of HE systems, educators are not just expected to
adapt to these technologies, but crucially must be empowered to do so. This
empowerment should allow them to focus more on guiding new learning approaches and
detect monitoring biases, ensuring that learning methods are effective and free from
unfair biases that might affect students' outcomes. With this shift, educators’ priorities
will change, with more time spent planning how to teach effectively rather than creating
the actual teaching materials (Celik et al., 2022). The burden of administrative tasks will
also be reduced – allowing educators to concentrate on the more critical aspects of
teaching –, and the evaluation methods will evolve towards more practical approaches,
instead of relying mostly on traditional written assignments (Abbas et al., 2024).
40
However, these opportunities come along with other kinds of issues. It is thus crucial that
educators are prepared to confront and surmount them.
5.1.1. Risks “Shift in students’ trust: Educators vs. GenAI” and “Job Displacement: the potential
of GenAI to automate teaching positions”
Additionally, educators must retain student engagement. In this sense, finding ways to
help educators boost student engagement levels beyond what GenAI tools can achieve is
crucial (U.S. Department of Education, 2023). This means providing educators with the
freedom to bring themselves and their personal experience to the lecture room even in
the presence of AI support. It can also mean providing educators with tools that allow
them to better track their teaching performance by delivering metrics on student
engagement and behaviours during class (U.S. Department of Education, 2023). This
information will then help educators adapt their teaching methods.
41
To prevent a decline in the number of graduate students financing their advanced
degrees through TA roles, HEIs need to reconsider the role of TAs in GenAI-based
teaching, to determine how their role can evolve alongside with technology. This involves
identifying which new essential tasks have arisen. It is possible that TA roles will require
more human interaction, such as mentoring students and facilitating group discussions.
TAs might also help educators conduct in-depth analyses of students' performances and
behaviours. However, if TA roles diminish, it is crucial to establish alternative financial
support methods for graduate students, such as research grants and fully funded PhD
fellowships.
In Section 2.3.2.2. we pointed out that AI detection tools are ineffective in identifying
plagiarism due to their lack of accuracy. There are various approaches to address this
issue. First, it is crucial for educators to modify their assessment techniques (UNESCO,
2023a) in a way that fosters personalization and interaction, such as oral exams,
presentations, and in-class writing tasks; and which enables continuous evaluation,
allowing the evaluation of students' ongoing progress through projects, portfolios, and
regular quizzes (OECD, 2023). Educators can also promote original work (by designing
assignments that require original thought, critical analysis, and personal reflection), and
employ project-based learning as well as problem-solving tasks that require original
solutions and creative thinking (OECD, 2023).
42
5.1.3. Risk “Ethical challenges in automating student evaluations"
Educators using AI in their grading tasks will still have to oversee and participate in
the assessment of each student. They should also be transparent about whether they are
using AI in their grading, which is fair since that same transparency is required from
students. However, there may be reasonable exceptions, such as in the case of short-
answer questions. Similar to traditional multiple-choice questions, a quick assessment of
answers may be amenable for AI grading without posing significant ethical concerns. This
will not be the case for other types of assignments, specifically those that are designed for
longer open answers.
Still, an extra challenge needs to be addressed in the context of evaluations under more
personalized teaching. If GenAI allows educators to adapt their educational approaches
to different students, should different educational methods be combined with a common
grading scale? In other words, the trade-off between rewarding the absolute level of skills
(“what does the student know?”) and learning progress (“how much have they learned?”)
may have to change. In this scenario, it may become necessary for educators to prioritise
evaluating fundamental skills, while guaranteeing fairness and consistency in
assessments. Combining formative assessments and performance-based assessments
(such as projects and presentations) can also be beneficial, as the former are tailored to
individual learning paths and contribute to a comprehensive understanding of student
progress, while the latter allows students to apply their knowledge and skills in authentic
contexts, providing a more accurate measure of their abilities. The overall learning
achievement of the individual may be more important for school or undergrad
environments, as it demonstrates long-term progress and proficiency, whereas the level
of knowledge and ability is more relevant for the final years of university, where the
9 https://eur-lex.europa.eu/eli/reg/2016/679/oj
10 https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai
43
emphasis is frequently on the immediate use of knowledge and skills. Regardless, it is
important to communicate the evaluation criteria, methods, and expectations clearly and
transparently to students so that they understand how they will be evaluated.
Educators should also receive AI literacy training to better lead with the lack of
transparency and explainability of most AI models (Lee et al., 2024; UNESCO, 2023a),
covering areas such as the operation of AI models, common sources of bias, and methods
for interpreting AI outputs. These training sessions should also equip educators with the
skills to identify and report biases, ensuring they can effectively intervene when biases
are detected during the course.
5.2. Students
Incorporating GenAI technologies into higher education will lead to various shifts in
students' roles and expectations. Assessments will need to focus more on students'
application of knowledge rather than just memorization, with a higher priority on critical
thinking, problem-solving, and analytical skills (OECD, 2023). Additionally, students will
have to develop strong abilities in organizing and evaluating information, including
collecting, arranging, and carefully assessing its significance, accuracy, and dependability.
44
irrespective of their field of study, acquire a foundational understanding of AI. This
approach will prepare students for the diverse impacts of AI in various professions and
industries (Pence, 2019), and it should be accompanied by providing fundamental
knowledge about the functioning of AI systems, including GenAI (Farrelly & Baker, 2023).
Understanding the essential principles of machine learning, data management, and
algorithm-based decision-making is crucial for understanding the capabilities and
constraints of these technologies.
Finally, to prevent students from becoming dependent on GenAI tools, these tools must
not offer immediate answers to students. Instead, these applications should offer hints or
different actions that students can take to solve exercises or find information, allowing
students to make the final decision. Furthermore, providing step-by-step feedback should
allow students to comprehend their mistakes and improve, instead of instantly
pinpointing what they are missing. This approach will help students understand how to
enhance their learning.
Therefore, gamified environments should be used with caution. Within each course,
the number of nudges should be limited. Moreover, there needs to be coordination
between the different courses in degree programs to avoid a tug-of-war between courses
that leaves students exasperated (Similar coordination is already common with respect to
out-of-class activities such as homework and group projects).
The second issue with gamification is that students may become dependent on nudges
to decide on how to allocate their time, crowding out their own initiative and ability to
take decisions by themselves. The first step to avoid this issue is again to refrain from
45
exaggerating the number and frequency of nudges. HEIs should also make sure that their
students learn time-management skills and become autonomous in organizing their
workload.
Uncritical use of GenAI output leads to the promotion of inherent biases and
information made up by the GenAI system (hallucinations). It is important to educate
students about facing either of those. This includes providing them with knowledge about
how biases arise and the ability to critically analyse AI-generated content (Farrelly &
Baker, 2023). For instance, this can be achieved by including AI ethics, data science, and
critical thinking courses in the curriculum. Students must also learn how to assess and
verify AI-generated results. Additional measures to avoid and combat biases are provided
in Section 5.3.2.
46
ensure the efficient utilization of advanced AI technologies and to detect potential risks
and constraints (UNESCO, 2023a).
To diminish the risk of accentuating biases and inequality in the context of student
recruitment, HEIs need to (i) establish strong monitoring mechanisms to monitor and
analyse the results of AI-based recruitment procedures (George, 2023), (ii) evaluate
whether AI systems impartially assess applications from students with disabilities and
offer the necessary assistance during the recruitment process, (iii) make use of AI to
acquire data-driven insights into the efficiency of recruitment approaches and pinpoint
47
areas where algorithmic discrimination may arise; (iv) leverage AI to proactively identify
students who may require additional support, such as tutoring, mentoring, or financial
aid; and (v) incorporate the perspectives of students with disabilities to address ethical
concerns and promote greater accessibility and inclusivity (Pierrès et al., 2024). This
information can be used by HEIs to create scholarship programs that cover tuition fees
and other associated costs to prevent dropouts. The qualification requirements for these
scholarships should encompass a variety of skills beyond traditional academic
evaluations.
In terms of student learning, it is essential to ensure that all students have equal access
to AI tools (UNESCO, 2023a), as well as that they are provided with all the required
support and resources, regardless of their socioeconomic and demographic backgrounds.
Also, HEIs must ensure that GenAI educational tools are tailored to recognise and tackle
the difficulties encountered by underrepresented groups (Pierrès et al., 2024), allowing
equitable learning among all students. Moreover, to prevent biases in student evaluation,
it is important to implement measures comparable to those recommended for addressing
biases in student admissions.
It is essential that HEIs develop AI systems that are designed to protect student data
(George, 2023; Kurtz et al., 2024). However, this should be accompanied by other types
of measures. For instance, HEIs should implement robust encryption methods, conduct
regular security audits, and train staff and students on data privacy practices. Ensuring
transparency in data collection and usage policies is also crucial, including compliance
with data protection regulations to maintain students’ trust.
Furthermore, it is important to establish standards and frameworks for the ethical use
of GenAI, ensuring compliance with evolving regulations and ethical considerations while
48
maintaining transparency and accountability in data handling practices (Malmström et
al., 2023; Johnston et al., 2024; Lim et al., 2023).
As mentioned in Section 4.3.2.4, the sustainability issues associated with GenAI take
two forms: environmental and economic. Regarding the former, universities should
explore more sustainable practices, such as opting for systems that save energy and
tapping into renewable energy sources to run their data centres (UNESCO, 2023a). Also,
HEIs should participate in developing sustainable, energy-efficient hardware and software
solutions, promoting recycling and responsible disposal of electronic waste, and
considering the life cycle of products to address the environmental impact of computer
science (Ray, 2023). Moreover, regulatory pressures and the push for greener
technologies are expected to drive HEIs to innovate and adopt energy-efficient AI
solutions. In this sense, establishing agreements on caps of energy consumption could
help ensure fair competition between HEIs.
In terms of economic sustainability given the financial restrictions that most public
HEIs are facing, one possibility is to encourage the establishment of consortia or
partnerships to share financial resources and make advanced AI systems more accessible
while also promoting collaborations between HEIs and industry partners to finance AI
research and development efforts jointly (UNESCO, 2023a). Additionally, HEIs will need
to advocate for increased government funding and grants specifically targeted for AI
research and infrastructure in HEIs. Lastly, lending support to open-source AI projects
that offer free and accessible tools for educational institutions can help reduce barriers to
entry for institutions with limited budgets.
GenAI possesses significant potential to enhance the experiences of both educators and
students within the realm of higher education and improve the functioning of higher
education institutions (HEIs), attributable to its ability to deliver personalised learning
experiences, optimise administrative processes, augment accessibility, and facilitate
innovative pedagogical strategies. It can customise educational materials to cater to the
unique requirements of individual students; assist educators in accommodating different
learning styles and paces; automate routine responsibilities such as assessment grading;
49
improve accessibility by offering adaptive learning resources; and empower educators to
explore novel and creative pedagogical methodologies.
At the institutional level, GenAI can transform various operational and academic
procedures, increasing efficiency and efficacy. It can refine the allocation of resources
through the analysis of patterns and the forecasting of requirements; can facilitate
recruitment and the admissions process by catering communication to different groups of
potential candidates and by automating the assessment of applications; can enhance
student support via AI-driven chatbots and virtual advisors; can manage, analyse, and
present substantial volumes of data, allowing for data-driven decision-making that
enhances internal processes.
However, GenAI poses risks to all these three actors mentioned (educators, students,
and HEIs) that must be carefully considered and managed. Such risks include biases in AI
algorithms; privacy and data security; lack of transparency or explainability; ethical and
plagiarism issues; sustainability issues; reduced critical thinking, creativity, and social
skills in students; and workforce replacement.
It is possible to prevent and minimise these risks – if HEIs are aware of their existence
and willing to confront them. Higher education institutions should take a proactive and
comprehensive approach by:
Approving guidelines for the use of GenAI within each institution. More than general
regulations at the university level, these should be adapted to the respective area of
knowledge, since each faces its own specific issues (e.g., engineering, literature studies,
data analytics, economics, or sociology).
Monitoring and updating AI systems so that they remain fair, transparent, effective,
safe, and secure. This implies investing in both technology and IT workforce.
Fostering a culture of ethical awareness among educators and students, ensuring that
the use of GenAI aligns with the core values of education in general and in each academic
community.
50
Increasing digital and AI literacy of students, particularly of those applications that are
more useful to their academic success, in order both to improve learning and to eliminate
the effect of socio-economic factors on pre-university access to digital technologies.
Investing in regular training and professional development for educators and staff to
equip them with the skills and knowledge needed to effectively integrate AI into teaching
and administrative tasks while maintaining the human-centered aspects of education.
Of course, HEIs do not exist in a vacuum – students will have gone through at least a
decade of schooling before they reach university. Educational policies at the high school
level should already include an exposure to GenAI and discussions about its capabilities,
strengths and weaknesses, and ethical issues. The coming generations of students will
have experience with GenAI independently of whether school helps them use it
responsibly or not.
Educators and students of HEI should proactively adapt to the increasing use of GenAI
in teaching, learning and assessments:
Students will use GenAI extensively, and this is beneficial if it is done with
transparency and respecting academic integrity and honesty. Educators should assist
them and promote open conversations about risks and rules of AI usage.
Educators should focus their teaching on promoting critical thinking (in general and
concerning the outputs of GenAI), make it interactive, and teach students how to learn.
At the same time, educators should be aware that students should not be subjected to
too many messages and nudges from AI-driven tutoring systems, especially those that
involve a gamified environment, to avoid cognitive overload, a drop in self-esteem if they
cannot follow up, and a growing dependence on external stimuli to drive their choices.
Students should learn how to use GenAI tools productively and correctly, to improve
their learning and prepare themselves for future careers in the labour market or in
academic research.
Educators should give more weight to student evaluations that reward independent
and critical thinking, problem solving, the ability to work in groups and present their work
in public, at the expense of traditional evaluation methods based on reproducing facts or
composing essays.
If the number of students is small, which tends to be the case at the master’s level,
educators should organize group presentations; while if the number of students is high,
such as at the bachelor level, more written in-class examinations are warranted. Either of
51
these implies that educators will have to spend more time evaluating students, though
group presentations can be an important part of a course designed to be interactive.
When educators use AI tools to perform the assessment of students’ work, this should
be transparent (indicated on the syllabus), and students should receive feedback,
including the possibility to contest the grade.
52
References
Abbas, M., Jam, F. A., & Khan, T. I. (2024). Is it harmful or helpful? Examining the causes
and consequences of generative AI usage among university students. International
Journal of Educational Technology in Higher Education, 21(10).
https://doi.org/10.1186/s41239-024-00444-7
Beck, S. W., & Levine, S. R. (2023). Backtalk: ChatGPT: A powerful technology tool for
writing instruction. Phi Delta Kappan, 105(1), 66-67.
https://doi.org/10.1177/00317217231197487
Berendt, B., Littlejohn, A., & Blakemore, M. (2020). AI in education: Learner choice and
fundamental rights. Learning, Media and Technology, 45(3), 312–324. Available at:
https://doi.org/10.1080/17439884.2020.1786399.
Biggs, J. (1999). What the Student Does: teaching for enhanced learning. Higher
Education Research & Development, 18(1), 57–75.
https://doi.org/10.1080/0729436990180105
Biggs, J. B. (2011). Teaching for quality learning at university: What the student does.
McGraw-Hill Education (UK).
Chan, C., & Hu, W. (2023). Students’ voices on generative AI: perceptions, benefits, and
challenges in higher education. International Journal of Educational Technology in
Higher Education, 20(43). https://doi.org/10.1186/s41239-023-00411-8
Chan, C. K. Y., & Tsi, L. H. Y. (2023). The AI Revolution in Education: Will AI Replace or
Assist Teachers in Higher Education? abs/2305.01185.
https://doi.org/10.48550/arXiv.2305.01185
53
Celik, İ., Dindar, M., Muukkonen, H., & Järvelä, S. (2022). The promises and challenges
of artificial intelligence for teachers: A systematic review of research. TechTrends, 66,
616–630. https://doi.org/10.1007/s11528-022-00715-y
Cingillioglu, I., Gal, U., & Prokhorov, A. (2024). AI-experiments in education: An AI-
driven randomized controlled trial for higher education research.
https://doi.org/10.1007/s10639-024-12633-y
Choi, J. H., Hickman, K. E., Monahan, A., & Schwarcz, D. (2023). ChatGPT goes to law
school. Journal of Legal Education, 71(2022), 387.
http://dx.doi.org/10.2139/ssrn.4335905
Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2023). Chatting and cheating: Ensuring
academic integrity in the era of ChatGPT. Innovations in Education and Teaching
International, 61(2), 228–239.
https://doi.org/10.1080/14703297.2023.2190148Crompton, H., & Burke, D. (2023).
Artificial intelligence in higher education: The state of the field. International Journal of
Educational Technology in Higher Education, 20(22). https://doi.org/10.1186/s41239-
023-00392-8
Dai, W., Lin, J., Jin, H., Li, T., Tsai, Y. S., Gasevic, D., & Chen, G. (2023). Can large
language models provide feedback to students? A case study on ChatGPT. In N.-S. Chen,
G. Rudolph, D. G. Sampson, M. Chang, R. Kuo, & A. Tlili (Eds.), Proceedings - 2023 IEEE
International Conference on Advanced Learning Technologies, ICALT 2023 (pp. 323-
325). IEEE, Institute of Electrical and Electronics Engineers.
https://doi.org/10.1109/ICALT58122.2023.00100
Dalalah, D. & Dalalah, O. (2023). The false positives and false negatives of generative AI
detection tools in education and academic research: The case of ChatGPT. The
International Journal of Management Education 21(2).
https://doi.org/10.1016/j.ijme.2023.100822
De Wit, K. & Broucker, B. (2017). The governance of big data in higher education. In Data
Analytics Applications in Education. Auerbach Publications.
54
Elkhatat, A. M., Elsaid, K., & Almeer, S. (2023). Evaluating the efficacy of AI content
detection tools in differentiating between human and AI-generated text. International
Journal for Educational Integrity, 19(17). https://doi.org/10.1007/s40979-023-00140-5
European Commission (2018). The impact of Artificial Intelligence on learning, teaching,
and education, Joint Research Centre Publications Office.
European Commission (2022). Ethical guidelines on the use of artificial intelligence (AI)
and data in teaching and learning for educators, Directorate-General for Education,
Youth, Sport and Culture. https://data.europa.eu/doi/10.2766/153756
Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and
considerations for higher education practice. Education Sciences, 13(11), 1109.
https://doi.org/10.3390/educsci13111109
Felix, C.V. (2020), "The Role of the Teacher and AI in Education", Sengupta, E.,
Blessinger, P. and Makhanya, M.S. (Ed.) International Perspectives on the Role of
Technology in Humanizing Higher Education (Innovations in Higher Education Teaching
and Learning, Vol. 33), Emerald Publishing Limited, Leeds, pp. 33-48.
https://doi.org/10.1108/S2055-364120200000033003
Felten, E. W., Raj, M., & Seamans, R. (2023). How will Language Modelers like ChatGPT
Affect Occupations and Industries? https://doi.org/10.2139/ssrn.4375268
García-González, J. M., Gutiérrez Gómez-Calcerrada, S., Solera Hernández, E., & Ríos-
Aguilar, S. (2020). Barriers in higher education: perceptions and discourse analysis of
students with disabilities in Spain. Disability & Society, 36(4), 579–595.
https://doi.org/10.1080/09687599.2020.1749565
Ge, Z., & Hu, Y. (2020). Innovative Application of Artificial Intelligence (AI) in the
Management of Higher Education and Teaching. 1533(3).
https://doi.org/10.1088/1742-6596/1533/3/032089
55
Hooshyar, D., Ahmad, R., Yousefi, M., Fathi, M., Horng, S.-J., & Lim, H. (2016). Applying
an online game-based formative assessment in a flowchart-based intelligent tutoring
system for improving problem-solving skills. 94.
https://doi.org/10.1016/J.COMPEDU.2015.10.013
Ibrahim, H., Asim, R., Zaffar, F., Rahwan, T., & Zaki, Y. (2023, March/April). Rethinking
homework in the age of artificial intelligence. IEEE Intelligent Systems, 24-27.
Ilieva, G., Yankova, T., Klisarova-Belcheva, S., Dimitrov, A., Bratkov, M., & Angelov, D.
(2023). Effects of generative chatbots in higher education. Information, 14(9), 492.
https://doi.org/10.3390/info14090492
Johnston, H., Wells, R., Shanks, E., Boey, T., & Parsons, B. (2024). Student perspectives
on the use of generative artificial intelligence technologies in higher education.
International Journal for Educational Integrity. https://doi.org/10.1007/s40979-024-
00149-4
Kabudi, T., Pappas, I., & Håkon Olsen, D. (2021). AI-enabled adaptive learning systems:
A systematic mapping of the literature. Computers and Education: Artificial Intelligence,
2. https://doi.org/10.1016/j.caeai.2021.100017
Kaharuddin, K. (2021). Assessing the effect of using artificial intelligence on the writing
skill of Indonesian learners of English. Linguistics and Culture Review.
Kamalov, F., Santandreu Calonge, D., & Gurrib, I. (2023). New era of artificial intelligence
in education: Towards a sustainable multifaceted revolution. Sustainability, 15(12451).
https://doi.org/10.3390/su151612451
Kasneci, E., Seßler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., …
Kasneci, G. (2023, January 30). ChatGPT for Good? On Opportunities and Challenges of
Large Language Models for Education. https://doi.org/10.35542/osf.io/5er8f
Kim, C., & Bennekin, K. (2016). The effectiveness of volition support (VoS) in promoting
students’ effort regulation and performance in an online mathematics course.
Instructional Science, 359-377. Retrieved from
https://link.springer.com/article/10.1007/s11251-015-9366-5
Kurtz, G., Amzalag, M., Shaked, N., Zaguri, Y., Kohen-Vacs, D., Gal, E., Zailer, G., & Barak-
Medina, E. (2024). Strategies for integrating generative AI into higher education:
56
Navigating challenges and leveraging opportunities. Education Sciences, 14(5), 503.
https://doi.org/10.3390/educsci14050503
Labadze, L., Grigolia, M., & Machaidze, L. (2023). Role of AI chatbots in education:
Systematic literature review. International Journal of Educational Technology in Higher
Education, 20(1), 56. https://doi.org/10.1186/s41239-023-00426-1
Lee, D., Arnold, M., Srivastava, A., Plastow, K., Strelan, P., Ploeckl, F., Lekkas, D., &
Palmer, E. (2024). The impact of generative AI on higher education learning and
teaching: A study of educators’ perspectives. Computers and Education: Artificial
Intelligence, 6, 100221. https://doi.org/10.1016/j.caeai.2024.100221
Liang, J., Wang, L., Luo, J., Yan, Y., & Fan, C. (2023). The relationship between student
interaction with generative artificial intelligence and learning achievement: Serial
mediating roles of self-efficacy and cognitive engagement. Frontiers in Psychology, 14.
https://doi.org/10.3389/fpsyg.2023.1285392
Lim, W.M., Gunasekara, A.N., Pallant, J.L., Pallant, J.I., & Pechenkina, E. (2023).
Generative AI and the future of education: Ragnarök or reformation? A paradoxical
perspective from management educators. The International Journal of Management
Education.
Lin, C.-C., Huang, A. Y. Q., & Yang, S. J. H. (2023). A review of AI-driven conversational
chatbots implementation methodologies and challenges (1999–2022). Sustainability,
15(5), 4012. https://doi.org/10.3390/su15054012
Lu, O., Huang, A., Tsai, D., & Yang, S. (2021). Expert-Authored and Machine-Generated
Short-Answer Questions for Assessing Students Learning Performance. Educational
Technology & Society, 159-173.
Malmström, H., Stöhr, C., & Ou, A. W. (2023). Chatbots and other AI for learning: A
survey of use and views among university students in Sweden. (Chalmers Studies in
Communication and Learning in Higher Education 2023:1).
https://doi.org/10.17196/cls.csclhe/2023/01
Memarian, B., & Doleck, T. (2023). ChatGPT in education: Methods, potentials, and
limitations. Computers in Human Behavior: Artificial Humans, 1(2), 100022.
https://doi.org/10.1016/j.chbah.2023.100022
Memarian, B., & Doleck, T. (2023a). Fairness, Accountability, Transparency, and Ethics
(FATE) in Artificial Intelligence (AI) and higher education: A systematic review.
57
Computers and Education: Artificial Intelligence, 5, 100152.
https://doi.org/10.1016/j.caeai.2023.100152
Maravanyika, M., Dlodlo, N., & Jere, N. (2017). An adaptive recommender-system based
framework for personalised teaching and learning on e-learning platforms. 2017 IST-
Africa Week Conference (IST-Africa). Windhoek, Namibia: Institute of Electrical and
Electronics Engineers (IEEE). Retrieved from
https://ieeexplore.ieee.org/abstract/document/8102297/references
Marrone, R., Taddeo, V., & Hill, G. (2022). Creativity and artificial intelligence—A
student perspective. Journal of Intelligence, 10(3), 65.
https://doi.org/10.3390/jintelligence10030065
Mousavinasab, E., Zarifsanaiey, N., Kalhori, S., Rakhshan, M., Keikha, L., & Saeedi, M.
(2018). Intelligent tutoring systems: a systematic review of characteristics, applications,
and evaluation methods. Interactive Learning Environments, 29, 142–163.
Nagy, M., & Molontay, R. (2023). Interpretable dropout prediction: Towards XAI-based
personalized intervention. International Journal of Artificial Intelligence in Education, 1–
2. https://doi.org/10.1007/s40593-023-00331-8
Ouyang, F., Zheng, L., & Jiao, P. (2022). Artificial intelligence in online higher education:
A systematic review of empirical research from 2011 to 2020. Education and Information
Technologies, 27(1). https://doi.org/10.1007/s10639-022-10925-9
Padron-Rivera, G., Joaquin-Salas, C., Patoni-Nieves, J.-L., & Bravo-Perez, J.-C. (2018).
Patterns in Poor Learning Engagement in Students While They Are Solving Mathematics
Exercises in an Affective Tutoring System Related to Frustration. Mexican Conference on
Pattern Recognition (pp. 169–177). Springer International Publishing AG.
Pence, H. E. (2019). Artificial intelligence in higher education: New wine in old
wineskins? Journal of Educational Technology Systems, 48(1), 5-13.
https://doi.org/10.1177/0047239519865577
58
Pierrès, O., Christen, M., Schmitt-Koopmann, F.M., & Darvishy, A. (2024). Could the Use
of AI in Higher Education Hinder Students With Disabilities? A Scoping Review. IEEE
Access, 12, 27810-27828. https://doi.org/10.1109/access.2024.3365368
Polyportis, A. (2024). A longitudinal study on artificial intelligence adoption:
Understanding the drivers of ChatGPT usage behavior change in higher education.
Frontiers in Artificial Intelligence, 6. https://doi.org/10.3389/frai.2023.1324398
Popenici, S. A. D., & Kerr, S. (2017). Exploring the impact of artificial intelligence on
teaching and learning in higher education. Research and Practice in Technology
Enhanced Learning, 12(22). https://doi.org/10.1186/s41039-017-0062-8
Popper, K. (1963). Conjectures and Refutations: The Growth of Scientific Knowledge.
Routledge.
Roediger, H. L., III, & Butler, A. C. (2011). The critical role of retrieval practice in long-
term retention. Trends in Cognitive Sciences, 15(1), 20-27.
https://doi.org/10.1016/j.tics.2010.09.003
Rutner, S., & Scott, R. (2022). Use of Artificial Intelligence to Grade Student Discussion
Boards: An Exploratory Study. Information Systems Education Journal, 4-18.
Sajja, R., Sermet, Y., Cwiertny, D., et al. (2023). Platform-independent and curriculum-
oriented intelligent assistant for higher education. International Journal of Educational
Technology in Higher Education, 20(42). https://doi.org/10.1186/s41239-023-00412-7
Sullivan, M., Kelly, A., & McLaughlan, P. (2023). ChatGPT in higher education:
Considerations for academic integrity and student learning. Journal of Applied Learning
and Teaching, 6(1), 1-10. https://doi.org/10.37074/jalt.2023.6.1.17
Tlili, A., Shehata, B., Adarkwah, M. A., et al. (2023). What if the devil is my guardian
angel: ChatGPT as a case study of using chatbots in education. Smart Learning
Environments, 10(15). https://doi.org/10.1186/s40561-023-00237-x
UNESCO (2023). Guidance for generative AI in education and research (F. Miao & W.
Holmes, Authors). UNESCO. https://doi.org/10.54675/EWZM9535
59
Sabzalieva, A. Valentini, D. Vieira do Nascimento, & C. Yerovi, Authors). UNESCO
IESALC. https://unesdoc.unesco.org/ark:/ 48223/pf0000386670
Weber-Wulff, D., Anohina-Naumeca, A., Bjelobaba, S., Foltýnek, T., Guerrero-Dib, J.,
Popoola, O., Waddington, L. (2023). Testing of detection tools for AI-generated text.
International Journal for Educational Integrity, 19. https://doi.org/10.1007/s40979-
023-00146-z
Yang, A., Chen, I., Flanagan, B., & Ogata, H. (2021). Automatic Generation of Cloze Items
for Repeated Testing to Improve Reading Comprehension. Educational Technology &
Society, 147-158.
Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review
of research on artificial intelligence applications in higher education – where are the
educators? International Journal of Educational Technology in Higher Education, 16(39).
https://doi.org/10.1186/s41239-019-0171-0
60
REPORT 11
Generative AI and Higher Education: Challenges and Opportunities
Authors: Steffen Hoernig,André Ilharco, Paulo Trigo Pereira, Regina Pereira
ISSN: 2183-9360
September 2024
Institute of Public Policy Lisbon – Rua Miguel Lupi 20, 1249-078 Lisboa PORTUGAL
www.ipp-jcs.org – email: [email protected] – tel.: +351 213 925 986 – NIF: 510654320
The views and information set out herein are those of the authors do not necessarily reflect those of Institute of Public Policy, the
University of Lisbon, or any other institution which either the authors or IPP may be affiliated with. Neither Institute of Public Policy nor
any person acting on its behalf can be held responsible for any use which may be made of the information contained herein. This report
may not be reproduced, distributed, or published without the explicit previous consent of its authors. Citations are authorized, provided
the original source is acknowledged.
www.ipp-jcs.org
© Institute of Public Policy Lisbon | September 2024 – All rights reserved