0% found this document useful (0 votes)
98 views61 pages

Generative AI and Higher Education

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
98 views61 pages

Generative AI and Higher Education

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

REPORT 11

Generative AI and Higher Education: Challenges


and Opportunities

Steffen Hoernig, André Ilharco, Paulo Trigo Pereira, Regina Pereira


[email protected], [email protected], [email protected], [email protected]

www.ipp-jcs.org
© Institute of Public Policy Lisbon | September 2024 – All rights reserved
Reports
Reports by Institute of Public Policy aim to support the public debate with comprehensive research and detailed empirical analysis.

The authors

Steffen Hoernig is a Professor of Economics at Nova School of Business and Economics. André Ilharco is a researcher at the Institute
of Public Policy. Paulo Trigo Pereira is a Professor at ISEG, University of Lisbon and President of the Institute for Public Policy. Regina
Pereira is a researcher at the Institute of Public Policy.

About Institute of Public Policy


The Institute of Public Policy is a Portuguese, academic and independent think tank. Its mission is to contribute to the continuous
improvement of the analysis and public debate of institutions and public policies, with emphasis on Portugal and Europe, through
the creation and dissemination of relevant research.

2
Contents

1. Introduction........................................................................................................................... 5
2. Educators: Teaching and Evaluation Methods ...................................................................... 8
2.1. Findings from the Literature ......................................................................................... 8
2.2. Use Cases..................................................................................................................... 10
2.3. Main Opportunities and Threats ................................................................................. 13
3. Students: Learning and Autonomy...................................................................................... 19
3.1. Findings from the Literature ....................................................................................... 20
3.2. Use Cases..................................................................................................................... 21
3.3. Main Opportunities and Threats ................................................................................. 24
4. Higher Education Institutions: Organizing and Planning ..................................................... 29
4.1. Findings from the Literature ....................................................................................... 30
4.2. Use Cases..................................................................................................................... 31
4.3. Main Opportunities and Threats ................................................................................. 35
5. How to Minimise Risks While Tapping the Potential of GenAI ........................................... 40
5.1. Educators..................................................................................................................... 40
5.2. Students ...................................................................................................................... 44
5.3. Higher Education Institutions ...................................................................................... 46
6. Conclusions and Recommendations ................................................................................... 49
References................................................................................................................................... 53

3
4
1. Introduction1
Generative AI (GenAI) became a worldwide sensation in late 2022, with the public
release of ChatGPT. Soon students, educators, teaching assistants, and university
administrators grasped the potential of instruments such as chatbots in higher education
(HE). While some foresaw the end of classical teaching via substitution by AI assistants,
others understood the transformative potential of GenAI in shaping and enhancing
education to everybody’s benefit.

GenAI can provide students with quick summaries of educational materials, help
teachers prepare their classes and grade their students' exams, as well as reorganise the
entire academic curriculum and teaching approach for future students based on an
analysis of previous and incoming students’ data. However, GenAI brings other
possibilities to the table. It does not simply curate information or summarise many sources
into one usable and easily accessible document for humans, it also generates new content.
This content may assume the form of human-like text, image, video, sound, software, or
anything the GenAI technology can collect data about. It is indeed the GenAI technology
of Large Language Models (LLMs) that is behind every AI chatbot such as OpenAI’s
ChatGPT, Google’s Gemini, or Microsoft’s Copilot. Other types of AI (“big data”) will also
play their part in developing higher education, namely due to their capabilities to analyse
large data sets and find patterns. For instance, an AI-powered analysis of student data,
such as their age, gender, or previous school grades, among other performance-related
data, can help universities better grasp the roots of old educational problems such as
dropouts, unequal opportunities arising from external factors, or outdated curricula.

If left to chance, these new possibilities may also lead to abuse. There is a double need
to prevent the misuse of GenAI and to use it to enhance students’, teachers’ and every HE
actor’s performance and integration. The European Union has already produced various
policy documents. The Living guidelines on the responsible use of generative AI in
research (European Commission, 2024)2 suggests guidelines for research, which can be
extended to higher education more generally. Examples are the obligation for researchers
to mention the use of AI in their research (similar to in-class activities/evaluations carried
out by teachers using AI); avoiding the predominant use of AI in tasks that impact other

1 This report is a work in progress and comments are welcome. We would like to thank Google for supporting this research, and
ISEG (University of Lisbon) for supporting Institute of Public Policy. The cover image was produced by ChatGPT.
2 https://research-and-innovation.ec.europa.eu/document/2b6cf7e5-36ac-41cb-aab5-0d32050143dc_en

5
researchers (e.g. peer reviews) may be justified on the same basis as, for example, the
obligation for teachers not to subject their students to 100% AI-managed evaluations.

The Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and
learning for Educators (European Commission, 2022)3 refers to four main domains where
AI and Data can be used in Education (student teaching; student supporting; teacher
supporting; and system supporting) and provides thirty-six Guidance Questions for
Educators regarding human agency and oversight; transparency; diversity, non-
discrimination and fairness; societal and environmental well-being; privacy and data
governance; technical robustness and safety; accountability.

Other institutions have produced policy documents (OECD 2023, UNESCO 2023,
2023a) addressing the issue of the impact of AI in education and research. As GenAI
develops, the number of papers published on the topic increases exponentially, and so
does the number of higher education institutions that produce either regulations on AI
use or guidelines for educators and students.

This report aims to give a brief overview of the impact of GenAI on higher education
namely in three main dimensions: educators (Section 2), students (Section 3) and higher
education institutions (Section 4). Our focus is on the main opportunities and threats
associated with GenAI to minimise the risks and fully take advantage of the opportunities.

We faced two difficulties in composing this report. One is that the GenAI is a moving
target. We know what it is today and which are its available applications (LLMs, image
/sound / video generators, editing tools such as Grammarly, bibliography management
tools such as SciSpace), but do not yet know its future incarnations and uses. The other
difficulty is that we would like to refer to a shared understanding about the mission and
core values of Higher Education Institutions (HEIs). For example, plagiarism is certainly
a threat because academic integrity is a shared core value. The identification of and trade-
offs between threats and opportunities depends on our focus on certain core values.

The first difficulty is unsurmountable. The only certainty we have is that the questions
we address in this paper will have different answers in a few years as GenAI develops and
new applications emerge. As for the second, we will mention a set of core values
throughout the paper, raise questions, and present our answers and recommendations in
Sections 5 and 6.

3 https://op.europa.eu/en/publication-detail/-/publication/d81a0d54-5348-11ed-92ed-01aa75ed71a1/language-en

6
The following questions are among the most relevant raised by the development of
GenAI:

• Will GenAI change the mission of higher education institutions?


• How to safeguard the irreplaceable value of human educators even under
widespread use of GenAI?
• How is Gen AI changing the role of teachers and teaching assistants?
• What are the likely changes in assessment techniques to be effective, ensure
integrity and discourage plagiarism?
• Can we expect that new efficient AI applications will detect plagiarism?
• What are reasonable and unreasonable uses of GenAI in the assessment of
students under transparency about its use in evaluation?
• If GenAI allows teaching personalized to each student’s progress, should or can a
common grading scale be maintained?
• How to promote the ethical and responsible use of AI outputs by the academic
community?
• Should we introduce AI literacy in all curricula, irrespective of the field of study,
but adapted to each field?
• How to best transmit to students how data biases influence AI outputs?
• How to reduce individualism and promote social skills and critical thinking?
• Should guidelines for ethical and conscious use of Gen AI be set at a university
level or (and) at a school/department level?
• How to protect students’ data and privacy?
• How to promote equal opportunities in the access to AI tools to students with
more vulnerable socioeconomic backgrounds?
• What is the optimal level of stimuli to students (e.g., notifications and nudges)
to promote engagement and motivation, while avoiding loss of autonomy and self-
esteem?

It is beyond the scope of this report to answer the first question, which is a fundamental
existential question of higher education institutions not much addressed in the literature.
However, we will address all the other questions in this report.

7
2. Educators: Teaching and Evaluation Methods

“One recent study found that in the 20 occupations most exposed to AI language
modelling, there are 14 teaching subcategories including English language and literature,
foreign language and literature, and history teachers” (Felten et al., 2023).

The impact of GenAI on teaching is expected to be immense and diverse. It will


certainly depend on the learning area, as the above citation indicates. Still, regardless of
the area, it will undoubtedly transform the possibilities educators have to engage with
their students and motivate them to learn.

At the level of opportunities for educators, one can identify three main domains: the
preparation of class materials, teaching attractiveness, and evaluation of students’ work,
where the latter faces new challenges due to students’ (mis)use of AI. Concerning the
risks, the quality of AI system outputs is not guaranteed, may not be in line with
educational goals, or may contain biases. Educators will be responsible for overseeing and
avoiding these issues.

Additionally, educators will have to develop evaluation methods consistent with


students’ use of GenAI and permanently be aware of its growing capabilities. HEIs and
their educators have indeed started to respond to the availability of GenAI. This
adaptation first resided in substituting evaluations where students can misuse AI without
detection, such as take-home exams, for classical real-time evaluations, such as written
exams or class presentations. However, this is a purely defensive move and will often not
be feasible due to time constraints. The next step is to actively exploit the capabilities of
GenAI to improve both teaching and learning. This cannot be left to each educator’s
initiative and resources: HEIs must offer active support.

2.1. Findings from the Literature

The rapid integration of GenAI into educational settings is reshaping curriculum design
and pedagogical approaches. In a recent study, Lee et al. (2024) formed a Community of
Practice (CoP) among the University of Adelaide’s staff and students to collectively discuss
issues surrounding AI. Half of the participants indicated they had modified their course
designs due to GenAI. A prevalent change involved assessments, with many participants
mentioning the use of varied assessment methods, the introduction of new assessment
questions, and tasks focused on the application of knowledge. Additional adjustments

8
included altering essay topics to emphasise fieldwork and using AI to create low-quality
drafts that students were then asked to improve.

Although most AI-enabled adaptive learning systems and frameworks are still in an
experimental phase, they have seen growing interest and use since the pandemic (Kabudi
et al., 2021), particularly to improve pedagogy. These systems can enhance teaching and
offer individualised assistance by accurately matching the specific learning goals and
criteria for the success of the course or instructional unit, allowing educators to “tailor
educational content to the distinct needs, interests, and learning preferences of each
student, offering personalised learning materials and activities” (Labadze et al., 2023).

Two prominent applications of GenAI in education are Intelligent Tutoring Systems


(ITS) and Chatbots. The first use AI techniques and educational approaches customised
to students' specific characteristics and requirements (Mousavinasab et al., 2018). The
second are AI-driven conversational agents that interact with students through text or
voice interfaces. They can answer questions, provide explanations, and guide students
through various learning activities (Lin et al., 2023; Memarian & Doleck, 2023).

GenAI may promote learner engagement through adaptive personalized


recommendations (Maravanyika et al., 2017). Padron-Rivera et al. (2018) propose the
affective tutoring system Tamaxtil, designed to detect when students become frustrated
and confused and adapt the learning experience to the student's emotional state.

Concerning exam preparation and correction, GenAI has been found to transform
educators’ work loads and capabilities to do either. Yang et al. (2021) reported that it can
generate questions and create multiple-choice tests. Additionally, Lu et al. (2021) studied
the use of natural language processing to create a system that automatically created tests
and found that AI technologies can generate highly reliable short-answer questions.
Although not pure GenAI per se, AI-powered automatic assessment reduces grading time
(Crompton & Burke, 2023; Rutner & Scott, 2022). Nonetheless, GenAI's capabilities are
essential to provide detailed and personalised feedback on those assessments.

After the launch of ChatGPT in November 2022, both students and educators were
taken in by the quality of the chatbot’s text outputs. Many educators were also
immediately concerned with the need to weed out GenAI-driven plagiarism. The literature
on this issue stresses the fragility of the existing mechanisms to detect AI-generated text,
arising from the difficulty of differencing the latter from human-generated text (Elkhatat
et al., 2023). Not only do AI-generated texts (such as those produced by ChatGPT) go
unnoticed by AI detection tools (Weber-Wulff et al., 2023), but studies have found that

9
these detection tools have too many false positives and false negatives (Dalalah & Dalalah,
2023), which may undermine HE core values such as trust and integrity.

2.2. Use Cases

2.2.1 Course administration

From the educators’ perspective, GenAI can play a pivotal role in preparing courses
and establishing learning goals. GenAI tools, such as AI chatbots and ITS, offer numerous
advantages that can significantly enhance the educational process. For instance, ITS can
assist educators by measuring students' comprehension, proposing suitable teaching
approaches and techniques, and offering assistance and direction to both students and
educators (Zawacki-Richter et al., 2019).

Well-designed GenAI systems can further aid various aspects of course preparation by
offering a wealth of resources, making learning materials easily accessible, and offering
expert guidance on challenging subjects (Ilieva et al., 2023; U.S. Department of
Education, 2023; UNESCO, 2023a). Additionally, they can assist with routine
administrative tasks associated with university courses (OECD, 2023). For example, if
correctly trained, AI chatbots could efficiently manage common student inquiries,
schedule office hours, and distribute course materials (Ilieva et al., 2023).

By managing these tasks, GenAI tools could allow educators to focus more on critical
aspects of teaching. In fact, according to Ilieva et al. (2023), the most important decisions
related to instructional design, assessment methods, and overall course management
must be retained by the educator. Human supervision guarantees that the course content
remains permanent and accurate. Educators can use insights from AI tools to make
informed decisions while maintaining their essential role in the educational process
(Felix, 2020).

2.2.2 Preparation of classes and study material

In the context of class preparation and study material, GenAI can be an asset. For
example, educators can use it to record classroom sessions and obtain metrics related to
student engagement (U.S. Department of Education, 2023). If students spend a significant
amount of class time talking to each other, the educator can analyse this behaviour to
understand its underlying causes: Is the content too difficult? Is it too boring? By

10
identifying the reasons, educators can develop targeted approaches to address these
specific issues, enhancing the overall effectiveness of their teaching.

In addition, GenAI tools can be used throughout the class itself. According to OECD
(2023), AI tools can be used to augment cognitive engagement, by supporting educators
in (i) generating multiple representations, (ii) adapting to more meaningful contexts and
(iii) providing numerous opportunities to practice.

Specifically, in (i), the educator can recur to text, graphics, or models to help students
make deeper connections and achieve a thorough understanding of the content. GenAI
can support this by providing diverse explanations, allowing students to evaluate and
contrast different viewpoints, which can enhance their comprehension. In the context of
(ii), educators can create a more adaptable bank of instructional resources, generating
them spontaneously to address the interests of distinct groups of students. This would
ensure that the learning materials are relevant and engaging to all students. Finally, in
(iii), through repetitive practice, students develop fluency and the ability to perform tasks
quickly and effectively by repeatedly engaging in the same computations and processes.
Educators can use GenAI to produce unlimited, customised practice exercises, offering
targeted practice opportunities that address the specific difficulties students might face in
each topic.

2.2.3. Personalization of learning goals and study materials

One of the most significant impacts of GenAI is on instructional design through


personalised learning experiences. According to OECD (2023), educators can use GenAI
to enhance various strategies for eliciting student thinking, providing feedback, and
tailoring instruction to student needs. For instance, GenAI can quickly develop quizzes to
assess understanding, offering detailed insights into student learning and helping with
targeted feedback and interventions. It can also streamline feedback by generating
comprehensive suggestions on assignments, identifying gaps in understanding, and
suggesting improvements.

Additionally, GenAI (through learning analytics) can support differentiated learning


by providing extra help to students who need it and simultaneously enable advanced
learners to delve further into topics through self-directed learning (Kamalov et al., 2023;
Zawacki-Richter et al., 2019; OECD, 2023). Specifically, learning analytics uses real-time
data to evaluate students' progress, understand their learning methods, and track their
self-regulated learning. It assesses performance, adapts the pace of new concepts, and

11
optimally reintroduces old content to refresh memory (Zawacki-Richter et al., 2019;
OECD, 2023). Beyond supporting specific topics, it can guide students' learning
trajectories, indicating when they are ready to advance.

An example of a tool that was designed to produce a feedback loop between teaching
statistics and assessing student’s progress is Stat-Knowlab, as described by de Chiusole et
al. (2020). It adapts each learning experience to individual students' competencies and
creates optimal learning paths. Applications such as this will help educators ensure that
their students engage with educational activities suited to their current level of readiness.

2.2.4. Creation of exams and other forms of evaluation

Kurtz et al. (2024) emphasise the transformative potential of GenAI in the domain of
student assessment. According to the authors, educational institutions still rely on
traditional testing methods focusing on memorization and recall, neglecting the practical
application of knowledge and skills. Moreover, while the call for transformative
approaches in teaching and assessment is not solely a consequence of GenAI's emergence,
the latter underscores the urgent need for these educational reforms.

One approach is through "AI-proof" assignments that use best practice teaching
principles, such as breaking them into smaller tasks, designing authentic assignments with
real-world value, incorporating space for student reflection and metacognition, and
creating connections in content to experiences that AI lacks (e.g., recent events, classroom
discussions). For instance, “AI-proof” assignments that were put into practice include
generating sample texts or code and asking students to fact-check, critique, and improve.

Integrating AI into evaluation methods themselves can take multiple forms. According
to Ouyang et al. (2022), various studies predating the launch of LLMs had already shown
the benefits of automated assessment in online higher education. For example, Hooshyar
et al. (2016) created an ITS Tic-tac-toe Quiz, designed to provide formative assessments
for students' programming and problem-solving skills. Their research found that this
system improved students' interest in learning, fostered positive attitudes, increased
technology acceptance, and enhanced problem-solving capabilities. Similarly, Aluthman
(2016) introduced an automated essay evaluation system that offers immediate feedback,
assessment, and scoring for students in an online English learning environment. The study
concluded that it significantly enhanced the writing performance of undergraduate
students.

12
GenAI tools can further revolutionise formative assessments by offering advanced
analysis and feedback capabilities (Celik et al., 2022). GenAI can evaluate student-created
graphs or models, provide immediate feedback on complex skills, and manage simpler
grading tasks, allowing educators to focus on more complex evaluations (U.S. Department
of Education, 2023).

2.3. Main Opportunities and Threats

2.3.1. Opportunities

2.3.1.1. More granular assessments of students' level of understanding

Large class sizes can be challenging when it comes to assessing students’ level of
understanding in real time. It is not a question of evaluation, but of keeping and fostering
the engagement of students with the content being taught. As mentioned in Section 2.2.3.,
GenAI may help educators by providing quizzes and diagnostic exercises, which can, in
turn, provide more granular and precise reporting on what each of these students learned
or is learning during a class. The level of monitoring reach and precision of GenAI
outperforms the educator’s (OECD, 2023). Following the above, in the future, GenAI can
be an asset to measure students’ level of understanding at two levels: the individual level
and the class level.

At the individual level, educators can identify students who may be struggling with
specific topics and intervene right away (Celik et al., 2022; Lim et al., 2023). For example,
suppose a student persistently answers questions incorrectly. In that case, the AI tool can
alert the educator, who can then provide additional support or adjust explanations to
address the student's difficulties.

At the class level, GenAI can aggregate data to provide insights into overall class
performance (Celik et al., 2022; U.S. Department of Education, 2023). Educators can use
such insights to adjust their strategies, allocate more time to topics that students find
challenging, and identify trends in student understanding. This could present a great
opportunity to develop group activities and assessments, by pairing students who have
mastered specific topics with others who have not yet reached the same level of
understanding.

13
2.3.1.2. Teaching materials better suited to students' abilities and educational levels

Students differ not only in their knowledge level, areas of preference or grades: they
also differ in the educational approach under which they learn best. While some require
more direct attention, others work best as self-learners. While some prefer to follow the
standard order of the contents, others find it better to have them reorganised. This, of
course, is not the cause of all academic failure, but it would be naïve to think that these
issues do not exist, that a one-fits-all approach to teaching is best where all students
regardless of their capabilities and motivation must learn each content at the same time,
at the same pace, with the same activities.

Challenging this conception, GenAI can guarantee a level of permanent adaptation to


each student according to their skills, motivation, and their educator’s educational goal.
The times of anxiety of being left behind while not understanding or being bored of
waiting for the class to move on to the next topic may soon be gone.

In addition, GenAI may also support and assist educators’ judgement of each student’s
development by providing learning analytics. The data gathered in real time of each
student's performance can inform their educator on what to study next and what set of
exercises will help the student achieve the expected learning goals. Each time this analysis
is performed, the educator will progressively have more insights and proposals on how to
plan next year’s curriculum.

2.3.1.3. Classes and exercises more interesting to students

As noted in European Commission (2022), “AR creates opportunities for teachers to


help students grasp abstract concepts through interaction and experimentation with
virtual materials. This interactive learning environment provides opportunities to
implement hands-on learning approaches that increase engagement and enhance the
learning experience.” This rather dry description points to the benefits of supplementing
academic abstraction with a plethora of real-life examples, to ensure not only that
students learn, but also that they enjoy learning.

GenAI can train and test students in unscripted and highly realistic situations. Take
the case of finance or economics. With GenAI, an educator could simulate a life-like real-
time market situation where students could interact with each other and the GenAI
mechanism in more ways than a classical game situation where all participants have a
limited set of options. In medical studies, GenAI technology could reproduce a real-time
simulation of a human liver (possibly complemented by virtual reality) that reacts with a

14
high level of accuracy to the medical student’s actions. In both cases, GenAI would provide
an unprecedented level of interactivity.

2.3.1.4. Educational materials for different languages or needs

By instantly translating and creating educational materials in different languages, as


well as adapting them to the specific student’s needs, GenAI technology allows educators
to reach a more diverse group of students, making education more inclusive and
accessible (OECD, 2023). Educators can quickly provide translated texts, lecture notes,
and other multimedia learning resources, ensuring that students who speak different
languages can access the same high-quality content as their peers (Farrelly & Baker,
2023). Similarly, educators can use GenAI to tailor educational materials to meet specific
needs, such as converting text into audio for students with visual impairments and
simplifying language for younger learners or those with learning disabilities (Pierrès et
al., 2024).

This capability allows educators to make their teaching more effective and responsive
to the individual needs of students. For instance, GenAI can convert complex texts into
simpler versions for students with different comprehension levels, or create multilingual
glossaries to support language learners. Moreover, in the future, it is also expected that
more specific adaptations of educational materials for various disabilities will be created,
covering a broader number of physical, sensory, cognitive, or learning disabilities (Pierrès
et al., 2024).

In the previous Section, we mentioned that a combination with VR could be used to


make challenges and exercises more interesting to students. However, it can also create
immersive and interactive learning experiences that are particularly beneficial for
students with disabilities. For example, virtual field trips can be an excellent resource for
students with mobility impairments. Additionally, augmented reality can also be used to
enhance visual and auditory enhancements to assist students with sensory processing
disorders.

Finally, these technological advancements may also facilitate more collaborative


learning and teaching across linguistic and cultural barriers by integrating global
perspectives into course curricula (Kurtz et al., 2024). This integration can enrich the
learning experience for a broader range of students in the classroom.

15
2.3.1.5. Less time spent on administrative tasks

“Schools and teachers can use software to perform many repetitive and time-
consuming tasks such as timetabling, attendance control, and enrolment. Automating
such tasks can allow teachers to spend less time on routine tasks and more time with their
students [or preparing class materials or other educational approaches]” (European
Commission, 2022).

Looking ahead, automating administrative tasks offers significant opportunities to


enhance the educational experience. As AI technology advances, it will manage complex
administrative functions more efficiently and accurately (Dai et al., 2023). This will free
up educators' time, allowing them to focus on essential activities that directly impact
student learning and development (Celik et al., 2022).

The reduction in administrative burdens also creates opportunities for more


personalised learning experiences. As previously mentioned, educators can dedicate more
time to understanding individual student needs and tailoring their instructional methods
accordingly. Additionally, with more time to collaborate, educators can work together to
develop interdisciplinary curricula that integrate diverse perspectives and skills, further
enriching the educational landscape. By reducing the time spent on routine tasks,
educators can focus on what truly matters: fostering a supportive, engaging, and dynamic
learning environment for all students.

2.3.2. Threats

2.3.2.1. Shift in students’ trust: Educators vs. GenAI

As GenAI tools become more integrated into educational environments, it is crucial to


consider their impact on the dynamics between students, educators, and the technology
itself. While these tools offer significant potential to enhance learning, they must be
implemented carefully to avoid weakening the essential role of educators.

The development and implementation of GenAI tools must ensure that they enhance
learning while not inadvertently harming students (Felix, 2020). If these tools are not
used correctly or exacerbate existing issues in education, their use could significantly
undermine trust in educators, both by students and their parents. Over-reliance on AI may
even lead students to prefer AI tools instead of their educators or peers, specifically if the
application seems to be more competent and to provide more immediate feedback than
the educator or colleagues (Felix, 2020).

16
This potential shift in trust could have several implications. Strong relationships
between educators and students are crucial for effective learning. If students start to rely
more on AI tools, the personal connection that educators build with their students will
weaken, leading to a less engaging and supportive learning environment. Additionally,
over-reliance on AI reduces opportunities for students to develop critical thinking and
interpersonal skills (Abbas et al., 2024). Educators not only impart knowledge but also
foster discussions, debates, and interactions that are essential for holistic development.
AI tools, while efficient, may not be able to replicate the nuanced guidance and
encouragement that a human educator provides.

2.3.2.2. Insufficient technological means to detect plagiarism

Karl Popper observed long ago that “every solution of a problem raises new unsolved
problems” (Popper, 1963). If it is true that AI could solve many current problems in higher
education, it created several new ones, and hard-to-detect plagiarism is certainly one of
them. If AI serves to both detect and avoid detection of AI authorship at the same time
and with equal effectiveness, it will be a threat to higher education. It may even not just
be a threat, but a transformative condition. As stated in the literature, AI tools and
mechanisms to detect AI-generated text are far from fulfilling their mission accordingly,
as there are many ways to avoid detection. Ibrahim et al. (2023) report that it does not
need much to mislead ChatGPT-based tools to detect plagiarism, a few typos or full-stops
can do the trick. The studies mentioned above (Dalalah & Dalalah, 2023; Elkhatat et al.,
2023; Weber-Wulff et al., 2023) strengthen the idea that AI will not be enough to deter
its use for plagiarism. Certainly, HE educators will have to adapt to this new condition.
Until this happens, AI must be considered a threat to academic integrity and the
achievement of the educational goals of HE in learning and research.

2.3.2.3. Ethical challenges in automating student evaluations

The ethical dimension of GenAI use in the evaluation and assessment of students’
learning and achievement is especially important. AI systems’ tendency for hallucinations
and biases is well-known and justifies the requirement of “in-the-loop” presence of the
educator in any AI tool participating in student evaluations. Additionally, GenAI systems
may overlook and disregard other human factors that matter in the assessment of a given
student, namely the learning process, the student’s effort, and their participation in course
work and classes. In other words, it seems clear that AI grading overlooks one educational

17
output that has been crucial to HE for a long time, which is the student's ability to “learn
to learn.” This does not mean that educators, therefore, must be forbidden to make
reasonable use of any form of automatic evaluation. Rather, they will have to have the
final word regarding the overall evaluation. We return to this issue in chapter 5.

2.3.2.4. Job Displacement: the potential of GenAI to automate teaching positions

As mentioned in the Introduction, recent research by Felten et al. (2023) explores how
LLMs such as ChatGPT will impact various occupations, identifying post-secondary
educators, such as those in English and foreign languages and literature, and history,
among the most affected. These teaching roles are among the top 20 occupations exposed
to AI-enabled advances in language modelling capabilities.

However, attention should be given to the fact that GenAI has the potential to
automate certain teaching positions, particularly those involving repetitive or less
specialised tasks. Maintaining the role of human educators is crucial due to concerns
about the quality of education. Human educators should bring empathy, creativity, and a
deep understanding of student needs—qualities that are difficult for AI to replicate (Chan
& Tsi, 2023). Reducing contact with human educators could result in a more transactional
form of education, diminishing the relational aspects that foster student engagement,
critical thinking, and interpersonal skills.

One group that may be at particular risk is teaching assistants (TAs), who play a crucial
role by aiding educators in teaching, grading, and administrative tasks in close contact
with students. TA positions also serve as a significant source of financial aid, as well as
valuable teaching and research experience, for graduate students pursuing advanced
degrees. With GenAI systems' increasing capabilities to manage tasks typically carried out
by TAs, there is a concern that these positions may become scarce. The replacement of
TAs with GenAI has the potential to reduce the number of graduate students who can
support their education through TA roles (Pence, 2019). This shift could have broader
effects on the academic community, potentially shrinking the group of prospective
educators and researchers.

2.3.2.5. Lack of transparency or explainability in AI systems

According to UNESCO (2023), “Artificial Neural Networks are usually black boxes;
that is, […] their inner workings are not open to inspection. As a result, ANNs are not
transparent or explainable, and it is not possible to ascertain how their outputs were

18
determined.” This lack of transparency makes it challenging to detect biases in GenAI
models. Furthermore, UNESCO (2023) emphasises that “If users do not understand how
a GenAI system arrived at a specific output, they are less likely to be willing to adopt it or
use it ….”

Two concerns reflected in policy documents regarding GenAI for higher education are
transparency and explainability. These concerns are especially important if one considers
the educators’ suggested role of oversight of AI systems used for classes. How can one
oversee a “black box”? And what do educators need to know regarding GenAI to act upon
these systems? For example, one may think of cases where biases are detected in the
middle of a course with GenAI incorporated in its educational approach. How will
educators be able to interact with or correct the GenAI system? AI systems skills will enter
the European Framework for the Digital Competence of Educators4 of 2017, which is
described as a “general reference framework to support the development of educator-
specific digital competences in Europe”. As did the other threats mentioned above, the
role of educators will have to be rethought both in terms of desired educational goals and
educational capabilities.

3. Students: Learning and Autonomy

The impact of AI on education is twofold, affecting both teaching and learning. He we


explore how AI can enhance students' learning experiences. AI systems, particularly
GenAI, have the potential to revolutionise learning by providing highly original and
tailored responses to user prompts. Text-to-text AI generators like ChatGPT can assist
students with their writing by enabling them to brainstorm ideas, receive feedback, and
improve their drafts (Beck et al., 2023). Similarly, text-to-image AI generators such as
DALL-E and Stable Diffusion are useful for conveying technical and artistic ideas,
especially within the realms of art and design (Chan & Hu, 2023).

Since the release of ChatGPT, students have leveraged the capacity of GenAI to
produce well-written text, often to bypass the effort of writing essays and assignments by
themselves (Beck et al., 2023). While this might save time, it defeats the purpose of the
evaluation exercise, which is to make students practice writing skills and assess whether
they have developed their skills concerning the relevant subject matter. Too often, grading
has focused on the quality of the text rather than the depth of understanding.

4 https://joint-research-centre.ec.europa.eu/digcompedu_en

19
Nevertheless, students’ use of GenAI capabilities should not be viewed only in negative
terms. Using GenAI to produce ideas, structure and summarise arguments, and produce
better and more imaginative text can be a vital skill set. Thus the challenge lies in finding
the right trade-off and persuading students that relying solely on AI to complete their
work short-changes their learning experience. Students must understand that while AI
can assist in the learning process, the ultimate goal is to develop their analytical and
critical thinking skills.

3.1. Findings from the Literature

The way students perceive their learning environment, their self-assessed abilities, and
the instructional methods employed are important in shaping their learning strategies
(Biggs, 1999). These perceptions can subsequently affect their educational outcomes
(Chan & Hu, 2023): When students view their learning environment positively – covering
aspects such as curriculum content, instructional methods, assessment techniques,
learning resources, and support services – and feel confident in their abilities, they are
more likely to engage in a deep learning approach. This approach involves striving for
understanding and linking concepts. Conversely, students who perceive their learning
environment negatively or lack confidence in their abilities tend to adopt a surface
learning approach, which is characterised by memorization and meeting basic
requirements (Biggs, 2011).

Understanding the broader implications of GenAI on student learning requires


considering existing research on students' perspectives and experiences with these
technologies. Recent studies have begun to shed light on how students perceive and use
GenAI in the context of higher education. On the one hand, research has shown that
students are welcoming the use of GenAI in their learning practices and future careers,
while highly valuing its perceived usefulness in providing unique insights and
personalised feedback (Chan & Hu, 2023). The same study reports a positive correlation
between students’ perceived willingness to use GenAI technologies and both knowledge
of GenAI and frequency of use. On the other hand, students are concerned about the lack
of clear policies on GenAI usage, AI bans in their institution, and unequal access to
technology (Johnston et al., 2024).

Applying these initial perceptions and learning approaches to GenAI in higher


education reveals that students' views, concerns, and experiences with GenAI can
significantly impact how they use these tools and, consequently, the extent to which these

20
tools benefit the learning process. If students perceive GenAI as a valuable resource and
feel confident in their ability to use it, they are likely to engage deeply with the
technology, enhancing their learning experience (Kurtz et al., 2024). However, if they
have concerns about access or unclear usage policies, they may adopt a more superficial
or improper approach to using GenAI. This can limit its potential benefits in their
education and lead to other consequences, such as reduced skill development due to
overreliance (Abbas et al., 2024), academic integrity issues (Sullivan et al., 2023; Cotton
& Cotton, 2023; Tlili et al., 2023; Memarian & Doleck, 2023a), inequality in access
(UNESCO, 2023a), and overdependence on technology.

A recent longitudinal study on AI adoption in higher education conducted by Polyportis


(2024) offers additional insights. The study tracked 222 Dutch students over eight months
and found a significant decline in ChatGPT usage. Key factors influencing this change
were trust (confidence in the technology's reliability, accuracy, and ability to perform),
emotional creepiness (discomfort with human-like AI), and perceived behavioural control
(belief in self-ability to perform tasks using the tool). Trust and perceived control over
the technology positively impacted usage, while emotional creepiness had a negative
effect. The study also underscores the importance of these factors in the sustained
adoption of AI tools in educational settings.

Despite these concerns, the literature shows that GenAI tools can significantly enhance
educational experiences across various domains. For instance, these applications can
potentially deepen students' knowledge and comprehension by providing tailored
educational experiences and helping them develop their writing skills (Beck et al., 2023;
Kaharuddin, 2021).

Moreover, as discussed in the previous Section, these tools adapt to individual learning
styles, making educational progress more personalised and effective. They provide
continuous, real-time feedback, which is crucial for performance improvement.
Furthermore, GenAI promotes active participation in the learning process, improves the
capacity to assess information critically, and encourages inquisitive learning,
strengthening critical thinking skills (OECD, 2023).

3.2. Use Cases

3.2.1. Personalised learning

The individualised approach provided by GenAI ensures that students receive


appropriate support and challenges based on their proficiency (Mousavinasab et al.,

21
2018), which promotes autonomous learning and enhances the learning process and
outcomes for students.

One notable example of an AI tutor system is Khan Academy's Khanmigo, a


conversational AI tutor powered by GPT-4 and trained on the Khan Academy's learning
content. Khanmigo supports real-time, one-on-one tutoring tailored to students' needs. It
assists students by coaching writing, serving as a debate partner, aiding with coding, and
even allowing conversations with historical figures. For students, this means having access
to a tutor who never sleeps and providing constant support and encouragement.
Additionally, Khanmigo incorporates safeguards to prevent giving direct answers,
ensuring students learn through guidance rather than rote memorizing.

Kim and Bennekin (2016), as reported in Crompton & Burke (2023), conducted a
study on Alex, an AI assistant used in a college mathematics course. Alex interacted with
students by asking diagnostic questions and providing support tailored to student needs.
This support was organised into four stages: goal initiation (“Want it”), goal formation
(“Plan for it”), action control (“Do it”), and emotion control (“Finish it”). Alex offered
help based on the specific needs of students in different subject areas, promoting
perseverance in their academic pursuits and enhancing their academic results. This study
highlighted the role of AI in ensuring timely support and adapting to students’ academic
abilities, preferences, and optimal support strategies.

Therefore, these AI technologies enhance the accessibility, involvement, and efficiency


of learning, enabling students to maintain their engagement and motivation (UNESCO,
2023a). The consistent presence of AI mentors ensures that students can access assistance
whenever necessary, without having to wait for office hours or scheduled tutoring
sessions. This instant support aids students in staying on course and promptly addressing
any doubts.

3.2.2. Course assistance

GenAI tools for course assistance can also revolutionise the way students organise
course materials and prepare for their classes. They can organize lecture notes, schedule
study sessions, and provide reminders for assignments and exams (Sajja et al., 2023).
Given that these tools analyse students' learning patterns, they can also offer tailored
recommendations to optimise study habits. For example, platforms like Mindgrasp
provide comprehensive GenAI-driven solutions that curate relevant study materials,
highlight key concepts, and answer challenging questions in real time.

22
AI course assistants also facilitate communication and information retrieval, allowing
students to ask questions related to the syllabus, such as exam dates, upcoming class
materials, homework assignments, attendance, and grade and course expectations (Sajja
et al., 2023). For instance, tools like Quizlet’s Q-Chat and Course Hero’s AI assistant offer
interactive features that enable students to receive instant, accurate responses to their
queries, enhancing their preparedness and engagement with the course content.

3.2.3. Homework and exam preparation

GenAI-driven course assistance tools are also valuable for homework and exam
preparation (Labadze et al., 2023). These tools can generate summaries of lecture notes,
making it easier for students to quickly review key concepts. This, allied to the fact that
these tools can highlight parts of the lesson that students did not understand well and
offer extra materials and explanations, can also help students grasp difficult topics better.
Flashcard creation is another significant feature of GenAI tools. By converting lecture
notes and textbooks into flashcards, these tools facilitate active recall, which is a proven
technique for enhancing memory and learning efficiency (Roediger & Butler, 2011).
Moreover, the interactive and personalised feedback offered by these tools can also be
very useful for homework preparation, helping students improve their practice problems
and/or assignments.

Regarding exam preparation, students can use GenAI tools to quiz themselves,
reinforcing their knowledge (Choi et al., 2023). In addition, GenAI tools can predict
potential exam questions based on the course content. This feature could help students
focus their study efforts on the most relevant material. For instance, AI algorithms can
analyse past exams and current course materials to generate exam questions, providing
students with a targeted and efficient study guide.

3.2.4. Skills development

GenAI tools can aid in developing various skills essential for academic and professional
success (Labadze et al, 2023). According to the authors, these are (i) enhanced writing
skills, (ii) problem-solving abilities, (iii) group collaboration, and (iv) critical thinking and
analytical skills.

In terms of (i), GenAI tools offer syntactic and grammatical corrections, stylistic
suggestions, and vocabulary enhancements. This helps students produce clearer writing
(Kaharuddin, 2021). For instance, tools like Grammarly provide real-time feedback on

23
grammar and style, allowing students to learn and improve their writing proficiency over
time.

Regarding (ii), GenAI can enhance problem-solving abilities by providing detailed


solutions to complex problems, which not only helps students arrive at the correct answer
but also enhances their understanding of the underlying concepts and methodologies.
Concerning (iii), the structured discussion frameworks provided by GenAI also support
group discussions and debates, while simultaneously offering real-time feedback to
participants. These tools can suggest discussion topics, outline structured arguments, and
highlight key points, improving the quality and effectiveness of group interactions. GenAI-
driven platforms can support debate preparation and execution, enabling students to
engage in more meaningful and organised discussions.

GenAI tools can also contribute to developing critical thinking and analytical skills
(Kasneci et al., 2023). By analysing large datasets and generating insights, AI can assist
students in conducting thorough research and developing well-supported arguments. This
exposure to advanced data analysis techniques prepares students for future academic and
professional challenges, enhancing their overall intellectual capability.

3.3. Main Opportunities and Threats

3.3.1. Opportunities

3.3.1.1. Advanced study support: tutoring sessions, autonomous exercise creation, and mock oral
examinations

As GenAI continues to evolve, its ability to provide precise and insightful feedback will
improve. By analysing larger and more complex datasets, AI can detect subtle patterns in
student performance, offering highly targeted recommendations for improvement. Areas
that can significantly benefit from these advancements are (i) personalised tutoring
sessions, (ii) autonomous creation of exercises, and (iii) mock oral examinations.

Regarding (i), currently, GenAI-powered tutoring systems already provide real-time,


24/7 support tailored to the student’s profile. In the future, these systems could become
even more adaptive, making interactions more seamless and intuitive. Additionally,
enhanced emotional recognition could allow AI tutors to adjust their teaching strategies
based not only on students' performance but also on their emotional states, providing
more empathetic and effective support (Padron-Rivera et al., 2018).

24
Concerning (ii), it can be expected that, in the future, GenAI tools autonomously create
even more personalised and varied exercises, integrating multimedia elements to meet
different learning preferences. Advanced data analytics might also allow AI to predict
learning trajectories and proactively adjust study plans to optimise long-term retention
and understanding.

In terms of (iii), GenAI can simulate real exam scenarios, ask questions, evaluate
responses, and offer feedback. Enhanced feedback mechanisms, including detailed
analysis of verbal and non-verbal communication skills, could help students refine their
presentation and public speaking abilities. Furthermore, augmented or virtual reality
tools can allow to creation of immersive, lifelike exam environments, offering an even
more comprehensive preparation experience. These technologies could be especially
useful for preparing presentations and thesis defences.

3.3.1.2. Enhanced self-learning management

GenAI holds substantial promise for the future of learning management by


transforming how students themselves can evaluate, assess, and regulate their learning
processes (Kasneci et al., 2023; Liang et al., 2023). Looking ahead, GenAI will provide
tools that enable students to gain deeper insights into their learning behaviours and
progress, which can foster greater autonomy and effectiveness in their educational
progress.

Specifically, the advancement of GenAI settings will allow the provision of precise and
insightful feedback, helping students to identify their strengths and weaknesses with
higher accuracy. This will empower them to take active steps in their learning path,
adjusting their strategies and efforts to achieve better outcomes. Moreover, students will
be able to identify their knowledge and readiness more accurately for exams or
assignments, allowing them to be better prepared and to develop deeper knowledge and
competencies.

Moreover, AI will be able to predict potential learning challenges and suggest


interventions before issues become significant (UNESCO, 2023a). For example, AI could
alert a student at risk of falling behind or recommend alternative study methods to
enhance comprehension and retention. This proactive support ensures students receive
timely, targeted help, leading to more successful learning outcomes.

25
3.3.1.3. Enhancing research skills

According to Al-Zahrani (2023), GenAI enables researchers to automate certain tasks,


augment their capabilities, and explore new possibilities in data analysis, knowledge
discovery, and problem-solving, enhancing research productivity, enabling discoveries,
and accelerating scientific progress in various fields. GenAI “can be a tool for the quick
and easy generation of data samples for various types of research, based on patterns and
structures”, and that “[such tools] can also be used as an analysis tool”, or “as a writing
assistant for research reports.”

A recent example is the fully automated experiment for double-blind online


randomised controlled trials by Cingillioglu et al. (2024). This system autonomously
manages participant interactions and group allocations. It demonstrated AI's efficiency in
data collection, establishing causal relationships, and producing reliable results,
highlighting its potential to improve the efficiency, rigour, and ethics of educational
research.

In this sense, and as students become more familiar with these tools, expressing
positive perceptions about their potential to revolutionise academic research and generate
new insights (Al-Zahrani, 2023), they will be used for more purposes than those exposed
earlier in this Section. For literature reviews a tool such as SciSpace, which applies GenAI
to an enormous set of papers published in peer-reviewed journals, is of enormous
advantage. This will allow both professional and young researchers, such as master’s and
PhD students, to efficiently select relevant literature and process large datasets, identify
patterns, and draw comparisons that would be too time-consuming to perform manually.
Consequently, students can focus more on interpreting results and deriving meaningful
conclusions, enhancing their studies' depth and precision.

Furthermore, GenAI can automate the collection and synthesis of information from
numerous sources for compilatory studies, enabling students to quickly compile
comprehensive literature reviews and meta-analyses, identifying key findings and gaps in
existing research, saving time, and improving the quality of their reviews (UNESCO,
2023). GenAI’s predictive capabilities can guide students toward emerging research
trends, helping them align their studies with future developments in their field (Al-
Zahrani, 2023).

26
3.3.1.4. Boost to asynchronous education and student self-motivation

The above-mentioned features of GenAI allow for asynchronous learning, avoiding the
one-size-fits-all trap and making learning more engaging and effective. These
technologies present a significant opportunity in education: increasing students’ self-
motivation (Memarian & Doleck, 2023).

Student self-motivation can be further enhanced through gamified learning (Johnston


et al., 2024). By integrating game elements into educational content, students can benefit
from a more engaging and enjoyable learning process. For instance, a student studying
math might complete levels and earn rewards as they master different concepts, while a
history student could embark on virtual quests to explore ancient civilizations. These
gamified elements can make learning more fun and interactive, enhancing student
motivation and retention of information.

In this context, Memarian and Doleck (2023) stated that “such effects [in student self-
motivation] need to be studied in the long term and efforts need to be made to study
student motivation changes based on their learning and demographic backgrounds”.
From a distinct perspective, another opportunity that arises from the deployment of GenAI
is its potential to bridge gaps for students worldwide, allowing the democratization of
knowledge where every student has access, independently of their location or economic
status.

3.3.2. Threats

3.3.2.1. Lack of critical thinking and creativity

In terms of critical thinking, there are conflicting effects of GenAI on students. As


mentioned above, GenAI could promote critical thinking by exposing students to step-by-
step explanations and exercises that challenge their understanding and encourage deeper
analysis. However, tools (such as ChatGPT) “may also lead to the students’ [over-]reliance
on software and result in reduced self-assessment and critical thinking among students”
(Memarian and Doleck, 2023).5

In fact, if students become too dependent on GenAI for answers, they will struggle to
develop their own problem-solving and idea creating skills. The convenience of having an

5 We conducted a test with ChatGPT. Initially, we asked in English for a syllabus for a public economics course, and it produced a
well-designed syllabus based on existing and reputable bibliography. Subsequently, we asked the same question in Portuguese,
specifying that we needed Portuguese bibliography. ChatGPT produced a similar syllabus in Portuguese, but all five references were
invented. One of the authors of this report discovered that he had a "new" book attributed to him that he had never written.

27
AI system readily provide information and solve problems could lead to a passive learning
attitude, where students might accept AI-generated solutions without questioning,
especially under time pressure. This dependency can result in a decline in original
thinking and the ability to generate unique ideas independently (Kurtz et al., 2024; Abbas
et al., 2024).

3.3.2.2 Stimulus overload and dependence

We described above how ITSs can provide feedback on learning progress and nudge
students towards better learning outcomes. If these are not implemented with
moderation, students could become quickly overwhelmed by the amount of feedback they
receive in multiple courses simultaneously, possibly even from different systems and with
multiple deadlines to respond to. This can lead to negation, abandonment, and in the
worst case permanently harm the self-confidence of students in their own abilities.

Students may also become dependent on the constant stimuli provided in gamified
learning environments, neglecting other important activities not covered by the system6
and becoming incapable of being autonomous when left on their own. Students need to
learn how to set their priorities and allocate their time accordingly, without tutoring
systems telling them what to do and when to do it.

3.3.2.3. Increased individualism and reduced social skills

While GenAI creates opportunities for group discussions and debates, by offering
structured discussion frameworks and real-time feedback (Labadze et al., 2023), it can
also lead to social isolation. As GenAI systems provide tailored content and instant
feedback, students might find themselves spending more time interacting with machines
rather than engaging with peers and educators (Felix, 2020).

This shift could result in reduced opportunities for collaborative learning, which is the
whole point of visiting an HEI and which is vital for developing empathy and
communication and teamwork skills. The lack of social interaction can hinder students'
ability to work effectively in group settings and diminish their overall social competence
(Marrone et al., 2022).

6 This is a classical topic discussed in the economics of incentive provision.

28
3.3.2.4. Hallucinations and reinforcement of biases

There are other potential dangers associated with GenAI in this area: GenAI systems
can make errors (“hallucinations”), which are due to their nature as statistical prediction
machines – they are not databases of “facts”.

Furthermore, while AI can reduce bias in decision-making by minimizing human


subjective interpretation of data—leading to more objective and data-driven outcomes—
it can also magnify it (UNESCO, 2023a). The underlying cause is that biases present in
the training data are passed down and perpetuated in the model structure. The lack of
diversity among engineers and researchers developing GenAI products can exacerbate
this issue. The gender gap in the science, technology, engineering, and math (STEM)
fields, including AI research, significantly contributes to this problem. Moreover, GenAI
can magnify societal issues extending beyond gender bias to include racial, social, and
cultural discrimination, which increases its the societal impact and perpetuates
unfairness.

Both hallucinations and biases are particularly problematic in the educational context,
where students should rely on accurate information to learn and develop their
understanding. This issue creates a critical need for students to be taught how to double-
check and verify the information provided by GenAI systems. In Chapter 5, mechanisms
and recommendations to address this issue will be provided.

4. Higher Education Institutions: Organizing and Planning

AI has arisen from an information society undergoing a transformation marked by the


expansion of new technologies aimed at digitalisation, global real-time networking and
production, and automation of productive processes (European Commission, 2018). As
these technologies advance, they increasingly find applications beyond traditional
industrial sectors. The development and implementation of tools that enhance time and
task efficiency are becoming increasingly prominent in education systems, particularly
within the higher education sector.

One driving force behind this trend is the goal of democratising higher education and
accommodating the international student market, creating pressures due to the large
number of enrolled students (Novoselova, 2023; Popenici & Kerr, 2017). These drive the
need for efficient tools that help manage large student populations and improve
educational outcomes. Moreover, as the usage of GenAI technologies spreads across the

29
various levels of education – teaching, learning and even research –, HEIs are obliged to
rethink their role, governance, and management processes.

Hence, incorporating AI into higher education requires addressing technical and


organisational aspects, such as hardware resources, software needs, data management
approaches, staffing and expertise, as well as security and privacy issues (UNESCO,
2023a). The increasing development of AI is likely to affect many administrative roles
within HEIs that typically require significant human and financial resources, such as IT
services, admissions, student assistance, libraries, and marketing and accounting
departments.

4.1. Findings from the Literature

The implementation of GenAI in higher education encompasses a broad spectrum of


applications, including institutional and student administration. At the institutional
administration level, before the breakthrough of GenAI, AI tools were already used in
various capacities, particularly in data governance (Beerkens, 2021; UNESCO, 2023a).
These tools collect, process, and analyse large volumes of data to facilitate decision-
making processes, by providing valuable insights that support effective governance and
strategic planning. Such insights are crucial for transparent and evidence-based decision-
making (De Wit & Broucker, 2017).

More recently, GenAI applications have been developed to further improve


institutional administration, by reducing the existing high volume of administrative tasks
and liberating resources from low-value activities to core operations. For example, GenAI
is being used to execute class scheduling and educator allocation. Additionally, public-
facing tasks such as managing student inquiries and providing campus information are
being streamlined through AI-powered chatbots and virtual assistants (Labadze et al.,
2023; Popenici & Kerr, 2017).

From the student administration perspective, these technologies offer substantial value
by enhancing efficiency and providing comprehensive support throughout students'
academic path, from critical phases such as admissions and financial aid to continuous
assistance in student services and proactive measures aimed at reducing dropout rates
(UNESCO, 2023a). For admissions and financial aid, chatbots can be used to clarify
students’ questions and help them get the information they need in a much faster and
personalised way. Furthermore, AI assistants are also being used to offer ongoing student
support to various kinds of modalities: tutoring, organizational help, social and campus

30
life, and career guidance. Finally, sophisticated data analysis of academic records and
performance generates early warning reports, identifying at-risk students and enabling
timely interventions to prevent dropouts (Ge & Hu, 2020; Crompton & Burke, 2023; Nagy
& Molontay, 2023).

Apart from these applications, GenAI tools in student administration can leverage
additional data to enhance the student experience further. According to Ge and Hu
(2020), universities can gather online consumption data, credit payment information, and
campus card usage to improve administrative decisions. This data can be used to optimise
dormitory assignments and roommate matching, enhancing communication and creating
a positive learning and living environment. Evidently, care should be taken due to the
highly personal natura of this data.

In exploring specific use cases of AI in higher education administration, notable


applications include enhancing student recruitment strategies and implementing
comprehensive support systems for both students and educators.

4.2. Use Cases

4.2.1. AI tools for planning and evaluating degree programs

Over the past few decades, HEIs have increasingly implemented performance
management practices, with more data being collected and more sophisticated indicators
being computed, recurring to big data techniques. Moreover, the amount and type of data
collected has been increasing, and the purposes for which the data is used is constantly
changing, reflecting the shift of priorities over the years (Beerkens, 2021). Specifically,
performance indicators are used at three levels: to reflect on the performance of the
system, on that of individual universities, and on that of sub-units within organisations.

In this sense, GenAI can play a significant role by enhancing the capacity to collect,
analyse, and use data for performance management. GenAI tools can automate the data
collection process, identify patterns and trends, and present the output in a form that is
adequate for its users and their needs. For instance, GenAI can assist in planning and
evaluating degree programs by predicting future trends in student enrolment and industry
needs. Additionally, it can evaluate the effectiveness of current programs by analysing
student performance data, graduation rates, and employment outcomes, so that
institutions can adapt their programs accordingly.

Finally, GenAI can assist in drafting degree programs and designing courses, including
setting learning goals and conducting consistency checks. It can also enhance the

31
complementarity between courses by analysing all existing syllabi and constructing new
ones based on this information.

4.2.2. AI tools for student recruitment

The process of student recruitment includes many steps, two of the most important
being (i) thoroughly answering students’ queries regarding the degree programs and
university, and the other (ii) advertisement, either through social media platforms or
direct and personalised communication to prospective students.

In the case of (i), the most common GenAI tool to answer students’ questions are
chatbots (UNESCO, 2023a; Labadze et al., 2023), as they are designed to mimic human
conversation using text to provide information in a conversational manner. In fact,
according to Ilieva et al. (2023), “intelligent chatbots can execute a wide range of business
functions, including sales and marketing, personal assistance, and information retrieval.”
This makes chatbots particularly attractive for universities aiming to enhance their
recruitment processes, as they can provide instant responses to frequent questions, guide
prospective students through application procedures, and even offer personalised
program recommendations based on student interests.

For (ii), advertisement campaigns can be designed and optimised for specific student
groups by using GenAI tools to create social media content and commercial scripts with
higher click-through rates, as well as develop personalised communication (emails and
messages). The data used for those communication purposes can be of various forms:
demographic and geographic data, behavioural data (e.g., website visits, interaction with
previous market content, as well as in social media platforms), previous communications
(e.g., email inquiries and chat conversations), survey responses, and interactions from
social media platforms.

This approach not only reduces marketing costs but also increases engagement. For
example, Element451 uses AI to enhance student engagement through personalised
communication, analysing data to identify prospective students and tailor recruitment
messages. Additionally, other tools facilitate peer-to-peer engagement, allowing potential
applicants to ask questions and gain insights into the student experience.

An innovative tool that potentially enhances recruitment processes is the E360 Tailor’s
University Virtual Campus Tour. By integrating GenAI, it analyses campus data, prioritises
elements, selects relevant locations, and crafts tour content accordingly. This creates a

32
multi-stop campus tour tailored to the candidate’s interests and needs, providing an
engaging virtual experience for each prospective student.

With time, similar innovations are expected to transform student recruitment even
further, offering increasingly personalised and immersive experiences tailored to the
specific interests and needs of prospective students, and making the process more
engaging and effective.

4.2.3. AI tools for student accompaniment

Even before the breakthrough in GenAI, universities were already evolving their
processes to provide more instant and intuitive assistance to students. In 2016, IBM
developed Jill Watson, an AI-powered virtual teaching assistant designed to support
individual courses by answering student questions like a human teaching assistant,
incorporating ChatGPT in 2022. Jill Watson answers basic student course questions and
helps students with course content, such as explaining topics and assisting with
assignments.

Nonetheless, as previously mentioned, student support extends beyond AI tutoring and


course assistance, encompassing all aspects of student life. Universities are implementing
AI-driven solutions to enhance various facets of the student experience (Pence, 2019), to
ensure that students receive timely and personalised assistance, helping them navigate
both academic and non-academic challenges effectively.

For instance, the University of Michigan developed two AI tools: a general AI assistant
(U-M GPT) and a personalised AI tool (U-M Maizey). The first consists of a versatile
assistant available to all campus members, capable of answering any academic or
university-specific questions, summarizing general information, and producing written
work. Users can choose between AI models like GPT-3.5, GPT-4, and Llama 2, and
students can access it at no cost, generating up to 25 prompts per hour without using
user-specific data for training to protect privacy. U-M Maizey, on the other hand, connects
to personal accounts on Google and Canvas to provide customised answers and insights.
It allows students to upload unstructured texts, adjust the AI tool's "temperature" for
output sensitivity, and use or share their customised tools. While U-M Maizey was free
through 2023, it required a monthly fee afterward.

It is expected that AI student assistant tools will move from chatbots to omnipresent
tools that provide multimodal support, available anytime and on any kind of device. These

33
advanced tools may even incorporate voice functions to make student interaction easier
and more intuitive.

4.2.4. AI tools for assisting educators

Generative AI can reduce educators’ workloads by automating tasks such as lesson


planning, grading, feedback provision, and routine administrative duties (OECD, 2023).
This automation manages time-consuming activities, allowing educators to focus on
personalised teaching, facilitating small group activities, and engage more meaningfully
with students. As a result, educators can evolve into mentors and inspirational leaders
within their educational settings (Chan & Tsi, 2023). Additionally, these time savings
enable educators to engage in other significant activities, such as securing research grants
and publishing in prestigious international journals (Felix, 2020).

Various technological solutions have already been developed to assist educators. For
instance, researchers from the University of Surrey developed KEATH.AI, an AI system
providing rapid and accurate essay assessments with an 80% baseline accuracy. It allows
educators to modify scores, customise grading rubrics, and generate detailed analysis
reports on student performance. Similarly, GRAIDE, created by PhD students at the
University of Birmingham, enhances grading by learning an assessor's style. It accepts
written and digital submissions, reducing grading times by 87% and offering seven times
more feedback.

4.2.5. AI tools for the back office: profiling, prediction, and dropout reduction

The implementation of GenAI technologies in HEIs presents a prominent opportunity


for profiling and prediction, facilitating the timely identification of students at risk of
dropping out by flagging student learning status or performance before it is too late (Nagy
& Molontay, 2023; Ouyang et al., 2022). Specifically, predictive GenAI tools can be used
in various steps of the student lifecycle: prior to admission, and then throughout the
learning process (Pierrès et al., 2024).

According to Pierrès et al. (2024), the initial stage involves predicting student
performance before admission based on their CVs, while the subsequent stage focuses on
identifying students at risk of dropping out from courses, study programs, or the
institution, as well as determining their learning profiles. These types of analysis involve
gathering information, such as knowledge levels (or knowledge gaps), skills and strengths

34
(Nagy & Molontay, 2023), learning styles or preferences, interests (Ouyang et al., 2022),
grades, and class attendance.

This information can then be used to create metrics and real-time dashboards, as well
as to provide feedback and guidance on content-related issues throughout the learning
process, enhancing student retention and success (Zawacki-Richter et al.,2019).

4.3. Main Opportunities and Threats

4.3.1. Opportunities

4.3.1.1. Automation of administrative tasks

An opportunity for GenAI tools at the institutional administration level lies in


optimising the creation of class schedules and allocating educators to class slots. These
are among the most time-consuming and complex tasks within HEIs and still tend to be
manual, involving many steps and relying on various forms of software and administrative
coordination (UNESCO, 2023a).

Typically, departments submit course offerings, prerequisites, and educator


availability, while information on classroom resources is gathered. Timetable planning
involves manually inserting this data into spreadsheets or scheduling software, aligning
course offerings with educator availability and resolving conflicts such as double-booked
rooms and overlapping classes. These processes heavily rely on tools like Microsoft Excel
and specialised software, requiring substantial manual oversight.

Currently, HEIs can already benefit from AI platforms that analyse data sets (including
educator availability, course requirements, student enrolments, and classroom resources)
to minimise conflicts and optimise resource use. They dynamically adjust to real-time
changes, providing flexibility and responsiveness that traditional methods lack.

Still, GenAI can further enhance these benefits, as future advancements in such tools
could focus on developing more intuitive and integrated platforms that seamlessly
combine various administrative functions. Additional enhancements could include
advanced predictive analytics to anticipate scheduling conflicts before they arise and
machine learning algorithms that continually improve scheduling efficiency based on
historical data. Moreover, incorporating user-friendly interfaces and real-time
collaboration tools can facilitate better communication and coordination among

35
departments, reducing the manual workload and enhancing overall institutional
productivity.

4.3.1.2. Breaking the walls of culture and disability in HEI

Another significant opportunity arising from the deployment of GenAI is the potential
to break down remaining cultural, ethnic and disability barriers, which would foster a
more inclusive and accessible learning and living environment. According to Pence
(2019), AI agents will soon start to participate in collaboration and team building,
resulting in more ethnically and culturally diverse teams.

Moreover, as discussed in Section 3, GenAI can offer real-time translation and


culturally adaptive learning materials, facilitating communication and understanding
among students from diverse backgrounds. Also, it can create culturally responsive and
inclusive content that promotes representation, diversity, and fosters a sense of belonging
(OECD, 2023)

On the disabilities front, Pierrès et al. (2024) mention that HE should be flexible and
tailored to the diverse needs of students. AI may help provide the necessary
personalization and adaptability, enabling individuals with disabilities to pursue higher
education. If students with disabilities are considered not just in the development and
adoption of AI educational technologies, but also included in the discussions for the
conception and implementation of these tools, they may benefit from an adapted pace of
learning and flexible course schedules (García-González et al., 2020). Furthermore, AI-
driven accessibility tools can provide support to students with disabilities by guaranteeing
equitable access to educational materials (OECD, 2023).

4.3.2. Threats

4.3.2.1. Plagiarism and intellectual property rights

In Section 2, we discussed the threats of plagiarism from the perspective of educators.


However, it is equally important to address this issue from the standpoint of higher
education institutions, because plagiarism is a problem for the whole academic
community. To understand why GenAI poses additional problems, first in defining
plagiarism and then in detecting it, let us quote Harvard University's “Code of Honor” as
a representative example:

36
“Members of the Harvard College community commit themselves to producing
academic work of integrity – that is, work that adheres to the scholarly and intellectual
standards of accurate attribution of sources, appropriate collection and use of data, and
transparent acknowledgement of the contribution of others to their ideas, discoveries,
interpretations, and conclusions. Cheating on exams or problem sets, plagiarizing or
misrepresenting the ideas or language of someone else as one’s own, falsifying data, or
any other instance of academic dishonesty violates the standards of our community, as
well as the standards of the wider world of learning and affairs.”7

This Code of Honor fits well with pre-GenAI academia, where intellectual property
rights were clearly defined, but seems out of step with the post-GenAI environment.
Sources cannot be attributed accurately if these property rights are not properly defined.
However, there are no property rights over the output of large language models or image
generators – neither the owners of the training data nor the GenAI user. The term
“plagiarism” is no longer adequate to capture the use of the outputs of these GenAI
applications.8 But it is still relevant that, for the sake of academic integrity, “there should
be an acknowledgement of the contribution of others” even though the “others” may be,
in certain cases, GenAI applications.

The increasing use of LLMs and other GenAI tools in subjects that rely heavily on
written outputs, such as essays, presents unique challenges. HEIs are currently in an
experimental phase, exploring various strategies to effectively manage and integrate these
tools into their academic processes. According to UNESCO (2023a), some of the strategies
so far implemented consist of (i) banning ChatGPT in assessments or entirely; (ii) using
software to detect AI-generated text; (iii) shifting to oral, handwritten, or invigilated
exams; (iv) employing assessments that AI struggles to produce, such as podcasts, lab
activities, and group work; (v) establishing policies and guidelines for the ethical and
transparent use of AI in teaching, learning, and research (for example, allowing a
considered use of ChatGPT); and (vi) creating new assessment forms using ChatGPT
explicitly.

However, these measures do not seem to be sufficient, nor widely implemented. For
instance, regarding (v), some universities have set explicit guidelines for AI use in
classrooms (e.g., University of Southern California, Nova School of Business and

7From https://oaisc.fas.harvard.edu/honor-code/ accessed in August 2024.


8In the absence of a better word, we continue using it, meaning that the agent (teacher, researcher, or student) copies an output
generated by AI, without making any significant changes and without mentioning that it was produced by a given application in a
specific part of the text. The difficulty in defining “plagiarism” in this context shows why the concept is no longer appropriate.

37
Economics) and others have defined expectations about AI usage at both syllabus and
individual levels (e.g., Ohio State University), but in a general manner students and
educators still ask for more clarity on the rules. Malmström et al. (2023) found that
although Swedish students are familiar with and have positive perceptions of using AI
tools like ChatGPT and other AI language tools in education, most were unaware of any
institutional guidelines for responsible AI use.

The issue becomes increasingly problematic as AI evolves rapidly, potentially


outsmarting current educational practices and making tools such as software to detect AI-
generated text less effective. According to Memarian and Doleck (2023), technologies
such as LLMs introduce new forms of plagiarism, which require more advanced detection
methods, creating a vicious circle of needing ever-greater computational power and
advanced hardware to combat academic misconduct. This indicates that relying solely on
detection software is not a sustainable solution to safeguard academic integrity.
Implementing effective policies to address this issue is essential and will be discussed in
detail in Sections 5 and 6.

4.3.2.2. The reinforcement of biases and inequality

The potential of GenAI to transmit human biases can be controversial. As previously


mentioned, GenAI can play a crucial role in establishing a learning environment within
HEIs that is more inclusive and accessible. It has the potential to bridge gaps,
accommodate diverse learning needs, and foster a supportive atmosphere. However,
GenAI can also perpetuate existing human biases, potentially exacerbating inequalities
within HEIs and society at large. If not carefully managed, these tools could reinforce
stereotypes and unfair practices, leading to greater disparity.

Biases can be present in the various steps of students’ recruitment: targeting,


admissions, and scholarship decisions. For instance, according to UNESCO (2023a), AI
can tailor fees to financial capacity, but the databases powering AI often contain biases
from historical, algorithmic, sampling, and human sources. This can lead to biased
admissions, negatively impacting students' futures (Berendt et al., 2020). Additionally,
evidence suggests that these algorithms tend to reduce scholarship funding (Jaschik,
2021) and do not account for unexpected costs, affecting students' ability to stay in school.
Consequently, AI using data from a majority group can inadvertently exclude minority
groups, perpetuating exclusion and lack of diversity.

38
Moreover, inequalities may also be present at the learning level. The content created
by GenAI tools may not follow established pedagogical methods. Take, for instance, the
case of students with impairments: European Commission (2022) questions whether
GenAI systems provide appropriate interaction modes for learners with disabilities or
special education needs. This includes addressing whether the AI system is duly designed
to treat learners respectfully and adapts to their individual needs.

Both educators and HEIs have a role in this matter, by assuring human agency and
oversight, when implementing GenAI. Particularly, HEIs must implement monitoring
systems to follow up on and eliminate these types of occurrences.

4.3.2.3. Data leaks and privacy violations

A fundamental requirement for GenAI functionality is a substantial amount of data,


which underscores the critical importance of data security and privacy, especially as the
volume of data increases exponentially, making it a prime target for cyber threats (Kurtz
et al., 2024), such as ransomware. Chatbots with access to private information, such as
those meant to assist students, can often be provoked into leaking this information even
though they were designed not to do so.

These vulnerabilities pose significant challenges in academic settings (George, 2023),


especially since IT personnel may not be trained to deal with cyber threats on these
systems, or not enough personnel may be available.

4.3.2.4. Environmental and economic sustainability

Analysing the implementation of GenAI in HEIs raises sustainability concerns from two
angles: the environmental and the economic standpoints. Regarding the environmental
perspective, it is expected that academic and research use of AI will drive energy
consumption levels. In fact, according to Ray (2023), “the large size and complexity of
ChatGPT models require significant computing resources, which can have negative
environmental impacts. Improving the energy efficiency of ChatGPT models is an
important challenge that needs to be addressed.” In this sense, as AI and machine learning
models become more sophisticated, the demand for computational power and energy
increases, leading to higher carbon footprints.

From an economic perspective, HEIs will have to invest heavily in AI technologies to


attract students, faculty, and funding, leading to a rapid escalation in spending on AI

39
research and infrastructure. However, developing, implementing, and maintaining
advanced AI systems is expensive. As stated above, significant financial resources are
required for computational power, data storage, skilled personnel, and ongoing research
and development. The direct consequence is that such high costs could strain the budgets
of HEIs, especially those with limited financial resources (UNESCO, 2023a). Additionally,
given the high costs associated with AI technologies, only major, well-funded
organizations will have the financial means to participate in the AI competition to start
with. This could lead to the concentration of AI knowledge and resources within a small
number of institutions, widening the gap between these and smaller or less prosperous
universities. Consequently, this may lead to an even more tilted playing field in higher
education, with leading institutions drawing an ever-higher number of students, research
opportunities, and financial backing.

5. How to Minimise Risks While Tapping the Potential of GenAI

This chapter discusses how the risks of GenAI in higher education can be managed
while harnessing its potential. The preceding chapters highlighted the risks associated
with implementing GenAI for educators, students, and higher education institutions.
Here, we propose measures and programs to address and mitigate each of these identified
risks, ensuring a safer and more effective integration of GenAI in educational
environments.

5.1. Educators

As GenAI technologies become part of HE systems, educators are not just expected to
adapt to these technologies, but crucially must be empowered to do so. This
empowerment should allow them to focus more on guiding new learning approaches and
detect monitoring biases, ensuring that learning methods are effective and free from
unfair biases that might affect students' outcomes. With this shift, educators’ priorities
will change, with more time spent planning how to teach effectively rather than creating
the actual teaching materials (Celik et al., 2022). The burden of administrative tasks will
also be reduced – allowing educators to concentrate on the more critical aspects of
teaching –, and the evaluation methods will evolve towards more practical approaches,
instead of relying mostly on traditional written assignments (Abbas et al., 2024).

40
However, these opportunities come along with other kinds of issues. It is thus crucial that
educators are prepared to confront and surmount them.

5.1.1. Risks “Shift in students’ trust: Educators vs. GenAI” and “Job Displacement: the potential
of GenAI to automate teaching positions”

To prevent the weakening of the role of educators caused by continuously


implementing GenAI tools, investment must be made in their professional development
(UNESCO, 2023a; Lee et al., 2024). This includes training sessions provided by HEIs
covering tool functionalities, how and when to use those functionalities, their advantages
and disadvantages, student guidance, and the policy guides for their usage.

Additionally, educators must retain student engagement. In this sense, finding ways to
help educators boost student engagement levels beyond what GenAI tools can achieve is
crucial (U.S. Department of Education, 2023). This means providing educators with the
freedom to bring themselves and their personal experience to the lecture room even in
the presence of AI support. It can also mean providing educators with tools that allow
them to better track their teaching performance by delivering metrics on student
engagement and behaviours during class (U.S. Department of Education, 2023). This
information will then help educators adapt their teaching methods.

Safeguarding the unique and irreplaceable value of human educators is crucial. To


achieve this, rote lecturing can be replaced by interactive teaching methods such as
collaborative learning and whole-class discussions (OECD, 2023). In addition, it is
important for educators to make use of their positions to provide individualised support
to students. This may involve guidance beyond the strict context of classes, giving
recommendations regarding career paths and academic pursuits, guiding students around
their personal obstacles, and assisting them in establishing and attaining their personal
objectives. By concentrating on the overall growth of students, educators can tackle
emotional, social, and ethical aspects of learning that AI tools cannot understand.

Finally, it is crucial that educators participate in developing and implementing these


tools in the context of their institution. This cooperation will help design user-friendly
tools, making it easier for educators to engage with and comprehend these systems.
Moreover, educators should have channels to provide ongoing feedback on the
performance and transparency of AI systems. This feedback should be used to improve
and adapt AI tools continuously.

41
To prevent a decline in the number of graduate students financing their advanced
degrees through TA roles, HEIs need to reconsider the role of TAs in GenAI-based
teaching, to determine how their role can evolve alongside with technology. This involves
identifying which new essential tasks have arisen. It is possible that TA roles will require
more human interaction, such as mentoring students and facilitating group discussions.
TAs might also help educators conduct in-depth analyses of students' performances and
behaviours. However, if TA roles diminish, it is crucial to establish alternative financial
support methods for graduate students, such as research grants and fully funded PhD
fellowships.

5.1.2. Risk “Insufficient technological ability to detect plagiarism”

In Section 2.3.2.2. we pointed out that AI detection tools are ineffective in identifying
plagiarism due to their lack of accuracy. There are various approaches to address this
issue. First, it is crucial for educators to modify their assessment techniques (UNESCO,
2023a) in a way that fosters personalization and interaction, such as oral exams,
presentations, and in-class writing tasks; and which enables continuous evaluation,
allowing the evaluation of students' ongoing progress through projects, portfolios, and
regular quizzes (OECD, 2023). Educators can also promote original work (by designing
assignments that require original thought, critical analysis, and personal reflection), and
employ project-based learning as well as problem-solving tasks that require original
solutions and creative thinking (OECD, 2023).

In addition, it is crucial to offer educators training to assist them in identifying and


dealing effectively with AI plagiarism, and that instruct them in developing assessments
that reduce the likelihood of plagiarism (European Commission, 2024). For instance,
Cotton and Cotton (2023) provide suggestions to help detect plagiarism, mentioning
strategies such as identifying patterns or irregularities in the language, checking for
sources and citations, checking for originality, and checking grammar and spelling.

Finally, educators can contribute to promoting ethical AI use by incorporating


conversations about the fragilities of AI models and their ethical use into their lectures.
This can help students comprehend the potential consequences of AI-generated
plagiarism and the negative effects on their own learning resulting from their excessive
employment. They can also encourage students to use AI tools responsibly and openly,
such as employing them as aids in idea generation instead of replacements for creating
original work.

42
5.1.3. Risk “Ethical challenges in automating student evaluations"

The ethical implications of automatic categorisation of individuals and the impact of


such categorization on the individual's well-being, status, or dignity, have been the subject
of scrutiny in recent times. Article 22(1) of the GDPR states that any “data subject shall
have the right not to be subject to a decision based solely on automated processing,9 (…)
which produces legal effects concerning him or her (…)”. More recently, the EU Artificial
Intelligence Act (AI Act) of 2024 has added human oversight as a measure to combat AI
violations of fundamental rights (Article 14) and has established transparency
requirements by requiring AI systems to disclose that AI generates their output.10

Educators using AI in their grading tasks will still have to oversee and participate in
the assessment of each student. They should also be transparent about whether they are
using AI in their grading, which is fair since that same transparency is required from
students. However, there may be reasonable exceptions, such as in the case of short-
answer questions. Similar to traditional multiple-choice questions, a quick assessment of
answers may be amenable for AI grading without posing significant ethical concerns. This
will not be the case for other types of assignments, specifically those that are designed for
longer open answers.

Still, an extra challenge needs to be addressed in the context of evaluations under more
personalized teaching. If GenAI allows educators to adapt their educational approaches
to different students, should different educational methods be combined with a common
grading scale? In other words, the trade-off between rewarding the absolute level of skills
(“what does the student know?”) and learning progress (“how much have they learned?”)
may have to change. In this scenario, it may become necessary for educators to prioritise
evaluating fundamental skills, while guaranteeing fairness and consistency in
assessments. Combining formative assessments and performance-based assessments
(such as projects and presentations) can also be beneficial, as the former are tailored to
individual learning paths and contribute to a comprehensive understanding of student
progress, while the latter allows students to apply their knowledge and skills in authentic
contexts, providing a more accurate measure of their abilities. The overall learning
achievement of the individual may be more important for school or undergrad
environments, as it demonstrates long-term progress and proficiency, whereas the level
of knowledge and ability is more relevant for the final years of university, where the

9 https://eur-lex.europa.eu/eli/reg/2016/679/oj
10 https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai

43
emphasis is frequently on the immediate use of knowledge and skills. Regardless, it is
important to communicate the evaluation criteria, methods, and expectations clearly and
transparently to students so that they understand how they will be evaluated.

5.1.4. Risk “Lack of transparency or explainability in AI systems”

Educators should also receive AI literacy training to better lead with the lack of
transparency and explainability of most AI models (Lee et al., 2024; UNESCO, 2023a),
covering areas such as the operation of AI models, common sources of bias, and methods
for interpreting AI outputs. These training sessions should also equip educators with the
skills to identify and report biases, ensuring they can effectively intervene when biases
are detected during the course.

Moreover, and to address this issue effectively, it is crucial to implement transparent


AI systems specifically designed for clarity and explainability (UNESCO, 2023a). This
entails selecting AI solutions that provide clear visibility into the decision-making process
and the production of outcomes. In addition, it is necessary to establish and uphold
policies that advocate for ethical and responsible AI utilization in education – which must
define expectations for transparency, accountability, and educator participation in AI
supervision, and comply with international standards and guidelines for ethical AI usage,
to ensure alignment with best practices (UNESCO, 2023a).

5.2. Students

Incorporating GenAI technologies into higher education will lead to various shifts in
students' roles and expectations. Assessments will need to focus more on students'
application of knowledge rather than just memorization, with a higher priority on critical
thinking, problem-solving, and analytical skills (OECD, 2023). Additionally, students will
have to develop strong abilities in organizing and evaluating information, including
collecting, arranging, and carefully assessing its significance, accuracy, and dependability.

5.2.1. Risk “Lack of critical thinking and creativity”

The training of students in using GenAI is as important as training educators.


Educating students to develop of AI competencies, including GenAI-related skills, should
encompass both human and technological aspects of AI (UNESCO, 2023). Specifically, it
is crucial to integrate AI literacy into the curriculum to ensure that all students,

44
irrespective of their field of study, acquire a foundational understanding of AI. This
approach will prepare students for the diverse impacts of AI in various professions and
industries (Pence, 2019), and it should be accompanied by providing fundamental
knowledge about the functioning of AI systems, including GenAI (Farrelly & Baker, 2023).
Understanding the essential principles of machine learning, data management, and
algorithm-based decision-making is crucial for understanding the capabilities and
constraints of these technologies.

In addition, it is important to emphasise the ethical implications and biases in AI


systems (Farrelly & Baker, 2023). Students need to understand how data biases can
influence AI outcomes and the implications this has for fairness and equality. This
understanding will enable them to critically evaluate the information generated by AI
tools and recognise the importance of using these tools thoughtfully.

Finally, to prevent students from becoming dependent on GenAI tools, these tools must
not offer immediate answers to students. Instead, these applications should offer hints or
different actions that students can take to solve exercises or find information, allowing
students to make the final decision. Furthermore, providing step-by-step feedback should
allow students to comprehend their mistakes and improve, instead of instantly
pinpointing what they are missing. This approach will help students understand how to
enhance their learning.

5.2.2 Risk “Stimulus overload and dependence”

Gamified environments, where students constantly receive notifications and nudges,


seem like a great idea to keep students involved with the topic they are studying. But if
they are subject to too many of those stimuli they may simply desist and give up, or
become insensitive, or become convinced that they personally are not good enough to
keep up.

Therefore, gamified environments should be used with caution. Within each course,
the number of nudges should be limited. Moreover, there needs to be coordination
between the different courses in degree programs to avoid a tug-of-war between courses
that leaves students exasperated (Similar coordination is already common with respect to
out-of-class activities such as homework and group projects).

The second issue with gamification is that students may become dependent on nudges
to decide on how to allocate their time, crowding out their own initiative and ability to
take decisions by themselves. The first step to avoid this issue is again to refrain from

45
exaggerating the number and frequency of nudges. HEIs should also make sure that their
students learn time-management skills and become autonomous in organizing their
workload.

5.2.3. Risk “Increased individualism and reduced social skills”

To tackle this problem, it is crucial to foster collaborative learning environments


(OECD, 2023) by designing courses that emphasise teamwork and group projects. The
promotion of interaction and teamwork among students can be achieved through
collaborative assignments that promote communication and collaboration among
students, aiding in developing their social skills.

Other initiatives may include whole-class discussions, focusing on exchanging and


comparing viewpoints, probing and expanding responses, and engaging in evidence-
based arguments and discussions (OECD, 2023). In addition, it is important to facilitate
face-to-face or synchronous virtual interactions, such as live discussions, debates, and
Q&A sessions, to uphold a sense of community and ensure that students have many
opportunities for social interactions.

5.2.4. Risk “Hallucinations and reinforcement of biases”

Uncritical use of GenAI output leads to the promotion of inherent biases and
information made up by the GenAI system (hallucinations). It is important to educate
students about facing either of those. This includes providing them with knowledge about
how biases arise and the ability to critically analyse AI-generated content (Farrelly &
Baker, 2023). For instance, this can be achieved by including AI ethics, data science, and
critical thinking courses in the curriculum. Students must also learn how to assess and
verify AI-generated results. Additional measures to avoid and combat biases are provided
in Section 5.3.2.

5.3. Higher Education Institutions

The increased development and deployment of GenAI technologies in higher education


will require HEIs to continuously monitor AI advancements and stay informed about their
impacts and applications in education. This involves keeping track of enhancements in AI
algorithms, software, and hardware and their incorporation into educational resources to

46
ensure the efficient utilization of advanced AI technologies and to detect potential risks
and constraints (UNESCO, 2023a).

Additionally, HEIs need to work together to synchronise the implementation of GenAI


technologies. Through collaboration, they can create consistent policies, exchange
successful methods, and align their approaches, which will contribute to the development
of common ethical standards, promote fair AI usage, and tackle issues such as bias, data
privacy, and security. Equally important is to guarantee that AI policies are uniformly
implemented by conducting regular audits and evaluations to uncover biases and ethical
concerns, which, in turn, allows for the timely resolution of these issues (U.S. Department
of Education, 2023).

5.3.1. Risk “Plagiarism and intellectual property rights”

Plagiarism needs to be addressed by higher education institutions from a more


comprehensive standpoint, looking beyond the perspectives of students and educators.
Addressing the challenges associated with plagiarism effectively requires a comprehensive
intervention acting on multiple fronts. Specifically, HEIs need to establish comprehensive
guidelines that delineate the ethical and conscientious usage of AI tools in education,
academic pursuits, and scholarly investigations (UNESCO, 2023a). These directives
should elucidate the conditions under which GenAI may be used and how this should be
disclosed and supervised. Moreover, it is crucial to ensure accessibility, which means
ensuring that faculty and students can easily reach these guidelines via the university's
website, course outlines, and orientation materials (Cotton & Cotton, 2023). In this
context, it is equally important to evaluate the efficiency of the adopted policies and
strategies by gathering input from both faculty and students, as well as reviewing data
related to plagiarism incidents (European Commission, 2024). This will allow for
necessary adjustments to the policies to tackle any new challenges that may arise.

5.3.2. Risk “The reinforcement of biases and inequality”

To diminish the risk of accentuating biases and inequality in the context of student
recruitment, HEIs need to (i) establish strong monitoring mechanisms to monitor and
analyse the results of AI-based recruitment procedures (George, 2023), (ii) evaluate
whether AI systems impartially assess applications from students with disabilities and
offer the necessary assistance during the recruitment process, (iii) make use of AI to
acquire data-driven insights into the efficiency of recruitment approaches and pinpoint

47
areas where algorithmic discrimination may arise; (iv) leverage AI to proactively identify
students who may require additional support, such as tutoring, mentoring, or financial
aid; and (v) incorporate the perspectives of students with disabilities to address ethical
concerns and promote greater accessibility and inclusivity (Pierrès et al., 2024). This
information can be used by HEIs to create scholarship programs that cover tuition fees
and other associated costs to prevent dropouts. The qualification requirements for these
scholarships should encompass a variety of skills beyond traditional academic
evaluations.

In terms of student learning, it is essential to ensure that all students have equal access
to AI tools (UNESCO, 2023a), as well as that they are provided with all the required
support and resources, regardless of their socioeconomic and demographic backgrounds.
Also, HEIs must ensure that GenAI educational tools are tailored to recognise and tackle
the difficulties encountered by underrepresented groups (Pierrès et al., 2024), allowing
equitable learning among all students. Moreover, to prevent biases in student evaluation,
it is important to implement measures comparable to those recommended for addressing
biases in student admissions.

To start with, it is essential to implement bias-reducing initiatives during the


development of GenAI tools. Such initiatives should focus on promoting diversity
(including gender, ethnicity, and geography) among the engineers, researchers, and
designers involved in building AI systems (UNESCO, 2023a) and using inclusive and
diverse training data. By doing so, a wide range of perspectives representing diverse
demographics and viewpoints can be incorporated into the development process,
reducing bias (UNESCO, 2023a).

5.3.3. Risk “Data leaks and privacy violations”

It is essential that HEIs develop AI systems that are designed to protect student data
(George, 2023; Kurtz et al., 2024). However, this should be accompanied by other types
of measures. For instance, HEIs should implement robust encryption methods, conduct
regular security audits, and train staff and students on data privacy practices. Ensuring
transparency in data collection and usage policies is also crucial, including compliance
with data protection regulations to maintain students’ trust.

Furthermore, it is important to establish standards and frameworks for the ethical use
of GenAI, ensuring compliance with evolving regulations and ethical considerations while

48
maintaining transparency and accountability in data handling practices (Malmström et
al., 2023; Johnston et al., 2024; Lim et al., 2023).

5.3.4. Risk “Environmental and economic sustainability”

As mentioned in Section 4.3.2.4, the sustainability issues associated with GenAI take
two forms: environmental and economic. Regarding the former, universities should
explore more sustainable practices, such as opting for systems that save energy and
tapping into renewable energy sources to run their data centres (UNESCO, 2023a). Also,
HEIs should participate in developing sustainable, energy-efficient hardware and software
solutions, promoting recycling and responsible disposal of electronic waste, and
considering the life cycle of products to address the environmental impact of computer
science (Ray, 2023). Moreover, regulatory pressures and the push for greener
technologies are expected to drive HEIs to innovate and adopt energy-efficient AI
solutions. In this sense, establishing agreements on caps of energy consumption could
help ensure fair competition between HEIs.

In terms of economic sustainability given the financial restrictions that most public
HEIs are facing, one possibility is to encourage the establishment of consortia or
partnerships to share financial resources and make advanced AI systems more accessible
while also promoting collaborations between HEIs and industry partners to finance AI
research and development efforts jointly (UNESCO, 2023a). Additionally, HEIs will need
to advocate for increased government funding and grants specifically targeted for AI
research and infrastructure in HEIs. Lastly, lending support to open-source AI projects
that offer free and accessible tools for educational institutions can help reduce barriers to
entry for institutions with limited budgets.

6. Conclusions and Recommendations

GenAI possesses significant potential to enhance the experiences of both educators and
students within the realm of higher education and improve the functioning of higher
education institutions (HEIs), attributable to its ability to deliver personalised learning
experiences, optimise administrative processes, augment accessibility, and facilitate
innovative pedagogical strategies. It can customise educational materials to cater to the
unique requirements of individual students; assist educators in accommodating different
learning styles and paces; automate routine responsibilities such as assessment grading;

49
improve accessibility by offering adaptive learning resources; and empower educators to
explore novel and creative pedagogical methodologies.

At the institutional level, GenAI can transform various operational and academic
procedures, increasing efficiency and efficacy. It can refine the allocation of resources
through the analysis of patterns and the forecasting of requirements; can facilitate
recruitment and the admissions process by catering communication to different groups of
potential candidates and by automating the assessment of applications; can enhance
student support via AI-driven chatbots and virtual advisors; can manage, analyse, and
present substantial volumes of data, allowing for data-driven decision-making that
enhances internal processes.

However, GenAI poses risks to all these three actors mentioned (educators, students,
and HEIs) that must be carefully considered and managed. Such risks include biases in AI
algorithms; privacy and data security; lack of transparency or explainability; ethical and
plagiarism issues; sustainability issues; reduced critical thinking, creativity, and social
skills in students; and workforce replacement.

It is possible to prevent and minimise these risks – if HEIs are aware of their existence
and willing to confront them. Higher education institutions should take a proactive and
comprehensive approach by:

Approving guidelines for the use of GenAI within each institution. More than general
regulations at the university level, these should be adapted to the respective area of
knowledge, since each faces its own specific issues (e.g., engineering, literature studies,
data analytics, economics, or sociology).

Monitoring and updating AI systems so that they remain fair, transparent, effective,
safe, and secure. This implies investing in both technology and IT workforce.

Prioritizing data security and privacy in the design of AI systems, by implementing


rigorous measures to protect sensitive information.

Fostering a culture of ethical awareness among educators and students, ensuring that
the use of GenAI aligns with the core values of education in general and in each academic
community.

Involving various stakeholders (educators, students, technologists, and ethicists) in the


development and implementation of AI tools.

50
Increasing digital and AI literacy of students, particularly of those applications that are
more useful to their academic success, in order both to improve learning and to eliminate
the effect of socio-economic factors on pre-university access to digital technologies.

Investing in regular training and professional development for educators and staff to
equip them with the skills and knowledge needed to effectively integrate AI into teaching
and administrative tasks while maintaining the human-centered aspects of education.

Of course, HEIs do not exist in a vacuum – students will have gone through at least a
decade of schooling before they reach university. Educational policies at the high school
level should already include an exposure to GenAI and discussions about its capabilities,
strengths and weaknesses, and ethical issues. The coming generations of students will
have experience with GenAI independently of whether school helps them use it
responsibly or not.

Educators and students of HEI should proactively adapt to the increasing use of GenAI
in teaching, learning and assessments:

Students will use GenAI extensively, and this is beneficial if it is done with
transparency and respecting academic integrity and honesty. Educators should assist
them and promote open conversations about risks and rules of AI usage.

Educators should focus their teaching on promoting critical thinking (in general and
concerning the outputs of GenAI), make it interactive, and teach students how to learn.

At the same time, educators should be aware that students should not be subjected to
too many messages and nudges from AI-driven tutoring systems, especially those that
involve a gamified environment, to avoid cognitive overload, a drop in self-esteem if they
cannot follow up, and a growing dependence on external stimuli to drive their choices.

Students should learn how to use GenAI tools productively and correctly, to improve
their learning and prepare themselves for future careers in the labour market or in
academic research.

Educators should give more weight to student evaluations that reward independent
and critical thinking, problem solving, the ability to work in groups and present their work
in public, at the expense of traditional evaluation methods based on reproducing facts or
composing essays.

If the number of students is small, which tends to be the case at the master’s level,
educators should organize group presentations; while if the number of students is high,
such as at the bachelor level, more written in-class examinations are warranted. Either of

51
these implies that educators will have to spend more time evaluating students, though
group presentations can be an important part of a course designed to be interactive.

When educators use AI tools to perform the assessment of students’ work, this should
be transparent (indicated on the syllabus), and students should receive feedback,
including the possibility to contest the grade.

Any implementation of an AI evaluation system should involve a pilot implementation


that runs parallel to existing evaluation schemes, to determine whether the AI system’s
assessments are consistent and unbiased.

More research is needed to comprehensively discern the long-term effects of artificial


intelligence in higher education at all the above-mentioned levels. Over the next few
years, as new AI systems are being rolled out, and as policies are designed and
implemented, significant experience will accumulate, which will require rethinking the
mission of higher education institutions and the roles of educators and students.

52
References

Abbas, M., Jam, F. A., & Khan, T. I. (2024). Is it harmful or helpful? Examining the causes
and consequences of generative AI usage among university students. International
Journal of Educational Technology in Higher Education, 21(10).
https://doi.org/10.1186/s41239-024-00444-7

Al-Zahrani, A. M. (2023). The impact of generative AI tools on researchers and research:


Implications for academia in higher education. Innovations in Education and Teaching
International, 1–15. https://doi.org/10.1080/14703297.2023.2271445
Aluthman, E. S. (2016). The effect of using automated essay evaluation on ESL
undergraduate students’ writing skill. International Journal of English Linguistics, 6(5),
54. https://doi.org/10.5539/ijel.v6n5p54

Beck, S. W., & Levine, S. R. (2023). Backtalk: ChatGPT: A powerful technology tool for
writing instruction. Phi Delta Kappan, 105(1), 66-67.
https://doi.org/10.1177/00317217231197487

Beerkens, M. (2021). An evolution of performance data in higher education governance:


a path towards a ‘big data’ era? Quality in Higher Education, 28(1), 29–49.
https://doi.org/10.1080/13538322.2021.1951451

Berendt, B., Littlejohn, A., & Blakemore, M. (2020). AI in education: Learner choice and
fundamental rights. Learning, Media and Technology, 45(3), 312–324. Available at:
https://doi.org/10.1080/17439884.2020.1786399.

Biggs, J. (1999). What the Student Does: teaching for enhanced learning. Higher
Education Research & Development, 18(1), 57–75.
https://doi.org/10.1080/0729436990180105

Biggs, J. B. (2011). Teaching for quality learning at university: What the student does.
McGraw-Hill Education (UK).

Chan, C., & Hu, W. (2023). Students’ voices on generative AI: perceptions, benefits, and
challenges in higher education. International Journal of Educational Technology in
Higher Education, 20(43). https://doi.org/10.1186/s41239-023-00411-8
Chan, C. K. Y., & Tsi, L. H. Y. (2023). The AI Revolution in Education: Will AI Replace or
Assist Teachers in Higher Education? abs/2305.01185.
https://doi.org/10.48550/arXiv.2305.01185

53
Celik, İ., Dindar, M., Muukkonen, H., & Järvelä, S. (2022). The promises and challenges
of artificial intelligence for teachers: A systematic review of research. TechTrends, 66,
616–630. https://doi.org/10.1007/s11528-022-00715-y

Cingillioglu, I., Gal, U., & Prokhorov, A. (2024). AI-experiments in education: An AI-
driven randomized controlled trial for higher education research.
https://doi.org/10.1007/s10639-024-12633-y

Choi, J. H., Hickman, K. E., Monahan, A., & Schwarcz, D. (2023). ChatGPT goes to law
school. Journal of Legal Education, 71(2022), 387.
http://dx.doi.org/10.2139/ssrn.4335905

Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2023). Chatting and cheating: Ensuring
academic integrity in the era of ChatGPT. Innovations in Education and Teaching
International, 61(2), 228–239.
https://doi.org/10.1080/14703297.2023.2190148Crompton, H., & Burke, D. (2023).
Artificial intelligence in higher education: The state of the field. International Journal of
Educational Technology in Higher Education, 20(22). https://doi.org/10.1186/s41239-
023-00392-8

de Chiusole, D., Stefanutti, L., Anselmi, P. & Robusto, E. (2020). Stat-Knowlab.


Assessment and Learning of Statistics with Competence-based Knowledge Space
Theory. Int J Artif Intell Educ 30, 668–700 (2020). https://doi.org/10.1007/s40593-020-
00223-1

Dai, W., Lin, J., Jin, H., Li, T., Tsai, Y. S., Gasevic, D., & Chen, G. (2023). Can large
language models provide feedback to students? A case study on ChatGPT. In N.-S. Chen,
G. Rudolph, D. G. Sampson, M. Chang, R. Kuo, & A. Tlili (Eds.), Proceedings - 2023 IEEE
International Conference on Advanced Learning Technologies, ICALT 2023 (pp. 323-
325). IEEE, Institute of Electrical and Electronics Engineers.
https://doi.org/10.1109/ICALT58122.2023.00100

Dalalah, D. & Dalalah, O. (2023). The false positives and false negatives of generative AI
detection tools in education and academic research: The case of ChatGPT. The
International Journal of Management Education 21(2).
https://doi.org/10.1016/j.ijme.2023.100822

De Wit, K. & Broucker, B. (2017). The governance of big data in higher education. In Data
Analytics Applications in Education. Auerbach Publications.

54
Elkhatat, A. M., Elsaid, K., & Almeer, S. (2023). Evaluating the efficacy of AI content
detection tools in differentiating between human and AI-generated text. International
Journal for Educational Integrity, 19(17). https://doi.org/10.1007/s40979-023-00140-5
European Commission (2018). The impact of Artificial Intelligence on learning, teaching,
and education, Joint Research Centre Publications Office.
European Commission (2022). Ethical guidelines on the use of artificial intelligence (AI)
and data in teaching and learning for educators, Directorate-General for Education,
Youth, Sport and Culture. https://data.europa.eu/doi/10.2766/153756

European Commission (2024). Living guidelines on the responsible use of generative AI


in research. Directorate-General for Research and Innovation. https://research-and-
innovation.ec.europa.eu/document/download/2b6cf7e5-36ac-41cb-aab5-
0d32050143dc_en?filename=ec_rtd_ai-guidelines.pdf

Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and
considerations for higher education practice. Education Sciences, 13(11), 1109.
https://doi.org/10.3390/educsci13111109

Felix, C.V. (2020), "The Role of the Teacher and AI in Education", Sengupta, E.,
Blessinger, P. and Makhanya, M.S. (Ed.) International Perspectives on the Role of
Technology in Humanizing Higher Education (Innovations in Higher Education Teaching
and Learning, Vol. 33), Emerald Publishing Limited, Leeds, pp. 33-48.
https://doi.org/10.1108/S2055-364120200000033003

Felten, E. W., Raj, M., & Seamans, R. (2023). How will Language Modelers like ChatGPT
Affect Occupations and Industries? https://doi.org/10.2139/ssrn.4375268
García-González, J. M., Gutiérrez Gómez-Calcerrada, S., Solera Hernández, E., & Ríos-
Aguilar, S. (2020). Barriers in higher education: perceptions and discourse analysis of
students with disabilities in Spain. Disability & Society, 36(4), 579–595.
https://doi.org/10.1080/09687599.2020.1749565

Ge, Z., & Hu, Y. (2020). Innovative Application of Artificial Intelligence (AI) in the
Management of Higher Education and Teaching. 1533(3).
https://doi.org/10.1088/1742-6596/1533/3/032089

George, A. S. (2023). The potential of generative AI to reform graduate education.


Partnerships for University International Research Journal, 2, 36–50.
https://doi.org/10.5281/zenodo.10421475

55
Hooshyar, D., Ahmad, R., Yousefi, M., Fathi, M., Horng, S.-J., & Lim, H. (2016). Applying
an online game-based formative assessment in a flowchart-based intelligent tutoring
system for improving problem-solving skills. 94.
https://doi.org/10.1016/J.COMPEDU.2015.10.013

Ibrahim, H., Asim, R., Zaffar, F., Rahwan, T., & Zaki, Y. (2023, March/April). Rethinking
homework in the age of artificial intelligence. IEEE Intelligent Systems, 24-27.

Ilieva, G., Yankova, T., Klisarova-Belcheva, S., Dimitrov, A., Bratkov, M., & Angelov, D.
(2023). Effects of generative chatbots in higher education. Information, 14(9), 492.
https://doi.org/10.3390/info14090492

Jaschik, S. (2021). Do Algorithms Lead Admissions in the Wrong Direction?


https://www.insidehighered.com/admissions/article/2021/09/27/critics-algorithms-
push-admissions-wrong-direction

Johnston, H., Wells, R., Shanks, E., Boey, T., & Parsons, B. (2024). Student perspectives
on the use of generative artificial intelligence technologies in higher education.
International Journal for Educational Integrity. https://doi.org/10.1007/s40979-024-
00149-4

Kabudi, T., Pappas, I., & Håkon Olsen, D. (2021). AI-enabled adaptive learning systems:
A systematic mapping of the literature. Computers and Education: Artificial Intelligence,
2. https://doi.org/10.1016/j.caeai.2021.100017
Kaharuddin, K. (2021). Assessing the effect of using artificial intelligence on the writing
skill of Indonesian learners of English. Linguistics and Culture Review.

Kamalov, F., Santandreu Calonge, D., & Gurrib, I. (2023). New era of artificial intelligence
in education: Towards a sustainable multifaceted revolution. Sustainability, 15(12451).
https://doi.org/10.3390/su151612451

Kasneci, E., Seßler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., …
Kasneci, G. (2023, January 30). ChatGPT for Good? On Opportunities and Challenges of
Large Language Models for Education. https://doi.org/10.35542/osf.io/5er8f

Kim, C., & Bennekin, K. (2016). The effectiveness of volition support (VoS) in promoting
students’ effort regulation and performance in an online mathematics course.
Instructional Science, 359-377. Retrieved from
https://link.springer.com/article/10.1007/s11251-015-9366-5

Kurtz, G., Amzalag, M., Shaked, N., Zaguri, Y., Kohen-Vacs, D., Gal, E., Zailer, G., & Barak-
Medina, E. (2024). Strategies for integrating generative AI into higher education:

56
Navigating challenges and leveraging opportunities. Education Sciences, 14(5), 503.
https://doi.org/10.3390/educsci14050503

Labadze, L., Grigolia, M., & Machaidze, L. (2023). Role of AI chatbots in education:
Systematic literature review. International Journal of Educational Technology in Higher
Education, 20(1), 56. https://doi.org/10.1186/s41239-023-00426-1
Lee, D., Arnold, M., Srivastava, A., Plastow, K., Strelan, P., Ploeckl, F., Lekkas, D., &
Palmer, E. (2024). The impact of generative AI on higher education learning and
teaching: A study of educators’ perspectives. Computers and Education: Artificial
Intelligence, 6, 100221. https://doi.org/10.1016/j.caeai.2024.100221
Liang, J., Wang, L., Luo, J., Yan, Y., & Fan, C. (2023). The relationship between student
interaction with generative artificial intelligence and learning achievement: Serial
mediating roles of self-efficacy and cognitive engagement. Frontiers in Psychology, 14.
https://doi.org/10.3389/fpsyg.2023.1285392

Lim, W.M., Gunasekara, A.N., Pallant, J.L., Pallant, J.I., & Pechenkina, E. (2023).
Generative AI and the future of education: Ragnarök or reformation? A paradoxical
perspective from management educators. The International Journal of Management
Education.
Lin, C.-C., Huang, A. Y. Q., & Yang, S. J. H. (2023). A review of AI-driven conversational
chatbots implementation methodologies and challenges (1999–2022). Sustainability,
15(5), 4012. https://doi.org/10.3390/su15054012
Lu, O., Huang, A., Tsai, D., & Yang, S. (2021). Expert-Authored and Machine-Generated
Short-Answer Questions for Assessing Students Learning Performance. Educational
Technology & Society, 159-173.
Malmström, H., Stöhr, C., & Ou, A. W. (2023). Chatbots and other AI for learning: A
survey of use and views among university students in Sweden. (Chalmers Studies in
Communication and Learning in Higher Education 2023:1).
https://doi.org/10.17196/cls.csclhe/2023/01

Memarian, B., & Doleck, T. (2023). ChatGPT in education: Methods, potentials, and
limitations. Computers in Human Behavior: Artificial Humans, 1(2), 100022.
https://doi.org/10.1016/j.chbah.2023.100022

Memarian, B., & Doleck, T. (2023a). Fairness, Accountability, Transparency, and Ethics
(FATE) in Artificial Intelligence (AI) and higher education: A systematic review.

57
Computers and Education: Artificial Intelligence, 5, 100152.
https://doi.org/10.1016/j.caeai.2023.100152

Maravanyika, M., Dlodlo, N., & Jere, N. (2017). An adaptive recommender-system based
framework for personalised teaching and learning on e-learning platforms. 2017 IST-
Africa Week Conference (IST-Africa). Windhoek, Namibia: Institute of Electrical and
Electronics Engineers (IEEE). Retrieved from
https://ieeexplore.ieee.org/abstract/document/8102297/references

Marrone, R., Taddeo, V., & Hill, G. (2022). Creativity and artificial intelligence—A
student perspective. Journal of Intelligence, 10(3), 65.
https://doi.org/10.3390/jintelligence10030065

Mousavinasab, E., Zarifsanaiey, N., Kalhori, S., Rakhshan, M., Keikha, L., & Saeedi, M.
(2018). Intelligent tutoring systems: a systematic review of characteristics, applications,
and evaluation methods. Interactive Learning Environments, 29, 142–163.

Nagy, M., & Molontay, R. (2023). Interpretable dropout prediction: Towards XAI-based
personalized intervention. International Journal of Artificial Intelligence in Education, 1–
2. https://doi.org/10.1007/s40593-023-00331-8

Novoselova, O.V. (2023). Virtual Internationalization at Universities: Opportunities and


Challenges. In: Roumate, F. (eds) Artificial Intelligence in Higher Education and Scientific
Research. Bridging Human and Machine: Future Education with Intelligence. Springer,
Singapore. https://doi.org/10.1007/978-981-19-8641-3_5

OECD, 2023. Generative AI in the classroom: From hype to reality? EDU/EDPC(2023)11.


https://one.oecd.org/document/EDU/EDPC(2023)11/en/pdf

Ouyang, F., Zheng, L., & Jiao, P. (2022). Artificial intelligence in online higher education:
A systematic review of empirical research from 2011 to 2020. Education and Information
Technologies, 27(1). https://doi.org/10.1007/s10639-022-10925-9
Padron-Rivera, G., Joaquin-Salas, C., Patoni-Nieves, J.-L., & Bravo-Perez, J.-C. (2018).
Patterns in Poor Learning Engagement in Students While They Are Solving Mathematics
Exercises in an Affective Tutoring System Related to Frustration. Mexican Conference on
Pattern Recognition (pp. 169–177). Springer International Publishing AG.
Pence, H. E. (2019). Artificial intelligence in higher education: New wine in old
wineskins? Journal of Educational Technology Systems, 48(1), 5-13.
https://doi.org/10.1177/0047239519865577

58
Pierrès, O., Christen, M., Schmitt-Koopmann, F.M., & Darvishy, A. (2024). Could the Use
of AI in Higher Education Hinder Students With Disabilities? A Scoping Review. IEEE
Access, 12, 27810-27828. https://doi.org/10.1109/access.2024.3365368
Polyportis, A. (2024). A longitudinal study on artificial intelligence adoption:
Understanding the drivers of ChatGPT usage behavior change in higher education.
Frontiers in Artificial Intelligence, 6. https://doi.org/10.3389/frai.2023.1324398
Popenici, S. A. D., & Kerr, S. (2017). Exploring the impact of artificial intelligence on
teaching and learning in higher education. Research and Practice in Technology
Enhanced Learning, 12(22). https://doi.org/10.1186/s41039-017-0062-8
Popper, K. (1963). Conjectures and Refutations: The Growth of Scientific Knowledge.
Routledge.

Ray, P. P. (2023). ChatGPT: A comprehensive review on background, applications, key


challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-
Physical Systems, 3, 121-154. Retrieved from
https://doi.org/10.1016/j.iotcps.2023.04.003

Roediger, H. L., III, & Butler, A. C. (2011). The critical role of retrieval practice in long-
term retention. Trends in Cognitive Sciences, 15(1), 20-27.
https://doi.org/10.1016/j.tics.2010.09.003

Rutner, S., & Scott, R. (2022). Use of Artificial Intelligence to Grade Student Discussion
Boards: An Exploratory Study. Information Systems Education Journal, 4-18.

Sajja, R., Sermet, Y., Cwiertny, D., et al. (2023). Platform-independent and curriculum-
oriented intelligent assistant for higher education. International Journal of Educational
Technology in Higher Education, 20(42). https://doi.org/10.1186/s41239-023-00412-7
Sullivan, M., Kelly, A., & McLaughlan, P. (2023). ChatGPT in higher education:
Considerations for academic integrity and student learning. Journal of Applied Learning
and Teaching, 6(1), 1-10. https://doi.org/10.37074/jalt.2023.6.1.17
Tlili, A., Shehata, B., Adarkwah, M. A., et al. (2023). What if the devil is my guardian
angel: ChatGPT as a case study of using chatbots in education. Smart Learning
Environments, 10(15). https://doi.org/10.1186/s40561-023-00237-x
UNESCO (2023). Guidance for generative AI in education and research (F. Miao & W.
Holmes, Authors). UNESCO. https://doi.org/10.54675/EWZM9535

UNESCO (2023a). Harnessing the era of artificial intelligence in higher education: A


primer for higher education stakeholders (L. L. Bosen, D. Morales, J. Roser-Chinchilla, E.

59
Sabzalieva, A. Valentini, D. Vieira do Nascimento, & C. Yerovi, Authors). UNESCO
IESALC. https://unesdoc.unesco.org/ark:/ 48223/pf0000386670

U.S. Department of Education, Office of Educational Technology (2023). Artificial


Intelligence and the Future of Teaching and Learning: Insights and Recommendations.
Washington, DC: U.S. Department of Education. Retrieved from https://tech.ed.gov

Weber-Wulff, D., Anohina-Naumeca, A., Bjelobaba, S., Foltýnek, T., Guerrero-Dib, J.,
Popoola, O., Waddington, L. (2023). Testing of detection tools for AI-generated text.
International Journal for Educational Integrity, 19. https://doi.org/10.1007/s40979-
023-00146-z

Yang, A., Chen, I., Flanagan, B., & Ogata, H. (2021). Automatic Generation of Cloze Items
for Repeated Testing to Improve Reading Comprehension. Educational Technology &
Society, 147-158.
Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review
of research on artificial intelligence applications in higher education – where are the
educators? International Journal of Educational Technology in Higher Education, 16(39).
https://doi.org/10.1186/s41239-019-0171-0

60
REPORT 11
Generative AI and Higher Education: Challenges and Opportunities
Authors: Steffen Hoernig,André Ilharco, Paulo Trigo Pereira, Regina Pereira

ISSN: 2183-9360

September 2024

Institute of Public Policy Lisbon – Rua Miguel Lupi 20, 1249-078 Lisboa PORTUGAL
www.ipp-jcs.org – email: [email protected] – tel.: +351 213 925 986 – NIF: 510654320

The views and information set out herein are those of the authors do not necessarily reflect those of Institute of Public Policy, the
University of Lisbon, or any other institution which either the authors or IPP may be affiliated with. Neither Institute of Public Policy nor
any person acting on its behalf can be held responsible for any use which may be made of the information contained herein. This report
may not be reproduced, distributed, or published without the explicit previous consent of its authors. Citations are authorized, provided
the original source is acknowledged.

www.ipp-jcs.org
© Institute of Public Policy Lisbon | September 2024 – All rights reserved

You might also like