Building Collaborative Learning - Exploring Social Annotation in Introductory Programming
Building Collaborative Learning - Exploring Social Annotation in Introductory Programming
SEET)
ABSTRACT Engineering Education and Training (ICSE-SEET ’24), April 14–20, 2024, Lis-
The increasing demand for software engineering education presents bon, Portugal. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/
3639474.3640063
learning challenges in courses due to the diverse range of topics that
require practical applications, such as programming or software
design, all of which are supported by group work and interaction.
1 INTRODUCTION
Social Annotation (SA) is an approach to teaching that can enhance Programming is one of the first topics taught in many engineer-
collaborative learning among students. In SA, both students and ing disciplines. Large classes of students typically have their first
teachers utilize platforms like Feedback Fruits, Perusall, and Diigo contact with programming when teachers explain basic program-
to collaboratively annotate and discuss course materials. This ap- ming constructs as statements and are shown examples of output
proach encourages students to share their thoughts and answers produced by executing code [21, 29]. Most of that knowledge is
with their peers, fostering a more interactive learning environment. not introduced to students before higher-level education. More-
We share our experience of implementing social annotation via over, most students are unfamiliar with the technological content
Perusall as a preparatory tool for lectures in an introductory pro- knowledge associated with teaching and learning programming
gramming course aimed at undergraduate students in Software (e.g., development environments, installation of compilers or in-
Engineering. We report the impact of Perusall on the examina- terpreters). Exercising those skills already in the first class can
tion results of 112 students. Our results show that 81% of students easily overwhelm students, particularly those without any prior
engaged in meaningful social annotation successfully passed the knowledge of programming [29]. Besides the content itself, stu-
course. Notably, the proportion of students passing the exam tends dents must learn how to explain their algorithms to each other,
to rise as they complete more Perusall assignments. In contrast, i.e., explain to peers the steps that they followed to solve a specific
only 56% of students who did not participate in Perusall discus- problem [25, 28]. This is particularly challenging when students
sions managed to pass the exam. We did not enforce mandatory need to collaborate towards a solution in, e.g., a project course.
Perusall participation in the course. Yet, the feedback from our Teaching approaches focused on flipping the classroom or per-
course evaluation questionnaire reveals that most students ranked forming active learning have shown effective results in improving
Perusall among their favorite components of the course and that the exam results of students [3, 5]. Particularly, preparing for lec-
their interest in the subject has increased. tures by reading material or watching videos has been one of the
main tools used to allow teachers and students to focus their time
CCS CONCEPTS together on solving problems and discussing different solutions
to the problems. However, those studies did not investigate the
• Applied computing → Collaborative learning; Computer-
collaborative dimension of students working and learning together.
assisted instruction; Interactive learning environments; •
Social Annotation (SA) is a pedagogical approach that fosters
Social and professional topics → Computing education.
collaborative learning among students, enabling them to jointly
engage with course materials, discuss concepts, solve problems,
KEYWORDS and compare their annotations with peers [1, 18]. Recent research
Social Annotation, Educational Technology, Computing Education highlights the effectiveness of collaborative learning in computer
science and programming education, with students showing in-
ACM Reference Format:
creased engagement and improved learning outcomes [12, 24]. SA
Francisco Gomes de Oliveira Neto and Felix Dobslaw. 2024. Building Col-
laborative Learning: Exploring Social Annotation in Introductory Program- has garnered overwhelmingly positive student responses, under-
ming. In 46th International Conference on Software Engineering: : Software scoring its potential for enhancing the educational experience [4].
Social Annotation requires an online platform facilitating com-
munication and knowledge sharing among students as they interact
with course resources, such as textbooks, exercises, and video lec-
tures. Feedback Fruits, Diigo, and Perusall are a few examples of
This work licensed under Creative Commons Attribution International 4.0 License. such tools. For instance, in Perusall 1 , instructors share course ma-
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal terials, allowing students to asynchronously and collaboratively
© 2024 Copyright held by the owner/author(s). generate annotations by highlighting specific sections within the
ACM ISBN 979-8-4007-0498-7/24/04.
https://doi.org/10.1145/3639474.3640063 1 https://www.perusall.com/
12
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal de Oliveira Neto and Dobslaw
13
Building Collaborative Learning: Exploring Social Annotation in Introductory Programming ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal
Figure 1: Example of student interaction in Perusall. Three students comment on the highlighted (pink) annotation in the
course material about the lecture on Functions in Java. Students help each other understand the difference between reusable
code for simple tasks (functions) and structural abstractions (classes).
3 CASE COURSE: CONTEXT AND SCOPE reading the material until the end, and engaging with the material
We investigate the impact of Perusall in a course taught in the first (e.g., scrolling, highlighting text, etc.). The reading assignments are
study period of an international bachelor program in Software Engi- the main component of Perusall that fosters collaborative interac-
neering and Management at the University of Gothenburg (Sweden). tion between students before the lecture. Therefore, the completion
The course is on Object-oriented Programming (OOP) and covers of reading assignments will be our main metric to measure the
the following learning outcomes: (i) basics in procedural program- social annotation element in the course.
ming (e.g., printing, conditionals, loops, arrays and functions), and To motivate students to complete reading assignments, the course
(ii) core concepts of OOP (e.g., classes, objects, encapsulation, poly- instructor offered bonus points to students. A maximum of eight
morphism). The programming language taught in the course is Java. bonus points towards the exam were given to students creating
meaningful annotations or engaging in discussion in the Perusall
Course structure: The course instance took place during 10 material for the eighteen lectures.3 For each lecture, a 0.5 bonus
weeks in 2022 and had 143 registered students. Students were ex- point could be obtained. Thus, the maximum amount of points was
pected to dedicate 20 hours per week to the course, which would achieved completing any 16 of the 18 Perusall tasks.
include time in lectures, laboratory sessions (focused on practical We used Perusall’s definition of meaningful annotations and
exercises), and self-studies at home. Students were offered three shared examples with the students at course start.4 In short, mean-
2-hour lecture sessions, and three 2-hour lab sessions a week — ingful annotations are comments or questions that showcase the
all of which non-compulsory. For course completion, the students student’s comprehension of the concepts discussed in the material.
must submit: (i) three programming assignments done in groups We exemplify a meaningful student exchange in Perusall through
of up to three; and (ii) a final individual written hall exam where Figure 1 where three students discuss what a Function is. All three
students score between 0–100 points. The course was taught on students received the bonus for that lecture as they made multiple
campus by one course responsible, and eleven teaching assistants. similar comments throughout that lecture’s material.
To cope with the large number of students, we used Perusall’s
Student background: No entry requirements in programming algorithm that automatically assess the quality of annotations of
or computer science applied, as this was the students’ first pro- students based on the content quality, the number of students’
gramming course in the Program. Nonetheless, students may or replies, length of text, among other features extracted from the
may not have had previous programming knowledge (e.g., during annotation [11]. The course instructor chose the holistic scoring
their high-school education), leading to a heterogeneous sample of strategy defined in Perusall which considers the annotations cre-
student backgrounds. Students took one other course in parallel in ated, and whether the student has read the entire material before
discrete mathematics with the same expected workload. the lecture.5 The course instructor also determined that students
needed to create at least two meaningful annotations to complete
Perusall and social annotation: For each lecture, students
3 Students were also informed that bonus points could not be used to cross the passing
were instructed to prepare by reading the material in the Perusall
threshold of 50 points, i.e., a student could not pass the course through bonus points.
platform in the form of reading assignments. Students complete 4 https://support.perusall.com/hc/en-us/articles/360034824694-How-is-annotation-
14
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal de Oliveira Neto and Dobslaw
the reading assignment to foster discussion threads and commu- usage of Perusall, (ii) the frequency and level of satisfaction from
nication between groups of students. The practical exercises and students engaged in peer instruction, (iii) students’ reactions to
student-teacher interactions happened mainly during lectures. In the teaching methods, and (iv) the self-reported impact on their
case a student disagrees with Perusall’s automatic score, the course learning.
instructor can revise and override Perusall’s decision. Fine-tuning
Perusall’s accuracy is beyond the scope of our study, therefore, Prior Knowledge in Programming: One of the main chal-
we acknowledge and discuss some limitations associated with its lenges in teaching first-year programming courses is the variance
automated grading system in our threats to validity. among students regarding their prior knowledge of programming.
On one hand, having prior knowledge can lower students’ motiva-
Lecture format: All lectures were hosted on campus. An aver- tion to engage in discussion about topics they are already familiar
age of 80 students showed up to class (55% attendance rate). Each with. On the other hand, those students can also share their experi-
two-hour lecture was divided into two parts with a 10-min break in ences with novice students to help them learn. Typically, program-
between. Part one was a Mentimeter6 session with multiple-choice ming is not taught in primary or secondary school, even though
questions regarding the Perusall material. We chose Mentimeter reality might change, given the benefits of introducing students
for its simplicity and to allow for anonymity. Students were told earlier to programming [26, 29]. The instructor estimated the prior
that the quiz participation had no influence on the grade and was knowledge of students by sending them an anonymous question-
optional. For each question, the answer statistics were presented naire with various programming-related questions before the first
live to the students, and the teacher initiated a discussion about lecture (each question had an option “I do not know/I cannot answer
the student’s reasoning, particularly when the answers were not yet”. From the sample of 115 respondents, 34% of students could not
converging to the correct option. The second part of the lecture answer what a String is, and 67% of students did not know what
focused on applying the lecture topics with the help of one or two an if-statement is. Both topics are basic programming constructs
coding exercises solved together with the class. taught in the first week of the course. Therefore, we argue that
prior programming knowledge is not a prevalent factor influencing
Written hall exam: Students did a four-hour written exam our analysis in this instance of the course.
with various questions focusing on tracing code, writing small
classes or functions, and explaining the application and trade-offs 4 RESEARCH METHODOLOGY
of topics covered in the course. Students received between 0–100 To prevent an unfair teaching environment and the risk of favoring
points based on the quality of their answers. We applied the four- or disadvantaging a particular group of students, we chose not to
level grading scale below. The exams were anonymised by the conduct a controlled experiment. Instead, we gave students the
examination office and graded by the course responsible. choice by making social annotation an optional part of the course.
• Fail (U): Assigned to students that scored less than 50 points Therefore, to answer our research questions, we used the individ-
in the exam. ual results of the Perusall reading assignments together with the
• Pass (3): Given to students that scored between 50 and 69 student’s exam results. The feedback from the course evaluation
points. questionnaire is used to discuss the qualitative aspects of the stu-
• Pass with merit (4): Given to students that scored between dent’s feedback about using social annotation. We refer to Perusall
70 and 84 points. activity as the outcome of the reading assignments made by each stu-
• Pass with distinction (5): This is the highest grade in the dent. Each lecture had a corresponding reading assignment. There
scale and is given to students that received points greater were three outcomes for each reading assignment:
than or equal to 85.
• Skipped: The student did not create a single annotation
Course evaluation: At the course’s outset, five students volun- for that reading assignment, or they did not even read the
teered to become student representatives who are the contact point material in Perusall.
for all students when the student collective wants to offer feed- • Incomplete: The student created at least one annotation in
back regarding the teaching and learning throughout the course. the material, but the content was not assessed as meaningful
Nonetheless, all students have direct channels to communicate with by Perusall’s algorithm, i.e., the annotation did not convey
the course instructor. In the last week of lectures, all students re- the student’s understanding of the subject covered.
ceive a questionnaire following the SEEQ feedback template [17]. • Completed: The student made at least two comments on
The questionnaire is closed before the written exam to reduce the annotations that were classified as meaningful according
risk of bias introduced by the examination experience. The course to Perusall’s algorithms. These comments can be questions
responsible and student representatives meet on two occasions: the they asked, answers provided to other students or comments
first time halfway into the course to reflect on the course status and in discussion threads.
the teaching methods for possible intervention; the second meeting
We compared those different types of activities in relation to the
was a retrospective with the presence of the program manager and
students’ exam results (both points and grade). We analysed the
study administrators where the SEEQ questionnaire results were
exam results without adding the bonus points from Perusall since
discussed. The information collected from those instruments helped
this would otherwise introduce a bias towards passing students. We
the instructor to understand: (i) some of the main obstacles in the
analysed the results of 112 students considering the intersection
6 https://www.mentimeter.com/ between those registering for Perusall during the course instance
15
Building Collaborative Learning: Exploring Social Annotation in Introductory Programming ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal
75.9 %
90 66.1 %
48.2 %
37.5 %
65.2 %
53.6 %
80 51.8 % 59.8 % 61.6 %
48.2 %
ID Topic of the lecture: 70
52.7 %
51.8 %
57.1 % 53.6 %
52.7 %
58.9 %
Num. of Students
L01 Variables, Types and Expressions
49.1 %
60
34.8 %
L03 Conditionals
43.8 %
40
34.8 %
L04 Loops 30
25.9 %
L05 Arrays
24.1 % 24.1 %
20 21.4 %
19.6 % 19.6 %
17 % 17 % 17 % 16.1 %
L01 L02 L03 L04 L05 L06 L07 L08 L09 L10 L11 L12 L13 L14 L15 L16 L17 L18
L08 Encapsulation and immutable objects Lecture titles
L09 Collections - Lists, Sets and Maps Assignment Status: Incomplete Completed
16
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal de Oliveira Neto and Dobslaw
OOP concepts. Note that the drop in completion rate did not affect 18
the drop in reading assignments engagement as the proportion of
incomplete assignments varies roughly between 40–60% through- 16
as Incomplete.) 8
3
4
5
6
“Why is [overriding] risky? I get that it could become a prob-
lem if a subclass needs to override a method, but doesn’t. 4
0 10 20 30 40 50 60 70 80 90
The first comment was classified as incomplete because the Exam Points
30
24.2 %
15
RQ1: More than 50% of students took part in the non-compulsory 22 % 22 %
social annotation activities in Perusall. However, the percent- 10
17
Building Collaborative Learning: Exploring Social Annotation in Introductory Programming ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal
Of the 62 students who completed less than two reading assign- Most students (64%) would often or always read the material
ments, 64.5% (40) failed the exam. This proportion was almost three available in Perusall, but only three students stated that they create
times higher than the proportion of failing students who completed annotations in Perusall at the same frequency. Below, we share the
at least two assignments (22%). Moreover, the proportion of stu- statement from a student reporting that the need to annotate the
dents in all passing grades is significantly higher in the group of material added a distraction to their studies, despite seeing their
students that completed the expected number of assignments to benefits when reading discussions from other students. Perusall
pass, particularly for the better grades 4 (6 vs. 11) and 5 (1 vs. 6). has options to hide comments and annotations, but students did
not receive a walk-through or demonstration of Perusall’s features
RQ2.1: We observed a distinct grade distribution among stu- in course start.
dents who actively participated in social annotation by com-
pleting the required number of assignments for passing. No- “Personally, I enjoyed being able to ask questions directly
tably, we identified a positive correlation between engage- in Perusall and receiving answers. However, my personal
ment and the distribution of exam scores and final grades. As learning style implies highlighting key concepts I find impor-
students increased their involvement in social annotations tant, and sometimes in Perusall there were full paragraphs
within Perusall, we noted a decrease in the number of exam highlighted with a question, and it distracted me from the
failures. Furthermore, a significant trend emerged, reveal- material”. (Student)
ing that the majority of students who completed fewer than
two assignments ended up failing the exam, while those who After L12 (Inheritance), student representatives in the course
completed at least two assignments exhibited proportionally asked the teacher to create anonymous annotations, which is an op-
better performance in their exam grades. tion for Perusall. The anonymity allows teachers to see the identity
of students creating or replying to comments, but students do not
Figure 5 contrasts the proportion of passing and failing students see each other. We see a slight increase in activity from students
based on their corresponding number of completed assignments. after that, but this is still lower than some assignments before the
We see that the largest proportion of failing students are those who anonymity was enabled.
did not complete a single assignment (x = 0), which aligns with The course evaluation questionnaire also includes two questions
our observations above. Moreover, when completing more than 4 about the different teaching practices used in the course: (Q9) “What
assignments the cumulative number of students passing the exam are the three things you liked the least about the course?”, (Q10)
(25) is much higher than the number of students failing (5). Focusing “What are the three things you liked the most about the course?”.
on the middle range of completed assignments (4–9), few students Only one student listed Perusall and social annotation as one of the
fail and many more pass in that range. For the students highly things they liked the least in the course. Particularly, the student
engaged in social annotation (above 9 lectures) the correlation is was unsatisfied with the amount of time spent during the lecture
even more apparent as only 2 (out of 19) students failed the exam. quizzes (which typically cover the content from Perusall). Also, this
student is more interested in a more traditional format of lectures
RQ2.2: The majority of students who failed did not com- where explanations are delivered predominantly by the lecturer
plete any reading assignments. After completing more than rather than by their colleagues. Related work also reports that
4 assignments, the proportion of students passing the exam students struggle to adopt social annotations and move towards
(22.5%) is much higher than those that failed (4.5%). continuous learning [19].
18
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal de Oliveira Neto and Dobslaw
15 11.6 %
10 8% 8%
5 3.6 %
2.7 % 2.7 % 2.7 % 2.7 %
1.8 % 1.8 % 1.8 % 1.8 %
Num. of Students
10 8.9 %
15
20
25
30 26.8 %
0 2 4 6 8 10 12 14 16 18
Num. Completed Assignments
Figure 5: Course passing statistics per lecture. Completing many of the assignments (x-axis) has a large impact on passing the
course, while little activity results in a high risk of failing. The “negative” y-axis is used simply to emphasise the difference
between students who passed and those who failed.
Table 3: The responses for a subset of questions from the course evaluation questionnaire. 25 students answered the questionnaire
using a Likert scale with 5 levels detailed below. For each question, the median answer is highlighted in bold and blue.
ID Question Description 1 2 3 4 5
1:Never. 2:Rarely. 3:Sometimes. 4:Often. 5.Always
Q1 How often did you read the material before the lecture (Perusall or offline)? 1 2 6 6 10
Q2 How often did you create or respond to annotations in Perusall? 8 9 5 1 2
Q3 How often did you attend the lectures? 1 0 0 2 22
1:Strongly disagree. 2:Disagree. 3:Neutral. 4:Agree. 5:Strongly Agree
Q4 Reading the material before the lectures helped me better understand the lessons 1 2 6 7 9
Q5 The lecture quizzes made me better understand the concepts being taught. 2 2 4 10 6
Q6 Students are encouraged to ask questions and are given meaningful answers. 0 0 1 9 15
Q7 I have learned something that I consider valuable. 0 1 1 8 15
Q8 My interest in the subject has increased as a consequence of this course. 1 1 2 9 12
measure of what had been understood or needed more prac- students vary in study habits or access to additional educational
tice, and also led to some nice discussions.”. (Student) resources outside the course, which might influence exam scores
independently of the course’s teaching methods. Such an analysis
Most of our analysis focuses on the correlation between Pe- would require other instruments, such as a thematic analysis of
rusall annotations and exam points, such that we cannot use those students’ annotations throughout the course, as well as exercises
measures to confidently infer causation between social annotation or assessments that can show more of a progression throughout
and students’ learning. When analyzing the correlation between the course. We aim to perform those analyses in future work.
exam scores and students’ learning, it is important to consider po- For the scope of this paper, we evaluate the learning based on the
tential confounding factors. For instance, students who engage in self-reported satisfaction from the course evaluation. The results
social annotation might already be highly motivated and diligent, suggest that social annotation correlates with the student’s learning
inherently contributing to their higher exam scores. Additionally, satisfaction. Note that 23 students (92%) agree or strongly agree
19
Building Collaborative Learning: Exploring Social Annotation in Introductory Programming ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal
that they have learned something valuable in the course (Q7) and of annotations following the introduction of anonymous
that, similarly, their interest (84%) in programming increased as annotation, albeit introduced later in the course. It is worth
a consequence of the course (Q8). Therefore, we summarise our considering that implementing anonymous annotations from
findings and lessons learned in the points below: the outset could have fostered greater engagement early on.
• Most students who engaged in social annotation passed the
exam and, proportionally, showed higher grades. This is also 6.1 Limitations
reported in other areas such as physics [20] or multimedia There are some limitations in our analysis. One of the main con-
applications [7]. struct validity threats is focusing on one instrument to indicate
• Completing more reading assignments in Perusall is corre- learning performance in the course. Written exams have limitations
lated with higher passing rates. in conveying the learning of students due to various factors such
• Most students listed that Perusall, the lecture quizzes and as anxiety due to time constraints or cultural biases [6, 22]. On
class discussions among the three things they liked the most the other hand, exam results or grades provide a consistent way to
in the course. compare trends across many instances of courses and have been
• More than 90% of the course evaluation respondents agree used to evaluate teaching in software engineering education in the
that they learned something that they consider valuable, and literature [5]. Using exam points also allowed us to compare our
their interest in programming has increased after the course. findings with other results from literature [7, 13, 20]. We mitigate
• Social annotation can leverage flipped classroom approaches. the limitation in using exam scores by focusing our conclusions
Many students read the material before the lecture and were on the correlations between points, grades and Perusall activity
motivated to engage in active learning during classes (e.g., without inferring a direct causation to learning.
Mentimeter quizzes and discussions). Even though the instructor can override Perusall’s automated
Based on the experience reported in this paper, we make the fol- grading, there are also risks with students receiving bonus points
lowing recommendations to instructors interested in introducing without making substantial comments (false positives). We did not
social annotation and Perusall to their programming courses: make adjustments for those cases to avoid reducing the grade of the
student. In our findings, most students did not earn bonus points as
• Consider material length and scope: Given that students
their comments were deemed insubstantial by Perusall. Less than
in the analyzed course had approximately 24 hours between
five students requested a score review, and of those, just two had
lectures to prepare and annotate the associated reading ma-
their scores modified. Nonetheless, we plan to explore Perusall’s
terials, it is crucial for these materials to be both concise and
accuracy in future research further.
succinct. In this particular course, the average content for
Moreover, course evaluation feedback is also subject to various
each lecture encompassed approximately 10 pages, compris-
factors, such as the student population (e.g., student bias), the im-
ing text and Java code examples
pact of exam results, and the phrasing of the questions. We mitigate
• Investigate incentives for social annotation: More than
these factors by: (i) using the standardised SEEQ template [17] and
half of the students consistently engaged with Perusall through-
collecting course feedback before the written exam is performed.
out the course. However, the percentage of completed read-
Moreover, course evaluations can be a useful tool to gather infor-
ing assignments dropped over time. Increasing the number
mation about the student’s experiences and can offer insights on
of bonus points, or making social annotation compulsory
how to improve course quality [9].
can encourage engagement.
In our analysis, the heterogeneity of students’ background knowl-
• Demonstrate the social annotation platform early: Stu-
edge in programming is a key internal validity threat. To assess
dents are not familiar with social annotation platforms in
this, we conducted an entry questionnaire at the course’s outset,
education, which can create initial barriers to engagement.
revealing that 67% of students were unfamiliar with basic concepts
Additionally, they may not be used to articulate their cog-
like "conditionals." This suggests a relatively uniform knowledge
nitive processes while writing their notes. To mitigate this,
level among participants. However, our results are not broadly gen-
demonstrating Perusall, along with illustrative examples of
eralizable due to the specific context of our study. Despite this, our
both effective and ineffective annotations, can diminish the
large student sample and the range of topics align with those in
learning curve associated with social annotation.
many university programming courses. Future iterations of this
• Explain the social annotation platform in the first
course will incorporate student feedback to refine these teaching
weeks: Students are not familiar with social annotation
activities.
platforms, which can increase the friction of creating an-
notations in the first weeks of the course. Moreover, they
are not necessarily critical about conveying their cognitive 7 CONCLUSIONS AND FUTURE WORK
processes while studying. Showing how a tool like Perusall We report on our experience in introducing optional social anno-
works, as well as some examples of meaningful (and not so tation in a programming course via reading assignments using
meaningful) annotations can reduce the learning curve to Perusall. We analyse whether social annotation has an impact on
social annotation as a practice. the student’s grades and satisfaction. Our findings suggest that
• Enable anonymous annotations: Initially, students may many (in our case, the majority of) students actively engage with
hesitate to openly express their uncertainties or questions the optional material and that a significant correlation to passing
to their peers. We observed a small increase in the number grades can be observed. However, only a subset of the annotations
20
ICSE-SEET ’24, April 14–20, 2024, Lisbon, Portugal de Oliveira Neto and Dobslaw
21