Project based learning
Project based learning
University of Waterloo Sun Yat-sen University The Hong Kong University of Science
Waterloo, Canada Zhuhai, China and Technology
Hong Kong, China
Xiaojuan Ma
[email protected]
The Hong Kong University of Science
and Technology
Hong Kong, China
ABSTRACT KEYWORDS
The increasing use of Artificial Intelligence (AI) by students in AI for education, project-based learning, co-design, qualitative
learning presents new challenges for assessing their learning out- study, generative AI
comes in project-based learning (PBL). This paper introduces a
co-design study to explore the potential of students’ AI usage data ACM Reference Format:
as a novel material for PBL assessment. We conducted workshops Chengbo Zheng, Kangyu Yuan, Bingcan Guo, Reza Hadi Mogavi, Zhenhui
Peng, Shuai Ma, and Xiaojuan Ma. 2024. Charting the Future of AI in Project-
with 18 college students, encouraging them to speculate an alterna-
Based Learning: A Co-Design Exploration with Students. In Proceedings
tive world where they could freely employ AI in PBL while needing
of ACM Conference (Conference’17). ACM, New York, NY, USA, 19 pages.
to report this process to assess their skills and contributions. Our https://doi.org/10.1145/nnnnnnn.nnnnnnn
workshops yielded various scenarios of students’ use of AI in PBL
and ways of analyzing these uses grounded by students’ vision of
education goal transformation. We also found students with dif- 1 INTRODUCTION
ferent attitudes toward AI exhibited distinct preferences in how to
The advance of Artificial Intelligence (AI), especially recent break-
analyze and understand the use of AI. Based on these findings, we
throughs in generative AI (GenAI) and foundation models [98], has
discuss future research opportunities on student-AI interactions
a foreseeable impact on higher education [27, 50, 70]. This is evi-
and understanding AI-enhanced learning.
dent by the increasing use of AI tools by students to assist in their
learning tasks [11, 22, 37]. Students use AI, such as ChatGPT [57],
CCS CONCEPTS to resolve confusion and assist with time-consuming tedious tasks,
• Human-centered computing → Empirical studies in HCI; such as debugging and documentation, allowing students to focus
User studies; • Computing methodologies → Artificial intelli- more on essential learning tasks [22]. Despite the benefits of using
gence; • Applied computing → Education. AI in students’ learning, this shift also creates new challenges for
education practitioners. One critical question that often arises is
how to fairly evaluate students’ learning outcomes when AI con-
tributes to the completion of learning tasks [4]. It is undesirable
that assessments end up measuring the capabilities of AI rather
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed than reflecting the students’ acquisition and application of skills.
for profit or commercial advantage and that copies bear this notice and the full citation To tackle this challenge of assessments, some researchers and
on the first page. Copyrights for components of this work owned by others than ACM education practitioners have suggested exercising more in-class
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specific permission and/or a or oral tests [70]. While this approach may adequately evaluate a
fee. Request permissions from [email protected]. student’s low-level learning outcomes such as remembering course
Conference’17, July 2017, Washington, DC, USA knowledge, it falls short in measuring students’ high-level learning
© 2024 Association for Computing Machinery.
ACM ISBN 978-x-xxxx-xxxx-x/YY/MM. . . $15.00 outcomes, such as creative thinking and metacognition, that are
https://doi.org/10.1145/nnnnnnn.nnnnnnn highly anticipated by educators in project-based learning (PBL) [12,
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
35]. In PBL, students tackle authentic problems and generate ar- a future ideal student, such as “creative” and “self-driven”, which
tifacts such as reports or models as solutions [9, 67]. Use of tech- they believe should be reflected in their AI-assisted PBL. Lastly, the
nology in PBL is usually encouraged in PBL [9, 32]. Thus, it is “AI Usage Report Design” activity invited participants to craft com-
conceivable that AI tools will be increasingly adopted by students ponents of a process report of their re-envisioned course projects
in PBL with instructors’ permission, if not already. Artifacts pro- specifically related to AI usage, aiming to help with the assessments
duced in PBL usually serve as key indicators of students’ learning of the traits (2nd activity) through analyzing students’ AI usage
outcomes [75]. However, the increasing use of AI tools in producing behavior (1st activity).
these artifacts raises questions about their reliability as accurate We organized seven separate iterations of a three-hour co-design
measures of student learning [27, 65, 70]. workshop, with a total of 18 college students, to explore the poten-
One alternative strategy is to base the assessment on detailed doc- tial future of reporting AI usage in PBL. We performed qualitative
umentation and reports of the PBL process data, possibly in the form analysis on the collected data and validated our findings using
of presentations or learning journals [9, 75]. The learning process source, investigator, and theory triangulation [19], ensuring they
data can provide insights into key higher-order cognitive processes are rooted in students’ experiences and accurately reflect their at-
(e.g., decision-making) that students undergo and qualities (e.g., titudes.1 During our workshop, students produced a broad range
critical thinking and creative thinking) they exhibit throughout the of both existing and envisioned AI usages based on their previ-
project. Details about how students leverage AI assistance in their ous learning journeys. Grounded in these envisioned AI usages,
projects can be an integral part of future students’ learning process participants suggested multiple methods for analyzing students’
data. This addition could potentially provide educators with valu- interaction with AI, aiming to yield valuable insights for evaluat-
able information about the extent to which students’ efforts, rather ing learning outcomes. Post-workshop interviews revealed that
than AI capabilities, contribute to the project outcomes. However, students’ various attitudes towards AI, led to distinct preferences
a significant gap remains in our understanding of how students for how their interaction with AI should be represented in reports.
might want to report their AI involvement in the PBL process and However, some participants voiced reservations about the evalua-
how such a report could support future assessments of students’ tion of their human-AI interactions, citing concerns about the po-
learning outcomes in PBL. tential for ambiguous interpretation. Due to our study’s qualitative
In this paper, we aim to take the first step towards filling this nature, statistical generalizability to other scenarios or populations
gap. One challenge to our investigation arises from the immature is limited [74]. However, it provides detailed insights into students’
state of AI tool adoption among students. Despite the popularity views on AI-enhanced PBL, encouraging further research in this
of commercial AI products like ChatGPT, many students may lack vital field.
the necessary skills, such as prompt engineering [22, 93], to use In summary, this paper contributes to the HCI community by
AI in the way they want. Besides, many of the AI services are presenting 1) various future AI usage scenarios, education goal
still evolving; they have yet to reach a state of being truly usable transformations, and possible analysis of students’ use of AI in
and suitable for students in their actual learning activities. Even if PBL from college students and the nuanced understanding of the
students have desired AI tools in mind, they may not have adequate reasons behind; 2) A discussion of future research opportunities on
access to them due to paywalls, regional restrictions, or concerns student-AI interaction as well as tracking and sensemaking of the
about academic integrity when using such aids. Nevertheless, the students’ use of AI based on our workshop findings.
Human-Computer Interaction (HCI) community has a long history
of exercising design research methods, such as design probes, co- 2 RELATED WORK
design workshops, and design fiction, to explore the impact of
emerging technologies on different groups of stakeholders and 2.1 Project-Based Learning
society in futuristic scenarios [40, 56]. Inspired by previous design Project-based learning (PBL) is a student-centered pedagogy widely
research in the field [14, 17, 43, 80, 86], we design and operate a adopted in higher education [31], which stems from the learning
co-design workshop study to engage college students to actively theory of active construction [32, 67]. Constructivists propose that
explore the future practices of documenting AI usage in PBL. The students learn superficially when receiving information from teach-
workshop participants are encouraged to speculate a future PBL ers or computers passively. In contrast, deeper understanding is
scenario where they have the freedom to leverage the assistance of achieved when students actively “construct and reconstruct” the
any AI capabilities, whether such capabilities currently exist or are knowledge through “experience and interaction in the world” [32].
yet to be developed, but the assessment to students would largely To this end, in PBL, students usually work on a project for an
affected by students’ submitted AI usage report, documenting how extended period, gaining hands-on experience by creating arti-
they have used AI. facts, such as reports, models, and videos to answer a driving ques-
Our workshop includes three innovative activities: AI-involved tion [9, 31]. PBL is also associated with the situated learning the-
PBL journey speculation, Imagine the ideal student, and AI ory, which suggests learning would be more effective in authentic
usage report design. In the first activity, participants imagined contexts [32]. Thus, the driving questions in PBL often relate to
how they might utilize AI in the process if they were to conduct a real-world challenges. Previous educational research highlights
previous course project again. This activity echoes the principles many benefits of PBL, such as better mastery of subject matter [12],
of design fiction [40] but situates the speculation not in the distant promotion of self-regulated learning [31], sparking students’ moti-
future but in an “alternative present” [14]. In the second activity, vation [7, 24], and improving students’ higher-level cognitive skills
“Imagine the ideal student,” participants envisioned the traits of such as creative thinking [12, 21].
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
Adopting PBL also presents several challenges, including gen- the flexibility to use AI tools not intended for educational pur-
erating driving questions that are both authentic and relevant to poses. This include AI tools for brainstorming [5], data science
the subject knowledge [62, 76]; time management [76]; balancing work [79], and creating presentation slides [96]. Recently, the rapid
instructor-led guidance and students’ self-directed learning [31, 62]. development of GenAI, particularly large language models (LLMs),
Another important challenge is evaluating students learning. Edu- has made AI assistance more accessible to students in the PBL con-
cation researchers argue that the assessment of PBL should also be text [16, 22]. LLMs, with their large model scales [83] and prompting
“authentic” [7, 31]. Traditional tests that can only capture students’ techniques like chain-of-thought [84] and multi-step chaining [88],
low-level understanding of knowledge cannot provide a comprehen- demonstrate remarkable proficiency in a wide range of tasks, even
sive evaluation of students [9]. The artifacts produced by students achieving success in college-level exams [13, 33]. Moreover, various
are frequently used for assessment, but this approach is critiqued LLM-based and other GenAI-based tools, such as ChatGPT [57] and
for neglecting the process [31, 62, 75]. As complementary, students Midjourney [46], are readily available in the market and accessible
are often required to provide in-class presentations, learning jour- to students. According to the content analysis of social media plat-
nals, portfolios, and self-reflection to show their learning process forms by Hadi Mogavi et al. [22], many college students nowadays
for assessment [9, 62]. have used GenAI tools in their learning, including but not limited
This paper explores the impact of AI on future PBL and its im- to generating review flashcards, creating or editing essays, and
plications for student learning assessment. The integration of tech- assisting peer review.
nology in PBL is an important research topic [31]. Technologies Students’ use of GenAI in their learning has raised concerns
are often described as “cognitive tools” [76], indicating they help among educators about potential harm. Generally, it is found chal-
students collect, process, and synthesize information and better lenging to implement responsible AI adoption [78, 81]. In educa-
engage in higher-order thinking. Additionally, technologies em- tion, teachers concern that students might use GenAI to cheat
power students to undertake tasks previously beyond their capa- on assignments [36, 77]. This concern is amplified by the chal-
bilities [32], thereby boosting motivation in learning [9]. Besides lenge in distinguishing between human-written and AI-generated
benefits, Blumenfeld et al. [9] raise concerns about the over-reliance content [20], despite efforts to develop “AI detectors” [58]. Re-
on technology potentially leading to a decline in students’ skills searchers [18, 49, 66] and OpenAI’s Educator FQAs [58] also high-
and the need to define appropriate roles for teachers and technol- light that GenAI could provide inaccurate, misleading, or biased in-
ogy. Previous studies have discussed various technological tools in formation, potentially impacting students’ learning negatively. Con-
PBL, such as search engines, project management software, doc- sequently, some renowned institutions have banned using GenAI
umentation tools and error diagnosis tools [9, 12, 24, 75]. Yet, the tools as an interim solution [51–53].
impact of AI has been less scrutinized. Significantly differing from Despite the concerns, education practitioners also widely recog-
other technologies, AI now demonstrates capabilities that rival or nize the benefits of AI and GenAI tools in learning, such as quick,
even surpass human intelligence, positioning it as more than just a personalized feedback [48, 54, 66]. Many foresee a near future where
cognitive tool for students (we will discuss this more in Sec. 2.2). In AI usage in student learning becomes a norm [36, 58, 66], leading
this paper, we investigate how students might use AI in future PBL, to a transformation in educational assessments, such as empha-
how their learning goals might shift, and how the assessments in sizing the learning process rather than just the outcomes [54, 55]
PBL should be empowered. By exploring these questions, we aim to and evaluating students’ AI literacy [42, 54, 66]. People call for
provide insights into designing future PBL instruction and support more research to explore responsible ways to apply AI in the edu-
tools. cation field [49]. Moreover, it is widely agreed that transparency
in how students use AI is vital for evaluating their learning in the
future [18, 36, 54, 58], not only to judge misconduct but also to un-
2.2 AI Tools to Support Learning Tasks & derstand their critical thinking and problem-solving abilities, and to
Generative AI (GenAI) foster students’ self-reflection [18, 54, 58]. Despite the importance,
Extensive research exists on employing AI to support the student to our knowledge, there is a lack of research investigating this level
learning process. This includes designing AI to partially replace the of transparency and relevant analysis in an AI-enhanced educa-
teacher’s role. Intelligent tutoring systems, such as those for lan- tional context. To this end, we investigate a speculative AI-rich PBL
guage and algebra, provide adaptive feedback and problem-solving setting, focusing on students’ needs to collect and analyze their in-
scaffolding [59, 60, 85]. AI is also deployed to handle class logis- teractions with AI to transparently communicate their AI-enhanced
tics and respond to student inquiries [82]. Furthermore, there is a learning process with others.
growing interest in how AI can collaborate with students during
learning. For example, Jonsson and Tholander [26] studied students
working with a code generation model for creative programming.
2.3 Tracking and Sensemaking of Learning
They found that errors in AI generation can confuse but also encour- Process
age reflection and exploration. Similarly, Kazemitabaar et al. [28] Much HCI research has been devoted to studying how to track
investigated code generation assistance in introductory program- students’ learning process and use techniques, such as learning ana-
ming learning, and they found AI could speed up coding without lytics (LA) dashboards, to help students and instructors understand
hindering learning. the tracked data and the learning process. Much of the learning data
While the aforementioned AI tools are designed or selected by is collected from learning-support platforms. For example, in class-
teachers, specifically for student learning, in PBL, students have room environments, VisProg [94] collects students’ programming
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
data on a Python learning platform and visualizes each student’s assessment [61]. Second, our study was situated in a future context
coding progress to empower instructors to provide in-time feedback; where AI is both advanced and widely accessible. We envisioned
Yang et al. [91] proposed a tool named Pair-Up to track students’ that the higher education sector will emphasize responsible usage of
learning on digital systems and display students learning status, AI in learning but cannot constrain how individual students interact
such as idling and making errors, to teachers to support in-class with established and emerging AI products and services [45, 65]. In
orchestration. Collecting video data from remote teaching tools, such a scenario, students tend to have a more accurate description
Glancee [44] recognizes students’ learning status to support in- of how they may personally leverage AI in PBL than instructors
structors’ teaching. In a non-classroom context, students’ learning do. Third, there are likely other stakeholders in interpreting future
behavior data on learning-support platforms, such as question pool AI usage data, such as potential employers, who may evaluate
websites, is also studied in previous work to, for example, predict students’ qualifications based on their presentation of learning
student dropout [47], and support student metacognition [90]. portfolios (e.g., past course projects). Students have the agency
Besides auto-collecting student data from learning-support digi- to analyze their learning data and craft the reporting of their AI
tal environments, previous research also studies data from students’ usage data in these scenarios. Lastly, prior research emphasizes the
self-tracking or instructors’ observations. Rong et al. [69] present importance of involving students in analyzing their learning data, as
a qualitative study regarding how Chinese students utilize a data they are central stakeholders in their own educational journeys [2,
tracking application to self-record various qualitative learning data 71]. We carefully considered the alternative of inviting instructors to
to support self-directed learning. Kharrufa et al. [29] design Group participate in the co-design workshop alongside students but finally
Spinner, an instructor-facing data tracking and visualization tool. In- turned down the idea. For one thing, the inherent power dynamics
structors can record their observations of students’ learning, includ- between instructors and students could impact the latter’s design
ing the use of technology and outcomes, through Group Spinner, thinking [72]. For another, we sought to include students with
which would then present student data in radar charts to support diverse PBL experiences for generalizability. Operational challenges
teachers in the classroom, such as improving communication with arose in simultaneously recruiting students and instructors whose
students. In PBL, due to its student-centered nature, students often past PBL activities align closely. In summary, For the purpose of
take the responsibility of tracking their learning data. For example, maintaining a focused scope in this paper, we have limited our
Sterman et al. [75] developed a documentation tool for students in exploration to students’ experiences and perspectives. Nevertheless,
design courses to document their intermediate outcomes in a de- we hope to incorporate teachers’ views in our future work.
sign project. Their user studies found that despite the benefits, such In the rest of this section, we first elaborate on the participants
as supporting metacognition, students also encounter challenges, recruitment and the study setup, and then introduce the three key
such as the tension between “creation” and “documentation”. activities involved in the workshop, which are inspired by previous
In this paper, we are interested in a learning context that few literature as well as a series of pilot studies. Lastly, we present the
research has investigated but could become increasingly common analysis process of the accumulated data.
in future education. This involves students learning by doing a
project over an extended period of time in non-classroom envi-
ronments, and AI plays a pivotal role in the learning. Specifically, 3.1 Participants
students are assisted by powerful AI tools during learning, and We recruited participants by disseminating recruitment messages
their learning goals include ones that might be important in an with registration forms through various channels, including social
AI-rich future, such as AI literacy. While prior works also involve media, word-of-mouth, and posters at four higher education insti-
collecting data on students’ interactions with AI, their purpose is tutions in East Asia. Following this, we received 68 applications for
not to assist education practitioners in understanding the learning the workshops. We carefully screened the applications and filtered
process but to answer their unique research questions. For example, candidates with inadequate experience in PBL. For instance, we
in Kazemitabaar et al. [28]’s study about students collaborating received five applications from first-year undergraduate students.
with Codex, they collected data on students’ AI usage, such as the However, the projects they described, such as building a personal
count of prompts per task and the “AI-generated code ratio” (of the web page or learning a programming language, were not solving
final submitted code), to understand whether novices can use AI real-world problems. Thus we considered they lacked PBL experi-
code generators. Instead, we study from students’ perspectives to ence and did not include them in the workshops. Additionally, we
explore how their AI interaction data can be leveraged to under- also required participants to have experience in using AI tools.
stand their learning process and evaluate their learning outcomes. We recruited 18 participants (female=13, male=5) from diverse
backgrounds. Our qualitative study’s sample size was determined
by reaching theoretical saturation [19]. This was evidenced by no
new insights emerging from the last two workshops, indicating a
3 CO-DESIGN WORKSHOP STUDY sufficient data breadth for our research objectives. Our participants
We designed a co-design workshop to investigate the potential consisted of both undergraduates (12) and graduate students (6),
of analyzing students’ AI usage in PBL for future assessment and and their majors varied from Computer Science (2), Engineering (2),
invited college students to participate. Data Science (4), Design (3), Psychology (1), Literature (3), and Lan-
In our investigation, we chose to focus on students rather than guage (3). In terms of AI tool experience, all participants interacted
instructors for several reasons. First, students also play the role with ChatGPT, while a subset (8) also used other AI tools, such as
of assessors in PBL, including self-assessment [12, 76] and peer Notion AI. Most participants were frequent users of AI tools, with
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
Table 1: An overview of workshop participants’ demographics and experience with projects and AI tools
7 using them weekly and 6 using them daily. When asked to self- lasted an hour and was dedicated to completing the second step of
assess their ability to apply AI to solve real-world problems on a activity 3.
5-point Likert scale, participants’ average score was 3.44 (SD=0.90).
Additionally, 14 participants reported having utilized AI in their 3.3 Workshop Design
recent course projects. For instance, P07 used ChatGPT to “explore Our workshop design was finalized through four rounds of pilot
the pros and cons of various neural network designs,’ while P04 workshop studies. The supplementary material provides our design
used ChatGPT for “data analysis tasks because it is good at dealing and reflection of each pilot study.
with data”. The detailed demographic information of our partici- Our overarching vision informs the framework of our final work-
pants is presented in Table. 1. Each participant was compensated shop. That is, in a future where AI plays a significant role in student
with $10 per hour for their participation. learning, both the learning process and the focus of assessments
will undergo considerable transformation. Under this assumption,
3.2 Workshop Setup our workshop first engaged participants in picturing how AI might
Our workshops took place in the summer of 2023 and allowed par- influence their learning processes (activity 1), then explored po-
ticipants to join in person or virtually. In-person attendance was tential changes in the assessments of PBL due to the integration
restricted to three cities where institutions at which we had dissem- of AI in learning (activity 2). We posit that materials for measur-
inated recruitment messages are based. However, due to challenges ing student performance will also need to be updated in response
arising from the disparate geographical locations of the partici- to assessment transformation. Students’ AI usage could serve as a
pants in each available time slot, all workshop sessions eventually valuable data addition to this evolving assessment landscape. Hence,
took place virtually. We utilized a video conferencing software 1 in our workshop, grounded in the insights from activities 1 & 2,
as well as a virtual collaboration whiteboard tool, Miro 2 , to con- participants are encouraged to conceptualize their own methods for
duct all co-design workshops. To ensure the smooth execution of analyzing and reporting AI usage for assessment purposes (activity
these workshops, at least one day prior to each session, we sent out 3). Figure 1 illustrates all the activities in our workshop. In the
necessary materials such as the consent form, workshop guidance, following subsections, we detail the design and the techniques we
and the link to the Miro whiteboard, that was going to be used in used in each activity.
that workshop session, to the participants. The workshop guidance
3.3.1 Activity 1: AI-involved Project Journey Speculation. In this
included instructions on pre-workshop preparations (detailed in
activity, we encouraged participants to envision future AI capa-
Sec. 3.3.1) and a brief guide to using the Miro platform.
bilities they might leverage in PBL freely. Initially, we planned to
Our pilot studies suggested that our workshop generally lasted
employ the Futuristic Autobiographies (FABs) method often used
around 3 hours. Each workshop was divided into two parts on the
in design fiction studies [14, 15, 86]. Specifically, we provided a PBL
same day to mitigate potential fatigue and optimize engagement.
context that included one project topic given by researchers (e.g.,
Our workshops had three key activities (see Sec. 3.3) and the first
“lung cancer prediction”) and a futuristic education context (e.g.,
part, which took place in the late afternoon and lasted for two
“teachers encourage AI use and any AI tools you want are ready”).
hours, covers activity 1, activity 2, and the first step of activity 3.
However, this approach fell short in our pilot studies; participants
Participants reconvened for the second part in the evening, which
struggled to generate concrete ideas about the needs and challenges
1 https://voovmeeting.com/ they might face in the hypothetical project, which in turn limited
2 https://miro.com/ their creative thoughts about AI use.
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
Critical Thinking
2 AI-involved Project Journey
Human AI
5 New traits to assess
Step 1 …
Use AI effectively ……
How …
[cite AI]
might you Identify
Learn from AI Human AI
use AI? relevant info
Figure 1: Our co-design workshop involved three activities (AC). In AC1, students mapped their learning journey (1) and
envisioned AI integration in projects with the help of AI Capability Cards (2). AC2 involved identifying key traits for student
assessment in a PBL context (3), AI’s impact on these assessments (4), and emergent traits necessary in an AI-rich future (5). In
the last activity (AC3), based on the outcomes of the prior two activities, students first considered what data should be covered
in a report on their AI usage in PBL (6) and then visually designed the report (6).
To overcome this limitation, we adopted the concept of the “al- creative thinking as props. These cards categorized AI function-
ternative present” from design fiction literature [3, 14] in the final alities into eight types 3 . Participants were introduced to each AI
workshop design. Instead of imagining a future project, participants capability category with examples under education contexts. They
were asked to recall a recent or memorable PBL experience they could play and customize the AI capability cards at will and put
actually had. They were then prompted to re-imagine these projects down new AI capabilities not captured by the existing categories
in a world where AI technology is ten years more advanced than on “wild cards” (Fig. 2 (1)).
today. This approach allowed participants to ground their specu- After integrating proposed AI usage into their learning
lations in concrete past experiences, enhancing the richness and journey maps, participants took turns sharing their revised project
feasibility of their envisioned AI applications. journey (Fig.1 (2)). We encouraged the audience to actively con-
Operationally, after signing the consent form, participants were tribute additional AI ideas for each other’s projects. Participants
asked to revisit and document a memorable or recent PBL expe- took a 10-minute break before going into activity 2.
rience using our learner journey template prior to the workshop.
Mapping learner journey is a co-design technique that captures 3.3.2 Activity 2: Imagine the Ideal Student. After exploring the po-
and communicates the essential phases and overarching flow of tential uses of AI in PBL, the next objective of our workshop was to
a student’s learning experience [63, 64]. The template prompted identify future challenges and needs related to assessing students’
them to detail each step of their PBL process, from objectives to PBL performance. This would inform the subsequent design of AI
actions and specific outcomes. Participants were allowed to adapt usage reports tailored to these assessments. Specifically, this activ-
the template to better fit their individual experiences (Fig.1 (1)). ity involved two key steps: identifying traits that are challenging to
At the beginning of the workshop, we first introduced the back- assess due to AI incorporation and postulating future traits of stu-
ground of our study, asking all participants to introduce themselves, dents that may emerge with more prevalent AI usage in PBL (Fig. 1
and then invited participants to share their mapped-out learner (4, 5)). Here, we use the term “trait” to refer to the qualifications
journeys. Then, they were tasked with a 15-minute brainstorming and qualities a student could exhibit from their learning processes,
session, envisioning how AI technology – presumed to be ten years such as critical thinking and creativity, and might be desired by
more advanced – could augment their past PBL processes. We used instructors and others to assess.
the “AI capability cards” presented by Yildirim et al. [92] to foster 3 Including“Estimate”, “Forecast”, “Compare”, “Detect”, “Identify”, “Discover”, “Gener-
ate” and “Act” [92].
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
1 2
“Share Your Learning Journey!” "Here and There"
Learning Journey
Discussion Identity
Construction
3
“Design Your AI Usage Report”
(a) Brainstorm
Data Usage
(b) Virtual
Report Design
Figure 2: The workshop utilized the Miro platform for collaborative activities. The facilitator shared the Miro board screen
throughout (1). In the first activity, students described their project journey in our provided learning journey table (1). The
second activity involved identifying current and future PBL traits using colored sticky notes—orange for current PBL importance,
purple for future needs, and orange with ticks for traits affected by AI’s rise (2). In the third activity, students linked traits
from the second activity to AI usage, using sticky notes for detailed notes (3(a)). For the visual report, they could use Miro’s
tools for design mockups (e.g., P16), incorporate external materials (e.g., P02), or sketch using preferred tools (3(b)).
Identifying traits whose assessments are challenged by AI usage, as speculated in Activity 1, might affect the assessment of
incorporation: In the first step, participants reviewed the assess- these traits.
ment goals of their past PBL experiences recollected in Activity Imagining future traits needed: In the second step, we asked
1. These goals could be self-defined, considering the autonomous participants to take 10 minutes to brainstorm new traits, such as
nature of PBL [9, 32], or set by external stakeholders like instruc- the ability to apply diverse AI tools in disciplinary tasks, that might
tors. Participants reflected on what traits were deemed essential in become desirable when AI plays a pivotal role in both society and
those assessments (Fig. 1 (3)) and how to measure them in the past. education. This aligns with literature suggesting that technological
Subsequently, they were asked to critically examine how their AI advancements can reshape educational paradigms [68].
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
To facilitate this speculative thinking, we employed an adapted to iterate their outcomes in prior steps, such as brainstorming the
persona-building technique commonly used in participatory design data coverage of the AI usage report (activity 3 step 1), whenever a
to analyze user needs [71]. Unlike traditional persona-building, new idea came to their minds.
which is rooted in past experiences, our revised approach asks As introduced in Sec. 3.2, this step was conducted in the evening
participants to envision “ideal future students” and their traits of the workshop. Before introducing this step, we took 5 minutes
related to AI experiences. This encouraged participants to think to recap the content of the first part of the workshop. Then, par-
beyond existing educational frameworks and consider emerging ticipants were instructed to consider how the reports should be
needs and challenges. framed in the scenario of submitting them to their course instruc-
After brainstorming the traits of an “ideal student,” we extended tors. They were also encouraged to consider how such reports can
the discussion by asking participants to take 5 minutes to consider be modified in other contexts, including self-reflection and job seek-
how these traits might be valued differently depending on assessor ing. We provided visual components, including charts, tables, and
identity: instructors, the students themselves for self-assessment, dialogue bubbles, in the Miro whiteboard as design material. We
and future employers. This exercise enabled a comprehensive un- also encouraged participants to use any other methods to showcase
derstanding of what learning outcomes students would want to their design ideas, such as sketching or text descriptions.
present to different stakeholders in an AI-aided PBL environment. Lastly, each participant was asked to share their report designs.
At the end of activity 2, participants were invited to discuss the Besides introducing the report design itself, they were tasked with
traits they had thought of and also commented on others’ ideas. demonstrating what traits they believed their designs could be used
to assess, how the chosen human-AI interaction data are related to
3.3.3 Activity 3: AI Usage Report Design. In this activity, partici- their assessment goal, and any other design ideas they had in mind
pants designed their desired report of AI usage to better understand but had difficulties illustrating. After participants of one workshop
their learning behavior in PBL. The report should surface their session had all presented their designs, they were then suggested
interaction with AI (activity 1) and aim to align with the traits they to comment on others’ proposals, including the pros and cons, and
value (activity 2). This activity was divided into two steps: brain- have an immediate discussion among themselves.
storming what AI usage data needs to be covered in the report and After participants completed activity 3, we also conducted a
visually crafting an AI usage report. 15-minute follow-up focus group interview with them, including
Brainstorm the Data Coverage of an AI Usage Report: We questions including “what do you think of the importance of lever-
asked each participant to choose one to two traits from activity 2 and age students’ AI usage data in future education”, “from a student’s
produced ideas about what interactions with AI that appeared in perspective, how would you like the AI usage report, such as the one
their speculated AI-aided PBL (activity 1) could be relevant to these you finally designed, be produced and delivered to others, e.g., your
traits (Fig. 1(6)). The goal was to prompt participants’ reflection instructor?”, and “what are your general experience of the workshop,
on what aspects of students-AI interactions in PBL may be worth any confusing moments?”
being analyzed, curated, and presented in their AI usage report for
learning assessment purposes. Besides, we would like participants
to recap their works in the first two activities so that their latter 3.4 Data Analysis
design can be grounded in their previous thoughts. In particular, We recorded all workshop sessions, accumulating approximately
participants were suggested to map any speculated AI usages from 24 hours of audio footage. These recordings were initially auto-
activity 1 to the relevant traits brought up in activity 2. To facilitate transcribed and subsequently manually verified for accuracy. We
individual brainstorming (15 minutes), we provided several example employed the Inductive Thematic Analysis method to analyze the
types of interaction data about one specific AI usage: the types of data [19]. The inductive approach offers flexibility in uncovering the
AI, at what stages in the project this AI usage occurs, the input to nuances of the data, which is particularly beneficial in studies like
AI, any customization to the AI tools, and how the AI output is ours that explore relatively uncharted territories [10, 19]. Three re-
handled. Participants used sticky notes to add comments on how searchers – including a facilitator and an assistant who participated
each type of interaction data about a specific AI usage can be aligned in all workshops – engaged in the data analysis. They first familiar-
or misaligned with the trait(s) they want to project (Fig. 2). They ized themselves with the data by reviewing the recordings several
were encouraged to consider not only their own selected traits but times. Then, the coders independently coded the transcripts. They
also those proposed by other participants. Once done, they took met frequently during the analysis to discuss any discrepancies.
turns sharing their results and giving feedback to other students. We integrated a dynamic approach in analysis, simultaneously
Visually Depicting an AI Usage Report: In the second step of analyzing early workshops while running later ones. This concur-
activity 3, we described the concept of a hypothetical “magic project rent analysis allowed us to triangulate data effectively. We asked
studio”, which could catch all students’ interactions with AI and participants in later workshops about their views on findings from
generate an AI usage report based on students’ needs. Participants earlier ones, enriching our understanding and validating our results.
were given 30 minutes to design AI usage reports that they desired This method, coupled with cross-referencing the data with Miro
this “magic project studio” to create (Fig. 1(7)). Using this concept, boards used in each workshop (as source triangulation), ensured a
we hope participants bypass implementation details and focus on robust, iterative analysis.
brainstorming what can be presented in such a report to fully After completing the analysis, we adopted member checking
explore the potential of students’ AI usage data as an assessment to validate our findings, which involved inviting participants to
material. During the design process, we encouraged participants review our findings and assess their alignment with their intentions
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
and experiences [19]. Specifically, we followed the synthesized 4.2 Students’ Envisioned Future Assessment
member checking [8]. Our preliminary analysis was summarized in Transformation
a concise five-page report. This document encapsulated the main
Participants have diverse and sometimes conflicting ideas when
themes, essential codes, and representative quotes. We approached
considering how their speculated AI usage might impact existing
participants for their assistance in reviewing this report, and 13
assessment methods. Participants also developed novel traits they
consented. These participants were provided with the report, and
believed were needed for an ideal future student. We introduce
we requested them to annotate and provide feedback on any aspects
these visions from students, which serve as interpreters of the
they found either reflective of or inconsistent with their intentions
purposes of students’ analysis of AI usage data introduced in the
and experiences. Of the participants, 11 returned the annotated
next section.
reports. We carefully compared the feedback from these reports
with our existing codes. This comparison allowed us to refine our 4.2.1 Old Traits Made Different by AI. Most participants felt that
analysis, ensuring it reflected the participants’ perspectives and traditional assessment methods, such as those based on artifacts, are
experiences more accurately. inadequate for evaluating traits like creative thinking and efficiency,
given the generative capabilities of AI. However, some participants
4 FINDINGS (P02, P06, P14, P16) argued that artifacts should still hold significant
In this section, we present six primary themes that emerged from weight in assessments. This belief was rooted in their assumptions
our analysis. The first three themes are aligned with the three about AI’s limitations, such as its ability to offer only coarse-grained
activities conducted during our workshop, while the latter three analyses and its inability to tailor solutions to a specific project
themes surfaced from participants’ comments and discussions. context.
Interestingly, while most participants considered logical and
critical thinking vital skills, P04 and P16 commented that these skills
4.1 Speculated AI Usages in PBL by Students
might not be critical in the future, given AI’s growing reasoning
Our seven workshops with 18 participants in total resulted in over capabilities. P17 opposed this view, stating: “How you choose to talk
100 student-desired AI usages. Through our analysis, six subthemes to the AI, what you ask it to clarify or expand on—that all takes some
regarding the purposes of these usages emerged. serious critical thinking.”
• Automating Repetitive and Time-consuming Tasks. All stu-
4.2.2 New Traits Needed Due to Students’ AI Adoption. These traits
dents desire AI to improve process efficiency by automating
are those not necessarily accessed currently but are believed by
activities perceived as time-consuming, monotonous, and
participants to be very important due to AI usage.
laborious, such as collecting data, documenting the imple-
mentation, and debugging. • Efficacy in using AI. This trait is the most frequently men-
• Supporting Divergent Thinking. Students hope AI can stim- tioned trait, through which participants highlight that an
ulate creative and out-of-the-box thinking by providing di- ideal student should use AI with clear purposes (P06, P10,
verse ideas (“AI has randomness, and the results it comes up P16), clearly communicate intents to AI (P11, P13, P15), and
with each time may be different, and I may let it give me strat- as a result, the output from AI lead to efficiency or perfor-
egy ideas multiple times for more perspectives”) and filtering mance increase.
ideas (“Let AI exclude published and commercially available • Leadership in Project Direction. Participants envisioned
related application ideas” (P1)). the ideal student as someone who retains control over the
• Supporting Selection from Alternatives. AI is expected to aid project’s direction, relegating AI to a “supporting actor” who
students in choosing the most effective ideas through analy- executes tasks as directed by the human leader.
sis and comparison. For example, P03 wanted AI to compare • Symbiotic Learning between Students and AI. Some
his ideas of algorithms and P05 would like AI to compare participants (P04, P07, P08, P16, P18) appreciated the notion
different literature to extract the “most correct conclusion” to of a mutually beneficial relationship between humans and
use in her project. AI: AI contributes valuable knowledge or capabilities, while
• Drafting or Direct Implementing of Solutions. Students hope students, in turn, refine AI functionalities to suit the project’s
AI can take their solution idea from conception to realization. needs better.
This would involve coding or creating prototypes based on • Judgment and Discernment. Participants, including those
the student’s foundational concept. For example, P18 said, “I with design backgrounds (P15, P16, P18), argued that tradi-
might express my ideas to the AI after I’ve formed a solution tional design skills may become less critical as AI becomes
for myself and let it do this last step of visualization for me.” powerful in designing. Instead, the ability to judge quality
• Feedback on Solutions. This involves AI evaluating the so- and make wise selections from alternatives could be essen-
lution’s effectiveness and offering suggestions for improve- tial.
ment. Participants mentioned 15 times that they hoped AI
could help them evaluate the proposed solution based on its 4.3 Students’ Designs of Reporting of AI Usages
effectiveness, feasibility, rigorousness, etc. This section introduces the participants’ proposed ideas for mining
• Guiding Students to Learn. This involves AI’s educational their AI usage data. In the activity 3, participants brainstormed
capabilities, from teaching new concepts to evaluating stu- what data insights could be extracted from the AI usage data, which
dents’ knowledge readiness to do the project. we refer to as usage analysis; and also tried to visually depict these
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
P07
P16 P18
P11
P15
P17 (2)
Figure 3: A subset of designs of the reporting of students’ AI usage resulted from step 2 of activity 3 of the co-design workshop.
We explain the designs when we mention them in the main text.
insights, which we refer to as framing idea;. Nevertheless, not every exemplified in P16’s task allocation framing, depicted in Fig. 2. Us-
usage analysis has corresponding data framing, possibly limited by ing a Gantt chart-style visualization, P16 illustrates the division of
participants’ data and visualization literacy, but it is worth future tasks between humans and AI, represented by differently colored
research. blocks with task annotations. The chart also includes time spent on
In the following subsections, we introduce the themes of usage each task (x-axis) and the level of creativity required (y-axis). P16
analysis we discovered. For each theme, if applicable, we discuss rationalized this design by stating, “The project is a learning journey,
the key usage analysis underneath the theme with associated and key to that learning is the execution of tasks that cultivate specific
framing ideas. We elaborate on why participants came up with this skills like creative thinking.”
usage analysis, which is tightly related to their imagined AI usages One very different consideration is that humans and AI have
and envisioned future assessment transformation. Note that we do advantages in different tasks. Thus for complementary performance,
not claim that the resulting categories of usage analysis and data they want to allocate specific tasks to the party that is good at them.
framing are exclusive or representative. Instead, we aim to open up For example, P08 liked to use a flow chart to show that he uses
the discussion space of the potential value of analyzing students’ AI mainly for Automating Repetitive and Time-consuming Tasks,
AI usage data through these categories. as he believed the advantage of AI is to perform these tasks fast;
P01 would like to show the time she spent on tasks that can be
easily and quickly done by AI such as debugging. On the other
4.3.1 Task Allocation between Students and AI. Many students wish
hand, P10 would like to show that students themselves are handling
to differentiate tasks handled by humans and those executed
creative tasks, although without concrete framing, and mentioned
by AI in their reports. An important design consideration stems
that “When creative ideas are needed, I don’t think AI is helpful even
from the students’ view that different tasks within the project have
if it lists a lot of data out and make a perfect analysis. Because the
different weights in nurturing and showcasing various skills. There-
spark of inspiration needs a particular moment.”.
fore, presenting who – human or AI – conducted specific tasks
implicitly indicates skill development or mastery. This notion is
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
– Discover student’s behavior changes in interacting with AI to show whether students develop their
Quantifying Students devel-
W=6, skills in using AI throughout the process.
opment through human-AI
M=9 – Quantify students’ learning based on how they have delivered tasks to AI.
interaction
Students’ ethical awareness W=5, – Analyzing whether students’ AI usage obeys regulations and respects people’s privacy.
in using AI M=8
Table 2: Summary of key codes under various themes of students’ designs of AI usage analysis. The frequency column presents
how frequently the themes were mentioned by participants in the workshops (W) and resonated in the member checking (M).
It is interesting to know that although some students would like project. The most common usage analysis proposed by students
to show the task allocation, in their real action, they did not want was to analyze the percentages of AI’s contribution versus
to really clearly allocate the tasks, which often originated from students’ in the project, often framed as pie chart (e.g., Fig. 3,
the belief that they and AI were working as a team. P16 designed P11), in either project-level or detailed task-level. The motivation
a table that clearly shows what AI did and what she did, but she of such an analysis is sometimes to complement the artifact-based
stated that: assessments, under which students expect the chart showing their
Although AI is responsible for a certain part, my input contribution to be much larger than AI; thus, the artifacts can rep-
is involved. As a team, we do need to divide so clearly. resent the students’ own skills. But sometimes, the motivation is
[...] I’m not using this chart to illustrate my specific about to show students’ efficacy in using AI, and participants (P02,
actions but to show it to the teacher to easily assess P16) expect the work done by AI to be much larger than humans,
my abilities. showing they could leverage AI to assist them in many pieces of
stuff.
4.3.2 Engagement of AI in the Project. Besides differentiating hu- Participants suggested various measurements of AI’s contribu-
man and AI labor discretely at task-level, many participants would tion, including AI-generated word count (P04), AI usage frequency
like to quantify to what extent AI has engaged with humans in the
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
in projects (P14), and the number of problems AI solved (P18). How- progress with our project or make a breakthrough.
ever, considering the complex human-AI interaction, some (P04, [...] It would tell me how to talk to AI better, which is
P16, P17) questioned the objectivity of these metrics. P17 pointed valuable.
out that even if 80% of a report is AI-generated, the human contri-
4.3.4 The Process of Students Incorporating AI’s Suggestions into
bution should be valued more, considering significant human input
the Project. Students hypothesized that using AI could bolster their
to AI and testing with AI.
decision-making capabilities by supporting divergent thinking, se-
While such percentages could be too abstract to understand and
lecting alternatives, and providing feedback on solutions. However,
hard to measure, some students proposed to connect AI usages to
many students knew that AI might provide incorrect information,
the artifacts to more concretely represent the AI engagement, to
introduce bias, and misguide their decision-making; thus, students
complement further the assessment based on artifacts. Participants
must prevent adopting AI’s ideas without caution. Such concerns
suggested using the citation-reference style to annotate any places
could turn into opportunities for assessing students. Students be-
AI has made a difference. Moreover, P15 proposed to detail the
lieve examining their behavior in incorporating AI’s suggestions
types of AI engagement (Fig. 3, P15), including directly using AI
can offer key insights for assessing traits such as critical thinking,
output, student-edited AI output, and human content inspired by
creativity, and leadership in human-AI interactions.
AI. While with a similar idea of differentiating different kinds of AI
One straightforward approach to understanding how students
engagement, P18 expected to annotate the AI engagement based
incorporate AI-generated ideas involves asking them to articu-
on the timeline of the project (Fig. 2 (P18)), which mainly aimed
late their perceptions and reflections on the AI’s suggestions.
for assessing students’ efficacy in using AI. She differentiated four
Beyond this subjective analysis, several participants expressed a
types of AI engagement: AI facilitates the project process, AI stalls
desire to demonstrate their process of filtering, editing, and
the progress, AI sparks the idea, and AI causes conflicts.
questioning AI’s suggestions. For example, P16 created a funnel
4.3.3 Effectiveness of Students-AI Interaction. Usage analysis under chart (Fig.3, P16) to visualize how AI-generated insights undergo
this theme mainly aims to assess whether a student exhibits the multiple layers of scrutiny. P17 depicted a diagram that showcases
trait of “efficacy of using AI” introduced in Sec. 4.2.2. Students the bidirectional information exchange between humans and AI
consider whether their question designed could well prompt (Fig.3, P17(2)). Although not visually represented, P17 expressed
AI to get desired assistance works as an important indicator interest in tracing which AI-provided inputs progressed to subse-
of their mastery of using AI. P11 wants to present her question quent stages and their ultimate impact. Similarly, P07 constructed
design process as well as the final question; P02 would like to have a bar chart to display the frequency with which she questioned
a video recording of her series of questions to AI highlighting her AI’s suggestions, asserting that a higher frequency of questioning
skills, for example, how her scaffolding the questions “I give it indicated more critical thinking (Fig. 3, P09).
a broad requirement and then see if it generates a good result and Moreover, some students emphasized their wish to highlight
if not, I refine the question step by step.” Students (P11, P17) also the diversity of opinions considered when incorporating AI’s
consider metrics such as the number of questions and time needed in suggestions. For example, P14 stated, “I want to show that I am
questioning and answering with AI as indicators of whether students synthesizing multiple AI’s suggestions. For example, I use ChatGPT for
could effectively use AI, and they believed that fewer questions and initial ideas and then turn to New Bing for additional perspectives.”
less time in questioning AI for one specific task indicate better AI Finally, some students emphasized the need to display the iter-
usage skills. Besides, P02, P09, and P18 also want to compare their ative process of blending AI suggestions with student inputs.
AI usage data with their peers to show their mastery. This demonstrates mutual enhancement in projects. For example,
Several students considered presenting the changes of the AI’s P07 used a flowchart (Fig. 3, P07) to show how students and AI
behavior due to students’ continuous input to AI to show collaboratively refine a model, with neither party’s ideas being used
whether they successfully guided AI towards the direction they like without the other’s feedback.
to show mastery, although without concrete data framing ideas. 4.3.5 Quantifying Student Development through Human-AI Interac-
P13 expressed that : tion. Our participants thought that data from student-AI interac-
Let’s say at the beginning, AI is just a basic general AI, tions could offer a valuable understanding of how students’ skills
but I feed it some papers, and it gradually understands evolve throughout the project. One aspect examined is the devel-
the stuff that I might be trying to do, and then it can opment of student’s skills in using AI. P02 expressed a desire
give some matching help. to demonstrate how she got useful assistance from AI through step-
by-step inquiries, illustrating her gradual mastery of effective AI
Another usage analysis is mining the project outcome
usage.
improvements due to students’ AI adoption, which is often
Moreover, some students were aware that relying too heavily on
framed using the comparison between students’ original work with
AI for specific tasks could potentially hinder their skill development;
the work improved by AI. This usage analysis is not only used to
as a result, they desire metrics to quantify such effects. P17 used
showcase students’ AI mastery but also to foster students’ reflection
a bar chart to capture the accumulated negative impact of adopting
to improve their AI mastery. For example, P17 suggested that:
AI in the project across time (Fig. 3, 17(1)). She explained that “there
I’d like to mark some points of the conversation where would be scores for skills such as creative thinking, and whenever the
asking questions or keywords in the back-and-forth student chooses to complete some tasks using AI, there would be some
dialogue [between AI and me] made it possible to deduction [to the scores].” P11 designed a similar chart, but instead
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
of a deduction, she would like the score to increase whenever the human traits from AI interaction and considered the final
student did something manually or had rich interaction with AI, artifacts of the project to be weighted much more than con-
such as many follow-up questions, on a certain task. sidering the student-AI interaction process. P13 commented:
“I think a good human-AI relationship should involve a blended,
4.3.6 AI Impact on Student-Student Collaboration. PBL often in-
mutual engagement, so differentiating our work from AI’s may
volves teamwork, and several participants indicated that the inte-
not be necessary or desirable.”
gration of AI might affect collaboration, warranting assessment.
• AI as an Expert. A third group (e.g., P11, P16) saw AI more
For example, P03 and P18 suggested that the ease of commu-
as an expert resource they could consult, albeit one whose
nicating with AI might discourage students from actively
advice could be subjective, biased, or misleading. P11 noted,
communicating with human teammates, which might be in-
When AI becomes almost perfect, it develops its own
appropriate for students to practice their collaboration skills. In
’thoughts’ or ’goals.’ [...] As a result, I could end up
member checking, three other participants found this point res-
losing my original focus.
onated with their experience. P04 added that “I prefer asking AI for
For these participants, traits like critical thinking were es-
assistance first, then share the results with teammates for discussion.”
sential. They believed reports must assess how cautiously
P03 and P18 recommended an analysis that contrasts the frequency
students integrated suggestions from these AI experts.
and quality of student-student communication against student-AI
communication.
Additionally, P01 suspected divergent attitudes toward AI
within teams would result in conflicts, which should be iden- 4.5 Impacts of Scenarios
tified in the analysis. In member checking, P04 and P16 indicated In Activities 2 & 3 of our workshop, we encouraged participants to
they experienced such conflicts in their projects. P16 mentioned reflect on how AI usage might be analyzed across three contexts:
that: instructor assessment, job-seeking evaluations, and self-assessment.
We generally agree to use AI for topic selection and Participants generally advocated for a holistic, in-depth analysis of
framework building. However, some team members AI usage for instructor assessments to inform learning assessments.
disagree with using AI to generate content due to In contrast, when considering job-seeking, the emphasis shifted
quality and integrity concerns. towards showcasing efficiency in leveraging AI technologies.
For self-assessment, the focus generally turned to empowering
Despite the need for analysis, participants did not develop a
reflection. P16 categorized his AI interactions based on the purposes
framing idea for exposing the AI impact on collaboration, which is
of facilitating learning or merely serving project goals. He exported
worth future research.
to the former ones in his self-assessment report. P07, meanwhile,
4.3.7 Students’ Ethical Awareness in Using AI. A few participants advocated for integrating AI usage data with personal metrics like
suggested whether students used AI responsibly was worth analysis, emotions and heart rates, arguing that this would enrich reflective
for example, obeying regulations (P12) and respecting people’s practices, which aligns with previous research on fostering self-
privacy (P10). The framing idea was mainly posting documentation reflection [38, 69].
of the AI students use. P16 described a framing idea: “Suppose I
used AI to draw a picture, but the AI’s training data that support its
drawing were from several painters, and it would be nice to have a 4.6 Concerns of Reporting Students AI Usage to
tree diagram of the source of this intellectual property.” Enable Assessment
Most participants acknowledged the value of analyzing students’ in-
4.4 Different Envisioned Roles of AI teraction with AI for assessment purposes. However, two concerns
We noticed distinct differences among participants regarding their were raised.
design goals and final reporting frameworks for analyzing their AI First, participants were concerned about the fairness of such
usage. Upon analyzing their rationales during the workshop and the evaluation adds-on. Based on prior experiences with GenAI tools,
member check results, we identified three students’ beliefs on the participants pointed out that students faced difficulties critically
role of AI, each of which significantly influenced how participants evaluating suggestions from powerful AI. P07 noted that students
analyzed and framed AI usage data: might not be “thoughtlessly accepting AI’s suggestions,” but could
• AI as a tool. Some participants viewed AI as a mechanism be settling due to these suggestions being good enough and lack of
to augment human abilities. Statements like “AI should not better alternatives. However, if instructors only rely on student-AI
replace humans in execution” (P05, P14) and “AI should only interactions for assessment, it may result in unfairly low scores
handle trivial tasks” (P09) were shared among this group. for students regarding critical thinking under the assumption that
These individuals were generally interested in highlighting students over-rely on AI. P17 raised an additional concern that
their “leadership in directing the project,” often through lower modifying AI-generated content could be mistakenly attributed to
levels of AI engagement. a student’s critical and creative thinking. The modification only
• AI as a teammate. Another group of participants (e.g., P02, reflects “the student’s external, contextual knowledge that the AI
P04, P13, P18) saw AI more as a collaborator. For them, the lacks.” The inherent limitations in AI’s sensing and understanding
overarching goal was to complete the project effectively as could inadvertently lead to unwarranted accolades for students
a team. As such, they questioned the necessity of separating without careful inspection.
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
Second, some participants (P04, P17, P18) considered students for fine-tuning future student-facing AI tools for PBL. Participants
might “hack” to get a “beautiful” report of AI usage. P17 men- expressed concerns that powerful AI threatens the fairness of assess-
tioned that students were likely to change their learning behavior ments based on student-AI interaction data, since students might
to cater to the better AI usage report recognized by instructors. have limited judgment abilities regarding AI’s outputs and may
merely accept AI’s outputs without question. Our findings echo
5 DISCUSSION the call for adapting AI for educational usage [49]. Future work
Our workshops provided valuable insights into students’ use of AI can explore a more student-centered design of AI. For example,
in future Project-Based Learning (PBL). We found various ways stu- designing personalized student-facing AI tools that align with their
dents might use AI and, from the student’s perspective, the potential capabilities or creating AI systems that scaffold responses based on
learning goal shifts. Our participants generally believed whether the student’s skill levels, offering guidance or direct assistance as
students can effectively use AI would be an important future assess- appropriate.
ment criterion. More importantly, participants suggested that the
student-AI interaction data can not only be used to augment tradi- 5.1.3 How can we support self-regulated learning in AI-enhanced
tional assessments by approaches such as linking project artifacts environments? Self-regulated learning (SRL), which is defined as
to specific AI usage but also offer a window into their higher-order learners actively controlling their learning process [99], is inte-
thinking skills and skills in effectively using AI. However, our anal- gral to PBL and other problem-based learning activities [31, 100].
ysis also revealed nuances, such as varied student beliefs about AI’s While AI tools might offer valuable feedback, there’s a risk that
role in learning, which in turn influence their engagement with the students’ over-reliance on AI could impede critical SRL steps such
technology. Students also raised practical concerns regarding ana- as self-assessment and the independent adjustment of learning
lyzing students’ use of AI to understand student learning, including strategies [99]. Acknowledging this, our study participants sug-
fairness and the potential for hacking behavior by students. In this gested emphasizing the analysis of how students incorporate AI’s
section, by triangulating these findings with existing literature, we outputs to assess whether students are using AI inappropriately
identify new research opportunities in student-AI interaction and (Sec.4.3.4). They also suggest monitoring students’ interactions with
tracking and sensemaking of students’ use of AI for education and AI (Sec.4.3.5), which is relevant to the self-monitoring concept in
HCI researchers. This section also discusses the generalizability of SRL [99]. Similar to previous research [69, 75], our participants’
our results, the limitations of our study, and our future work. design aims to promote documentation and learning analysis prac-
tices to support SRL. Future research should empirically examine
the impacts of AI-enhanced environments on SRL and investigate
5.1 Research Opportunities on Student-AI the effects of documentation and learning analytics on students’ AI
Interaction reliance and autonomy in learning.
5.1.4 How should education practitioners balance the goals of ef-
5.1.1 How do students’ perceptions of AI roles influence educational
fective use of AI and actively learning in future PBL?. PBL engages
interactions? Our research revealed a diversity of opinions among
students in solving real-world problems. But there is a risk that
participants regarding the roles AI should assume, ranging from a
students may fall into a situation where the “doing” of a project
tool to a teammate or an expert. These roles significantly influence
takes precedence over “doing with understanding” [6]. In previous
their conjectures on AI utilization and the subsequent analysis of
PBL, these two goals have had the potential to complement each
such use. Previous HCI research has explored various potential roles
other, as succeeding in practical tasks generally requires students to
for AI, including those of an “assistant” [30, 96], “mediator,” [25, 30]
develop specific skill sets. However, the advent of AI technologies
or “equal decision-maker” [97]. However, the discussion focuses
adds a layer of complexity. Many participants considered practicing
primarily on the implications of the designers’ framing of AI roles for
and demonstrating skills in effectively utilizing AI important for
end-users. With the evolution of AI towards serving more general
future PBL (Sec. 4.2). They considered tasks that AI can do better
purposes [83], users have much more autonomy in using AI in their
should be delegated to AI. These ideas echo previous research on
desired way. Our findings suggest that users’ beliefs about what
effective human-AI collaboration in the workplace [34, 73, 96]. In
roles AI should play also matter, which deserves future research
this way, the growing capability of AI suggests students would be
on the broader impacts. For example, in the educational contexts
in an oversight position for many tasks in PBL, including some that
examined in this study, mismatched beliefs about AI’s role between
require creativity and critical thinking, which will help students
students or between students and teachers may create conflicts or
understand knowledge better. However, PBL’s foundation lies in
result in ineffective pedagogical designs.
constructive learning theories, where students learn through active
engagement [9, 32]. The task delegation to AI can bypass these
5.1.2 How can we tailor AI for students to use in PBL?. In our work- critical active learning steps. To this end, the goal of effective use
shops, we encouraged students to envision utilizing any AI tools of AI could harm students’ active learning. Future research should
in PBL. However, another potential future learning environment investigate how to balance these two goals. One opportunity is to
involves students using AI that is specifically fine-tuned for educa- instruct students to use AI in a way that they can actively construct
tion purposes, suggested by the development of domain-specific knowledge. For example, many participants mentioned students
GenAIs [87, 95]. Our study findings reveal potential friction when should spend time carefully crafting and guiding AI to get effective
students use powerful general-purpose AI and suggest directions assistance from AI. Future research can study whether, in the input
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
crafting and engagement process, students can “construct and re- • The lineage from students’ AI usage to their solutions. Solu-
construct” knowledge mentally and actively learn from the process. tions that students come up with, such as artifacts in PBL, are
still considered important assessment materials. Connecting
students’ AI usage to the corresponding parts of the solu-
tion might help education practitioners understand students’
5.1.5 What are the impacts of AI on communication in education? contributions.
Some participants wanted to use AI to partially, if not totally, replace
the instructors’ position in PBL, such as providing feedback on solu- It is non-trivial to collect these data. The first two types of data
tions and guiding students to learn. Such AI usage might not be ap- might need input from students. Previous HCI research studies
propriate as although PBL is student-centered, instructors still play how students document their artifact-based learning data [75] or
a significant role in it [31]. Without adequate student-instructor multi-modal learning data [69] and how interactive tools might
communication, students might learn in a direction that does not help with the documentation. Future research can look into how
match the curriculum and instructors’ teaching plans. Besides, AI students document their motivation and thoughts when using AI
could also impact student-student communication (Sec. 4.3.6). Fu- during learning, investigate what specific challenges students can
ture research should consider more comprehensively examining encounter, and what designs of documentation tools can be helpful.
the effects of AI on educational communications, especially with Moreover, research on information provenance through interac-
longitudinal study design. tions [23, 41] can provide insights for collecting the third data
type. For example, future research can explore how to reify the
transformation from AI outputs to solutions.
5.2 Research Opportunities on Tracking and
Sensemaking Students Use of AI
5.2.1 How can we support the collection of data around students use 5.2.2 How can we make sense of AI’s contribution based on students-
of AI?. The first step to analyzing students’ AI usage is to collect AI interaction data? Students considered making sense of AI’s con-
relevant data. While the interaction log of students and AI serves as tribution to the project essential. Some participants provided vari-
the most direct data, our findings provide insights into several other ous ideas on the computing methods of AI’s contribution, but others
types of data worth collecting from students’ AI usage, including: suspected that the evolving complex human-AI interaction would
make it difficult to disentangle the contributions of two parties.
• Contexts when using AI . Our study shows the analysis needs These findings echo an early discussion on human-AI symbiosis.
to differ based on when and why students use AI. For exam- Licklider [39] conjectures it is difficult to separate the contribution
ple, when AI is used for automating tasks, the focus would of humans and AI in decision-making. But Licklider [39] also men-
be on the types of learning tasks managed by AI and student tions that, overall, humans should provide leading contributions
proficiency with AI. For using AI for feedback on solutions, by doing tasks such as goal setting and judgments. Future research
participants expect to examine the detailed process of how should further explore signals of whether students are in the leading
students incorporate AI suggestions. position when collaborating with AI. We believe the signals should
• Students’ thoughts and actions with AI’s suggestions. The anal- not necessarily be single values, such as percentages, as many par-
ysis theme favored by our participants, “the process of incor- ticipants imagined (e.g., Fig. 3, P11). One might study how to gather
porating AI’s ideas into the project,” requires examination qualitative and quantitative evidence from student-AI interaction
of students’ thoughts and actions. data on whether students are leading their projects compared to
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
AI and invite education practitioners to engage in sensemaking of space and how to support fair and comprehensive sensemaking
students’ and AI’s contribution more comprehensively. of students’ use of AI. For example, one may study how to in-
volve students themselves in the interpretation better, considering
students’ self-assessments are always considered essential for suc-
5.2.3 How can we support sensemaking of students’ use of AI from cessful PBL [76].
multiple perspectives? Our study provides insights into the diverse
lenses one can adopt to analyze students’ use of AI. Moreover, our
study reveals intriguing complexities regarding the values students 5.2.4 How can we motivate students to document their use of AI
attach to using AI, which significantly impact the sensemaking and report it honestly? Our participants admitted that if their use of
of the analysis results. The diversity of values aligns with and AI is considered one of the ways to assess their learning, they will
amplifies the “one chart, many meanings” consideration in learning probably hack up a nice report of AI usage to get a higher grade.
analytics [1]. This matches with teachers’ expectation that students might not tell
For example, the analysis of question-and-answer rounds and how they use AI honestly [36]. Future research can study in what
time spent communicating with AI serves divergent purposes for ways we can motivate students to document and report their use
different student groups. One faction sees fewer rounds and shorter of AI faithfully. Literature provides some potential directions. First,
time as evidence of students’ efficient mastery over AI. Conversely, education practitioners might leverage various methods to commu-
another group interprets more rounds and longer time as indicative nicate the benefits of faithful AI usage documentation and report
of a careful, critical engagement with AI’s suggestions. Likewise, to students. Xia et al. [89] proposed to use visualization to nudge
students understand the pie charts showing AI’s impact on project students to reflect on their behavior of “gaming the system”. In our
results differently. Some participants, like P09 and P12, aim for a case, one might communicate with students how their AI usage
more minor AI contribution arc to highlight their significant human- might negatively impact their learning, how documentation might
led efforts. Others, such as P02 and P16, aspire to demonstrate balance that, and how an honest report can help instructors provide
a larger AI contribution to showcase their ability to leverage AI better instructions. Second, the assessment of students should not
capabilities fully. only be based on students’ reports of their AI usage. Instructors
Another nuanced example is found in the analysis of task alloca- might emphasize that such a report is used to understand students’
tion between students and AI (see Sec. 4.3.1). A group of students learning, and the final assessment would be made by synthesizing
aims to analyze whether focusing on certain tasks leads to better multiple factors. Overall, we propose that the documentation of stu-
learning. Another group uses the data to show that humans and AI dents’ use of AI should be framed as helping students better learn
are suited for different tasks. As a result, while both groups agree on instead of as a grading tool to motivate them in documentation and
using AI for repetitive tasks and humans for creative and decision- reporting.
making roles, the first group values this for its educational benefits,
and the second sees it as practical due to AI’s current limits. How-
ever, as AI evolves to become more personalized, context-sensitive, 5.3 Limitation & Future Work
and creative, the perspectives of the second group suggest that roles This paper presents a qualitative investigation into the potential
involving critical thinking, decision-making, and creativity may future of students’ use of AI in PBL based on workshops with 18
increasingly be transferred to AI (P04 and P16 already have such college students. Our participants are from four East Asian institu-
a tendency), which conflict with the educational ideals of the first tions with diverse major backgrounds. Given the qualitative nature
group. of our study, we cannot assert that our findings generalize to a
In the above cases, we do not seek to discuss which values are broad range of scenarios (e.g., PBL in courses not engaged by our
more “correct” or beneficial. However, such diversity underscores participants) or to a larger population (e.g., students from other
the need for education practitioners to interpret students’ use of institutions) in a statistical-probabilistic sense, nor can we ensure
AI carefully. Future research should examine the interpretation their applicability over extended periods [74]. Nevertheless, the
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
qualitative approach of this paper lets us dive into a growing impor- their essential role in facilitating this research. Last but not least,
tant learning scenario (i.e., AI-enhanced PBL and its assessment) we appreciate Lennart Nacke’s insightful input during the revision
due to the rapid development of AI, provide in-depth insights into process and Cayley MacArthur and Marvin Pafla’s insightful review
students’ beliefs and needs, and motivate relevant future research. of our work at the University of Waterloo’s HCI group meeting.
Future research could build upon our work by quantitatively exam-
ining our findings, including the effectiveness of various analytic
designs, students’ anticipated roles for AI, and the influence of sce-
REFERENCES
[1] June Ahn, Fabio Campos, Maria Hays, and Daniela DiGiacomo. 2019. Designing
narios on students’ needs, using larger student samples and more in Context: Reaching beyond Usability in Learning Analytics Dashboard Design.
extended study periods. Journal of Learning Analytics 6, 2 (2019), 70–85.
Our study also presents several additional limitations. First, the [2] Carlos Prieto Alvarez, Roberto Martinez-Maldonado, and Simon Bucking-
ham Shum. 2020. LA-DECK: A card-based learning analytics co-design tool. In
format of our investigation is limited to 3-hour workshops, while Proceedings of the tenth international conference on learning analytics & knowl-
the PBL usually extends over weeks or even months. While we edge. 63–72.
[3] James Auger. 2010. Alternative Presents and Speculative Futures: Designing
prompted participants to draw upon their prior long-term PBL ex- fictions through the extrapolation and evasion of product lineages. Negotiating
periences for our activities, a longitudinal study involving actual futures–Design Fiction. 6 (2010), 42–57.
PBL settings is a promising next step. Such a study could yield [4] Benjamin Bach, Mandy Keck, Fateme Rajabiyazdi, Tatiana Losev, Isabel Meirelles,
Jason Dykes, Robert S Laramee, Mashael AlKadi, Christina Stoiber, Samuel
deeper insights into how students would like to interact with AI, Huron, et al. 2023. Challenges and Opportunities in Data Visualization Educa-
and analyze and present AI usage data. Second, the co-design activi- tion: A Call to Action. arXiv preprint arXiv:2308.07703 (2023).
ties in our study were based on hypothetical AI usage, driven by our [5] Suyun Sandra Bae, Oh-Hyun Kwon, Senthil Chandrasegaran, and Kwan-Liu
Ma. 2020. Spinneret: Aiding creative ideation through non-obvious concept
aim for generalizability in light of rapidly advancing technology. associations. In Proceedings of the 2020 CHI Conference on Human Factors in
However, hands-on experience with AI in PBL is invaluable for Computing Systems. 1–13.
[6] Brigid JS Barron, Daniel L Schwartz, Nancy J Vye, Allison Moore, Anthony
generating more nuanced perspectives on how AI can be leveraged. Petrosino, Linda Zech, and John D Bransford. 1998. Doing with understanding:
As an extension to our current work, we envision encouraging stu- Lessons from research on problem-and project-based learning. Journal of the
dents to employ existing AI tools in the aforementioned long-term learning sciences 7, 3-4 (1998), 271–311.
[7] Stephanie Bell. 2010. Project-based learning for the 21st century: Skills for the
study while speculating on desired future capabilities. Lastly, our future. The clearing house 83, 2 (2010), 39–43.
workshops primarily focused on eliciting student perspectives. In- [8] Linda Birt, Suzanne Scott, Debbie Cavers, Christine Campbell, and Fiona Walter.
corporating the viewpoints of educators by exposing the findings of 2016. Member checking: a tool to enhance trustworthiness or merely a nod to
validation? Qualitative health research 26, 13 (2016), 1802–1811.
our workshops could provide a more comprehensive understanding [9] Phyllis C Blumenfeld, Elliot Soloway, Ronald W Marx, Joseph S Krajcik, Mark
and assessment of students’ AI usage suggestions. Guzdial, and Annemarie Palincsar. 1991. Motivating project-based learning:
Sustaining the doing, supporting the learning. Educational psychologist 26, 3-4
(1991), 369–398.
6 CONCLUSION [10] Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology.
Qualitative research in psychology 3, 2 (2006), 77–101.
In conclusion, this paper presented a co-design study exploring the [11] Cecilia Ka Yuk Chan and Wenjie Hu. 2023. Students’ Voices on Generative
potential of utilizing students’ AI usage data to understand student AI: Perceptions, Benefits, and Challenges in Higher Education. arXiv preprint
arXiv:2305.00290 (2023).
learning in project-based learning (PBL). The study provided in- [12] Cheng-Huan Chen and Yong-Cih Yang. 2019. Revisiting the effects of project-
sights into the opportunities and challenges of analyzing students’ based learning on students’ academic achievement: A meta-analysis investigat-
ing moderators. Educational Research Review 26 (2019), 71–81.
AI usage data. Participants envisioned how they would use AI in [13] Zhutian Chen, Chenyang Zhang, Qianwen Wang, Jakob Troidl, Simon Warchol,
future PBL and highlighted the impact of AI on assessment trans- Johanna Beyer, Nils Gehlenborg, and Hanspeter Pfister. 2023. Beyond Gener-
formation. They proposed various designs to analyze students’ use ating Code: Evaluating GPT on a Data Visualization Course. arXiv preprint
arXiv:2306.02914 (2023).
of AI to examine students’ skills, decision-making processes, and [14] EunJeong Cheon, Stephen Tsung-Han Sher, Šelma Sabanović, and Nor-
ethical awareness in using AI. We also found different students have man Makoto Su. 2019. I beg to differ: Soft conflicts in collaborative design
different beliefs in the role AI should play in their projects, from a using design fictions. In Proceedings of the 2019 on Designing Interactive Systems
Conference. 201–214.
tool that augments their abilities to a teammate or expert. Such be- [15] EunJeong Cheon and Norman Makoto Su. 2018. Futuristic autobiographies:
lief impacts how they want to use AI and report their AI usage. This Weaving participant narratives to elicit values around robots. In Proceedings
research contributes to the HCI community by offering insights of the 2018 ACM/IEEE International Conference on Human-Robot Interaction.
388–397.
into future practices related to AI usage in education and informing [16] Nassim Dehouche and Kullathida Dehouche. 2023. What’s in a text-to-image
the design of AI education systems, project documentation tools, prompt? The potential of stable diffusion in visual arts education. Heliyon
(2023).
and learning analytics systems. It advances our understanding of [17] Chris Elsden, Bettina Nissen, Andrew Garbett, David Chatting, David Kirk, and
how AI can shape student learning and assessment in PBL contexts. John Vines. 2016. Metadating: exploring the romance and future of personal
data. In Proceedings of the 2016 chi conference on human factors in computing
systems. 685–698.
[18] Joel E Fischer. 2023. Generative AI Considered Harmful. (2023).
ACKNOWLEDGMENTS [19] Greg Guest, Kathleen M MacQueen, and Emily E Namey. 2011. Applied thematic
analysis. sage publications.
This work is supported by the 30 for 30 Research Initiative Scheme [20] Biyang Guo, Xin Zhang, Ziyuan Wang, Minqi Jiang, Jinran Nie, Yuxuan Ding,
Jianwei Yue, and Yupeng Wu. 2023. How Close is ChatGPT to Human Experts?
(project no. 3030_003) from the Hong Kong University of Science Comparison Corpus, Evaluation, and Detection. arXiv:2301.07597 [cs.CL]
and Technology. Zhenhui Peng is supported by the Young Scientists [21] Pengyue Guo, Nadira Saab, Lysanne S Post, and Wilfried Admiraal. 2020. A
Fund of the National Natural Science Foundation of China with review of project-based learning in higher education: Student outcomes and
measures. International journal of educational research 102 (2020), 101586.
Grant No. 62202509. We are grateful to the anonymous reviewers [22] Reza Hadi Mogavi, Chao Deng, Justin Juho Kim, Pengyuan Zhou, Young D.
for their insightful feedback and to our workshop participants for Kwon, Ahmed Hosny Saleh Metwally, Ahmed Tlili, Simone Bassanelli, Antonio
Conference’17, July 2017, Washington, DC, USA Zheng, et al.
Bucchiarone, Sujit Gujar, Lennart E. Nacke, and Pan Hui. 2024. ChatGPT in Systems. 1–25.
education: A blessing or a curse? A qualitative study exploring early adopters’ [45] David Mhlanga. 2023. Open AI in education, the responsible and ethical use of
utilization and perceptions. Computers in Human Behavior: Artificial Humans 2, ChatGPT towards lifelong learning. Education, the Responsible and Ethical Use
1 (2024), 100027. https://doi.org/10.1016/j.chbah.2023.100027 of ChatGPT Towards Lifelong Learning (February 11, 2023) (2023).
[23] Han L Han, Junhang Yu, Raphael Bournet, Alexandre Ciorascu, Wendy E Mackay, [46] Midjourney. 2023. Midjourney. https://www.midjourney.com/
and Michel Beaudouin-Lafon. 2022. Passages: interacting with text across https://www.midjourney.com/.
documents. In Proceedings of the 2022 CHI Conference on Human Factors in [47] Reza Hadi Mogavi, Xiaojuan Ma, and Pan Hui. 2021. Characterizing student
Computing Systems. 1–17. engagement moods for dropout prediction in question pool websites. arXiv
[24] Avneet Hira and Emma Anderson. 2021. Motivating online learning through preprint arXiv:2102.00423 (2021).
project-based learning during the 2020 COVID-19 pandemic. IAFOR Journal of [48] Ethan Mollick and Lilach Mollick. 2023. Assigning AI: Seven Approaches for
Education 9, 2 (2021), 93–110. Students, with Prompts. arXiv preprint arXiv:2306.10052 (2023).
[25] Maurice Jakesch, Megan French, Xiao Ma, Jeffrey T Hancock, and Mor Naaman. [49] Emiliana Murgia, Zahra Abbasiantaeb, Mohammad Aliannejadi, Theo Huibers,
2019. AI-mediated communication: How the perception that profile text was Monica Landoni, and Maria Soledad Pera. 2023. ChatGPT in the Classroom: A
written by AI affects trustworthiness. In Proceedings of the 2019 CHI Conference Preliminary Exploration on the Feasibility of Adapting ChatGPT to Support Chil-
on Human Factors in Computing Systems. 1–13. dren’s Information Discovery. In Adjunct Proceedings of the 31st ACM Conference
[26] Martin Jonsson and Jakob Tholander. 2022. Cracking the code: Co-coding with on User Modeling, Adaptation and Personalization. 22–27.
AI in creative programming education. In Proceedings of the 14th Conference on [50] Michael Neumann, Maria Rauschenberger, and Eva-Maria Schön. 2023. “We
Creativity and Cognition. 5–14. Need To Talk About ChatGPT”: The Future of AI and Higher Education. (2023).
[27] Enkelejda Kasneci, Kathrin Seßler, Stefan Küchemann, Maria Bannert, Daryna [51] University of Cambridge. 2023. Retrieved September 11, 2023
Dementieva, Frank Fischer, Urs Gasser, Georg Groh, Stephan Günnemann, Eyke from https://www.plagiarism.admin.cam.ac.uk/what-academic-misconduct/
Hüllermeier, et al. 2023. ChatGPT for good? On opportunities and challenges of artificial-intelligence
large language models for education. Learning and individual differences 103 [52] The University of Hong Kong. 2023. Retrieved September 11, 2023 from
(2023), 102274. https://tl.hku.hk/2023/02/about-chatgpt/
[28] Majeed Kazemitabaar, Justin Chow, Carl Ka To Ma, Barbara J Ericson, David [53] University of Oxford. 2023. Unauthorised use of AI in exams and assessment.
Weintrop, and Tovi Grossman. 2023. Studying the effect of AI Code Generators Retrieved September 11, 2023 from https://academic.admin.ox.ac.uk/article/
on Supporting Novice Learners in Introductory Programming. In Proceedings of unauthorised-use-of-ai-in-exams-and-assessment
the 2023 CHI Conference on Human Factors in Computing Systems. 1–23. [54] The Hong Kong University of Science and Technology. 2023. Retrieved
[29] Ahmed Kharrufa, Sally Rix, Timur Osadchiy, Anne Preston, and Patrick Olivier. September 11, 2023 from https://cei.hkust.edu.hk/en-hk/education-innovation/
2017. Group Spinner: recognizing and visualizing learning in the classroom generative-ai-education/guidelines-and-policies
for reflection, communication, and planning. In Proceedings of the 2017 CHI [55] University of Washington. 2023. ChatGPT and other AI-based tools. Re-
Conference on Human Factors in Computing Systems. 5556–5567. trieved September 11, 2023 from https://teaching.washington.edu/course-
[30] Taenyun Kim, Maria D Molina, Minjin Rheu, Emily S Zhan, and Wei Peng. 2023. design/chatgpt/
One AI Does Not Fit All: A Cluster Analysis of the Laypeople’s Perception of AI [56] Judith S Olson and Wendy A Kellogg. 2014. Ways of Knowing in HCI. Vol. 2.
Roles. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Springer.
Systems. 1–20. [57] OpenAI. 2023. ChatGPT. https://openai.com/chatgpt
[31] Dimitra Kokotsaki, Victoria Menzies, and Andy Wiggins. 2016. Project-based https://openai.com/chatgpt.
learning: A review of the literature. Improving schools 19, 3 (2016), 267–277. [58] OpenAI. 2023. Educator FAQ. Retrieved September 11, 2023
[32] Joseph S Krajcik and Phyllis C Blumenfeld. 2006. Project-based learning. na. from https://help.openai.com/en/collections/5929286-educator-faq
[33] Tiffany H Kung, Morgan Cheatham, Arielle Medenilla, Czarina Sillos, Lo- https://help.openai.com/en/collections/5929286-educator-faq.
rie De Leon, Camille Elepaño, Maria Madriaga, Rimel Aggabao, Giezel Diaz- [59] Zachary A Pardos, Matthew Tang, Ioannis Anastasopoulos, Shreya K Sheel, and
Candido, James Maningo, et al. 2023. Performance of ChatGPT on USMLE: Ethan Zhang. 2023. OATutor: An Open-source Adaptive Tutoring System and
Potential for AI-assisted medical education using large language models. PLoS Curated Content Library for Learning Sciences Research. In Proceedings of the
digital health 2, 2 (2023), e0000198. 2023 CHI Conference on Human Factors in Computing Systems. 1–17.
[34] Vivian Lai, Samuel Carton, Rajat Bhatnagar, Q Vera Liao, Yunfeng Zhang, and [60] Zhenhui Peng, Xingbo Wang, Qiushi Han, Junkai Zhu, Xiaojuan Ma, and Huamin
Chenhao Tan. 2022. Human-ai collaboration via conditional delegation: A case Qu. 2023. Storyfier: Exploring Vocabulary Learning Support with Text Genera-
study of content moderation. In Proceedings of the 2022 CHI Conference on Human tion Models. In Proceedings of the 36th Annual ACM Symposium on User Interface
Factors in Computing Systems. 1–18. Software and Technology. 1–16.
[35] Darren Hayes Lamb. 2003. Project based learning in an applied construction [61] Beatriz Pérez and Ángel L Rubio. 2020. A project-based learning approach for
curriculum. (2003). enhancing learning skills and motivation in software engineering. In Proceedings
[36] Sam Lau and Philip J Guo. 2023. From" Ban It Till We Understand It" to" of the 51st ACM technical symposium on computer science education. 309–315.
Resistance is Futile": How University Programming Instructors Plan to Adapt as [62] Lara Piccolo, Daniel Buzzo, Martin Knobel, Prasanna Gunasekera, and Tina
More Students Use AI Code Generation and Explanation Tools such as ChatGPT Papathoma. 2023. Interaction Design as Project-Based Learning: Perspectives
and GitHub Copilot. (2023). for Unsolved Challenges. In Proceedings of the 5th Annual Symposium on HCI
[37] Yen-Fen Lee, Gwo-Jen Hwang, and Pei-Ying Chen. 2022. Impacts of an AI- Education. 59–67.
based cha bot on college students’ after-class review, academic performance, [63] Carlos G Prieto-Alvarez, Roberto Martinez-Maldonado, and Theresa Dirndorfer
self-efficacy, learning attitude, and motivation. Educational technology research Anderson. 2018. Co-designing learning analytics tools with learners. In Learning
and development 70, 5 (2022), 1843–1865. analytics in the classroom. Routledge, 93–110.
[38] Ian Li, Anind Dey, Jodi Forlizzi, Kristina Höök, and Yevgeniy Medynskiy. 2011. [64] Carlos Gerardo Prieto-Alvarez, Roberto Martinez-Maldonado, and Simon Buck-
Personal informatics and HCI: design, theory, and social implications. In CHI’11 ingham Shum. 2018. Mapping learner-data journeys: Evolution of a visual
Extended Abstracts on Human Factors in Computing Systems. 2417–2420. co-design tool. In Proceedings of the 30th Australian conference on computer-
[39] Joseph CR Licklider. 1960. Man-computer symbiosis. IRE transactions on human human interaction. 205–214.
factors in electronics 1 (1960), 4–11. [65] Junaid Qadir. 2023. Engineering education in the era of ChatGPT: Promise and
[40] Joseph Lindley and Paul Coulton. 2015. Back to the future: 10 years of design pitfalls of generative AI for education. In 2023 IEEE Global Engineering Education
fiction. In Proceedings of the 2015 British HCI conference. 210–211. Conference (EDUCON). IEEE, 1–9.
[41] Siân E Lindley, Gavin Smyth, Robert Corish, Anastasia Loukianov, Michael [66] Tareq Rasul, Sumesh Nair, Diane Kalendra, Mulyadi Robin, Fernando de
Golembewski, Ewa A Luger, and Abigail Sellen. 2018. Exploring new metaphors Oliveira Santini, Wagner Junior Ladeira, Mingwei Sun, Ingrid Day, Raouf Ah-
for a networked world through the file biography. In Proceedings of the 2018 CHI mad Rather, and Liz Heathcote. 2023. The role of ChatGPT in higher education:
Conference on Human Factors in Computing Systems. 1–12. Benefits, challenges, and future research directions. Journal of Applied Learning
[42] Duri Long and Brian Magerko. 2020. What is AI literacy? Competencies and and Teaching 6, 1 (2023).
design considerations. In Proceedings of the 2020 CHI conference on human factors [67] Dan Richardson and Ahmed Kharrufa. 2020. We are the greatest showmen:
in computing systems. 1–16. Configuring a framework for project-based mobile learning. In Proceedings of
[43] Michal Luria. 2023. Co-Design Perspectives on Algorithm Transparency Re- the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.
porting: Guidelines and Prototypes. In Proceedings of the 2023 ACM Conference [68] Danae Romrell, Lisa Kidder, and Emma Wood. 2014. The SAMR model as a
on Fairness, Accountability, and Transparency. 1076–1087. framework for evaluating mLearning. Online Learning Journal 18, 2 (2014).
[44] Shuai Ma, Taichang Zhou, Fei Nie, and Xiaojuan Ma. 2022. Glancee: An adaptable [69] Ethan Z Rong, Mo Morgana Zhou, Ge Gao, and Zhicong Lu. 2023. Understanding
system for instructors to grasp student learning status in synchronous online Personal Data Tracking and Sensemaking Practices for Self-Directed Learning
classes. In Proceedings of the 2022 CHI Conference on Human Factors in Computing in Non-classroom and Non-computer-based Contexts. In Proceedings of the 2023
CHI Conference on Human Factors in Computing Systems. 1–16.
Charting the Future of AI in Project-Based Learning Conference’17, July 2017, Washington, DC, USA
[70] Jürgen Rudolph, Samson Tan, and Shannon Tan. 2023. ChatGPT: Bullshit spewer [86] Jordan Wirfs-Brock, Sarah Mennicken, and Jennifer Thom. 2020. Giving voice
or the end of traditional assessments in higher education? Journal of Applied to silent data: Designing with personal music listening history. In Proceedings
Learning and Teaching 6, 1 (2023). of the 2020 CHI Conference on Human Factors in Computing Systems. 1–11.
[71] Juan Pablo Sarmiento and Alyssa Friend Wise. 2022. Participatory and co- [87] Shijie Wu, Ozan Irsoy, Steven Lu, Vadim Dabravolski, Mark Dredze, Sebas-
design of learning analytics: An initial review of the literature. In LAK22: 12th tian Gehrmann, Prabhanjan Kambadur, David Rosenberg, and Gideon Mann.
international learning analytics and knowledge conference. 535–541. 2023. Bloomberggpt: A large language model for finance. arXiv preprint
[72] Douglas Schuler and Aki Namioka. 1993. Participatory design: Principles and arXiv:2303.17564 (2023).
practices. CRC Press. [88] Tongshuang Wu, Michael Terry, and Carrie Jun Cai. 2022. Ai chains: Transpar-
[73] Chuhan Shi, Yicheng Hu, Shenan Wang, Shuai Ma, Chengbo Zheng, Xiaojuan ent and controllable human-ai interaction by chaining large language model
Ma, and Qiong Luo. 2023. RetroLens: A Human-AI Collaborative System for prompts. In Proceedings of the 2022 CHI conference on human factors in computing
Multi-step Retrosynthetic Route Planning. In Proceedings of the 2023 CHI Con- systems. 1–22.
ference on Human Factors in Computing Systems. 1–20. [89] Meng Xia, Yuya Asano, Joseph Jay Williams, Huamin Qu, and Xiaojuan Ma. 2020.
[74] Brett Smith. 2018. Generalizability in qualitative research: Misunderstandings, Using information visualization to promote students’ reflection on" gaming the
opportunities and recommendations for the sport and exercise sciences. Quali- system" in online learning. In Proceedings of the Seventh ACM Conference on
tative research in sport, exercise and health 10, 1 (2018), 137–149. Learning@ Scale. 37–49.
[75] Sarah Sterman, Molly Jane Nicholas, Janaki Vivrekar, Jessica R Mindel, and [90] Meng Xia, Mingfei Sun, Huan Wei, Qing Chen, Yong Wang, Lei Shi, Huamin
Eric Paulos. 2023. Kaleidoscope: A Reflective Documentation Tool for a User Qu, and Xiaojuan Ma. 2019. Peerlens: Peer-inspired interactive learning path
Interface Design Course. In Proceedings of the 2023 CHI Conference on Human planning in online question pool. In Proceedings of the 2019 CHI conference on
Factors in Computing Systems. 1–19. human factors in computing systems. 1–12.
[76] John Thomas. 2000. A Review of Research on Project-Based Learning. (01 2000). [91] Kexin Bella Yang, Vanessa Echeverria, Zijing Lu, Hongyu Mao, Kenneth Holstein,
[77] Yao Tian, Chengwei Tong, Lik-Hang Lee, Reza Hadi Mogavi, Yong Liao, and Nikol Rummel, and Vincent Aleven. 2023. Pair-Up: Prototyping Human-AI Co-
Pengyuan Zhou. 2023. Last Week with ChatGPT: A Weibo Study on Social orchestration of Dynamic Transitions between Individual and Collaborative
Perspective regarding ChatGPT for Education and Beyond. arXiv preprint Learning in the Classroom. In Proceedings of the 2023 CHI Conference on Human
arXiv:2306.04325 (2023). Factors in Computing Systems. 1–17.
[78] Rama Adithya Varanasi and Nitesh Goyal. 2023. “It is currently hodgepodge”: [92] Nur Yildirim, Changhoon Oh, Deniz Sayar, Kayla Brand, Supritha Challa, Violet
Examining AI/ML Practitioners’ Challenges during Co-production of Responsi- Turri, Nina Crosby Walton, Anna Elise Wong, Jodi Forlizzi, James McCann, et al.
ble AI Values. In Proceedings of the 2023 CHI Conference on Human Factors in 2023. Creating Design Resources to Scaffold the Ideation of AI Concepts. In
Computing Systems. 1–17. Proceedings of the 2023 ACM Designing Interactive Systems Conference. 2326–
[79] Dakuo Wang, Justin D Weisz, Michael Muller, Parikshit Ram, Werner Geyer, 2346.
Casey Dugan, Yla Tausczik, Horst Samulowitz, and Alexander Gray. 2019. [93] JD Zamfirescu-Pereira, Richmond Y Wong, Bjoern Hartmann, and Qian Yang.
Human-ai collaboration in data science: Exploring data scientists’ perceptions of 2023. Why Johnny can’t prompt: how non-AI experts try (and fail) to design
automated ai. Proceedings of the ACM on human-computer interaction 3, CSCW LLM prompts. In Proceedings of the 2023 CHI Conference on Human Factors in
(2019), 1–24. Computing Systems. 1–21.
[80] Qiaosi Wang, Shan Jing, and Ashok K Goel. 2022. Co-Designing AI Agents [94] Ashley Ge Zhang, Yan Chen, and Steve Oney. 2023. VizProg: Identifying Misun-
to Support Social Connectedness Among Online Learners: Functionalities, So- derstandings By Visualizing Students’ Coding Progress. In Proceedings of the
cial Characteristics, and Ethical Challenges. In Designing Interactive Systems 2023 CHI Conference on Human Factors in Computing Systems. 1–16.
Conference. 541–556. [95] Hongbo Zhang, Junying Chen, Feng Jiang, Fei Yu, Zhihong Chen, Jianquan
[81] Qiaosi Wang, Michael Madaio, Shaun Kane, Shivani Kapania, Michael Terry, Li, Guiming Chen, Xiangbo Wu, Zhiyi Zhang, Qingying Xiao, et al. 2023. Hu-
and Lauren Wilcox. 2023. Designing Responsible AI: Adaptations of UX Practice atuoGPT, towards Taming Language Model to Be a Doctor. arXiv preprint
to Meet Responsible AI Challenges. In Proceedings of the 2023 CHI Conference arXiv:2305.15075 (2023).
on Human Factors in Computing Systems. 1–16. [96] Chengbo Zheng, Dakuo Wang, April Yi Wang, and Xiaojuan Ma. 2022. Telling
[82] Qiaosi Wang, Koustuv Saha, Eric Gregori, David Joyner, and Ashok Goel. 2021. stories from computational notebooks: Ai-assisted presentation slides creation
Towards mutual theory of mind in human-ai interaction: How language reflects for presenting data science work. In Proceedings of the 2022 CHI Conference on
what students perceive about a virtual teaching assistant. In Proceedings of the Human Factors in Computing Systems. 1–20.
2021 CHI conference on human factors in computing systems. 1–14. [97] Chengbo Zheng, Yuheng Wu, Chuhan Shi, Shuai Ma, Jiehui Luo, and Xiaojuan
[83] Jason Wei, Yi Tay, Rishi Bommasani, Colin Raffel, Barret Zoph, Sebastian Ma. 2023. Competent but Rigid: Identifying the Gap in Empowering AI to
Borgeaud, Dani Yogatama, Maarten Bosma, Denny Zhou, Donald Metzler, Participate Equally in Group Decision-Making. In Proceedings of the 2023 CHI
et al. 2022. Emergent abilities of large language models. arXiv preprint Conference on Human Factors in Computing Systems. 1–19.
arXiv:2206.07682 (2022). [98] Ce Zhou, Qian Li, Chen Li, Jun Yu, Yixin Liu, Guangjing Wang, Kai Zhang,
[84] Jason Wei, Xuezhi Wang, Dale Schuurmans, Maarten Bosma, Fei Xia, Ed Chi, Cheng Ji, Qiben Yan, Lifang He, et al. 2023. A comprehensive survey on pre-
Quoc V Le, Denny Zhou, et al. 2022. Chain-of-thought prompting elicits rea- trained foundation models: A history from bert to chatgpt. arXiv preprint
soning in large language models. Advances in Neural Information Processing arXiv:2302.09419 (2023).
Systems 35 (2022), 24824–24837. [99] Barry J Zimmerman. 2002. Becoming a self-regulated learner: An overview.
[85] Daniel Weitekamp, Erik Harpstead, and Ken R Koedinger. 2020. An interaction Theory into practice 41, 2 (2002), 64–70.
design for machine teaching to develop AI tutors. In Proceedings of the 2020 CHI [100] Barry J Zimmerman and Magda Campillo. 2003. Motivating self-regulated
conference on human factors in computing systems. 1–11. problem solvers. The psychology of problem solving 233262 (2003).