0% found this document useful (0 votes)
37 views24 pages

Jatin PDF

The document is a project report for an AI Virtual Mouse developed by Jatin Kaushal and his team at Allenhouse Public School as part of their Class XII curriculum. The project utilizes eye tracking technology to control a computer mouse through eye movements, enhancing accessibility for individuals with physical disabilities. The report includes acknowledgments, team roles, project planning, and a communication plan detailing the collaborative efforts of the team members throughout the project.

Uploaded by

jatinkaushaltwo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views24 pages

Jatin PDF

The document is a project report for an AI Virtual Mouse developed by Jatin Kaushal and his team at Allenhouse Public School as part of their Class XII curriculum. The project utilizes eye tracking technology to control a computer mouse through eye movements, enhancing accessibility for individuals with physical disabilities. The report includes acknowledgments, team roles, project planning, and a communication plan detailing the collaborative efforts of the team members throughout the project.

Uploaded by

jatinkaushaltwo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

CAPSTONE PROJECT (AI

VIRTUAL MOUSE)
CLASS XII

Project Report
Submitted by:
Jatin Kaushal
Board Roll No -___________________

All India Senior Secondary Certificate Examination


Session 2024-25

Guided By
Mr. Vikas Tiwari
CERTIFICATE
This is to certify that Jatin Kaushal of Class XII , student of
Allenhouse Public School , Rooma , Kanpur has successfully
completed their Artificial Intelligence Capstone Project on the topic
“ AI Virtual Mouse” under the guidance of “Mr. Vikas Tiwari”
during the year 2024-25.
I am satisfied with their initiative and efforts for the completion of
project file as a part of curriculum of CBSE Class XII Examination.

SIGNATURE SIGNATURE
(PRINCIPAL) (SUBJECT TEACHER)

SIGNATURE
(EXAMINER)
ACKNOWLEDGEMENT

I would like to extend my sincere and heartfelt gratitude towards all


those who have helped me in making this project. Without their
active guidance, cooperation and encouragement. I would not have
been able to present this project on time.

I extend my sincere gratitude to my Artificial Intelligence Teacher,


Mr Vikas Tiwari and Principal Mrs. Karuna Gupta Sejpal for their
moral support and guidance during the tenure of my project.

I also acknowledge with deep sense of reverence, my gratitude


towards my parents and my friends for their valuable suggestions
given to me in completing the project.

Date: __________
AI PROJECT LOGBOOK

 PROJECT NAME: VIRTUAL MOUSE

 SCHOOL NAME: ALLENHOUSE PUBLIC SCHOOL


ROOMA

 YEAR/CLASS: 2024-25 /XII

 TEACHER NAME: Mr. VIKAS TIWARI

 TEACHER EMAIL: [email protected]

TEAM MEMBER NAMES AND GRADES:


 Sanya Gautam - XII A
 Nanshi Pal - XII A
 Sukhdeep Singh – XII A
 Arnav Singh – XII A
 Jatin Kaushal – XII A
 Manali Singh – XII C
1. INTRODUCTION

The AI Virtual Mouse using eye tracking is an innovative project


developed using Python, combining computer vision and gaze
tracking technologies To control the mouse pointer through eye
movements. By utilizing specialized libraries such as OpenCV and
MediaPipe, the system detects the user’s eye position and
translates it into cursor movement on the screen. This technology
enhances accessibility for individuals with physical disabilities
and provides a hands- free interaction method for all users. The
system tracks eye gaze with high precision, offering an intuitive
and efficient way to control a computer, improving user
experience and interaction without traditional input devices.
2.TEAM ROLES

2.1 Who is in your team and what are their roles?

Role Role description Team Member


Name
Helping define the project scope managing and Sanya
PROJECT resolving issues as they arise and developing
LEADER progress report.
PROJECT Distribute task among team members and Nanshi
MANAGER organize and manage
Logbook.
Access data and define it and work with end user
INFORMATION and understand their problem. Sukhdeep
RESEARCHER Decides upon data and its type of training
model to be used, collect data and draw the Arnav
insights required, perform analysis and analyze
DATA EXPERT trends in data.
Create design and decide how flow of problem
DESIGNER statement make sure it visuals are impactful. Jatin

Take the final concluding decision on data


numbers. Decides and points out patterns, outliers
STATISTICIAN and metric summaries Manali
2.2Project plan

Phase Task Planned Planne Plan Actu Actual Actual Who is Remarks
start date d end ned al end duratio responsible
date dura start date n
tion date (hours,
(hou minutes
rs, )
minu
tes)
Preparing 15-07-24 21-07-24 20- 15-07-24 21-07-24 20-30 hrs All Team
for the 30 Members Collabora
project Coursework
readings hrs tive work

Set up a 15-07-24 21-07-24 1 hr 15-07-24 21-07-24 1 hr All Team


team Members
folder on a
shared
drive
Defining Backgrou 18-07-24 20-07-24 1hr 19-07-24 20-07-24 1 hr All Team Collabora
the nd Members tive work
problem reading

Research 22-07-24 24-07-24 2 hr 22-07-24 23-07-24 2 hr All Team Collabora


issues in Members tive work
our
communit
y
Team 22-07-24 25-07-24 3 hr 23-07-24 24-07-24 2 hr All Team Collabora
meeting Members tive work
to discuss
issues and
select an
issue for
the
project
Complete 26-07-24 All Team
28-07-24 1 hr 27-07-24 29-07-24 1 hr
section 3 Members
of the
Project
Logbook

Rate 3/3
yourselves
UnderstandinIdentify users 29-07-24 31-07-24 1 hr 29-07-24 30-07-24 1 hr All Team Collabora
g the users Members tive work

Meeting 03-08-24 05-08-24 2 hrs 03-08-24 04-08-24 2 hrs Collabora


with users tive work
to observe
them
Interview 05-08-24 05-08-24 1 hr 05-08-24 05-08-24 1hr All Team
with Members
user (1)

6
Interview 06-08-24 06-08-24 1 hr 06-08-24 06-08-24 1 hr All Team
with user
Members
(2)
Interview 07-08-24 07-08-24 1hr 07-08-24 07-08-24 1hr All Team
with Members
user(3)
Interview 08-08-24 08-08-24 1hr 08-08-24 08-08-24 1hr All Team
with Members
user(4)
Complete 08-08-24 09-08-24 2hr 08-08-24 09-08-24 2 hr All Team Collabora
section 4 Members tive work
of the
Project
Logbook
Rate 3/3
yourselves
Team 09-08-24 10-08-24 2-3 09-08-24 10-08-24 3 hr All Team Collabora
Brainstorming meeting hr Members tive work
to
generate

12-08-24 4 hr 10-08-24 12-08-24 4 hr All Team


Complete 10-08-24
Members
section 5 of
the Project
Logbook
Rate 3/3
yourselves
Team 14-08-24 4 hr 12-08-24 15-08-24 3 hr All Team
12-08-24
meeting to Members
design the
solution
Complete 18-08-24 3 hr 16-08-24 18-08-24 2 hr All Team
Designing 16-08-24
Members
your solution section 6 of
the
Logbook
Rate 3/3
yourself
21-08-24 2 hr 19-08-24 21-08-24 2 hr All Team
Team 19-08-24
Members
Collecting and meeting to
preparing discuss
data data
requiremen
t
06-10-24 2 hr 04-10-24 07-10-24 3 hr All Team
Data 04-10-24
Members
collection
11-10-24 4 hr 09-10-24 12-10-24 4 hr All Team
Data 09-10-24
Members
pre pa ra t io
n and
Collecting and labeling
preparing 14-10-24 2 hr 13-10-24 14-10-24 2 hr All Team
Co mp l et 13-10-24
data Members
e Section 7
Prototyping
of the
project
Logbook
7
All Team
Team 15-10-24 15-10-24 1 hr 15-10-24 15-10-24 1 hr
Members
meeting to
plan
prototypin
g phase
Train your 18-10-24 3 hr 16-10-24 19-10-24 4 hr All Team
16-10-24
Members
model with
input data
set
All Team
Test your 20-10-24 20-10-24 4 hr 20-10-24 20-10-24 5 hr
Members
model and
Prototyping keep
testing training
with more
data until
you think
your model
is accurate
Write a 21-10-24 3 hr 21-10-24 21-10-24 3 hr All Team
21-10-24
program to Members
initiate
actions
based on
the result
of your
model
Complete All Team
23-10-24 23-10-24 4 hr 23-10-24 23-10-24 3 hr
Section 8 of Members
the Project
Logbook
Rate
3/3
yourself
Team
24-10-24 24-10-24 3hr 24-10-24 24-10-24 3hr All Team Collabora
meeting to Members tive work
discuss
testing
plan

Testing Invite 02-11-24 02-11-24 2hr 02-11-24 02-11-24 2hr All Team
Creating users to Members
the video test your
prototype
Conduct 03-11-24 5hr 03-11-24 04-11-24 5hr All Team
04-11-24 Members
testing
with users
Complete 05-11-24 05-11-24 4hr 05-11-24 05-11-24 4hr All Team
section 9 Members
of the
Project
Logbook
Rate 3/3
yourselves
Team 08-11-24 08-11-24 2hr 08-11-24 08-11-24 3hr All Team
meeting to Members
discuss
video
creation
8
Write 09-11-24 09-11-24 2hr 09-11-24 09-11-24 2hr All Team
your Members
script
Film 09-11-24 09-11-24 4hr 09-11-24 09-11-24 3hr All Team Collabora
your Members tive work
video

Edit 10-11-24 2hr 10-11-24 10-11-24 3hr All Team


your 10-11-24
Members
video
Completi Reflect 02-12-24 3hr 02-12-24 02-12-24 2hr All Team Collabora
ng the on the 02-12-24 tive work
Members
logbook project
with your
team
Complete 04-12-24 2hr 04-12-24 05-12-24 2hr All Team
sections 10 05-12-24 Members
and 11 of
the Project
Logbook
Review 05-12-24 2hr 05-12-24 05-12-24 2hr All Team Collabora
your 05-12-24 Members tive work
Project
logbook
and video
Submit All Team
Submission 10-12-24 30
10-12-24 min 10-12-24 10-12-24 30min
your entries
Members
on the IBM

2.3Communications plan

 Will you meet face-to-face, online or a mixture of each to communicate?


➢ Mixture of both. Face-to-face whenever possible, and online for late hours.
 How often will you come together to share your progress?
➢ As often as possible, ideally 2-3 times a week.
 Who will set up online documents and ensure that everyone is contributing?
➢ Sanya Gautam is responsible for setting up online documents, along with

ensuring appropriate contribution.

 What tools will you use for communication?


➢ Google Meet, WhatsApp, Google Drive and shared MS Word file for
collaboration.

2.4 Team meeting minutes (Meeting 1)


 Date of meeting:12 July 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali

9
 Who wasn’t able to attend: None
 Purpose of meeting: Research on topic, discussion of ideas and division of roles
 Items discussed:
1.Define the purpose and objective of the AI virtual mouse.
2.Determine specific use cases for individual with disabilities.
 Things to do (what, by whom, by when):
1.Assign roles, oversee the entire project, set the timelines and track progress.(Sanya)
2.Investigate and evaluate eye tracking technologies at AI algorithms .(Sukhdeep)
2.4.1 Team meeting (Meeting 2)
 Date of meeting:22 July 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: To discuss solutions for the problem
 Items discussed: An Computer Vision powered image model responsible
for eye tracking.
 Things to do: Learn and research on best AI powered framework that
would help us make our model
2.4.2 Team Meeting(Meeting 3)
 Date of meeting: 12 August 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: To design the solution
 Items discussed: Organizing and listing all the requirements required for
construction and designing of model and decided to opt for image based
recognition approach.
 Things to do: Start with data preparation and collection

2.4.3 Teams Meeting(Meeting 4)


 Date of meeting: 19 August 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: To discuss data requirements
 Items discussed: Minimum amount of data required for good accuracy and
precision as well as the reliable sources we can collect it from.
 Things to do: Collect data and check it for patterns and statistics

2.4.4 Teams Meeting(Meeting 5)


 Date of meeting: 15 October 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: Team meeting to discuss prototyping phase
 Items discussed: Deciding on different types of model we will make
 Things to do: Input the data obtain desirable prototype

2.4.5 Team Meeting(Meeting 6)


10
 Date of meeting: 24 October 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: Team meeting to discuss testing plan
 Items discussed: Decide on how to proceed with testing and calculate
accuracy,F1 score and prediction.
Adding diversity to ideas with the help of Sanya, Nanshi, Sukhdeep, Jatin,
Arnav and Manali and end users.
 Things to do: Testing by team members followed by end users.

2.4.6 Team Meeting(Meeting 7)


 Date of meeting: 8 November 24
 Who attended: Sanya,Nanshi,Sukhdeep,Jatin,Arnav and Manali
 Who wasn’t able to attend: None
 Purpose of meeting: Team meeting to discuss video creation.
 Items discussed:
 1.What all editing and creation software to be used ?
 2.Overall theme and script of video
 3.Any voice over required ?
 Things to do: Film and edit the video and do the necessary changes.

11
3. PROBLEM DEFINITION
3.1 List important local issues faced by your school or community ?
➢ The important local issue faced by our society is for disabled people to use
technology and the people who are unable to use their hands.

3.2 Which issues matter to you and why?


➢ Gaze-tracking mouse control can transform how individuals with mobility
impairments interact with technology. For those unable to use traditional input
devices like a mouse or keyboard, this model enables seamless control
through eye movement alone. It helps unlock their ability to communicate,
work, and create, empowering them to contribute their skills and potential
without limitations. By bridging this accessibility gap, we ensure technology
becomes inclusive and supportive of everyone’s abilities.
3.3 Which issue will you focus on?
➢ We will focus on accessibility for people with disabilities.
Gaze-tracking enables hands-free device control for individuals with mobility
impairments. This helps them access education, work, and communication,
unlocking their potential and promoting independence.

3.4 Write your team’s problem statement in the format below.


➢ How can we enable individuals with mobility impairments to interact with
technology hands-free using gaze tracking, ensuring greater accessibility,
reducing reliance on physical input devices, and unlocking their full potential.

12
4.THE USERS

4.1 Who are the users and how they affected by the problem?
➢ The users are individuals with mobility impairments who cannot use
traditional input devices like a mouse or keyboard. They are affected by
limited access to technology, making it difficult to communicate, work, or
perform daily tasks independently.
4.2 What have you actually observed about the users and how the problem
affects them?
➢ We have observed that users with mobility impairments face challenges in
interacting with technology due to their inability to use traditional input
devices. This limits their independence, reduces opportunities for education
and employment, and hinders their ability to fully utilize digital tools.

4.3 Record your interview questions here as well as responses from users.
 Interviewer: Can you explain how eye tracking works in your AI Virtual Mouse
Project?
➢ User: It uses computer vision (OpenCV, MediaPipe) to detect and track eye
Movements, translating gaze positions into screen coordinates to
control the Mouse pointer.
 Interviewer: What libraries or tools did you use to implement this AI Virtual
Mouse?
➢ User: I used OpenCV for image processing, MediaPipe for facial landmarks,
Pynput for stimulating mouse actions, and NumPy for mathematical
calculations.
 Interviewer: What challenges did you face in implementing gaze tracking and
how did you overcome it?
➢ User: Lightning conditions and head movements were challenging. I improved
accuracy by refining facial landmark detection and using techniques to stabilize
gaze-to screen mapping.
 Interviewer: How does it handle click events?
➢ User: Eye blinks are used for click detection.

13
4.4 Empathy Map

SAYS THINKS
 It’s amazing that I can control my Will it be tiring for my eyes
computer just with my eyes. to use this for long? How
 It’s a bit challenging to control  can I improve my accuracy
precise actions like dragging and and speed with practice?
dropping.
 I wish it could be more accurate in  How would it handle rapid
low light conditions. movements or sudden
changes in gaze?

DOES FEELS
 Does the system to move the cursor  Feels like they have more
using eye gaze. control especially with limited
 Calibrates the system regularly for mobility.
improved accuracy.  Interested in the technology but
 May take breaks to avoid eye strain concerned about its
or fatigue. effectiveness in real life use.
 Eager to try an innovative way
of interacting with technology.

4.5 What are the usual steps that users currently take related to the
problem and where are the difficulties?
 Voice Control: Often imprecise and struggles in noisy environments.
 External Switches: Expensive, require setup, and can be physically
 challenging.
Specialized Hardware: High cost and limited availability make it inaccessible
for many users.

General Difficulties: Limited accuracy, affordability, and accessibility.

4.6 Write your team’s problem statement in the format below.


➢ Individuals with mobility impairments are facing difficulties in interacting with
technology due to limited access to affordable and accurate assistive devices,
hindering their independence and potential.

14
5.BRAINSTORMING
5.1 Ideas

AI Idea #1  Use AI to accurately track eye movements and translate them into precise
cursor control, enabling hands-free interaction.
AI-driven speech recognition to assist users in controlling devices through
AI Idea #2 
voice commands.
Machine learning to interpret specific head or facial gestures as input for device
AI Idea #3 
interaction.
AI to adapt the gaze tracking system to individual user needs, enhancing
AI Idea #4  accuracy and comfort.
AI-powered guidance and tutorials for users to learn how to use assistive
AI Idea #5  technologies effectively.

5.2 Priority Grid

High value to users, easy to create High value to users, hard to create
➢ Speech-to-Text ➢ Gaze Tracking
Low value to users, easy to create Low value to users, hard to create
➢ Accessibility Tutorials ➢ Gesture Recognition
Easy Hard
EASE OF DEVELOPMENT

5.6 Based on the priority grid, which AI solution is the best fit for your
users and for your team to create and implement?
➢ Our approach is to create a gaze tracking model that we have trained using custom
computer vision techniques to track eye movements. By leveraging MediaPipe and
Python, we built the entire system from scratch, enabling hands-free interaction
with technology. This approach allows individuals with mobility impairments to
control their devices more efficiently, improving accessibility. We developed and
trained the model to accurately track and translate gaze into cursor movement,
ensuring a seamless user experience.

15
6.DESIGN

6.1 What are the steps that users will now do using your AI solution to address the
problem?
➢ Calibration: Users will first calibrate the system by focusing on specific
points on the screen to align the gaze tracker with their eye movements.
➢ Eye Tracking: Once calibrated, the system will continuously track the user’s
eye movements in real-time.
➢ Cursor Control: Users will move the cursor on the screen by looking at
different areas, with the system translating gaze direction into precise cursor
movement.
➢ Interaction: Users can interact with their device by gazing at buttons, icons, or
text, allowing for hands-free clicking and navigation.

16
7. DATA

7.1 What data will you need to train your AI solution?


➢ To train the gaze tracking AI solution, we need eye movement data with
corresponding gaze points, face and eye landmarks for accurate detection, and
Suser behavior data to account for individual differences. Additionally,
environmental data in various lighting conditions will ensure the system
works reliably in different settings.

7.2 Where or how will you source your data?

Where will the data come Do you have


Data needed from? Who owns the data? permission to use
the data?
Have Custom data collection, eye Team Yes
tracking setups
Public domain datasets (e.g., Yes
from research papers or online Public domain
Want/Need
resources)
User- contributed eye tracking Yes
School and volunteers
Nice to have data, environmental images

17
8. PROTOTYPE

7.1 Which AI tool(s) will you use to build your prototype?


➢ We will use MediaPipe for eye and face landmark detection, OpenCV for
image processing, and Python for implementing the gaze tracking model.

7.2 Which AI tool(s) will you use to build your solution?


➢ We will use MediaPipe for real-time eye and face tracking, OpenCV for
image processing and manipulation, and Python to implement the gaze
tracking algorithms and integrate them into a functional solution.
7.3 What decisions or outputs will your tool generate and what further

action needs to be taken after a decision is made?


➢ The tool will detect the user's gaze location and translate it into cursor
movement on the screen. After detecting the gaze, the system will move
the cursor accordingly, allowing the user to interact with the device.
Further actions include clicking or selecting items by focusing on them,
with occasional recalibration or adjustments for improved accuracy.

18
9. TESTING

9.1 Who are the users who tested the prototype?


➢ The users who tested the prototype are individuals with mobility
impairments, including those with limited hand or arm movement, as well
as volunteers from schools or communities who have experience with
assistive technology.

9.2 List your observations of your users as they tested your solution.
➢ Upon testing, we found that the model performed as expected, with an
accuracy of 95%. The accuracy and F1 score for our model were 91% and
0.8, respectively, demonstrating strong performance in gaze tracking and
cursor control.

9.3 Complete the user feedback grid

What works? What needs to change?


 Ease of setup hands free control and  Address discomfort from prolonged use
innovative approach. and enhance click method.
 Customizable options for blink  Improve grace tracking in low light
sensitivity and cursor speed. settings and with fast head moments.
Questions? Ideas
 How can we improve the click  Improve system performance in
interaction for ease of use? different environments with better
How can we optimize the system for sensors or algorithms.

barren lightning conditions?  Allow for real time customization and
alternative clicking methods for
enhanced user experience.

9.4 Refining the prototype: Based on user testing, what needs to be acted on
now so that the prototype can be used?
➢ User testing made us realize that background color is affecting model’s
decision so we have decided to place the material on white background
while using the model for image classification.
9.5 What improvements can be made later?
➢ Upon testing we found that the model was not fully predictive about some
plastic material which are similar to look at. Upon discussing it with the
team we have decided to update our training set with more diverse images
especially images taken from usage by real life people.

19
10.TEAM COLLABORATION

10.1 How did you actively work with others in your team and with stakeholders?
➢ We collaborated with our stakeholders by investigating the troubles they face
on a daily basis and understanding them to help us make a project that aims
to solve their troubles with full efficiency. Their inputs guided us throughout
our journey.
➢ We as co-members created a group chat to stay in touch and communicate
with each other efficiently at any time of the day comfortably. We made sure
to have open ears for each others’ suggestions and did not let any member's
efforts be unrecognized.
➢ We organized online meetings amongst the group members for
brainstorming ideas.
➢ Sessions with mentors and stakeholders to test the efficiency of the
prototype.

20
11.INDIVIDUAL LEARNING REFLECTIONS

11.1. Team Reflections


Team member name: Sanya
➢ As a team leader I learned the importance of effective communication and
collaboration managing diverse challenges like calibration and accuracy taught
me to prioritize user experience while fostering innovation .

Team member name: Nanshi


➢ This project showed me my weaknesses and strengths and made me work on
them and enhance my skills.I learnt how to work in a team and how to be time
bound.It helped bring change in my work ethics and taught me new skills. I
spent so much time on this project and all through it, I learnt that if I set my
goals, I can achieve anything. Hence this project gave me a new perspective
on things.

Team member name: Jatin


➢ The project has shaped my personality in a way that now I know what to
expect when people say they are looking for people with skill and passion to
work with, not just some degree holding person. Working with Arnav and
Sukhdeep, who are the most passionate people to work with, have taught me
how to have fun without worrying too much, contributing not only to my team
building skills but also my personality. The sense of responsibility that I
experienced to make sure that everything about the project is perfect was fun
not a burden.

Team member name: Sukhdeep


➢ This project helped me improve my time and stress management,the positives
and negatives of working in a group,and how to tackle different ideas and
ideologies on a particular topic.

Team member name: Arnav


➢ I gained valuable experience in processing and analyzing gay stacking data I
learned the significance of refining algorithms for accuracy and optimizing
performance across different users environments.
Team member name: Manali

➢ I deepened My understanding of data analysis and statistical modeling I


learned how to interpret case taking data patterns optimize accuracy and
apply statistical methods to improve system performance.

21
12. VIDEO LINK

Enter the URL of your team video: https://www.youtube.com/watch?v=lMPMV3l-


Sew

22
Appendix
Recommended Assessment Rubric (for Teachers)
LOGBOOK AND VIDEO CONTENT

Steps 3 points 2 points 1 point Points


Given

Problem A local problem which has not A local problem which has not been A local problem is
definition been fully solved before is fully solved before is described. described
explained in detail with
supporting research.

The Users Understanding of the user group is Understanding of the user group The user group is described
evidenced by completion of all of is evidenced by completion of but it is unclear how they
the steps in Section 4 The Users most of the steps in Section 4 are affected by the problem.
and thorough investigation. The Users.

Brainstorming A brainstorming session was A brainstorming session was A brainstorming session was
conducted using creative and critical conducted using creative and conducted. A solution was
thinking. A compelling solution critical thinking. A solution was selected.
was selected with supporting selected with supporting
arguments from Section 5 arguments in Section 5
Brainstorming. Brainstorming.

The use of AI is a good fit for the The use of AI is a good fit for The use of AI is a good fit for
solution. The new user experience the solution and there is some the solution.
is clearly documented showing how documentation about how it
users will be better served than they meets the needs of users.
are today.

Data Relevant data to train the AI Relevant data to train the AI Relevant data to train the AI
model have been identified as model have been identified as model have been identified
well as how the data will be well as how the data will be as well as how the data will
sourced or collected. There is sourced or collected. There is be sourced or collected.
evidence that the dataset is evidence that the dataset is
balanced, and that safety and balanced.
privacy have been considered.
A prototype for the solution has
Prototype A prototype for the solution has A concept for a prototype
been created and successfully been created and trained. shows how the AI model
trained to meet users’ will work
requirements.

Testing A prototype has been tested with a A prototype has been tested A concept for a prototype
fair representation of users and all with users and improvements shows how it will be tested.
tasks in Section 9 Testing have have been identified to meet
been completed. user requirements.
Effective team collaboration and Team collaboration among peers
Team There is some evidence of team
collaboration communication among peers and and stakeholders is clearly interactions among peers and
stakeholders is clearly documented documented in Section 10 Team stakeholders.
in Section 10 Team collaboration. collaboration.

Individual Each team member presents a Each team presents an Some team members
learning reflective and insightful account of account of their learning present an account of their
their learning during the project. during the project. learning during the project.

23
VIDEO PRESENTATION

Points Given
3 – excellent
2 – very good
Criteria 1 – satisfactory

Communication The video is well-paced and communicated, following a clear and


logical sequence.

Demonstrations and/or visuals are used to illustrate examples,


Illustrative
where appropriate.

Accurate The video presents accurate science and technology and uses
language appropriate language.

The video demonstrates passion from team members about their


Passion
chosen topic/idea.

Sound and
The video demonstrates good sound and image quality.
image quality

The content is presented in the video within a 3-minute


Length
timeframe.

Total points

24

You might also like