0% found this document useful (0 votes)
200 views24 pages

Logbook

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
200 views24 pages

Logbook

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

AI PROJECT LOGBOOK

Resource for Students


(Adapted from “IBM EdTech Youth Challenge – Project Logbook” developed by
IBM in collaboration with Macquarie University, Australia and Australian Museum)

KEY PARTNERS

INDIA IMPLEMENTATION PARTNERS

GLOBAL PARTNERS
AI Project Logbook

PROJECT NAME: GENERAL CHATBOT

SCHOOL NAME: MAHARISHI VIDYA MANDIR

YEAR/CLASS: 2024-25/XII

TEACHER NAME: K REVATHI

TEACHER EMAIL:

TEAM MEMBER NAMES AND GRADES:

1. Vishwajith G

2. Naren Ragunandhan

3. A K Jeevesh

4. S Lokeshwaran

5. G Nithish
1. Introduction
This document is your Project Logbook, and it will be where you record your ideas,
thoughts and answers as you work to solve a local problem using AI.

Make a copy of the document in your shared drive and work through it digitally with your
team. You can also print a copy of the document and submit a scanned copy once you have
completed the Project Logbook. Feel free to add pages and any other supporting material to
this document.

Refer to the AI Project Guide for more details about what to do at each step of your project.

2. Team Roles
Who is in your team and what are their roles?

Rol Role description Team Member Name


e

TO SCHEDULE AND ALLOCATE Vishwajith G


LEADER TASKS AMONG TEAM MEMBERS
FILLS LOGBOOK AND ACT AS A
LINK BETWEENTEACHER AND
THE TEAM MEMBERS.

Naren Ragunandhan
DESIGNER TO WORK WITH THE TEAM TO
DESIGN THE PROJECT
PROTOTYPE.

INFORMATION COLLECTS QUESTIONS FROM


TEAM, FINDS ANSWERS AND A K Jeevesh
RESEARCHER
FORWARD IT TO TEAM LEADER

DATA EXPERT DECIDES ON WHAT TYPE OF


DATA TO WORK WITH TO TRAIN S Lokeshwaran
AN Al MODEL

TESTER WORKS WITH THE USERS TO


TEST THE PROTOTYPE AND GET G Nithish
FEEDBACK FROM THE USERS
Project plan

The following table is a guide for your project plan. You may use this or create your own
version using a spreadsheet which you can paste into this section. You can expand the
‘Notes’ section to add reminders, things that you need to follow up on, problems that need
to be fixed urgently, etc.

Phase Task Planned Planned Plann Actual Actual Actual Who is Notes/Re
start end date ed start end date duration responsible marks
date durati date (hours,
minutes)
on
(hour
s,
minut
es)
Preparing for Coursework, 15/5/2024 18/5/2023 2 hrs 15/5/202 18/5/2024 2 hrs Naren
the project readings 4

Set up a team 15/5/2024 15/5/202 18/5/2024


4
folder 18/5/2023 15 min 25 min jeevesh
on a shared 3
drive
Defining the Problem 19/5/2024 19/5/2024 20 min 19/5/202 19/5/2024 30 min lokesh
problem Definition 4
Research
issues in our 19/5/2024 19/5/2024 19/5/202 19/5/2024
community 30 min 4 40 min lokesh

Team
meeting
Discuss 22/5/2024
issues and 22/5/2024 20 min 22/5/202 22/5/2024 20 min all team members
4
select an
issue for the
Project
Complete vishwajith,nithish
section 3 of
the Project 25/5/2024 25/5/2024 30 min 25/5/202 25/5/2024 40 min
4
Logbook
Rate
Yourselves
Understanding Identify users 27/5/2024 27/5/2024 2 hrs 27/5/202 27/5/2024 2 hrs vishwajith
the users 4

Meeting with
users to 29/5/2024 29/5/2024 29/5/202 29/5/2024
1 day 4 1 day jeevesh, nithish
observe them

Interview 30/5/2024 30/5/2024 2 hrs 30/5/202 30/5/2024 2 hrs naren


with user (1) 4
Interview
with user (2), 30/5/2024 30/5/2024 30/5/202 30/5/2024
etc… 2 hrs 4 2 hrs jeevesh
Complete 10/6/2024 10/6/2024 10/6/202 10/6/2024 vishwajith
section 4 of 4
3 hrs 2 hrs 30
the Project
min
Logbook
Rate
yourselves

Team
Brainstorming meeting to
generate 10/6/2024 10/6/2024 1 hrs 10/6/202 10/6/2024 1 hrs All team members
ideas for a 4
solution
Complete
section 5 of 11/6/2024 11/6/2024 11/6/202 11/6/2024
2 hrs 2 hrs nithish
the Project 4
Logbook
Rate
yourselves
Designing
Team
our solution meeting to 11/6/2024 11/6/2024 11/6/202 11/6/2024
design the 2 hrs 4 2 hrs vishwajith,jeevesh
Solution
Complete
section 6 of 15/6/2024 15/6/2024 15/6/202 15/6/2024 lokesh
8 hrs 4 10 hrs
the logbook

Rate
Yourselves
Collecting and Team 15/6/2024 15/6/2024 15/6/202 15/6/2024
preparing data 4
meeting to 30 min 30 min vishwajith
discuss data
requirements
Collecting and Data 15/6/2024 15/6/2024 1 hrs 15/6/202 15/6/2024 1 hrs jeevesh
preparing data 4
collection
Prototyping
Data 15/6/2024 15/6/2024 1 hrs 15/6/202 15/6/2024 1 hrs naren
preparation 4
and labeling

Complete 30 min 45 min lokesh


Section 6 of 15/6/2024 15/6/2024 15/6/202 15/6/2024
4
the Project
Logbook
Team 15 min 15 min nithish
meeting to 15/6/2024 15/6/2024 15/6/202 15/6/2024
plan 4
prototyping
phase
Prototyping Train your 4 hrs 4 hrs naren
Testing model with 20/6/2024 21/6/2024 20/6/202 21/6/2024
input dataset 4
Test your 21/6/2024 21/6/2024 4 hrs 21/6/202 21/6/2024 4 hrs vishwajith
4
model and
keep training
with more
data until you
think your
model is
accurate
Complete 21/6/2024 21/6/2024 1 hrs 21/6/202 21/6/2024 1 hrs lokesh
4
section 8 of
the Project
Logbook
Rate
yourselves
Team 24/6/2024 24/6/2024 30 min 24/6/202 24/6/2024 30 min All members
meeting to 4
discuss
testing plan
Completing Reflect on the 24/6/2024 24/6/2024 24/6/202 24/6/2024 All members
the logbook 4
project with
your
Team
Complete 25/6/2024 25/6/2024 25/6/202 25/6/2024 All members
4
sections 10
and 11 of the
Project
Logbook
Review your 25/6/2024 25/6/2024 25/6/202 25/6/2024 All members
4
Project
logbook and
video
Submit your All members
Submission
entries on
IBM.
Communications plan

● How will you plan to meet for discussion?

Online and Offline modes

● How often will you come together to share your progress?

Weekly 2-3 times.

● Who will set up online documents and ensure that everyone is contributing?

naren

● What tool will you use for communication?


Face to face, Google Drive, Whatsapp, Gmail

Team meeting minutes (create one for each meeting held)

● Date of meeting : 15/5/2024 Who attended: Everyone

● Purpose of meeting: To decide roles and responsibilities.

Topics discussed:
1. Project Topic - chatbot

2. Team Roles

3. Problem Definition

4. Communication plans
3. Problem Definition

Problem Overview
● Description: Define the primary reason for creating the chatbot. What business or user challenge does
it aim to solve?
Example: “Users face delays in getting customer support, leading to dissatisfaction and lost business
opportunities.”

2. Objective
● Goal: Clearly state the intended outcome of the chatbot.
Example: “The chatbot will automate customer service to provide instant responses to common
inquiries, thereby improving user experience and reducing support costs.”

3. Target Audience
● Who will use it?: Identify the end-users (e.g., customers, employees, students).
Example: “The chatbot is aimed at customers seeking technical support for software products.”

4. Key Challenges
● Current Problems: List existing pain points that the chatbot will alleviate.
Example:
○ Long response times from human agents.
○ Repetitive questions consuming agent time.
○ Users needing help outside business hours.

5. Scope and Requirements


● Features: Outline the functionality the chatbot must have to address the problem.
Example:
○ 24/7 availability.
○ Answer FAQs about products and services.
○ Handoff to human agents for complex issues.
○ Multilingual support.

6. Success Metrics
● How will you measure success?
Example:
○ Reduction in average response time by 70%.
○ 50% reduction in customer service emails.
○ 90% of inquiries resolved without human intervention.

7. Technical Constraints
● What limitations might impact the project?
Example:
○ Integration with existing customer support platforms.
○ NLP accuracy for understanding user queries.
○ Data privacy and security requirements.
Write your team’s problem statement in the format below.
By defining the problem clearly, you ensure that the chatbot is developed with a focused objective and
measurable success outcomes, addressing both user needs and business goals.

4. The Users
Who are the users and how are they affected by the problem?
everyone to make their work easy

What have you actually observed about the users and how the problem affects them?

1. Delayed Response Times

● Observation: Users often wait too long for a response to their inquiries, especially during peak hours or
outside business hours.
● Impact: This leads to frustration, reduced customer satisfaction, and users abandoning inquiries or
escalating issues through alternative, costlier support channels (e.g., phone calls).

2. Repetitive Queries

● Observation: A large number of user queries are repetitive, such as basic troubleshooting steps, account
management, or common product FAQs.
● Impact: Support agents are overwhelmed with answering the same questions multiple times, which limits
their availability to address more complex, high-priority issues. Users feel frustrated when they experience
delays for simple information.

3. Limited Availability

● Observation: Customer support is only available during specific hours, and users who need assistance
outside of this timeframe are left without help.
● Impact: Users who require assistance in different time zones or during off-hours are unable to get
immediate answers, leading to dissatisfaction and negative brand perception.

4. User Preferences for Instant Help

● Observation: Users prefer self-service options and instant responses for simple issues rather than waiting
for human agents.
● Impact: The inability to provide quick, automated solutions for basic queries results in lost opportunities for
quick resolutions and user engagement drops.

5. Complex Queries Needing Human Intervention

● Observation: While some users have simple queries that can be automated, others have more complex
issues that require human intervention.
● Impact: Users get frustrated when their complex queries are handled poorly by automated systems,
resulting in negative experiences and the need for proper escalation processes to human agents.

6. Inconsistent Customer Experience Across Channels


● Observation: Users experience different levels of service and response quality depending on the channel
(e.g., email vs. live chat vs. phone support).
● Impact: This inconsistency creates confusion and leads to a lack of trust in the support system, as users
expect the same level of service regardless of the channel they use.

Record your interview questions here as well as responses from users.

1. Delayed Response Times

● Observation: Users often wait too long for a response to their inquiries, especially during
peak hours or outside business hours.
● Impact: This leads to frustration, reduced customer satisfaction, and users abandoning
inquiries or escalating issues through alternative, costlier support channels (e.g., phone
calls).

2. Repetitive Queries

● Observation: A large number of user queries are repetitive, such as basic troubleshooting
steps, account management, or common product FAQs.
● Impact: Support agents are overwhelmed with answering the same questions multiple
times, which limits their availability to address more complex, high-priority issues. Users
feel frustrated when they experience delays for simple information.

3. Limited Availability

● Observation: Customer support is only available during specific hours, and users who
need assistance outside of this timeframe are left without help.
● Impact: Users who require assistance in different time zones or during off-hours are
unable to get immediate answers, leading to dissatisfaction and negative brand
perception.

4. User Preferences for Instant Help

● Observation: Users prefer self-service options and instant responses for simple issues
rather than waiting for human agents.
● Impact: The inability to provide quick, automated solutions for basic queries results in lost
opportunities for quick resolutions and user engagement drops.

5. Complex Queries Needing Human Intervention

● Observation: While some users have simple queries that can be automated, others have
more complex issues that require human intervention.
● Impact: Users get frustrated when their complex queries are handled poorly by automated
systems, resulting in negative experiences and the need for proper escalation processes
to human agents.

6. Inconsistent Customer Experience Across Channels

● Observation: Users experience different levels of service and response quality depending
on the channel (e.g., email vs. live chat vs. phone support).
● Impact: This inconsistency creates confusion and leads to a lack of trust in the
support system, as users expect the same level of service regardless of the channel
they use.
Empathy Map

Map what the users say, think, do and feel about the problem in this table

What our users are saying What our users thinking

- "It takes too long to get a response." - "Why do I have to wait for something that
- "I just need a simple answer, why is this so should be quick?"
difficult?" - "Is there a faster way to get this
- "I can't get help outside of business hours." information?"
- "Why can't I resolve this on my own?"

What our users are doing How our users feel

- Repeatedly sending inquiries or switching - Frustrated by long wait times.


between support channels (e.g., email, live - Impatient when seeking answers to simple
chat, phone). questions.
- Abandoning inquiries if not addressed - Dissatisfied when they can't reach support
quickly. outside of work hours.
- Trying to find answers through self-help - Anxious or stressed when they can't get
resources like FAQs or forums. immediate help for urgent issues.
What are the usual steps that users currently take related to the problem and where are the
difficulties?

1. Step 1: Identifying the Issue


○ Action: Users encounter a problem with a product or service (e.g., technical issue, billing
question).
○ Difficulty: Users may not know where to start or what support channel to use, causing
confusion and delays.
2. Step 2: Searching for Help
○ Action: Users often search for help on their own, typically by browsing FAQs, help articles, or
forums.
○ Difficulty: These resources may be outdated, hard to navigate, or not specific enough to
address their issue. Users may also find it challenging to locate the right information, especially
if their query is complex.
3. Step 3: Contacting Support
○ Action: When self-help fails, users contact customer support through email, chat, or phone.
○ Difficulty:
■ Long wait times: During peak hours or outside business hours, users experience
significant delays in getting responses.
■ Repetitive processes: Users often have to provide the same information multiple times
(e.g., through chatbots or forms) before reaching a human agent.
4. Step 4: Interacting with Support
○ Action: Users engage with customer service agents or automated systems (e.g., chatbot or
IVR).
○ Difficulty:
■ Inefficient chatbot interactions: When interacting with a chatbot, users may experience
poor responses to complex questions, forcing them to escalate to a human.
■ Lack of personalization: Automated responses may feel generic and not tailored to the
user’s specific situation, leading to frustration.
5. Step 5: Resolution or Escalation
○ Action: If the problem is resolved, the user leaves the support process. If not, they may
escalate the issue to a higher-level agent.
○ Difficulty:
■ Delayed escalations: If the chatbot or lower-level support fails to resolve the issue, users
experience further delays as the problem is passed to human agents.
■ Lack of follow-up: In some cases, users feel abandoned if they don’t receive timely
updates on their escalated issue.
6. Step 6: Post-Support Feedback
○ Action: Users may be asked to rate their experience after the issue is resolved.
○ Difficulty: Negative experiences with long wait times, ineffective chatbots, or unresolved issues
lead to lower feedback scores and dissatisfaction.

Write your team’s problem statement in the format below.

Key Difficulties:

● Difficulty in finding relevant information quickly (especially in self-help).


● Long response times from human agents or inefficient chatbot handling.
● Repetitive steps, like re-entering the same information across different channels.
● Inadequate chatbot capability for complex issues, leading to the need for human
intervention.
● Unavailability of support during off-hours, which leaves users without immediate
assistance.

5. Brainstorming
Ideas

How might you use the power of AI/machine learning to solve the users’ problem by increasing
their knowledge or improving their skills?

AI Idea AI-driven chatbots can instantly answer common questions using natural
#1 language processing (NLP) to interpret user queries and provide accurate,
context-aware responses.

AI Idea Machine learning algorithms can analyze past user interactions and tailor
#2 responses based on individual preferences and previous queries.

AI Idea AI can diagnose user issues by analyzing patterns in their behavior or input (e.g.,
#3 error messages, troubleshooting steps). The chatbot can suggest proactive
solutions or provide interactive guides for resolving issues in real-time.

AI Idea AI chatbots can use ML to continuously learn from user feedback and new data
#4 sources, adapting their responses and improving their problem-solving
capabilities over time.

AI Idea AI can provide step-by-step interactive tutorials or guides that help users develop
#5 their skills in using a product or solving recurring issues. For instance, the chatbot
can walk users through advanced product features or troubleshooting methods.
6.Design
What are the steps that
users will now do using
your AI solution to
address the problem?
6. Data

What data will you need to train your AI solution?

1. Historical customer queries (chat logs, emails, etc.).


2. Knowledge base content (FAQs, guides, documentation).
3. User profiles and preferences for personalization.
4. Chatbot feedback and ratings to improve performance.
5. Escalation patterns to learn when to hand off to humans.
6. Product/service usage data to predict and solve common issues.

Where or how will you source your data?

Where will Do you have Ethical


the data Who owns permission to considerations
Data come the use the data?
needed from? data?
PAST RECORDS Public dataset YES SHOULD BE
Have
AUTHENTIC
IDENTIFICATIO Public dataset YES SHOULD BE
Want/Need
N ACCURATE
& AUTHENTIC
AI MODELS
Nice to have
7. Prototype

Which AI tool(s) will you use to build your prototype?

To build the prototype, I would use:

1. Dialog Flow or Rasa for natural language understanding and chatbot development.
2. TensorFlow or PyTorch for machine learning model training.
3. Amazon Lex or Microsoft Bot Framework for integration and deployment.

Which AI tool(s) will you use to build your solution?

To build the solution, I would use:

1. Dialog Flow or Rasa for natural language processing and chatbot framework.
2. TensorFlow or PyTorch for developing and training machine learning models.
3. Amazon Lex or Microsoft Bot Framework for integration and deployment.
4. MongoDB or Firebase for user data storage and management.

What decisions or outputs will your tool generate and what further action needs to be taken
after a decision is made?

The AI tool will generate:

1. User Intent Identification: Classifying user queries to determine the intent.


○ Further Action: Provide relevant responses or escalate to a human agent if necessary.
2. Suggested Solutions: Offering troubleshooting steps or articles based on the identified issue.
○ Further Action: Users can follow the provided guidance or request additional help.
3. Feedback Analysis: Collecting user feedback on responses for continuous improvement.
○ Further Action: Update the training data and adjust algorithms based on feedback trends.
4. Escalation Triggers: Deciding when to hand off complex issues to human support.
○ Further Action: Notify human agents with context about the user’s issue for seamless support.
8. Testing
Who are the users who tested the prototype?

Internal Team Members: Employees from customer support, product management, and
development teams who provide initial feedback on usability and functionality.

Beta Testers: Selected users from the target audience who volunteer to try the chatbot and
share their experiences, focusing on ease of use and effectiveness.

List your observations of your users as they tested your solution.

Ease of Use: Most users found the chatbot interface intuitive and easy to navigate.
Response Time: Users appreciated the quick responses but noted occasional delays with
complex queries.
Query Understanding: Some users experienced frustration when the chatbot misinterpreted
their questions, highlighting the need for improved NLP.
Helpfulness of Suggestions: Many found the provided solutions helpful, while others felt the
information was too generic for specific issues.
Escalation Process: Users were generally satisfied with the escalation process but wanted
clearer communication on response times when handed off to human agents.

Complete the user feedback grid

What works What needs to change

Intuitive interface Improve NLP for better understanding of


queries
Quick response times for simple queries
Reduce response time for complex queries
Helpful suggestions for common issues
Make solutions more tailored to specific
Smooth escalation process problems

Positive feedback on overall experience Provide clearer communication during


escalation

Simplify the feedback mechanism

Questions? Ideas

How can we further personalize responses? Add a guided tutorial for first-time users

What types of queries are most frustrating? Implement a suggestion feature for rephrasing
queries
How do users feel about the escalation
process? Create a feedback loop for continuous
improvement
What additional features would enhance user
experience? Introduce interactive troubleshooting guides

Are users aware of all features available? Gamify the feedback process to encourage
user participation
Refining the prototype: Based on user testing, what needs to be acted on now so that
the prototype can be used?

Enhance NLP: Improve understanding of user queries to reduce misinterpretations.


Tailor Solutions: Provide more personalized responses based on user history.
Speed Up Responses: Optimize processing to minimize delays, especially for complex
queries.
Improve Escalation Communication: Clarify notifications about the escalation process and
expected response times.
Simplify Feedback: Redesign the feedback mechanism for easier user participation.
Introduce Guided Tutorials: Create onboarding tutorials to help users interact effectively.

What improvements can be made later?

Advanced Personalization: Incorporate deeper user profiling for more tailored interactions.
Multi-Language Support: Add capabilities for multiple languages to reach a broader
audience.
Voice Interaction: Implement voice recognition for hands-free user engagement.
Analytics Dashboard: Create an analytics feature for tracking user behavior and chatbot
performance.
Integration with Other Tools: Connect with third-party applications for seamless user
experiences (e.g., CRM systems).

Rate yourself 3

Testing

point – A concept for a prototype shows how it will be tested.


points - A prototype has been tested with users and improvements have been identified to
meet user requirements.
points - A prototype has been tested with a fair representation of users and all tasks in this
section have been completed.
9. Team collaboration
How did you actively work with others in your team and with stakeholders?

Regular Meetings: Held weekly meetings to discuss progress, gather feedback, and align on
goals and priorities.
Cross-Functional Workshops: Conducted brainstorming sessions involving team members
from different functions (e.g., development, marketing, customer support) to gather diverse
insights and ideas.
User Testing Collaboration: Engaged team members and stakeholders in the user testing
process to observe interactions firsthand and collect varied feedback.
Feedback Loops: Established channels for ongoing feedback, such as shared documents or
platforms (e.g., Slack or Trello), allowing team members to contribute observations and
suggestions in real time.
Stakeholder Presentations: Presented prototype updates and findings to stakeholders to
keep them informed, gather their input, and ensure alignment with business objectives.
Iterative Design Reviews: Facilitated regular design reviews to evaluate progress and
incorporate suggestions from both team members and stakeholders before moving to the next
phase.
Collaborative Documentation: Used collaborative tools for documenting requirements,
decisions, and changes, ensuring everyone had access to the latest information and could
contribute.

Rate yourself 3

Team collaboration

point – There is some evidence of team interactions among peers and stakeholders.
points - Team collaboration among peers and stakeholders is clearly documented in this section.
3 points - Effective team collaboration and communication among peers and stakeholders is
clearly documented in this section.
10. Individual learning reflection
11.1. Team Reflections

A good way to identify what you have learned is to ask yourself what surprised you during
the project. List the things that surprised you and any other thoughts you might have on
issues in your local community.

Team member name:

Team member name:

Team member name:

Team member name:


Team member name:

11. Video link

Enter the URL of your team video:


Appendix
Recommended Assessment Rubric (for Teachers)

LOGBOOK AND VIDEO CONTENT


Steps 3 points 2 points 1 point Poin
ts
Give
n
Probl A local problem which has A local problem which has A local
em not been fully solved not been fully solved problem is
definit before is explained in detail before is described. described
ion with supporting research.

The Users Understanding of the user Understanding of the user The user group is
group is evidenced by group is evidenced by described but it is
completion of all of the steps completion of most of the unclear how they are
in Section 4 The Users and steps in Section 4 The affected by the
thorough investigation. Users. problem.

Brainstorming A brainstorming session A brainstorming session A brainstorming


was conducted using was conducted using session was
creative and critical creative and critical conducted. A solution
thinking. A compelling thinking. A solution was was selected.
solution was selected with selected with supporting
supporting arguments arguments in Section 5
from Section 5 Brainstorming.
Brainstorming.
Design The use of AI is a good fit for The use of AI is a good fit The use of AI is a
the solution. The new user for the solution and there is good fit for the
experience is clearly some documentation about solution.
documented showing how how it meets the needs of
users users.
will be better served than
they are today.
Data Relevant data to train the Relevant data to train the Relevant data to train
AI model have been AI model have been the AI model have
identified as well as how identified as well as how been identified as well
the data will be sourced or the data will be sourced or as how the data will be
collected. There is collected. There is sourced or collected.
evidence that the dataset evidence that the dataset
is balanced, and that is balanced.
safety and
privacy has been considered.
Prototype A prototype for the solution A prototype for the solution A concept for a
has been created and has been created and prototype shows how
successfully trained. the AI model will work
trained to meet
users’
requirements.
Testing A prototype has been tested A prototype has been A concept for a
with a fair representation of tested with users and prototype shows how
users and all tasks in improvements have been it will be tested.
Section 9 identified to meet
Testing has been completed. user requirements.
Team Effective team collaboration Team collaboration among There is some
collaborat and communication among peers and stakeholders is evidence of team
ion peers and stakeholders is clearly documented in interactions among
clearly documented in Section 10 Team peers and
Section 10 Team collaboration. stakeholders.
collaboration.
Individ Each team member Each team presents Some team members
ual presents a reflective and an account of their present an account of
learnin insightful account of their learning during the their learning during the
g learning during the project. project. project.

Total points
VIDEO PRESENTATION
Points Given
3–
Criteria excellent
2 – very
good
1 – satisfactory

Communicatio The video is well-paced and communicated, following a


n clear and logical sequence.

Demonstrations and/or visuals are used to


Illustrative
illustrate examples, where appropriate.

Accurate The video presents accurate science and technology


language and uses appropriate language.

The video demonstrates passion from team members


Passion
about their chosen topic/idea.

Sound and
image
The video demonstrates good sound and image quality.
quality

The content is presented in the video within a 3-


Length
minute timeframe.

Total points

You might also like