Class 9 AI Supplement 1
Class 9 AI Supplement 1
CLASS – IX
Total Marks: 100 (Theory-50 + Practical-50)
LEARNING OUTCOMES:
Learners will be able to:
1. Identify and appreciate Artificial Intelligence and describe its applications in daily life.
2. Relate, apply and reflect on the Human-Machine Interactions to identify and interact with the three domains of AI: Data, Computer
Vision and Natural Language Processing and Undergo assessment for analysing their progress towards acquired AI-Readiness skills.
3. Imagine, examine and reflect on the skills required for futuristic job opportunities.
4. Unleash their imagination towards smart homes and build an interactive story around it.
5. Understand the impact of Artificial Intelligence on Sustainable Development Goals to develop responsible citizenship.
6. Research and develop awareness of skills required for jobs of the future.
7. Gain awareness about AI bias and AI access and describe the potential ethical considerations of AI.
8. Develop effective communication and collaborative work skills.
9. Get familiar and motivated towards Artificial Intelligence and Identify the AI Project Cycle framework.
10. Learn problem scoping and ways to set goals for an AI project and understand the iterative nature of problem scoping in the AI
project cycle.
11. Brainstorm on the ethical issues involved around the problem selected.
12. Foresee the kind of data required and the kind of analysis to be done, identify data requirements and find reliable sources to
obtain relevant data.
13. Use various types of graphs to visualize acquired data.
14. Understand types of modelling.
15. Understand the importance of Math for AI.
16. Learn the concept of data literacy and generative AI
17. Acquire introductory Python programming skills in a very user-friendly format.
SKILLS TO BE DEVELOPED
AI
Applying Concepts in READINESS Developing Life Skills
Learning Technical Skills CONCEPTS Through Concept Building
LIFE
TECHNICAL
SKILLS FROM
SKILLS FOR AI
AI
(iii)
SCHEME OF STUDIES:
This course is a planned sequence of instructions consisting of units meant for developing employability and vocational
competencies of students of Class IX opting for skill subject along with other education subjects.
The unit-wise distribution of hours and marks for class IX & X is as follows.
Employability Skills
10 2
Total 50 10
Theory Practical
Total 160 40
Practical Work
Practical Examination
• Simple programs using input and output function
PART C
Viva Voce 5
Total 35
Project Work / Field Visit / Student Portfolio
15
PART D
Total 15
(iv)
EMPLOYABILITY SKILLS
UNIT LEARNING OUTCOMES THEORY PRACTICAL
1. D
emonstrate knowledge 1. Methods of communication 1. W riting pros and cons of written, verbal and non-
of various methods of • Verbal verbal communication
communication • Non-verbal 2. Listing do’s and don’ts for avoiding common body
• Visual language mistakes
COMMUNICATION SKILLS –I
helps in building self- cultural, and physical factors 2. Use of positive metaphors/ words
confidence 2. Self-confidence building tips– getting rid of the negative 3. Positive stroking on wakeup and before going
thoughts, thinking positively, staying happy with small bed
things, staying clean, hygienic and smart, chatting with 4. Helping others and working for community
positive people, etc.
4. D
emonstrate basic 1. P
rimary operations on a computer system– input, process, 1. Identification of the various input and output
computer skills storage, output, communication networking, etc. units and explanation of their purposes
(v)
UNIT
ENTREPRENEURIAL LEARNING OUTCOMES THEORY PRACTICAL
1. T ypes of businesses– service, manufacturing, hybrid adopted by small businesses in a local community
1. Identify various types of 2. Types of businesses found in our community (Business 3. Best out of waste
business activities activities around us)
4. Costing of the product made out of waste
5. Selling of items made from waste materials
6. Prepare list of businesses that provides goods and
services in exchange for money
2. D
emonstrate the 1. Meaning of entrepreneurship development 1. P repare charts showing advantages of
knowledge of 2. Distinguishing characteristics of entrepreneurship over wages
distinguishing entrepreneurship 2. Group discussions on role and features of
characteristics of 3. Role and rewards of entrepreneurship entrepreneurship
entrepreneurship
3. Lectures/presentations by entrepreneurs on their
experiences and success stories
4. Identify core skills of successful entrepreneur
natural resource conservation ecosystem and factors causing imbalance 2. Prepare posters showing environment conservation
3. Natural resource conservation 3. D iscussion on various factors that influence our
4. Environment protection and conservation environment
2. D
escribe the importance 1. Definition of green economy 1. D iscussion on the benefits of green skills and
of green economy and 2. Importance of green economy importance of green economy
green skills. 2. Prepare a poster showing the importance of green
economy with the help of newspaper/ magazine
cuttings
(vi)
Session: Problem Scoping
Activity: Brainstorm around the theme provided and set a goal for the AI
project.
Learn problem scoping and ways to set goals ● Discuss various topics within the given theme and select one.
for an AI project.
● Fill in the 4Ws problem canvas and a problem statement to learn more
about the problem identified in the community/ society
● List down/ Draw a mind map of problems related to the selected topic
and choose one problem to be the goal for the project.
Session: Modelling
Understand modelling (Rule-based &
Learning-based) ● Introduction to modelling and types of models (Rule-based & Learning-
based)
(vii)
Session: Evaluation
Learners will understand about new terms
● True Positive
Understand various evaluation techniques.
● False Positive
● True Negative
● False Negative
Session: Ethics
Video Session: Discussing about AI Ethics
Recommended Activity: Ethics Awareness
To understand and reflect on the ethical
issues around AI. ● Students play the role of major stakeholders, and they have to decide
what is ethical and what is not for a given scenario.
● Students to explore Moral Machine (https://www.moralmachine.net/ )
to understand more about the impact of ethical concerns
(viii)
Session: Acquiring Data, Processing, and Interpreting Data
● D etermine the best methods to acquire ● Types of data
data. ● Data Acquisition/Acquiring Data
● Classify different types of data and enlist ● Best Practices for Acquiring Data
different methodologies to acquire it. ● Features of data and Data Preprocessing
Acquiring Data, ● Define and describe data interpretation. ● Data Processing and Data Interpretation
Processing, and ● Enlist and explain the different methods ● Types of Data Interpretation
Interpreting Data of data interpretation. ● Importance of Data Interpretation
● R
ecognize the types of data
interpretation.
Recommended Activities:
● R
ealize the importance of data ● Trend analysis
interpretation ● Visualize and Interpret Data
(ix)
Understand the concept of Probability in Session: Introduction to Probability
real life and explore various types of events. ● How to calculate the probability of an event
● Types of events
● understand the concept of Probability using a relatable example.
Exercise: Identify the type of event.
Probability Application in various real life scenarios Session : Applications of Probability
● Sports
● Weather Forecast
● Traffic Estimation
Exercise: Revision time
Session:
● Types of Generative AI
● Examples of Generative AI
Session:
● Benefits of using Generative AI
● Limitations of using Generative AI
Recommended Activities:
● Applying Generative AI tools to create content. ● Hands-on Activity: GAN Paint
● Generative AI tools
Session:
● Applying Generative AI tools to create content.
● Ethical considerations of using Generative AI
(x)
PART-C: PRACTICAL WORK
UNIT 5: INTRODUCTION TO PYTHON: Suggested Program List
PRINT
● To print personal information like Name, Father’s Name, Class, School Name.
● To print the following patterns using multiple print commands-
● Create a list in Python of children selected for science quiz with following names- Arjun, Sonakshi, Vikram, Sandhya, Sonal, Isha,
Kartik Perform the following tasks on the list in sequence-
○ Print the whole list
○ Delete the name “Vikram” from the list
○ Add the name “Jay” at the end
○ Remove the item which is at the second position.
● Create a list num=[23,12,5,9,65,44]
○ print the length of the list
○ print the elements from second to fourth position using positive indexing
○ print the elements from position third to fifth using negative indexing
● Create a list of first 10 even numbers, add 1 to each list item and print the final list.
● Create a list List_1=[10,20,30,40]. Add the elements [14,15,12] using extend function. Now sort the final list in ascending order and
print it.
IF, FOR, WHILE
(xi)
● To print sum of first 10 natural numbers
● Program to find the sum of all numbers stored in a list
Important Links
● https://cbseacademic.nic.in/web_material/Curriculum21/publication/secondary/Python_Content_Manual.pdf
● https://drive.google.com/drive/folders/1qRAckDculA5i164OUFDlilxb8mT65MMb
Suggested Projects
2. Choose an issue that pertains to the objectives of sustainable development and carry out the actions listed below.
○ To understand more about the problem identified, create a 4Ws problem canvas.
○ Identify the data features and create a system map to understand relationship between them
○ Visualize the data collected graphically (Spreadsheet software to be used store and visualize the data)
Visit to an industry or IT company or any other place that is creating or using AI applications and present the report for the same.
Visit can be on physical or virtual mode.
Maintaining a record of all AI activities and projects (For Example Letter to Futureself, Smart Home Floor Plan, Future Job
Advertisement, Research Work on AI for SDGs and AI in Different Sectors, 4Ws canvas, System Map). (Minimum 5 Activities)
(xii)
Content from Existing Book
(xii)
CONTENTS
(xiv)
UNIT-1
AI REFLECTION, PROJECT
CYCLE AND ETHICS
Learning Outcomes
Artificial Intelligence (AI) is the simulation of human intelligence by machines, particularly computer systems. It
involves algorithms that enable computers to perform tasks such as learning, reasoning, problem-solving, and
understanding language. AI systems can improve their performance over time through machine learning. AI is
used in various applications, from virtual assistants to autonomous vehicles.
Project Cycle
AI project cycle is the process of solving the real-life problems by converting them into an AI-based model. The
project cycle framework is designed to help project managers guide their projects successfully from start to end.
The purpose of the project life cycle is to create an easy-to-follow framework to guide projects. The AI project
cycle provides us with an appropriate framework which can lead us towards our goal.
Problem
Scoping
Data
Deployment
Acquisition
Data
Evaluation
Exploration
Modelling
Reboot
1. List down the steps of AI Project Cycle.
Reboot
1. Explain the term Problem scoping in AI project Cycle.
Importance of Evaluation
Evaluation is a process that critically examines a program. It involves collecting and analysing information about
a program’s activities, characteristics, and outcomes. Its purpose is to make judgments about a program, to
improve its effectiveness, and/or to inform programming decisions. Following are the some of the advantages
of evaluating a model:
•• Evaluation ensures that the model is operating correctly and optimally.
•• Evaluation is an initiative to understand how well it achieves its goals.
•• Evaluations help to determine what works well and what could be improved in a program.
An AI based Prediction model is deployed in schools. The model is supposed to predict whether the students
of grade 12 will be taking board exams in the coming year or not. The model will be checking for whether there
will be board exams in the coming year or not.
There are two important parameters which are used for the Evaluation of a model. These are:
•• Prediction: It is the output given by the AI model using Machine Learning algorithm.
•• Reality: It is the real scenario of the situation for which the prediction has been made.
Let’s look at the various combinations that can be considered for the above scenario.
Due to COVID-19 things are becoming unpredictable. Even the conducting of the board exams totally depends
on the number of active cases. We need an AI model which can predict whether the board exams will be
conducted or not so that the students can timely plan their preparation and schedule to study as per the date
sheet.
In the above picture, we show the possibility of board exams for grade 12 students. The model predicts a Yes
which means the board exams will be conducted. The prediction matches with the reality: Yes therefore, this
condition is called True Positive.
Predicion: No Reality: No
True Negative
There are no board exams as the numbers of COVID-19 cases have increased, hence the Reality is No. In this case, the
machine too has predicted it correctly as a No. Therefore, this condition is termed as True Negative.
Here the reality is that there are no board exams to be conducted as they got cancelled due to the COVID-19
number of cases increasing drastically. But the machine has incorrectly predicted that there will be board exams
for the students of grade 12. This case is termed as False Positive.
Here, the board has decided to conduct examinations for grade 12 students because of which the Reality is Yes
but the machine has incorrectly predicted it as a No which means the machine predicts that there will be no
board exams. Therefore, this case becomes False Negative.
Reality
Confusion Matrix
Yes No
Yes True Positive (TP) False Positive (FP)
Prediction
No False Negative (FN) True Negative (TN)
Reboot
1. Differentiate between Prediction and Reality.
•• False Positive Rate is the proportion of actual negative cases that are incorrectly classified as positive.
FP
FPR =
FP + TN
TP vs FP rate at one
decision threshold
TP vs FP rate at
another decision
1 threshold
TP Rate
0
0 FP Rate 1
AI Project Cycle Mapping Template presents how different stages are related with each other and how the
functions performed in every phase forms an input for the next phase.
The performed task at one stage forms the root for the next stage.
AI Project: Customer Churn Prediction (identifying at-risk customers who are likely to cancel their subscriptions
or close/abandon their accounts.)
•• Problem Scoping:
✶✶ Identify the problem: The telecommunications company wants to reduce customer churn rates.
•• Data Acquisition:
✶✶ Gather data sources: Collect customer demographics, usage patterns, service history, and churn status data
from the company's databases.
✶✶ Ensure data quality: Clean the data, handle missing values, and remove duplicates.
•• Data Exploration:
✶✶ Explore the data: Analyse customer demographics, usage patterns, and churn rates through visualisations
and statistical summaries.
✶✶ Preprocess data: Simplify numerical features, convert categorical variables, and create new metrics like
customer tenure.
•• Modelling:
✶✶ Select techniques: Choose machine learning algorithms suitable for classification tasks, such as logistic
regression, decision trees, and random forests.
✶✶ Train models: Use the prepared data to train multiple models, adjusting hyperparameters and performing
cross-validation to optimise performance.
•• Evaluation:
✶✶ Evaluate models: Assess the performance of each model using metrics like accuracy, precision, recall, and
F1-score.
✶✶ Compare models: Compare the performance of different models to select the best-performing one for
deployment.
The telecommunications Gather customer Analyse customer Select machine Evaluate Integrate
company wants to demographics, demographics, learning each the model to
reduce customer churn usage patterns, usage patterns, algorithms for model's predict new
rates. service history, and churn rates classification, performance customer
and churn with visualisations like logistic using churn risk.
status data and statistical regression, accuracy,
from company summaries. decision trees, precision,
databases. and random recall, and
forests. F1-score.
Examples Medical ethics, business ethics, legal ethics Personal beliefs about honesty, integrity,
kindness
Enforcement Enforced by external bodies (e.g., professional Self-governed and enforced by individual
organisations, legal systems) conscience
Flexibility Can change over time to reflect new norms or More stable over time, but can evolve with
societal changes personal growth
Inclusion Human
Rights
Privacy Bias
By adhering to these AI ethics principles, developers and organisations can contribute to the creation of AI
solutions that are not only technically robust but also ethically sound, socially responsible, and aligned with the
values and interests of society.
At a Glance
•• AI project cycle is the process of converting the real-life problem into an AI-based model.
•• Data exploration is the first step of Data Visualisation. It refers to the techniques and tools used to visualise data
through complex statistical methods.
•• Evaluation: This stage is the testing of the system, where we check if the model can achieve required goals or not.
•• An AI project cycle is essential because it provides a structured framework for developing, deploying, and maintaining
AI systems.
•• Model Evaluation is the last stage of the AI Project development cycle.
•• Evaluation of an AI model can be done using various terminologies. It is used to determine the best model for
deployment.
•• Confusion matrix is a tabular structure which helps in measuring the performance of an AI model using the test
data.
•• The Receiver Operating Characteristic (ROC) curve is a graphical representation that illustrates the diagnostic ability
of a binary classifier system as its discrimination threshold is varied.
•• The deployment phase of the AI project cycle is when the AI model is put into use in a real-world setting.
•• AI Project Cycle Mapping Template presents how different stages are related with each other and how the functions
performed in every phase forms an input for the next phase.
•• Ethics and morals are related concepts often used interchangeably, but they have distinct meanings and applications
4. What is the primary purpose of the confusion matrix while evaluating an AI model?
a. To measure the execution time of the model
b. To identify the best algorithm for the model
c. To measure the performance of an AI model using the test data
d. To optimise the hyper parameters of the model
Unsolved Questions
SECTION A (Objective Type Questions)
uiz
A. Tick ( ) the correct option.
1. What is covered by the ethical principle in terms of handling personal data?
a. Privacy b. Consent
c. Transparency d. Data Security
2. Which ethical principle places the most focus on disclosing information about data collection methods and making it
clear what data is collected and how it will be used?
a. Keeping data collection practices secret
b. Collecting data without informing individuals
c. Obtaining clear and explicit permission from individuals before collecting or using their data
d. Sharing data with third parties without notification
3. Which of the following is NOT a fundamental human right that AI solutions should respect?
a. Freedom of expression b. Right to a fair trial
c. Right to own slaves d. Privacy
2. The ………………………. curve is a graphical representation that illustrates the diagnostic ability of a binary classifier system
as its discrimination threshhold is varied..
3. Protecting individuals' personal data and their right to control how it's used is a core principle of AI ethics called
………………………. .
4. Ensuring that AI solutions are accessible and beneficial for all members of society, regardless of background, is referred
to as ………………………. .
2. The capacity of an organisation to collect and use personal data without harming confidentiality is referred
to as privacy in data ethics. ……….……
3. Data security is the use of strong security measures to prevent unauthorised access and breaches of
personal data. ……….……
4. AI systems should only be designed by human rights experts to ensure they are ethical. ……….……
2. Why are tools for recording, reporting, and monitoring so important throughout the AI project's deployment phase?
In Life
A robotic vacuum cleaner, sometimes called a robovac or a roomba as a generic trademark, is an autonomous robotic
vacuum cleaner which has a limited vacuum floor cleaning system combined with sensors and robotic drives with
programmable controllers and cleaning routines. Do robot mops work? Can they be replaced by our conventional house
helps?
king
al Thin
Lab Comput
ation
Answers
Quiz
1. c 2. b 3. b 4. c
Exercise
B. 1. AI project cycle 2. data acquisition 3. Data exploration 4. Model Evaluation
C. 1. True 2. False 3. True 4. True
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
3. List any two examples of TRUE Positive cases that exists in our day to day life.
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
4. Explain the term "Inclusion" as one of the principles of AI Ethics. Give an example.
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
…………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
30 Touchpad Artificial
Supplement Artificial
Intelligence
Intelligence
(Ver.
(Ver.
2.0)-IX
2.0)-IX
UNIT-2
DATA LITERACY
Learning Outcomes
Data refers to any collection of raw facts, figures, or statistics that can be stored and processed by a computer.
It can be in different forms like numbers, text, images, audio, and video etc.
Data literacy means knowing how to understand, work with, and talks about data. It’s about being able to
collect, analyse, and show data in ways that make sense.
Data Literacy 31
Data literacy is essential because it enables individuals to make informed decisions, think critically, solve
problems, and innovate.
Data Literacy
Video Session
Scan the QR code or visit the following link to understand the Data Literacy:
https://www.youtube.com/watch?v=yhO_t-c3yJY
1. What are the TakeAway from the given video?
2. What is the definition of data and data literacy, and how do these concepts impact our
ability to understand and use information effectively?
The Data Pyramid is a conceptual model that illustrates the hierarchical structure of data processing, depicting
the progressive transformation of raw data into actionable wisdom. It starts with raw data, which initially has
no use. Through processing and analysis, this data evolves into meaningful information, then knowledge, and
ultimately wisdom. This transformation enables informed decision-making and a deeper understanding of the
world around us.
Wisdom Why?
How? Knowledge
Who? What?
Information
When? Where?
Red, Traffic_Light_1
Data
Let us see another example of Creating a Data Pyramid for planning a picnic as a student after receiving the
official letter from school:
Information Researching the picnic location to learn about its amenities, facilities, and
nearby attractions. checking the weather forecast for the picnic day.
Data Literacy 33
Task
Can you try another example of creating a Data Pyramid for preparing a speech on the occasion of “Technology
Day”?
Wisdom
Knowledge
Information
Data
Data literacy can equip individuals with skills and knowledge to improvise in a tech and data driven world. There
are countless reasons why data literacy is critical to an organisation. Some of these are:
● Data literacy enhances decision making ability in individuals
based on evidence. Based on sources of data, emerging trends
and interpretations, individuals can make decisions that are data-
driven.
● Data literacy is able to cultivate critical thinking skills to
understand and explore data’s implications by questioning
assumptions, to reach logical conclusions, identifying patterns,
and evaluating evidence and data accuracy.
Task
Let us do an activity to understand the impact of Data Literacy
Impact of News Articles (Select any trending news)
Session Preparation Logistics:
For a class of 40 Students [Pair Activity]
Materials Required:
Item Quantity
Online Data Sources Clues NA
Computers 20
Purpose:
The purpose of this activity is to engage participants in various scenarios that involve collecting data and
analysing its sources. Emphasising the importance of validating data sources, the aim is to instill the concept
of data literacy. By understanding how authentic data sources contribute to reliable and unbiased decision
making, participants will develop critical skills for navigating and interpreting data effectively.
Brief: [Pair Activity]
Participants will search the internet for data sources, extracting key information to support their decisions.
Data Literacy 35
Author of the Weblink to the How was the situation Key figures in the
Source Source described by the Source source
You have to rank the sources of the news articles from most accurate to least, state reasons for your choice.
So, we can conclude that every data tells a story, but we must be careful before believing the story.
Data literate is a person who can interact with data to understand the world around them and derive meaningful
information from data. Some key points that help you to become a data literate are as follows:
•• Data identification and sourcing: Identify the source of data to find whether the data is reliable or not.
•• Understand the basics: Learn the concepts of data, types of data and how it can be used as not all data is
suitable for every kind of analysis.
•• Learn data analysis tools: There are many data analysis apps available that can be learned in order to understand
the impact of right data. Analysis involves using statistical tools or software to interpret the data. This can include
calculating simple averages to more complex tasks.
•• Gain statistical knowledge: Statistics play a vital role in data literacy. Its one of the vital components that must
be learned before you dive into the data driven world.
•• Use data visualisation: Understand the techniques of data visualisation graphics, charts. Tools like tableau,
matplotlib, python can be used effectively.
•• Learn data manipulation: Understanding how to manipulate data to meet the requirements is also one of the
key factors. Methods like filtering, sorting, grouping and omitting are essential for extracting insights from large
data set.
•• Practise data cleaning: Learning to remove data redundancy and data inaccuracy is essential to be data literate.
This may include dealing with missing values, removing outliers, or transforming data into a format suitable for
analysis.
Task
Let us do an activity:
Scenario: Buying a video game online.
Data literacy helps people research about products while shopping over the Internet.
How do you decide the following things when we are shopping online?
1. Which is the cheapest product available?
Data Literacy 37
Here are the typical levels of awareness in a Data Literacy Process Framework:
Plan
Planning sets a clear roadmap and structured approach to enhance data
literacy across different levels of awareness. You can:
● define the specific and measurable goals.
● develop a timeline and milestones for achieving these goals.
● identify and allocate resources needed (e.g., budget, tools, personnel).
Communication
Clear and consistent communication about data literacy will ensure an
efficient data literacy framework within an organisation. This involves:
● designing a well formulated communication plan to explain the
purpose of the goal and requesting commitment from the team
towards it.
● sharing stories and case studies that demonstrate the positive
impact of data literacy. It will help in motivating and encouraging
the team.
● monitoring continuously the effectiveness of communication efforts and make adjustments as needed. It will
help in minimising the risk of any associated costs.
Assess
Introducing participants to data literacy assessment tools and finding
out how comfortable they are with data is crucial in a data literacy
program. This process will help to:
● see what skills they already have
● identify what they need to learn
● create customised training plans for them
Prescribe Learning
By implementing a prescriptive learning approach, organisations
can provide a set of diverse resources that align with individual
learning styles. This approach ensures that there is:
● customised learning journeys tailored according to different
people( for example different educational background) based
on individual needs and preferences.
● a variety of learning materials that caters to different learning
styles and help in easier grasping of concepts.
● enough leverage or advantage to the learners to progress at their own pace, accommodating their schedules
and learning speeds.
● create an environment that make learners to feel comfortable and gain new skills in an environment to
supports continuous learning and encourages self-directed exploration.
● each participant can choose the materials and methods that work best for them, leading to more effective
learning and greater improvement in data literacy skills over time.
Evaluate
Designing an evaluation metric for the data literacy program involves
creating a structured framework to assess participants’ progress and
the effectiveness of the program overall. It helps to:
● improve participants’ overall data literacy skills.
● establish clear criteria to measure the success of the data literacy
program and individual participant growth.
● establish a schedule for assessing participant progress to monitor
their development over time.
Data Literacy 39
•• Learning
✶✶ Learning is the initial stage where individuals acquire new knowledge and skills related to data literacy.
✶✶ Individuals engage in various learning activities such as formal training sessions, online courses, reading
materials, and hands-on workshops to gain insights into data concepts, tools, and methodologies.
•• Application
✶✶ Application involves putting acquired knowledge and skills into practice in real-world contexts.
✶✶ Individuals apply what they have learned to analyse real datasets, solve data-related problems, and make
informed decisions.
✶✶ They are engaged in data projects, experiments, or simulations to gain practical experience and develop a
deeper understanding of data concepts.
•• Refinement
✶✶ Refinement focuses on reflecting the past experiences, identifying areas for improvement, and enhancing
data literacy skills over time.
✶✶ Feedback from peers, mentors, supervisors, and outcomes of data-related activities informs the refinement
process, guiding individuals to adjust their practices accordingly.
Reboot
1. List down the levels of Data Literacy Process Framework.
Data Security It is protecting data from attackers who might want to misuse it.
Terms of Service
● Access Controls: Access controls refer to the security measures and protocols to
restrict access to sensitive data, ensuring that only authorised individuals or
entities can view, modify, or interact with it. This reduces the risk of unauthorised
access by limiting the number of users who can interact with sensitive data.
•• Data masking: It obscures data so that, even if criminals exfiltrate it, they
can't make sense of what they stole. Unlike encryption, which uses encryption
algorithms to encode data, data masking involves replacing legitimate data
with similar but fake data. This data can also be used by the company in
scenarios where using real data isn't required, such as for software testing or
user training.
•• T
raining: Corporates must take up regular Data Security sessions
of their staff to sensitise them about following the data protection
processes being implemented and the importance of doing so.
Making them conscious of suspicious emails, links that they might
receive, not leaving their devices unlocked when unattended, keeping
software's up to date and not sharing passwords, are some of the
things that can be taken up.
•• A
udits and Testing of Security System: Regular audits and testing of security policies, integrated malware
protection, firewalls, Wi-Fi connections security, Hardware-based security, checking applications security, email
security and compliance also play very important role in maintaining data privacy and providing data security.
•• O
ther Basic Preventions: Being aware of surroundings and threats from insiders, complying with security
regulations which might be shared by entrusted agencies or bodies which track online cyber activities all across
the world are few other ways to provide cyber security.
Data Security in AI
AI systems often rely on vast amounts of data for training and operation. Unauthorised access and tampering
could lead to inaccurate AI models and compromised outcomes. Many AI applications process sensitive data,
such as personal, financial, or health-related information. Strong data security measures can stop data breaches
and unauthorised access.
Data Literacy 43
Data Privacy in AI
Data privacy brings the ethical use of AI. This ensures that AI systems comply with data privacy laws and
regulations (such as GDPR, CCPA) to help protect individuals’ rights and maintain public trust. AI systems must
ensure that data is collected, shared, and used in ways that users have explicitly consented to, maintaining
transparency and trust.
Cyber security involves protecting computers, servers, mobile devices, electronic systems, networks, and data from
harmful attacks. The best practices for cyber security are constantly evolving to keep up with the cyber threats.
Reference Links:
Video Session
Scan the QR code or visit the following link to understand the Internet safety tip for online
security:
https://www.youtube.com/watch?v=aO858HyFbKI
What are some key takeaways from the video session on online security, specifically
regarding best practices for protecting personal information?
Task
Refer the given link and answer the following questions:
https://www.cbse.gov.in/cbsenew/documents/Cyber%20Safety.pdf
1. What are the key online security practices discussed in the PDF that can help protect personal information
from cyber threats?
2. What essential digital etiquette guidelines were highlighted in the PDF to ensure respectful and effective
online communication?
Do’s Don’t’s
● U
se strong, unique passwords with a mix of ● A
void sharing personal info like real name or
characters for each account. phone number.
● A
ctivate Two-Factor Authentication (2FA) for ● D
on’t send pictures to strangers or post them on
added security. social media.
● D
ownload software from trusted sources only and ● D
on’t open emails or attachments from unknown
scan files before opening. sources.
● Prioritise websites with "https://" for secure logins. ● Ignore suspicious requests for personal info like
● Keep your browser, OS, and antivirus updated. bank account details.
● A
djust social media privacy settings for limited ● K
eep passwords and security questions private.
visibility to close contacts. ● D
on’t copy copyrighted software without
● Always lock your screen when away. permission.
● Connect only with trusted individuals online. ● A
void cyberbullying or using offensive language
● Use secure Wi-Fi networks. online.
● R
eport online bullying to a trusted adult ● D
on't respond to phone calls or emails asking for
immediately. confidential data.
● D
o use privacy settings on social media sites to ●D
on't leave wireless or Bluetooth turned on when
restrict access to your personal information. not in use.
Reboot
1. How are Data security and Data Privacy related?
2. List down the practices that can help you ensure data privacy.
Types of Data
In statistics, various types of data are gathered, analysed, interpreted, and presented. These data consist of
individual factual pieces recorded for analysis. Data analysis involves interpretation and presentation, producing
statistics to get some meaningful insight from that data. Data classification and handling are crucial processes
that use multiple tags and labels to define data, ensuring its integrity and confidentiality. Artificial Intelligence is
crucial, with data serving as its foundation. We come across different types of data and information every day.
Data Literacy 45
Data
Textual Numeric
Continuous Discrete
Data can be broadly classified under Textual data and Numeric Data as explained.
Nominal Data
It consists of categories or names that cannot be ordered or ranked. Nominal data is often used to categorize
observations into groups, and the groups are not comparable. Examples of nominal data include gender (Male
or female), and blood type (A, B, AB, O).
Ordinal Data
It consists of categories that can be ordered or ranked. Ordinal data is often used to measure opinions, where
there is a natural order to the responses. Examples of ordinal data include education level (Elementary, Middle,
High School, College), job position (Manager, Supervisor, Employee), etc.
Example: 100%, 1:3, 123 Examples: loud behaviour, fair skin, soft quality, and more.
Numeric data can be further classified as Continuous Data and Discrete Data:
Continuous data can take as a numeric value Discrete data refers to distinct single values. It consists
given within a range. of whole numbers without decimal parts that represent
distinct categories or values.
Continuous data is measurable. Discrete data is countable.
This type of data can be infinitely subdivided Discrete data cannot be subdivided meaningfully.
and often includes decimal points.
Often used to analyse using statistical It is used to analyse using frequency distributions, bar
techniques such as mean, median, standard charts, and probability distributions.
deviation, and correlation.
Example: Dimensions of classroom, Height, Examples: Number of girls and boys in class, Number of
Weight, Temperature, Time, etc. subjects in class 9th, Count of anything.
Computer Vision
omputer Vision is a field of artificial intelligence (AI) that uses machine learning and neural networks to teach
C
computers to derive meaningful information from digital images, videos and other visual inputs. It is like giving
eyes to computers. It helps them look at pictures and videos from the real world and understand what they’re
seeing. With Computer Vision, computers can figure out what’s in a picture or video, just like we do. They can
recognise objects, people, and even actions happening in videos.
Data Literacy 47
Types of data used in Computer Vision include:
● Image Data: Digital images captured by cameras or satellite imagery, medical scans, and surveillance footage.
● Video Data: Video data captured using camera
Machine Learning
achine Learning is like teaching computers to learn from examples and make decisions on their own. Imagine if
M
you showed a computer lots of pictures of dogs and cats, and you told it which ones were dogs and which ones
were cats. After seeing many examples, the computer learns to tell dogs and cats apart on its own. Types of data
used in Machine Learning include:
● Numeric Data: Data taken from tables, Excel sheets, etc.
DAT
A1
DAT
A2
DAT
A3
DAT
A4
04
22
03
20
21
02
20
20
20
19
20
80
70
60
50
40
30
20
10
CV NLP SD
Computer Vision Natural language Processing Statistical Data
Task
Let us now do an exercise to categorise the given data as Textual Data (Qualitative Data) or Numeric Data
(Quantitative Data):
Temperature
Gender
Show size
Comment on social media
Favourite colour
Newspaper article
Population number in a state
Email
Heart rate
Weight of a person
Data acquisition, also known as acquiring data, refers to the procedure of gathering data like raw facts, figures or
statistics from relevant sources either for reference or for analysis needed in AI projects. This involves searching
for datasets suitable for training AI models. The process typically comprises three key steps and plays a crucial
role in obtaining and preparing data for analysis.
1
Data
searching for new datasets
Discovery
2
Data Data
adding more data to the existing data
Acquisition Augmentation
3
Data
generating data if data is not available
Generation
Let’s say we want to collect data for making a CV model for a self-driving car.
The three key steps involved in Data Acquisition are given below:
Step 1: Data Discovery
Data discovery is about hunting for valuable information in different places, checking if it’s good quality, and
making sense of what we find. In the above example:
● We will require pictures of roads and the objects on roads.
● We can search and download this data from the Internet.
Data Literacy 49
Step 2: Data Augmentation
Data augmentation is the process of increasing the
amount and diversity of data. We do not collect
new data, rather we transform the already present
data. Data augmentation means increasing the
amount of data by adding copies of existing data
with small changes. The image given here does not
change, but we get data on the image by changing
different parameters like colour, rotation, flipping and
brightness. New data is added by slightly changing
the existing data.
In the above example:
● We apply flipping and rotation transformation to create variations of the original images.
● We also simulate occlusions such as objects partially blocking the view to train the model to handle obstructed
scenarios.
Step 3: Data Generation
Data generation refers to generating or recording data using sensors. Recording temperature readings of a building
is an example of data generation. Recorded data is stored in a computer in a suitable form.
In the above example, of self driving car. Data acquisition is done for creating fake driving scenarios with
different road conditions, traffic patterns, weather, and lighting to cover many possible situations.
Task
Visualise that you are in a big, mysterious forest and searching for hidden treasure.
Write four observations you will be making for finding the treasure and categorise them under the
heads viz: Data discovery, Data augmentation and Data generation.
Observations Categories
1. …………………………..……..………………………..……....……....…….. …………………………..……..………………………..……....……....……..
2. …………………………..……..………………………..……....……....…….. …………………………..……..………………………..……....……....……..
3. …………………………..……..………………………..……....……....…….. …………………………..……..………………………..……....……....……..
4. …………………………..……..………………………..……....……....…….. …………………………..……..………………………..……....……....……..
• UCI is a collection of
• Countries like
databases, domain
Australia, EU, India,
theories, and data
New Zealand, and
generators in
Singapore are openly
collaboration with
sharing datasets on
various portals Dataset Search the University of
Massachusetts
• Kaggle is an online
• This is a toolbox
community of data .gov by Google that can
UCI
scientists where you
can access different datasets search for data by
name Machine Learning Repository
types of data
Usability of Data
Let's take an example of completing a school project. You need clear instructions, a neat workspace, and accurate
information. Similarly, using data effectively relies on its clarity, organisation, and accuracy. There are three
primary factors determining the usability of data:
1. Structure of Data: Defines how data is stored. Data needs to have a clear structure. It should be organised
in a way that makes sense so that it can be used effectively.
Like when your mother starts cooking your favourite food she ensures before cooking that all ingredients
are available and are put in order for smooth and organised cooking.
For example:
Marks of a students arranged in a spreadsheet.
3. Accuracy: Accuracy is same as reliability so it indicates how well the data matches real-world values. Accurate
data closely reflects actual values without errors, enhancing the quality and trustworthiness of the dataset.
When your measurement is accurate, it makes your data really good. It’s like having a gold star on your
homework—it shows you did a great job!
Data Literacy 53
In the example given below, we are comparing data gathered for measuring the weight of 12 eggs in a box in
grams.
Task
Open a website https://www.kaggle.com. Kaggle is like a playground for data enthusiasts! It’s an online
platform where people from all over the world come together to play with data, learn new things, and
compete in data science competitions.
Do this
The Titanic competition on Kaggle is a classic and beginner-friendly challenge that introduces you to the
basics of data analysis and machine learning. The goal is to predict whether a passenger survived the Titanic
shipwreck based on factors like age, gender, ticket class, and more.
Explore:
Kaggle provides tutorials and notebooks to help you get started with the Titanic competition. You can find
them under the “Notebooks” tab on the competition page.
Features of Data
Data features are also called the characteristics or properties of the data. They describe each piece of information
in a dataset. They define what each data point represents and help us make sense of the data. For example,
● In a table of student records, features could include things like the student’s name, age, or grade.
● In a photo dataset, features might include properties like the colour present in each image, the resolution,
brightness, or the presence of certain objects.
These features help us understand and analyse the data. In AI models, we need two types of features: Independent
and Dependent.
Independent
Independent variables (sometimes called predictor variables) are those that are used to generate predictions
about or to account for the variation in the dependent variable (the goal). These features are the input to the
model—they’re the information we provide to make predictions.
Dependent
The dependent variable is the variable about which predictions or explanations are being sought. These features
are the outputs or results of the model—they’re what we’re trying to predict. For example, imagine we’re
building an AI model to predict students’ final exam grades based on various factors. The independent features
would include:
2 hr, 80%,
A
B Badminton
AI
3 hr, 90%, Model A+
A NaN
•• The independent variable is the cause. Its value is independent of other variables in your study.
•• The dependent variable is the effect. Its value depends on changes in the independent variable.
Data Preprocessing
Data preprocessing is an essential phase in the machine learning process that prepares datasets for effective
machine learning applications. It is the process of detecting and correcting (or removing) corrupt or inaccurate
records from a dataset. It includes multiple processes to clean, transform, reduce, integrate, and normalise data.
Data
Data Data Feature
Data Cleaning Integration &
Transformation Reduction Selection
Normalisation
Data Literacy 55
● How many pink round candies are there in the image?
Data Processing
Data processing involves tasks to refine raw data for analysis or application, including cleaning, organising,
transforming, and summarising information.
● It ensures data accuracy, relevance, and accessibility for effective decision-making and analysis.
● It is crucial across various sectors like business, science, and technology, facilitating better utilisation of data
assets.
● Data processing helps computers understand raw data.
● Use of computers to perform different operations on data is included under data processing.
Data Interpretation
Data interpretation is the process of making sense of data by analysing it to uncover patterns, trends, and
insights. It involves examining the data to understand its meaning, implications, and significance, helping to
inform decision-making and draw conclusions.
● It is the process of making sense out of data that has been processed.
● The interpretation of data helps us answer critical questions.
These steps make sure that working with data is organised, complete, and useful, so that organisations can
make smart choices based on the data.
Data Interpretation is the process of making sense out of a collection of data that has been processed. This
collection may be present in various forms like bar graphs, line charts and tabular forms and other similar forms.
Quantitative Data
Data Interpretation Interpretation
Methods
Qualitative Data
Interpretation
Data Literacy 57
● Longitudinal Studies: A type of study conducted over a long time
● Survey: Surveys can be conducted for a large number of people to collect quantitative data.
Task
Let’s do a small activity based on Identifying trends.
•• Visit the link: https://trends.google.com/trends/?geo=IN (Google Trends)
•• Explore the website
•• Check what is trending in the year 2024 – Global
✶✶ Make a list of trending sports (top 5)
✶✶ Make a list of trending movies (top 5)
•• Check what is trending globally in the year 2024
Data Literacy 59
3. Set a code to the data collected: Assign labels or codes to different parts of the data to identify themes,
patterns, or categories. For example, the researcher reads through the interview transcripts and highlights
sections discussing “ease of use,” “technical issues,” and “benefits of the app,” tagging them with corresponding
codes.
4. Analyse your data for insights: Examine the coded data to identify deeper patterns, relationships, and
insights. For example, The researcher groups codes related to “ease of use” and “technical issues” into a
broader theme of “user experience” and analyses how these themes impact overall user satisfaction with the
app.
5. Reporting on insights derived form analysis: Present the findings clearly, using quotes and visual aids to
support your conclusions and recommendations. For example, the researcher writes a report highlighting
the main themes along with positive and negative feedback.
Difference Between Qualitative and Quantitative Data Interpretation
Categorical Numerical
Provides insights into feelings and emotions Provides insights into quantity
Answers how and why Answers when, how many or how often
Example question – Why do students like attending Example question – How many students like attending
online classes? online classes?
Task
Word Puzzle
Instructions:
•• Partner with a person to play the game.
•• There will be three rounds of Word puzzle.
•• After 3 rounds, answer the questions given on the next slide.
Now answer the following questions:
•• Who won round one?
•• Who won round two?
•• Who won round three?
If you answered any of the above questions, you collected data!
Textual DI
Data is put into words, like in a paragraph, which works well for small amounts of data that can be easily
understood. But for larger amounts, this type of presentation may not be the best because it can get too
complicated. For instance, a paragraph might describe how a company's sales went up in the first quarter, and
how many units of each product they sold, as well as improvements in customer satisfaction.
Tabular DI
Data is organised systematically in rows and columns within a table, facilitating structured representation. In the
example given below, the title of the table, "Students Marks Analysis," provides a descriptive overview of the
table's content, summarising the analysis of student marks within the table.
Graphical DI
Some of the graphs include bar graphs, line graphs, pie charts, and scatter plots, which help in visualising trends,
relationships, and distributions within the data.
Data Literacy 61
Bar Graphs
In a Bar Graph, data is represented using vertical and horizontal bars.
Pie Charts
ie charts resemble pies, with each slice representing a portion of the
P
whole pie assigned to different categories. These circular charts are divided
into sections, and the size of each section corresponds proportionally to its
value within the dataset.
Line Graphs
A line graph connects data points to illustrate changes in quantity over time, aiding in visualising trends and patterns.
Brief:
The following are questions for the quiz. You can either go for a Pen/Paper Quiz or you can visit any
open-sourced, free, online portal; one of which is Kahoot, and create your quiz there. For Kahoot:
Go to https://kahoot.com/ and create your login ID. Then, add your own kahoot in it simply by
adding all the given questions into it. Once created, you can initiate the quiz from your ID and
students can participate in it by putting in the Game pin.
Quiz Questions
1. What are the basic building blocks of qualitative data?
a. Individuals b. Units c. Categories d. Measurements
2. Which among these is not a type of data interpretation?
a. Textual b. Tabular c. Graphical d. Raw data
3. Quantitative data is numerical in nature.
a. True b. False
4. A Bar Graph is an example of ………………………. data interpretation.
a. Textual b. Tabular c. Graphical d. None of the above
5. ………………………. relates to the manipulation of data to produce meaningful insights.
a. Data Processing b. Data Interpretation
c. Data Analysis d. Data Presentation
Reduced Cost
Identifying needs can lead to reduction in cost. It means by knowing what’s necessary, we can cut down on
waste. We can use resources more efficiently and not spend money on things that aren’t important.
Data Literacy 63
For example, restaurant owner could decide to drop/modify some dishes of the menu which aren’t popular or
have got bad reviews.
Identifying Needs
We can identify the needs of people by data interpretation. It means understanding what people want or require
by looking at the information we have.
For example, in a Pizza Shop there are possibilities that Veg Farmhouse Pizza is a popular choice among age
group 8-10.
What is Tableau?
Tableau is a powerful data visualisation and business intelligence tool for visualising and analysing data in order
to aid in business choices. It takes in data and produces various charts, graphs, maps, dashboards, and stories.
3. Click on the DOWNLOAD THE APP button to begin with the download process as shown below:
4. After finishing with the downloading of the files, double-click the installer of the Tableau Public Desktop.
The Tableau Public 2024.1 Setup wizard opens.
As soon as the installation process is over the application is ready for use.
You can also open the Tableau application by double-clicking the shortcut icon of the Tableau application on
the Desktop.
Data Literacy 65
Creating a Bar Graph Using Tableau
The steps to draw a Bar Chart in Tableau are as follows:
1. Create an Excel file and save it as student.xlsx with
the following data:
2. Double-click on the Tableau app shortcut icon on the
Desktop.
The Tableau app opens.
3. Select the Microsoft Excel option from the Connect
pane to access the Excel data that is used for visualising
the representation in Tableau. The Opens dialog box
appears.
4. Navigate the location where the Excel file is stored.
5. Select the student.xlsx file
6. Click on the Open button.
The data of the Excel file is displayed in the Data Source window.
7. Click on Sheet 1 in the Sheet tab.
You can sort the bars in graph in ascending or descending order by clicking the Ascending or Descending
option in the toolbar.
Data Literacy 67
A colourful Bar Graph is generated as shown below:
Duplicating a Chart
The steps to duplicate a chart are as follows:
1. Right-click the sheet in the Sheet tab whose chart you want to duplicate.
2. Select the Duplicate option from the context menu. A duplicate sheet is added in the Sheet tab.
Video Session
Scan the QR code or visit the following link to understand about tableau:
https://www.youtube.com/watch?v=NLCzpPRCc7U
Data Literacy 69
Task
Your favourite songs
•• Think about songs! Which songs do you like to listen to? Which songs do you love to sing?
•• Do you have a favorite song, artist, album, or playlist?
•• Let’s start thinking about the different aspects of a song, like instruments played and lyrics.
•• Do your favorite songs have anything in common?
•• Maybe your favorite music falls within the same genre.
✶✶ A genre refers to the different styles of music.
✶✶ Common genres include hip-hop, pop, alternative, and rock.
✶✶ Classifying songs by genre, and other traits allows us to see trends in our favorite music.
✶✶ All of this information is valuable data that we can count, summarise, and present!
Instructions
•• Draw a grid with 6 columns as shown.
•• Title the first column Song Name, then write down the names of 5-10 of your favorite songs.
•• For this activity, we’re going to collect data about the Album, Artist, Genre, Year, and Song Length.
•• Add the headings to your table.
•• Fill out the table by looking up each song on Google, Spotify, or Apple Music.
Let’s visualise
•• Count the number of songs that fall into each genre.
•• M
ake a bar chart to visualise the number of songs within each genre using your counting. Colour each bar
a different colour.
•• You will get a graph as shown in the image.
At a Glance
•• Data refers to any collection of raw facts, figures, statistics, or information that can be stored and processed by a
computer. It can be in different forms like numbers, text, images, audio, and video etc.
•• Literacy refers to the ability to read, comprehend and use information effectively.
•• To integrate data literacy skills into the organisational culture it is important to make data-driven decision-making
a fundamental part of everyday work.
•• Designing an evaluation metric for the data literacy program involves creating a structured framework to assess
participants' progress and the effectiveness of the program overall.
•• Data privacy referred to as information privacy is concerned with the proper handling of sensitive data including
personal data and other confidential data.
•• Data security is the practice of protecting digital information from unauthorised access, corruption, or theft
throughout its entire lifecycle.
•• Strong password is a combination of atleast 8 characters with upper and lower-case letters, numbers, and special
characters that are difficult for unauthorised individuals or automated programs to guess or crack.
•• Encryption is a security technique that transforms readable data (plaintext) into an unreadable format (ciphertext)
using an algorithm and an encryption key.
•• Data disposal refers to the process of securely destroying or deleting data that is no longer needed to prevent
unauthorised access, recovery, and misuse.
•• Using Firewall and Antivirus software can stop and alert users of any suspicious activity happening on their device.
•• Cyber security involves protecting computers, servers, mobile devices, electronic systems, networks, and data from
harmful attacks.
•• Data augmentation is the process of increasing the amount and diversity of data. We do not collect new data, rather
we transform the already present data.
•• The process of collecting data from websites using software is called Data Scraping.
•• Data processing refers to the manipulation and transformation of data into useful information through various
techniques and methods.
•• Data interpretation involves examining the data, identifying patterns, trends, and relationships, and translating the
findings into actionable information or decisions.
•• Data analysis is to examine each component of the data in order to draw conclusions.
•• Tableau is a powerful data visualisation and business intelligence tool for visualising and analysing data in order to
aid in business choices.
Data Literacy 71
Exercise
Solved Questions
SECTION A (Objective Type Questions)
uiz
A. Tick ( ) the correct option.
1. Which of the following is not an example of data?
a. Audio b. Video
c. Text d. Hardware
2. The ………………………….. illustrates the progressive transformation of raw data into actionable wisdom.
a. Data b. Data literacy
c. Data Pyramid d. Information
5. Designing an ………………………….. metric for the data literacy program involves creating a structured framework.
a. Mathematical b. Logical
c. Skill d. Evaluation
6. ………………………….. is about hunting for valuable information in different places, checking if it's good quality, and making
sense.
a. Data discovery b. Data investigation
c. Data quality d. Data literacy
7. ………………………….. can recognise objects, people, and even actions happening in videos.
a. NLP b. Data Science
c. Data d. Computer Vision
8. Which of the following is not the method of data collection in qualitative data interpretation?
a. Record keeping b. Observation
c. Case Studies d. Driving
9. ………………………….. should be organised in a way that makes sense so that it can be used effectively.
a. Data b. Knowledge
c. Privacy d. Ethics
10. ………………………….. should not have duplicates, missing values, outliers, and other anomalies so that its reliability and
usefulness for analysis is not affected.
a. Text Data b. Clean data
c. Visual data d. Ethical
2. ………………………. is a person who can interact with data to understand the world around them.
3. ………………………. can equip individuals with skills and knowledge to improvise in a data driven world.
4. The data literacy ………………………. provides a comprehensive and structured approach to develop the necessary skills for
using data efficiently with all levels of awareness.
5. ………………………. means increasing the amount of data by adding copies of existing data with small changes.
8. In ………………………. DI, data is represented systematically in the form of rows and columns.
10. In ………………………. data collection method , data is collected on the same data source repeatedly over an extended
period of time.
2. Data analysis is used to examine each component of the data in order to draw conclusions. ……….……
4. Data from people's experience is a data collection method in quantitative data. ……….……
5. In a pie graph, data is represented using vertical and horizontal bars. ……….……
6. Data interpretation helps in making informed decisions by providing a clearer picture of the situation. ……….……
A B C D E F
In the given image typical levels of awareness in a Data Literacy Process Framework are shown.
Data Literacy 73
2. What is prescribed learning?
Ans. By implementing a prescriptive learning approach, organisations can provide a set of diverse resources that align with
individual learning styles.
7. Why is it important to choose the appropriate measurement scale for data analysis?
Ans. Choosing the correct scale ensures accurate measurement and representation of data, leading to valid analysis results.
8. Describe a scenario where identifying needs through data interpretation can lead to reduced costs.
Ans. A restaurant owner could use customer feedback data to modify or remove unpopular dishes from the menu, reducing
waste and costs.
9. What are some common features of bar graphs and line graphs?
Ans. Both graph types represent data visually, with bar graphs using bars and line graphs using points connected by lines to
show changes over time.
10. Give an example of a situation where record keeping would be a useful data collection method.
Ans. Using library documents as reliable and curated sources of information for data collection.
2. List down the ways that will help you to become data literate.
Ans. Here is a guide to help you become data literate:
● Understand the Basics: Start from learning the concepts of data, types of data and how it can be used.
● Learn Data Analysis Tools: There are many data analysis apps available that can be learned in order to understand
the impact of right data.
● Gain Statistical Knowledge: Statistics play a vital role in data literacy. Its one of the vital components that must be
learned before you dive into the data driven world.
● Use Data Visualisation: Understand the techniques of data visualisation such as Graphics and Charts. Tools like
Tableau, matplotlib, python can be used effectively for this purpose.
● Learn Data Manipulation: Understanding how to manipulate data to meet the requirements is also one of the key
4. Explain the importance of having a clear structure in data and provide examples of good and poor data structure.
Ans. Clear structure in data ensures it is organised logically, facilitating efficient analysis and interpretation. For example,
marks of students arranged in a spreadsheet is a good structure, whereas a poor structure example if the student
records were stored in a disorganised manner, with inconsistent naming conventions or missing attributes, it would
impede data analysis and decision-making processes.
6. Explain the term Computer Vision and the type of data used in this ?
Ans. Computer Vision is like giving eyes to computers. It helps them look at pictures and videos from the real world and
understand what they're seeing. With Computer Vision, computers can figure out what's in a picture or video, just like
we do with our eyes. They can recognise objects, people, and even actions happening in videos.
Types of data used in Computer Vision include:
● Image Data: Digital images captured by cameras or satellite imagery, and medical scans.
● Video Data: Video data captured using camera, and surveillance footage.
Unsolved Questions
SECTION A (Objective Type Questions)
uiz
A. Tick ( ) the correct option.
1. Data literacy is able to cultivate ………………………. skills to understand and explore data's implications by questioning
assumptions.
a. critical thinking b. programming
c. awareness d. probability
Data Literacy 75
2. Data literacy fuels ………………………. by providing tools and techniques to explore data from different perspectives.
a. errors b. comprehension
c. innovation d. repetition
3. ………………………. enables user to tackle complex problems and derive meaningful relevance.
a. Mathematics b. Trends
c. project cycle d. Data literacy
5. By implementing a ………………………. learning approach, organisations can provide a set of diverse resources that align
with individual learning styles.
a. modern b. prescriptive
c. planned d. latest
6. The process of collecting data from websites using software is called ………………………. .
a. Data analysis b. Data reference
c. Data literacy d. Data Scraping
8. Digital images captured by cameras or satellite imagery, medical scans, and surveillance footage is ………………………. .
a. Text data b. Numeric data
c. Computer Vision d. Audio data
9. ………………………. is the process of making sense out of a collection of data that has been processed.
a. Data Interpretation b. Data scrapping
c. Data validation d. Data Handling
f. Qualitative Data
Data Literacy 77
B. Long answer type questions:
1. What are the ethical concerns while doing data acquisition?
2. Why is Data Security important?
3. Define the term “Data Backup."
4. How is Data Security Related to AI?
5. List any three best practices of cyber security.
6. Explain with example the two types of numeric data.
7. Explain in short the Types of Data used in three domains of AI.
C. Competency-based/Application-based questions:
1. Your teacher has asked students to give the choice of at least 3 co-curricular activities from the given list:
a. Painting e. Dance- Indian
b. Music - Western f. Best out of waste
c. Music - Indian g. English Theatre
d. Dance- Western h. Hindi Theatre
You're provided with a dataset containing errors, duplicates, and missing values. How would you approach organising
and cleaning this data to ensure its reliability and usefulness for analysis?
Outline the steps you would take to organise and clean the dataset, ensuring that it is free from errors, duplicates, and
missing values. Additionally, describe any methods or techniques you would use to address these issues and ensure the
dataset's reliability and usefulness for analysis.
2. The following dataset represents the students' academic performance, identify which features in the dataset would be
considered independent variables and which would be dependent variables in predicting students' final exam grade.
In Life
1. Prepare a questionnaire using Padlet.com, to know how data literacy is helpful in education.
2. Make a presentation to depict the Data Literacy Process Framework.
Deep Thinking
1. In today's digital age, data has become an incredibly valuable resource, much like gold was during the gold rush
era. In the context of Artificial Intelligence (AI), data is the raw material that fuels AI systems, How?
2. Who first said that Data is a new gold? Is data more precious than the gold? Justify
Lab
Ask students collect data of different coloured objects in the Lab and record it in a spreadsheet. Create a
basic bar chart to visualise the collected data using spreadsheet software. Later, ask students to present their
bar charts, followed by a brief discussion on the importance of data quality and ethical considerations in AI.
Answers
AI Quiz Section A (Objective Type Questions)
A. 1. d 2. c 3. b 4. c 5. d 6. a 7. d 8. d 9. a 10. b
Exercise
B. 1. Data 2. Data literate individual 3. Data literacy education 4. Framework
5. Data augmentation 6. Tabular 7. Numeric 8. Longitudinal
C. 1. True 2. True 3. False 4. False
5. False 6. False 7. True 8. True
9. False 10. True
Data Literacy 79
Ready 2
Answer the following questions:
1. Why do you think there’s a need to educate students about data literacy?
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
2. Why is data privacy important? Give an example.
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
3. What are the 3 C's of data literacy?
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
4. What is meant by cyber attack? Give an example of it.
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
Learning Outcomes
Mathematics is crucial to artificial intelligence (AI) because it provides the theoretical foundation and practical
tools for developing, assessing, and optimising AI systems. It is the backbone of AI algorithms and models,
empowering machines to process, analyse, and interpret vast amounts of data.
In general, mathematics guarantees that AI models are precise, effective, and able to resolve challenging issues.
Patterns in Numbers
Number patterns, often simply referred to as patterns, are sets of numbers that follow a specific order or rule.
There are various types of number patterns, including Fibonacci, geometric, algebraic, and arithmetic patterns.
Some tricks are given below for finding the patterns in numbers:
•• Number Sequences: Look for a sequence where each number is related to the one before it in a specific way.
This could be:
✶✶ Adding a constant value (e.g., 1, 4, 7, 10... +3 each time)
✶✶ Multiplying by a constant value (e.g., 2, 4, 8, 16... x2 each time)
✶✶ Following a formula (e.g., 2n + 1: 3, 5, 7, 9...)
•• Series: Analyse a series of numbers to identify the underlying rule for example square series, cube series, etc.
•• Even or Odd: Is the sequence made of even numbers (2, 4, 6...) or odd numbers (1, 3, 5...)?
•• Prime or Composite: Are the numbers prime (only divisible by 1 and itself) or composite (divisible by more
than two numbers)?
•• Fibonacci Sequence: This famous sequence starts with 0 and 1, and each subsequent number is the sum of the
two preceding numbers (0, 1, 1, 2, 3, 5, 8...).
g
Thinkin
Critical
Task
1. Can you identify a pattern in the given image?
2. Find the missing numbers in given series and draw the pattern.
0 1 4 25 36
g
Thinkin
Critical
Task
2. 0, 1, 1, 2, 3, 5, 8,_____ – Can you find the next number in the given series?
3. 3, 6, 12, 24, ___, 96, ___ – Can you fill in the missing number and find the next number in the given series?
4. Identify the highest temperature in the given graph and mention the time of it.
Temperature v/s Time
Temperature
50
45
40
35
30
6 am 10 am 2 am 6 am
Time
Statistics
Statistics is used for collecting, exploring, and analysing the data. It also helps in drawing conclusions from data.
It enable AI systems to detect patterns, identify relationships, and infer conclusions from data.
•• Data is collected from various sources.
•• Data is explored and cleaned to be used.
•• Analysis of data is done to understand it better.
•• Conclusions and decisions can be made from the data.
Let’s consider an example to illustrate these steps:
A school wants to improve the performance of its students and decides to collect data on study habits and
grades.
• Collecting Data: The school conducts a survey where students report the number of hours they
study each week and their grades in various subjects.
• Exploring and Cleaning Data: The school first looks at the data to find patterns, like the
range of study hours and grades.
They also clean the data by fixing any missing or incorrect information (e.g., if some students
didn’t fill in all the fields or gave unrealistic answers).
• Analysing Data: The school summarises the average study hours and grades to get an overview.
They also check if there is a significant relationship between study hours and grades.
• Drawing Conclusions: The analysis might reveal that students who study more hours tend
to have higher grades. Based on this conclusion, the school might decide to implement study
support programs to encourage students to study more.
Thus, we can say statistics helps in transforming the raw data into meaningful insights, enabling better decisions
and strategies in various fields such as business, healthcare, education, and more.
a
Brainy Fact
During the COVID-19 pandemic, statistical models were used to predict the spread of the virus and
allocate medical resources efficiently. The World Health Organization (WHO) reported in 2020 that
these models helped reduce the strain on healthcare systems by optimising resource distribution.
Application of Statistics
Statistics can be used in numerous fields because of its ability to extract meaningful insights from data and assist
in decision making. Let us discuss some of the fields where statistics is used.
Education
•• nalysing test scores and grades to evaluate student learning, identify
A
areas for improvement and allocating resources effectively.
•• sing data to identify gaps in the curriculum and areas where
U
students need more support.
•• nalysing how students and teachers use educational technology for
A
future implementations.
•• tatistics helps in determining the average skills of students in a
S
particular school or grade. This information shows which areas need
more focus and help improve education strategies.
Disaster Management
•• Authorities use statistics in analysing existing risks and alert the citizens Landslip
Meteori te
Weather Forecast
•• S
tatistical summaries create easy-to-understand weather
forecasts for the public.
•• S
tatistics compare the weather conditions with the information
about past seasons and conditions.
•• U
sing statistical methods to analyse long-term climate data and
detect trends related to global warming and climate change.
•• P
redicting the future based on data from the past and provide
forecasts that express the likelihood of various weather events
(e.g., “There is a 70% chance of rain tomorrow”).
Options of Voters
Statistics are used in elections to enhance transparency for
•
voters and encourage voters to participate in elections.
• By surveying a group of voters and analysing the results,
statistics can reveal what the majority of people think about
political candidates, policies, or upcoming elections. This helps
politicians and decision-makers to understand public opinion.
• Statistics play an important role in Election Forecasts, making
Campaign Strategies and micro-target individuals based on
available data.
Reboot
Explain any two application of statistics in education.
Task
Let’s understand Probability with the following activity.
Purpose: To understand the possibility of occurrence of an event.
Case 1: Varun went to the park to play football. Which pet is he more likely to see, a cat or a dog?
What is the probability of drawing a spade or diamond from a standard deck of 52 cards?
When discussing probability, we often rely on specific terms to describe the likelihood of events occurring.
Here’s an elaboration on each of these terms:
Certain events
These events are guaranteed to happen; there is no doubt about their occurrence. It will have a probability of 1.
For example:
✶✶ If you flip a fair coin, the probability of it landing heads up or tails up is certain, as one of these outcomes is
guaranteed.
✶✶ The occurrence of sunrise and sunset each day is certain.
✶✶ When you flip a light switch, the light bulb will either turn on or off.
✶✶ The act of inhaling and exhaling is certain as long as a person is alive.
✶✶ Time consistently moves forward, and each passing moment is certain.
✶✶ The beating of the heart is a certain event, as long as a person is alive.
✶✶ If it is Sunday today, it is certain, tomorrow is going to be a Monday.
Likely events
These events have a higher probability of occurring as compared to other events. For example:
✶✶ If you roll a fair six-sided die, the likelihood of rolling a number greater than 2 (3, 4, 5, or 6) is higher than
rolling a number less than or equal to 2 (1, or 2).
✶✶ If you visit a store known for carrying a wide variety of brands, it’s likely that you’ll find your favourite brand
among their products.
✶✶ If you study well for an exam, you’re more likely to pass with good grades.
✶✶ If you buy many raffle tickets compared to others, you’re more likely to win a prize.
Unlikely events
These events have a lower probability of occurring compared to other events. For example,
✶✶ If you randomly select a card from a standard deck, the probability of drawing an ace of spades is lower
compared to drawing a card of a different suit.
✶✶ Observing a shooting star in the night sky is an unlikely event compared to seeing regular stars.
0 1
Case 1
Health insurance companies often use probability to determine how likely it is that certain individuals will spend a
certain amount on healthcare each year and determine risk factors for contracting a disease and for being cured.
For example, a company might use factors like age, existing medical conditions, current health status, etc. to
determine that there’s a 90% probability that a certain individual will spend $10,000 or more on healthcare in a
given year.
Individuals who are likely to spend more on healthcare will be charged higher premiums because the insurance
company knows that they’ll be more expensive to insure.
•• Scenario 1: If the person is young, below 25 years of age, then insurance premium will be very low.
•• S
cenario 2: If the person is middle-aged (25–40), then the insurance premium will be of higher value than that
in the first category.
Age below 25 Age 40-60
Premium very low Premium very high
•• Scenario 3: If the age of the person is 40-60, then the premium charged will be quite high.
•• Scenario 4: If the person is above 60 years of age then the premium charged is of highest bracket.
a
Brainy Fact
GPT-3 (Generative Pre-trained Transformer 3) is a large language model, released by OpenAI in
2020, has 175 billion parameters and uses probabilistic sampling techniques for text generation.
Case 2
Let there be a bet among friends that a person will have to perform a dare if the die rolls out 5 or 6 and otherwise
the person has to speak the truth.
•• Scenario 1: When die rolls into any number between 1-4. The person replies a question with truth
1 2 3 4 5 6
Truth Dare
•• Scenario 2: When the dice rolls out 5 or 6 then a person gets to do a dare.
1 2 3 4 5 6
Truth Dare
Applications of Probability
Probability has a wide range of applications across various fields, making it an essential concept in many areas
of study and professional practice. Here are some key applications of probability.
Sports
•• In cricket, the batting average represents how many runs
a batsman would score before getting out. Probability
can help in estimating the batting average. For instance,
if a batsman had scored 20 runs out of 80 from only
boundaries in the last match. Then, there is a chance
that he will score 25% of his runs in the next match from
boundaries.
Weather Forecasting
•• Probability is used by weather forecasters to assess how likely it is that there will
be rain, wind, snow, clouds, etc., on a given day in a certain area.
For example, forecasters may say things like “there is a 70% chance of rain today
between 4 PM and 6 PM” to indicate a medium to high likelihood of rain during
certain hours.
•• Probability helps in understanding long-term climate patterns and changes.
For example, estimating the probability of extreme weather events under different
climate scenarios aids in planning and mitigation efforts.
Traffic Estimation
•• People often use probability when they decide to drive to someplace. Vehicular
traffic flow is examined, an estimate vehicle waiting time in each direction is
estimated through probability.
•• B
ased on the time of day, location in the city, weather conditions,etc. people
tend to make probability predictions about how bad traffic will be during a
certain time. For example, if you think there’s a 90% chance that traffic will be
heavy from 6 PM to 7:30 PM in your vicinity then you may decide to wait during
that time.
Finance
•• Probability is used to assess the risk of investments and financial
decisions.
For example, calculating the probability of a stock’s return falling within
a certain range helps investors make informed decisions.
•• Probability models help investors spread their investments across
different assets to reduce risk.
For example, by looking at the chances of returns for various investments,
investors can build a balanced portfolio.
•• Probability is crucial in figuring out insurance costs and how policies are structured by various insurance agencies.
For example, actuaries use probability to predict events like accidents or natural disasters and set insurance
premiums based on those chances.
Education
•• Probability is used to estimate how well students will do in exams
based on different factors.
For example, considering the study habits and attendance can
predict the chances of students getting certain grades.
•• Probability helps create fair and accurate standardised tests.
For example, figuring out the chances of different scores in tests, it
can better reflect how well students understand the subject taught.
At a Glance
•• Mathematics is crucial to artificial intelligence (AI) because it provides the theoretical foundation and
practical tools for developing, assessing, and optimising AI systems.
•• Mathematics and AI are interrelated fields, where mathematics provides theoretical concepts to many of
the AI algorithms.
•• Artificial Intelligence assist computers to understand and recognise patterns, much like how humans do.
•• Number patterns, often referred to as patterns, are sets of numbers that fit into a particular order.
•• Patterns in images refer to recurring visual elements or structures that can be recognised and analysed
within the context of an image.
•• Probability theory and statistics are one of the key fundamentals to many AI algorithms, particularly
those involving machine learning.
•• Linear algebra is involved in large scale data processing, playing a vital role in machine learning and AI.
•• Calculus is essential for understanding the best possible solution algorithms used in machine learning.
•• Statistics is used for collecting, exploring, and analysing the data. It also helps in drawing conclusions
from data.
•• Statistics can be used in numerous fields because of its ability to extract meaningful insights from data
and assist in decision making.
•• Probability is a branch of statistics that deals with the likelihood or chance of different outcomes occurring
in a given situation.
7. Picking a random day of the week, and it turns out to be both Monday and Friday at the same time. Identify events.
a. Likely events b. Unlikely events
c. Impossible events d. Certain events
3. Define Statistics.
Ans. Statistics is used for collecting, exploring, and analysing the data. It also helps in drawing conclusions from data.
4. Explain the use of statistics in disease prediction.
Ans. Statistics models can predict the spread and impact of diseases, helping in planning public health measures.
5. What is an impossible event? Provide an example.
Ans. An impossible event has no chance of occurring, such as rolling a number greater than 6 on a fair six-sided die.
6. Why is calculus important for AI models?
Ans. Calculus is crucial for training and improving AI models by understanding the best possible solution algorithms.
7. What is an example of a certain event in probability?
Ans. The occurrence of sunrise and sunset each day is a certain event with a probability of 1.
3. In the upcoming elections, the election commissioner wants to know whether it will be a hung parliament or a party
will have a clear majority. Can this be achieved using Statistics? How?
Ans. Yes, it can be achieved. By surveying a group of voters and analysing the results, statistics can reveal what the majority
of people think about political candidates, policies, or upcoming elections. This helps politicians and decision-makers
understand public opinion.
4. If you throw an arrow to this pie chart, in which colour is the arrow more likely to fall on? [CBSE Handbook]
a. Red b. Blue
c. Orange d. Green
5. If you select a balloon from a bag having equal number of red, green and yellow balloons, how
likely is it that you pick up a blue balloon? [CBSE Handbook]
a. Probable b. Certain
c. Unlikely d. Impossible
6. With one throw of a 6-sided die, what’s the probability of getting an even number? [CBSE Handbook]
a. 1/5 b. 2/5
c. 5/6 d. 1/2
8. GPT-3, released by OpenAI in 2020, has 175 billion parameters and uses sampling techniques for text
generation.
a. data b. probabilistic
c. statistical d. pattern
4. Statistics can only be used in the mathematical field because of its ability to extract meaningful insights
from data and assist in decision making.
7. Understanding math will help us to better understand AI and its way of working.
6. If you have 10 red dresses and 3 white dresses. What is the probability of wearing white dress?
3. Explain the concept of probability with the help of an example of a deck of 52 cards.
4. When discussing probability, we often rely on specific terms to describe the likelihood of events occurring. Explain any
one likelihood of an event with examples.
5. What role does probability play in estimating the traffic on the road? List any three with examples.
6. Identify the likely, unlikely, impossible and equal probability events from the following with proper explanation:
C. Competency-based/Application-based questions:
1. Imagine a student preparing for a math exam that covers five topics. Based on past exams and the instructor’s hints,
the student estimates the probability of each topic appearing on the exam as follows:
Topic A: 0.8
Topic B: 0.6
Topic C: 0.4
Topic D: 0.7
Topic E: 0.5
How will this help him in preparing for the exam to score good marks?
2. Let’s say a company wants to launch a new product—a smartphone—into the market. Before launching the product,
the company conducts market research to understand consumer preferences and potential demand for the new
smartphone. What role will the statistics play in smoothening this process?
3. Predicting earthquakes with precise accuracy is incredibly challenging due to the complex nature of seismic activity.
However, probability can still play an important role in this. Can you find a few of the important applications of
probability in predicting earthquakes?
4. Aman is confused, how probability theory is utilised in artificial intelligence, help Aman by providing two examples to
illustrate its importance.
In Life
Students are introduced to the concept of probability at a very early stage of life, it begins with a real life situation. For
example, flip a coin and ask what the chances are that it will come out heads. Or, place the coin in one hand, and put
both hands behind your back. Ask one student to guess which hand it is in.
Discuss your findings with your classmates, and illustrate some more examples from real life.
Deep Thinking
The adoption of AI in healthcare could save between 5% to 10%. AI in healthcare statistics shows that 90% of nursing
tasks will still be performed by humans in 2030.
Considering the above facts, create a report on which all tasks of nurse shall be taken over by AI.
ID: 3 ID: 3
ID: 3 ID: 3
ID: 0
Type: Dog
Breed: Chihuahua (41.0%)
Emotion: Scared (98.0%)
Scared (98.0%), Angry (2.0%), Happy (0.0%), Neutral (0.0%), Sad (0.0%)
IMG: 5 IMG: 2
IMG: 6 IMG: 1
IMG: 5 IMG: 3
IMG: 8 IMG: 7
1 × 9 + 2 = 11
12 × 9 + 3 = 111 1
123 × 9 + 4 = 1111 1 1
1234 × 9 + 5 = 11111 1 2 1
12345 × 9 + 6 = 111111 1 3 3 1
123456 × 9 + 7 = 1111111 1 4 6 4 1
123457 × 9 + 8 = 11111111 1 5 10 10 5 1
12345678 × 9 + 9 = 11111111 1 6 15 20 15 6 1
123456789 × 9 + 10 = 111111111 1 7 21 35 35 21 7 1
4. Find connections between sets of images and use that to solve problems, think smartly, and grasp
tricky ideas.
Complete the sequence in the left column by identifying the correct missing piece in the right column
out of the given options. [CBSE Handbook]
2.
3.
4.
5.
6.
Data Interpretation
What is the most common colour choice for the residents of this area?
Answers
Exercise (Section A)
A. 1. a 2. d 3. a 4. a 5. c 6. b 7. c 8. a
B. 1. Artificial Intelligence 2. Graph theory 3. Statistics 4. Probability 5. Unlikely
6. Order 7. Probability 8. Fibonacci Series
C. 1. False 2. False 3. True 4. True 5. False 6. False 7. True 8. True
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
2. When you order food online, you receive a link for the feedback. Based on the questions asked by the chatbot, you are
able to give your feedback. What is the probability of the feedback shared by you is not altered by chatbot?
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
3. How is probability affecting the weather forecast system?
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
……………………….……………………….……………………….……………………….………………………....................………………………………………..
AI Ready 3 103
UNIT-4
INTRODUCTION TO
GENERATIVE AI
Learning Outcomes
The simple it is to understand the context through visualisation (images), more difficult it is to identify, whether the
image is real or artificial. Today 50% of the images we see online are a result of Generative Artificial Intelligence.
Let us learn more about it.
1. ________________________________________ a. ________________________________________
2. ________________________________________ b. ________________________________________
3. ________________________________________ c. ________________________________________
4. ________________________________________ d. ________________________________________
5. ________________________________________ e. ________________________________________
Brainy Fact
Companies like OpenAI, Google, Facebook, and NVIDIA have invested heavily in research and
development of generative AI technologies.
Supervised Learning
Supervised learning is a machine learning where a model is trained on a labelled dataset, implying that each
input data point is associated with a corresponding output label. The goal of supervised learning is to learn the
mapping between input data and output labels, enabling the model to make predictions on new, unseen data.
In simple words, input data is paired with the desired output thus making the machine learn to predict the
output for new input data.
For example, In the given image, first is the input image and characteristics of this image are marked as boy
and ball which can be seen in center image . Now according to supervised learning it has to learn the mapping
between input labels and output labels, which is shown in last image and highlights "ball" as red, "boy" as purple
and "boy playing with a ball" in a rectangle.
In supervised learning, discriminative modelling contrasts with generative modelling, where the goal is to model
the joint probability distribution of both the input features and the output labels. Generative models can be
used to generate new data points that resemble the training data, whereas discriminative models are primarily
focused on classification or regression tasks.
Output
Input Features of given image
Label for the item
In a supervised learning model, a labelled dataset is given to the machine. A labelled dataset is the information
which is tagged with identifiers of data. For example, clothes in a store are marked under various categories of
clothing like Shirts, Trousers, Coats, etc. They are further labelled as per gender and size.
Discriminative Modelling
Discriminative modelling is an approach in machine learning where the focus is on learning the boundary or
decision boundary that separates different classes or categories directly from the data. So, if an image contains
a combination of Dogs and Cats, the model is able to tell which is a Dog and which is a Cat.
Label: Cats
Output
Cats = 2 Dog = 1
In supervised learning, discriminative modelling contrasts with generative modelling, where the goal is to model
the joint probability distribution of both the input features and the output labels. Generative models can be
used to generate new data points that resemble the training data, whereas discriminative models are primarily
focused on classification or regression tasks.
Let us consider an example.
A father has two kids, Kid A and Kid B. Kid A has a special character whereas he can learn everything in depth.
Kid B have a special character whereas he can only learn the differences between what he saw.
One fine day, father shows his kids (Kid A and Kid B) two kinds of animals say a dog and a cat. After a few days,
father showed them an animal and asked both of them “Is this animal a dog or a cat?”
Kid A drew the image of dog and cat on a piece of paper based on what he saw earlier. He compared both
the images with the animal standing before him and answered based on the closest match of image & animal,
he answered: “The animal is Dog.” Kid B knows only the differences, based on different properties learned, he
answered: “The animal is a Dog”.
Here, we can see both of them is finding the kind of animal, but the way of learning and the way of finding
answer is entirely different. In Machine Learning, we generally call Kid A as a Generative Model & Kid B as a
Discriminative Model.
Unsupervised Learning
Output
Input Example that's similar to
Emergent pattern/inherent
Unstructured/Unlabelled dataset what's in the dataset
structure
Unsupervised learning is a type of machine learning where the model is trained on input data without any
corresponding output labels. The goal of unsupervised learning is to find patterns, structure, or representations
in the data without human intervention. An unsupervised learning approach works on an unlabelled dataset.
This means that the data which is fed to the machine is random and there is no know-how available about it to
the trainer.
Reboot
Hands on experience
1. Differentiate between supervised learning and unsupervised learning approach.
2. List any two points that can identify the difference between a real image and Gen AI image.
Random Noise
Generative
Model
Task
Examine the images and determine whether either of the images is a real image or an AI-generated image. Also, give
reasons for your answer.
g
al Learnin
Experienti
Video Session
Scan the QR code or visit the following link to watch the video:
https://www.youtube.com/watch?v=26fJ_ADteHo
This video gives you a clear picture of Generative AI. Now answer the following questions:
1. What is your understanding of generative AI?
Brainy Fact
Generative AI is now part of the workforce in the US, UK, Australia, and many other parts of the
world.
Generative AI vs Conventional AI
Generative AI and conventional AI represent two different approaches in the field of artificial intelligence. The
difference between them is given in the following table:
Generative AI Conventional AI
Goal Generative AI creates new content which Conventional AI analyses, processes, and
mimics the original content. This content classifies data. It works to improve the
includes images, text, music, or other forms accuracy, precision, recall, and speed within
of media. the scope of the defined task.
Training Generative AI models are often trained Conventional AI models are typically
using techniques such as generative trained using supervised, unsupervised, or
adversarial networks (GANs), variational reinforcement learning techniques.
autoencoders (VAEs), or autoregressive
models.
Dataset Generative AI models typically require large Conventional AI models rely on smaller, more
amounts of diverse and representative curated datasets that are tailored to the task
data to learn effectively. These datasets at hand.
often contain thousands or even millions
of examples across various categories or
classes.
Applications Generative AI is used in the fields of art, Conventional AI is used in banking, healthcare,
music, literature, gaming, and design. image recognition, and language processing.
Generative AI comes in a variety of forms, each with unique advantages and uses. Some of the most typical
varieties are as follows:
•• Generative Adversarial Networks (GANs)
•• Variational Autoencoders (VAEs)
•• Recurrent Neural Networks (RNNs)
•• Autoencoders (AEs)
These two networks work together in a cycle where the Generator tries to create realistic fake data, and the
Discriminator tries to identify whether the data is real or fake. This back-and-forth process helps the Generator
improve and produce more convincing data over time.
Generator Discriminator
network: network:
produces the analyses the
data data and shares
feedback
Latent Space
Distribution
Output
Input
Output
Input
Hidden Layer
Autoencoders (AEs)
These are Neural networks that have been trained to learn a compressed representation of data. They work by
compressing the data into a lower-dimensional form (encoding) and then decompressing it back to its original
form (decoding). This process helps the network learn the most important features of the data.
Latent Space
Output
Input
Encoder Decoder
Similarities
•• Both AE and VAE are neural network architectures that are used for unsupervised learning
•• Both AE and VAE consist of an encoder and a decoder network. The encoder maps the input data to a latent
representation, and decoder maps the latent representation back to the original data.
•• Both AE and VAE can be used for tasks such as dimensionality reduction, data generation, and anomaly detection.
Differences
AE VAE
Basic Function Neural network model that learns to Similar to AE but incorporates
encode input data into a compressed probabilistic elements to learn a latent
representation and then decode it space representation of input data.
back to the original data.
Examples of Generative AI
Generative AI has many applications, from art and music to language and natural language processing.
Let's study about some examples of how generative AI is being used in various fields.
Art
Generative AI can create new artworks by learning styles from famous painters and generating novel pieces in
similar styles. For example:
•• AI artists like "AI Portraits" and "DeepArt" have gained popularity for their ability to create visually stunning images.
•• The Next Rembrandt project used data analysis and 3D printing to create a new painting in the style of Rembrandt.
Portrait Creation
Style Transferring
Music
Generative AI is transforming the music industry by enabling the creation of new music, either through composing
original pieces or remixing existing ones.
One prominent example of this innovation is AIVA, an AI composer capable of creating original music in various genres.
g
al Learnin
Experienti
Video Session
Scan the QR code or visit the following link to watch the video:
https://www.youtube.com/watch?v=wYb3Wimn01s
This video is on How AI could compose a personalised soundtrack to your life | Pierre Barreau.
Now, answer the following questions:
1. What is the name of AI composer mentioned in the video?
GenAI in Language
g
al Learnin
Experienti
Video Session
Scan the QR code or visit the following link to watch the video:
https://www.youtube.com/watch?v=BWCCPy7Rg-s
This video is on What is ChatGPT, the AI software taking the internet by storm? – BBC News. Now,
answer the following questions:
1. ChatGPT is being used by masses, what are the concerns arising because of it?
Task
GAN Paint CBSE Handbook
Link: https://ganpaint-v2.vizhub.ai/
● GAN Paint directly activates and deactivates neurons in a deep network trained to create pictures.
● Each left button ("door", "brick", etc.) represents 20 neurons.
● The software shows that the network learns
about trees, doorways, and roofs by drawing.
● Switching neurons directly shows the
network's visual world model.
● To use GAN Paint, you will first need to select
a base image from the website's library. You
can then use the brush tool to add objects
and textures to the image. As you paint, the
GAN network will learn to generate more
realistic images.
● You are encouraged to experiment with GAN
Paint and see what you can create. Have fun!
Artbreeder
Artbreeder is a web-based tool where you can
make and change pictures using advanced
AI technology called generative adversarial
networks (GANs). With Artbreeder, you can mix
different images and text together and adjust
specific features to create completely new and
unique artworks.
You may work on Artbreeder using the given link
https://www.artbreeder.com/
You can use it for free with some limitations or
choose a paid plan for more features and options.
It helps you:
•• create new characters by blending and modifying existing images.
•• generate imaginative and unique artworks to use in stories, games, or movies.
•• experiment with different image combinations and features to discover new artistic possibilities.
Task
Generate Images with Text Prompt
1. Go to artbreeder.com
2. Select Create from the menu bar and click on New Image under the Poser category.
ChatGPT
It is an AI tool created by OpenAI. It generates responses like humans in real-time, based on the user’s input.
It can give natural answers to questions in a conversational tone and can generate stories, essays, and poems.
You may work on ChatGPT using the given link https://chat.openai.com/
It can:
•• answer any type of questions
•• solve maths or scientific problems
•• translate between languages
•• debug and fix code
•• write a story/poem
•• differentiate the things given as input.
•• rephrase text input
Runway ML
Runway ML is a platform for creating, training, and deploying generative models. It provides a user-friendly
interface for building and training various types of generative models, including GANs, VAEs, and image classifiers.
You may work on Runway ML using the given link https://runwayml.com
Brainy Fact
ChatGPT, the world’s most popular genAI platform, had an average of 1.5 billion monthly visits
in 2023.
Gemini
Gemini is a generative multimodal AI model created by Google. Just like ChatGPT, Google Gemini is designed to
understand text, images, audio, video, computer code, and more.
You may work on Gemini using the given link https://gemini.google.com/
It can:
•• answer any type of questions
•• solve maths or scientific problems
•• translate between languages
•• debug and fix code
•• write a story/poem
•• differentiate the things given as input.
4. Try it yourself: Explore some new AI apps which can help ease your work.
Copilot
Copilot is an AI tool designed by Microsoft. It can do all the jobs just like ChatGPT and Gemini but it focuses
more on software development assistance to streamline the coding process, increase productivity, and assist
developers in writing high-quality code faster.
You may work on Copilot using the given link https://copilot.microsoft.com/
2. Give the prompts listed below and see the answer that you get.
Audio
FineShare Boomy AI Playlist AI
Productivity
Briefly AI Socra AI Leexi AI
While Generative AI offers many benefits, there are also several ethical considerations that should be considered
when using this technology.
•• Ownership: There's a gray area concerning who owns content created by generative AI. This is especially
significant in creative fields like music, literature, and art, where AI can produce original works that blur the lines
between human and machine authorship.
•• Human Agency: Generative AI prompts questions about human control and autonomy. As technology advances,
it may become harder to distinguish between content made by humans and that made by machines. This
blurring of lines could result in a loss of human agency and control.
•• Bias: Generative AI learns from the data it's fed, meaning biased data can result in biased AI-generated content.
This bias can have harmful effects, particularly in high-stakes areas like hiring, loan approvals, or criminal justice.
•• Misinformation: Generative AI can be used to create fake news or deep fakes, which can spread misinformation
and sway public opinion. This poses a threat to democracy and undermines trust in authorities.
•• Privacy: There's a risk that generative AI could be used to generate sensitive personal information, such as credit
card numbers or medical records, which could then be exploited for malicious purposes. Protecting privacy is
crucial in preventing such misuse.
Fake News: AI can generate convincing fake news articles and social media posts, spreading misinformation
a.
quickly and widely.
Deepfakes: The term "deepfake" combines "deep learning" and "fake," referring to AI techniques that
b.
create realistic but fake videos and audio. These AI-generated videos can mislead people by making it
seem like someone said or did something which they didn't, undermining trust in public figures and
institutions.
Automation: Generative AI can perform tasks traditionally done by humans, such as writing, graphic
a.
design, and customer service, potentially leading to significant job displacement.
6. Environmental Impact
a. Training and running large AI models require significant computational resources, contributing to high
energy consumption and environmental impact.
b. Many devices are being exchanged due to outdated hardware leading to increase in e-waste.
Reboot
Hands on experience
1. Provide examples illustrating instances where biases in Generative AI are evident.
All these points ensure the responsible use of Generative AI. By emphasising ethics, creating trust, limiting
negative repercussions, defining legislation, and encouraging innovation, we maximise Generative AI’s potential
and use it in ways that benefit the society.
At a Glance
•• AI generated images are created by AI algorithms.
•• Distinguishing between a real image and one generated by AI can be challenging as AI-generated images continue
to become more sophisticated.
•• Artificial intelligence shows inconsistencies if observed closely, although it tries to piece together its creations from
the original work.
•• AI-generated images may include elements that seem unrealistic or improbable, such as impossible perspectives,
mismatched colors, or objects that defy physics.
•• In a supervised learning model, a labelled dataset is given to the machine.
•• A labelled dataset is the information which is tagged with identifiers of data.
•• Discriminative modelling is an approach in machine learning where the focus is on learning the boundary or decision
boundary that separates different classes or categories directly from the data.
•• Unsupervised learning is a type of machine learning where the model is trained on input data without any
corresponding output labels.
•• An unsupervised learning approach works on an unlabelled dataset.
•• In Generative Modelling there is no labelled dataset, and the model can generate structured data from the Random
Noise dataset.
•• A "random noise dataset" typically refers to a collection of data points or samples where each data point is generated
randomly.
•• Generative Artificial Intelligence also called gen AI, refers to the algorithms that generate new data that resembles
human-generated content, such as audio, code, images, text, simulations, and videos.
•• Generative AI is trained with existing data and content, creating the potential for applications such as natural
language processing, computer vision, the meta-verse, and speech synthesis.
•• GANs are neural networks that work to produce fresh data.
•• Variational Autoencoders (VAEs) produces fresh data, learn the distribution of the data and then sample from it.
•• RNNs are a special class of neural networks that excel at handling sequential data, like music or text.
•• Autoencoders are Neural networks that have been trained to learn a compressed representation of data.
2. ………………………. generated images can add modifications, enhancements and even entirely new imaginative details.
a. Computer b. AI
c. Algorithm d. Real
3. In Generative Modelling, the model can generate structured data from the ………………………. dataset.
a. Random Noise b. labelled
c. unlabelled d. Temporary
4. ………………………. refers to AI techniques that create realistic but fake videos and audio.
a. AI b. Deep Fakes
c. Copyright d. Plagiarism
7. ………………………. can be used to generate large volumes of content quickly and efficiently.
a. Supervised learning b. Unsupervised
c. Generative AI d. Deep Fake
8. Name the type of Generative AI used for creating new sentences that mimic the style of Shakespeare or generating
dialogue for a chatbot.
a. Real Data b. GANs
c. VAEs d. RNNs
9. In generative AI, a ………………………. is the initial input or instruction given by the user to the AI model to guide it in
generating the desired content.
a. dataset b. prompt
c. data d. information
2. The classification of data elements into categories or labels was initially taught to the machine learning models by
………………………..
5. Gen AI is gaining popularity due to the fact that people can use ………………………. to prompt AI.
7. If generative AI is trained on biased or incomplete data, the output may be similarly ………………………..
9. ………………………. is a web-based tool where you can make and change pictures using advanced AI technology called
Generative Adversatial Networks (GANs).
5. GNNs work by compressing the data into a lower-dimensional form (encoding) and then
decompressing it back to its original form (decoding). ……….……
6. “The Next Rembrandt Project used in data analysis” is an example of Generative AI in Music. ……….……
7 The Discriminator in GANs helps to distinguish between real and generated data. ……….……
8. Misinformation created by generative AI does not have any serious social impacts. ……….……
9. Generative AI tools are not capable of personalising content for individual users. ……….……
10. Privacy concerns in generative AI include the risk of generating sensitive personal information. ……….……
f. Banking
3. What are Recurrent Neural Networks ? List its important features with example.
Ans. ● RNNs are a special class of neural networks that excel at handling sequential data, like music or text.
● They excel at tasks where the order of the data points is important, as they can remember previous inputs and use
this information to influence current outputs.
● Example:
Generating novel text in the style of a specific author or genre,Example: Creating new sentences that mimic the style
of Shakespeare or generating dialogue for a chatbot.
2. ………………………. is unpredictable fluctuations and disarranged data which makes it impossible to identify target patterns
or relationships in it.
a. Deep fake b. Random Noise Dataset
c. Gen AI d. Discriminative Modelling
6. Which generative AI tool is designed to understand text, images, audio, video, and code?
a. ChatGPT b. Runway ML
c. Gemini d. Copilot
9. Which technology introduced the concept of the world's first neural network?
a. Transformers b. Perceptron
c. RNNs d. GAN
4. The ethical consideration of ………………………. in generative AI involves the risk of generating biased or discriminatory
content.
5. One of the primary advantages of autoencoders is their ability to create ……………………….free images.
6. A major risk associated with generative AI is the potential to create misinformation, such as ………………………. and
deepfakes.
7. Recurrent Neural Networks (RNNs) are especially good at handling ………………………. data like text and music.
8. Generative AI refers to algorithms that create new data resembling ………………………. generated content.
9. The AI tool Artbreeder is used for blending and modifying ………………………. using GANs.
10. ………………………. is a web-based platform for creating and deploying generative models.
3. The Generator in GANs evaluates the generated data to ensure it is realistic. ……….……
6. Runway ML is a platform for creating and deploying Art work and is part of chatGPT. ……….……
7. Generative AI is not capable of composing new music or remixing existing pieces. ……….……
8. Variational Autoencoders (VAEs) are used to sample new data from learned data distributions. ……….……
9. Generative AI algorithms are primarily used to analyse data rather than create new data. ……….……
10. “Generative AI is a threat to Privacy and has Data Security Risks.”Explain in short
3. What do you understand about Generative Artificial Intelligence? Give any two examples.
5. Give one point ot support how Generative AI can be helpful in following fields-
● Architecture
● Coding
● Music
● Content Creation
C. Competency-based/Application-based questions:
1. Sakshi has been assigned a homework essay on the topic, “The Impact of Climate Change on Coral Reefs.” The essay
requires Sakshi to research and explain various aspects of climate change, such as ocean acidification and rising sea
temperatures, and their effects on coral reef ecosystems. His friend suggested using some text generation tool. List
some guidelines for Sakshi to prevent misuse of Generative AI and use it constructively.
2. How do you think generative AI can revolutionise the creative industry, such as art and fashion, by enabling the
generation of unique and innovative designs?
3. Considering the ethical challenges associated with generative AI, what are your thoughts on establishing guidelines or
regulations to ensure responsible use of these technologies? How can we balance the potential benefits and risks?
4. Find out ChatGPT vs Gemini vs Copilot on the basis of
● Parameter 1: Human-Like Response.
● Parameter 2: Training Dataset and Underlying Technology.
● Parameter 3: Authenticity of Response.
● Parameter 4: Access to the Internet.
● Parameter 5: User Friendliness and Interface.
● Parameter 6: Text Processing: Summarisation, Paragraph Writing, Etc.
In Life
Generative AI enables media and entertainment companies to streamline content creation processes. These AI models
can generate original scripts, articles, and even music compositions, freeing up human creators to focus on more complex
and creative tasks. Using one of these tools, prepare a skit for your morning class assembly related to the topic chosen.
Deep Thinking
Deep learning is a subset of machine learning that uses multi-layered neural networks, called deep neural networks,
to simulate the complex decision-making power of the human brain. Some form of deep learning powers most of the
artificial intelligence (AI) in our lives today. List some of the recent developments in deep learning.
inking
tatio nal Th
Lab Compu
1. Visit the shared link and determine which one is a real image or an AI generated image.
https://britannicaeducation.com/blog/quiz-real-or-ai/
State 5 points you have noted for easy identification of AI generated image.
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
Students can use spreadsheet software to clean this data by removing duplicates, filling in missing values,
and correcting errors to make it usable for analysis.
2. Hands-on Activity
Explore the Chatgpt tool
a. Goto https://chat.openai.com/
b. Give the prompts listed below and see the answer that you get.
Answers
Quiz
1. a 2. b 3. a 4. b 5. b 6. b 7. c 8. d 9. b 10. d
Exercise
B. 1. algorithms 2. humans 3. labelled 4. labelled 5. Natural Language
6. GANs (Generative Adversarial Networks) 7. biased 8. Runway ML
9. Artbreeder 10. Gemini
C. 1. False 2. True 3. False 4. True 5. False
6. False 7. True 8. False 9. False 10. True
D. 1. d 2. a 3. b 4. f 5. c
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
3. List any two jobs in the field of Art that will be soon replaced by AI.
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
……………………….………………………………………….……………………….……………………….……………………….………………………………………..……..
…………………….………………………………………….……………………….……………………….……………………….………………………………………..……..