Unit 1 - Introduction: Artificial Intelligence For Everyone: Categorize The Given Applications Into The Three Domains
Unit 1 - Introduction: Artificial Intelligence For Everyone: Categorize The Given Applications Into The Three Domains
It is defined as branch of Artificial Intelligence that deals with the interaction between
computers and humans using the natural language. The main objective of NLP is to read,
interpret, comprehend, and make sense of the human language such that it creates value for
all. Therefore, we can safely conclude that NLP essentially comprises of Natural Language
Understanding (NLU-human to machine) and Natural Language Generation (NLG-machine
to human).
Applications of NLP:
• Speech Recognition
• Google translation
• Chatbots
• Computer Vision
Computer Vision (CV) is a field of study that enables the computers to “see”. It is a subfield
of AI that involves extraction of information from digital images like the videos and
photographs to analyze and understand the content to produce numerical and symbolic
information.
When we take a digital image, it is essentially a grid of tiny colored dots called pixels. Each
pixel represents a tiny portion of the image and contains information about its color and
intensity.
• Object detection
• Optical Character Recognition
• Fingerprint Recognition
• F aci al Recognition
There are primarily four tasks that Computer vision accomplishes :
1. Semantic Segmentation (Image Classification)
2. Classification + Localization
3. Object Detection
4. Instance Segmentatio
• Data Science
Data science is the study of data to extract meaningful insights for business. It is a
multidisciplinary approach that combines principles and practices from the fields of
mathematics, statistics, artificial intelligence, and computer engineering to analyze large
amounts of dataData might be facts, statistics, opinions, or any kind of content that is
recorded in some format. This could include voices, photos, names, and even dance moves!
It surrounds us and shapes our experiences, decisions, and interactions.
For example:
• Your search recommendations, Google Maps history are based on your previous data.
• Amazon's personalized recommendations are influenced by your shopping habits.
• Social media activity, cloud storage, textbooks, and more are all forms of data.
UNIT 2 - UNLOCKING YOUR FUTURE IN AI
SAYS THINKS
DOES FEELS
Create an empathy map for a student Kunal who is planning to buy, one new
laptop for her educational purpose
SAYING THINKING
• Which brand is best and latest model ? • Do I miss anything?
• Which is best battery for long backup? • What budget is best?
• Which store gives best prices? • Should take decision fast?
KUNAL
DOING FEELS
• Comparing some best laptops. • I am confused.
• Getting reviews from friends. • I am not getting enough information.
• Checking more information about • Excited for my new laptop.
laptop on websites.
UNIT 5 - DATA LITERACY
Python programs to demonstrate the use of mean, median, mode, standard
deviation and variance.
1. MEAN
The mean() function is used to calculate the arithmetic mean of the numbers in the list.
import statistics Mean is : 4.857142857142857
datasets = [5, 2, 7, 4, 2, 6, 8]
x = statistics.mean(datasets)
print(" Mean is :", x)
2. MEDIAN
The median() function is used to return the middle value of the numeric data in the list.
import statistics Median is : 5
datasets = [5, 2, 7, 4, 2, 6, 8]
x = statistics.median(datasets)
print("Median is :", x)
3. MODE
The mode() function returns the most common data that occurs in the list.
import statistics Mode is : 2
datasets = [5, 2, 7, 4, 2, 6, 8]
x=statistics.mode(datasets)
print(" Mode is :”,x)
4. STANDARD DEVIATION
The stdev() function is used to calculate the standard deviation on a given sample which
is available in the form of the list.
import statistics Standard Deviation:
datasets = [5, 2, 7, 4, 2, 6, 8] 2.340126166724879
x = statistics.stdev(datasets)
print(“Standard Deviation:” , x)
print(x)
5. VARIANCE
Calculate the variance of a dataset using the statistics module's variance() function for
sample variance.
import statistics Sample Variance:
data = [10, 12, 23, 23, 16, 23, 21, 16] 27.428571428571427
sample_variance = statistics.variance(data)
print("Sample Variance:")
print(sample_variance)
UNIT 6 – MACHINE LEARNING ALGORITHMS
IBM's Project Debater is an AI marvel that has captivated audiences with its debating
prowess. Here are some fascinating facts about this innovative system:
Origins and Development: Project Debater was conceptualized in 2011 by Noam Slonim at
IBM's Haifa, Israel lab. It marked IBM's third Grand Challenge, following Deep Blue and
Watson.
First Public Debate: On February 11, 2019, Project Debater made its public debut in San
Francisco, debating the motion "We should subsidize preschools" against Harish Natarajan, a
world-renowned debater. This event showcased AI's potential to engage in complex,
persuasive discussions.
Unique Capabilities: Project Debater combines data-driven speech writing, delivery, and
listening comprehension. It can synthesize vast amounts of information to create structured
arguments and understand human opponents' points to formulate rebuttals. Its use of a
knowledge graph enables it to construct principled arguments on human dilemmas.
Television Feature: The AI system was featured in "That’s Debatable," a TV series that
explored topics like wealth redistribution and the US-China space race. Project Debater
contributed insightful perspectives on these issues.
Applications: Beyond debates, Project Debater's technologies are utilized in business and
academia for tasks like opinion analysis, summarization, and text classification, illustrating
its versatility and practical value.
Offline Operation: Unlike many AI systems, Project Debater doesn't rely on internet access,
operating based on internal data and capabilities.
UNIT 8 – AI ETHICS AND VALUES
Summarize your insights and interpretations from the video "Humans need not
apply.”
The "Humans need not apply” was uploaded by CGP Grey and its discusses quick
development of automation technology and its effects on the labour market are covered in
the film. It contends that the current wave of automation, fueled by robotics and artificial
intelligence, poses a severe threat to both professional and low-skilled employment. It
provides historical instances of how technology has replaced human jobs. The narrator lists
several forms of automation, such as self-driving cars and all purpose robots, and predicts
that a large number of employment in a variety of industries would be greatly impacted. As
automation increases,the film urges proactive conversations on the nature of labour in the
future.
IBM Skill Bulid
Course Completion
Certificates