UNIT I-Part 2
UNIT I-Part 2
Introduction to Machine
Learning and Supervised
Learning
Code:U18CST7002
Presented by: Nivetha R
Department: CSE
Learning Multiple Classes
• Handling Doubt
• Doubt Cases: When no hypothesis or multiple
hypotheses predict 1 for an instance.
• Rejection: Classifier rejects instances in doubt
regions for further human review.
Example 1
Consider the problem of assigning the label “family car” (indicated by “1”) or “not family
car” (indicated by “0”) to cars. Given the following training set for the problem and
assuming that the hypothesis space is as defined by (p1 ≤ price ≤ p2) AND (e1 ≤ engine
power ≤ e2) , find the version space for the problem.
Example 1-solution
•Dependent Variable (y):
Regression
•The variable we are trying to predict or
explain.
•Also known as the response variable.
•Linear Relationship:
•The relationship between the dependent and
independent variables is modeled as a straight
line.
•The general form of the linear equation:
y=mx+c
Regression
https://www.youtube.com/watch?
v=CtsRRUddV2s
Regression
Regression
Regression is a type of supervised learning where the
output is a numeric value, not a Boolean class.
• Confidence:
• The confidence of an association rule X→Y measures the
probability that Y is purchased given that X is purchased.
• Confidence =
• Lift:
• Lift (also known as interest) measures the strength of an
association rule compared to the random co-occurrence
of X and Y
• Lift =
Apriori algorithm
• The Apriori algorithm is a popular method for mining
frequent item sets and generating association rules. It
operates in two main steps:
• Finding Frequent Item sets:
• The algorithm first identifies item sets that have
sufficient support.
• It uses the fact that any subset of a frequent
itemset must also be frequent to reduce the
search space.
• Generating Rules
• Once frequent item sets are identified, the
algorithm generates rules by dividing the item
sets into antecedents and consequents and
calculating their confidence
• Rules that do not meet the minimum confidence
threshold are discarded
References
• 1. Ethem Alpaydin, “Introduction to Machine
Learning”, Second Edition, MIT Press, 2013.
• 2. Tom M. Mitchell, “Machine Learning”, McGraw-Hill
Education, 2013.
• 3. Stephen Marsland, “Machine Learning: An
Algorithmic Perspective”, CRC Press, 2009.
• 4. Y. S. Abu-Mostafa, M. Magdon-Ismail, and H.-T.
Lin, “Learning from Data”, AML Book Publishers,
2012.
• 5. K. P. Murphy, “Machine Learning: A Probabilistic
Perspective”, MIT Press, 2012.
• 6. M. Mohri, A. Rostamizadeh, and A. Talwalkar,
“Foundations of Machine Learning”, MIT Press,
2012.