0% found this document useful (0 votes)
7 views

Intro To Machine Learning

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Intro To Machine Learning

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

INTRO TO MACHINE LEARNINGS

Assignment 1:

Exercise 1):
Task 1):

a)Supervised Learning: Learning with labeled data in which the system uses
examples to anticipate outputs for fresh data.

b)Unsupervised Learning: Learning from unlabeled data in which the machine


detects patterns or structures on its own.

3)Reinforcement learning is the process of learning through trial and error while
interacting with an environment and obtaining feedback or rewards.

Task 2)
Paradigm Choices:

1)Sentiment Analysis: Supervised Learning (learns from tagged positive or


negative sentiments).
2)Data Compression: Unsupervised Learning (reduces data redundancy without
the requirement for labels).
3)Self-driving Cars use reinforcement learning (they learn by interacting with
their surroundings and improving depending on feedback).
4)Supervised Learning is commonly used for personalized content
recommendation.
5)Spam Filtering: Supervised Learning (training on tagged spam and non-spam
emails).
6)Sorting Fruits in a Basket by Type: Supervised Learning.

Exercise 2: Specification of Learning Tasks


For the mushroom classification task, match each item with a symbol:
 A pile of Mushrooms: Symbol “O” (Objects).
 A table with size, weight, color: Symbol “X” (Feature space).
 A human expert on mushrooms: Symbol “γ(o)” (mapping from objects
to classes).
 A device that measures attributes: Symbol “α(o)” (feature extraction
function).
 Set {Poisonous, Edible}: Symbol “C” (Classes).
 Machine Learning System: Symbol “y(x)” (predictive model).

Exercise 3: Data Annotation and Feature Engineering


1. Annotation: Attached the annotation part additionally.
2. Roles and Functions:
o Role of a Group Member: Symbol “γ(o)” (like the human expert in
Exercise 2).
o Function in this Exercise: Symbol “y(x)” (since you’re building
functions to predict or classify).

Exercise 5:

Task (a): Apply the Find-S Algorithm


The Find-S algorithm starts with the most specific hypothesis possible and
generalizes it step-by-step to cover positive examples only.
1. Initialize:
o Start with the most specific hypothesis, which is h = (?, ?, ?, ?),
meaning "no conditions set."
2. Update Hypothesis for Each Example:
o For each positive example (where Run-a-red-light = Yes), update
the hypothesis to include only the attributes of that example.
Step-by-Step Updates:
o Example 1 (Positive: Run-a-red-light = Yes):

 Hypothesis becomes (Monday, No, Easygoing, Evening)


o Example 2 (Negative: Run-a-red-light = No):

 No update, as this is a negative example.


o Example 3 (Negative: Run-a-red-light = No):

 No update, as this is a negative example.


o Example 4 (Positive: Run-a-red-light = Yes):

 Update hypothesis by generalizing attributes that differ from


the previous hypothesis:
 (Monday, No, Easygoing, ?)
Final Hypothesis (Find-S):
o After processing all examples, the final hypothesis is:
h=(Monday,No,Easygoing,?)

Task (b): Apply the Candidate-Elimination Algorithm


The Candidate-Elimination algorithm maintains two sets of hypotheses: SSS
(specific boundary) and GGG (general boundary).
1. Initialize:
o Start with S={(?,?,?,?)} (most specific hypothesis).

o Start with G={(∗,∗,∗,∗)} (most general hypothesis).

2. Update SSS and GGG for Each Example:


o Example 1 (Positive: Run-a-red-light = Yes):

 Update SSS to include (Monday, No, Easygoing, Evening).


 No change in GGG, as it's already general enough.
o Example 2 (Negative: Run-a-red-light = No):

 Remove any hypothesis in GGG that classifies this example


as positive.
 Specialize GGG hypotheses to make sure they exclude this
negative example. The refined GGG set becomes:
 (Monday, ?, ?, ?), (?, No, ?, ?), (?, ?, Easygoing, ?), (?, ?,
?, Evening)
o Example 3 (Negative: Run-a-red-light = No):

 Further refine GGG by removing hypotheses that would


incorrectly classify this example as positive.
 Update GGG to:
 (Monday, ?, Easygoing, ?), (?, No, Easygoing, ?)
o Example 4 (Positive: Run-a-red-light = Yes):

 Generalize S as needed to be consistent with this positive


example:
 S=(Monday,No,Easygoing,?)
 Refine GGG to remain consistent:
 G=(Monday,?,Easygoing,?)
Final Boundary Sets:
o S={(Monday,No,Easygoing,?)}

o G={(Monday,?,Easygoing,?)}

Task (c): Determine the Version Space HDH_DHD


The Version Space HDH_DHD is the set of hypotheses that lie between the
boundaries defined by SSS and GGG.
 In this case, the only consistent hypothesis that fits within both SSS and
GGG is: HD={(Monday,No,Easygoing,?)}
Summary of Answers
1. Find-S Hypothesis:
o Final hypothesis from Find-S is: (Monday,No,Easygoing,?)

2. Candidate-Elimination Boundary Sets:


o Specific boundary SSS: (Monday,No,Easygoing,?)

o General boundary GGG: (Monday,?,Easygoing,?)

3. Version Space:
o The version space HDH_DHD contains the hypothesis
(Monday,No,Easygoing,?)

Exercise 6:

Part a): Yes, a version space HD can contain hypotheses that are neither in the
specific boundary HS nor in the general boundary HG

Part b) : Option 3 and Option 4.


Part c) : The Candidate-Elimination algorithm has a stronger inductive bias
than the Find-S algorithm.

You might also like