0% found this document useful (0 votes)
6 views9 pages

ML Problems

The document outlines the Candidate Elimination Algorithm and the Find-S Algorithm, detailing how to adjust hypotheses based on positive and negative training examples. It illustrates the process of extending specific boundaries and retaining consistent hypotheses at generic boundaries through a series of examples. The document concludes with instructions to apply the candidate elimination algorithm to obtain the final version space for given training examples.

Uploaded by

colourfulabhi12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views9 pages

ML Problems

The document outlines the Candidate Elimination Algorithm and the Find-S Algorithm, detailing how to adjust hypotheses based on positive and negative training examples. It illustrates the process of extending specific boundaries and retaining consistent hypotheses at generic boundaries through a series of examples. The document concludes with instructions to apply the candidate elimination algorithm to obtain the final version space for given training examples.

Uploaded by

colourfulabhi12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Candidate Elimina on Algorithm

S0: (ø, ø, ø, ø, ø, ø) Most Speci c Boundary

G0: (?, ?, ?, ?, ?, ?) Most Generic Boundary

The rst example is posi ve, the hypothesis at the speci c boundary is inconsistent, hence we
extend the speci c boundary, and the hypothesis at the generic boundary is consistent hence we
retain it.

S1: (Sunny,Warm, Normal, Strong, Warm, Same)

G1: (?, ?, ?, ?, ?, ?)

The second example in posi ve, again the hypothesis at the speci c boundary is inconsistent,
hence we extend the speci c boundary, and the hypothesis at the generic boundary is consistent
hence we retain it.

S2: (Sunny,Warm, ?, Strong, Warm, Same)

G2: (?, ?, ?, ?, ?, ?)

The third example is nega ve, the hypothesis at the speci c boundary is consistent, hence we
retain it, and hypothesis at the generic boundary is inconsistent hence we write all consistent
hypotheses by removing one “?” (ques on mark) at me.
fi
fi
ti
ti
fi
ti
fi
ti
ti
ti
fi
fi
fi
S3: (Sunny,Warm, ?, Strong, Warm, Same)

G3: (Sunny,?,?,?,?,?) (?,Warm,?,?,?,?) (?,?,?,?,?,Same)

The fourth example is posi ve, the hypothesis at the speci c boundary is inconsistent, hence we
extend the speci c boundary, and the consistent hypothesis at the generic boundary are retained.

S4: (Sunny, Warm, ?, Strong, ?, ?)

G4: (Sunny,?,?,?,?,?) (?,Warm,?,?,?,?)


fi
ti
fi
1. Apply the candidate-elimination algorithm to obtain the nal version space for the
training example. [Dec 2018, Jan 2019, Jul 2019, Aug 2020

fi
2. Find_S algorithm with example

• The rst step of FIND-S is to initialize h to the most speci c hypothesis in H


h - (Ø, Ø, Ø, Ø, Ø, Ø)
• Consider the rst training example
x1 = <Sunny, Warm, Normal, Strong, Warm, Same>,+
Observing the rst training example, it is clear that hypothesis h is too speci c. None
of the "Ø" constraints in h are satis ed by this example, so each is replaced by the next
more general constraint that ts the example
h1 =<Sunny Warm Normal Strong Warm Same>
• Consider the second training example
x2 =<Sunny, Warm, High, Strong, Warm, Same>,+
The second training example forces the algorithm to further generalize h, this time
substituting a "?" in place of any attribute value in h that is not satis ed by the new
example
h2 =<Sunny Warm ? Strong Warm Same>

• Consider the third training example


x3 =<Rainy, Cold, High, Strong, Warm, Change>,-

Upon encountering the third training the algorithm makes no change to h. The FIND-S
algorithm simply ignores every negative example.
h3 = < Sunny Warm ? Strong Warm Same>

• Consider the fourth training example


x4 = <Sunny Warm High Strong Cool Change>,+

The fourth example leads to a further generalization of h


h4 = < Sunny Warm ? Strong ? ? >
fi
fi
fi
fi
fi
fi
fi
fi
3. Apply candidate elimination algorithm and obtain the version space considering the
training example given. [Jan 2021]

You might also like