0% found this document useful (0 votes)
73 views2 pages

Foundations of Machine Learning: Assignment 1: Problem 1

This document outlines 4 problems for an assignment in a machine learning course. Problem 1 asks students to show that concentric circles can be PAC learned with sample complexity proportional to 1/ε. Problem 2 asks students to evaluate another student's approach for PAC learning non-concentric circles and suggests modifying the approach. Problem 3 asks students to show how a consistent hypothesis can be found from a finite training set using a PAC learning algorithm. Problem 4 poses as a challenge problem to prove a class of half-planes can be PAC learned.

Uploaded by

Talon Xu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views2 pages

Foundations of Machine Learning: Assignment 1: Problem 1

This document outlines 4 problems for an assignment in a machine learning course. Problem 1 asks students to show that concentric circles can be PAC learned with sample complexity proportional to 1/ε. Problem 2 asks students to evaluate another student's approach for PAC learning non-concentric circles and suggests modifying the approach. Problem 3 asks students to show how a consistent hypothesis can be found from a finite training set using a PAC learning algorithm. Problem 4 poses as a challenge problem to prove a class of half-planes can be PAC learned.

Uploaded by

Talon Xu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Foundations of Machine Learning:

Assignment 1
Instructor: Prof. Xiangyang Ji
Due: 2023.03.21

Problem 1
30pts. (Exercise 2.3 in Foundations of Machine Learning) Concentric circles. Let X = R2 and consider the
set of concepts of the form c = {(x, y) : x2 + y 2 ≤ r2 } for some real number r. Show that this class can be
(, δ)-PAC-learned from training data of size m ≥ (1/)log(1/δ).

Problem 2
35pts. (Exercise 2.4 in Foundations of Machine Learning) Non-Concentric circles. Let X = R2 and consider
the set of concepts of the form c = {x ∈ R2 : ||x − x0 || ≤ r} for some point x0 ∈ R2 and real number r.
Gertrude, an aspiring machine learning researcher, attempts to show that this class of concepts may be
(, δ)-PAC-learned with sample complexity m ≥ (3/)log(3/δ), but she is having trouble with her proof. Her
idea is that the learning algorithm would select the smallest circle consistent with the training data. She
has drawn three regions r1 , r2 , r3 around the edge of concept c, with each region having probability /3 (see
Figure 1). She wants to argue that if the generalization error is greater than or equal to , then one of these
regions must have been missed by the training data, and hence this event will occur with probability at most
δ.
Can you tell Gertrude if her approach works? If the answer is not, can you modify this approach by drawing
more than three regions as above?

1
Assignment 1 Problem 2

Figure 1: Gertrude’s regions r1 , r2 , r3 .

Problem 3
35pts. (Exercise 2.9 in Foundations of Machine Learning) Consistent hypotheses. In this chapter, we
showed that for a finite hypothesis set H, a consistent learning algorithm A is a PAC-learning algorithm.
Here, we consider a converse question. Let Z be a finite set of m labeled points. Suppose that you are given
a PAC-learning algorithm A. Show that you can use A and a finite training sample S to find a hypothesis
h ∈ H that is consistent with Z, with high probability.
Note: S represents the training dataset, and Z represents the set of all possible examples in this problem.
You should focus on proving S is finite and h is consistent on Z.

Problem 4
(Challenge, 20pts) Let H be set of all half-planes in R2 . We assume that the underlying concept h ∈ H.
Try to prove that this class can be (, δ)-PAC-learnable.

Page 2 of 2

You might also like