0% found this document useful (0 votes)
31 views

Intro ML Linear Classifier

The document discusses machine learning and describes different machine learning problem classes including supervised learning techniques like classification and regression as well as unsupervised learning techniques like density estimation and clustering. It also covers evaluation criteria, hypothesis classes, learning algorithms, and examples of algorithms like perceptron.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Intro ML Linear Classifier

The document discusses machine learning and describes different machine learning problem classes including supervised learning techniques like classification and regression as well as unsupervised learning techniques like density estimation and clustering. It also covers evaluation criteria, hypothesis classes, learning algorithms, and examples of algorithms like perceptron.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

https://www.facebook.com/hcmut.ml.iot.

lab

4/2021
Python Data Structures 2 Name
Machine learning focuses on making decisions
and predictions based on data.

Model in machine learning is a means to the


end of making good predictions or decisions.
Applications:
What do we do?
Get data, organize data.
Design a space of possible solutions.
Characterize objective function.
Choose an algorithm and its parameters.
Run.
Validate the resulting solution.
Learning from data is a problem of induction.

Training data are IID (independent and


identically distributed)

In general, we deal with 2 problems in machine


learning:
- Estimation
- Generalization
Problem class:
- Supervised learning:
The learning system is given inputs and told
which specific outputs should be associated
with them.
+ Classification
{(x(1), y(1)), (x(2), y(2)), … , (x(n), y(n))}
where x(i) ∈ Rd, y(i) is a discrete value.
The goal is predict y(n+1) for x(n + 1)
+ Regression
Same as classification, but y(i) is a
continuous value.
Problem class:
- Unsupervised learning:
One is given a data set and generally
expected to find some patterns or structure
inherent in it.
+ Density estimation
+ Clustering
+ Dimensionality reduction
Problem class:
- Reinforcement learning
- Sequence learning
Hypothesis class: A guess about the
relationship between x and y.

h ∈ H
y = h(x; Θ) Θ is the parameter
x ∈ Rd
H is a set of hypothesis classes

Θ will pick a particular hypothesis from H


x is parameterized by Θ

x h y
Evaluation criteria: 2 levels
- How an individual prediction is scored
- How the overall system is scored

Loss function: L(g, a): how much you will


be penalized for making a guess g when the
answer is actually a.

a. 0-1 loss
L(g, a) = 0 if g = a
1 otherwise

b. Squared loss
L(g, a) = (g – a)2

c. Linear loss
L(g, a) = |g – a|
Evaluation criteria: 2 levels
- How an individual prediction is scored
- How the overall system is scored

∙Small loss on training data:


Ҽtraining = 1/n * ∑0->nL(h(x(i); Θ), y(i))

This is training error


(n is the number of training examples)

∙Test error
Ҽtest = 1/n’ * ∑n+1->n+1+n’L(h(x(i); Θ), y(i))

Small training error is not enough, small


test error is our goal.
Learning algorithm

data algorithm h

Algorithm is specified by H
Ways to come up with an algorithm:
- Be a clever person
- Use optimization
- Be dump
Python Data Structures 14 Name
H is linear classifiers
h(x;Θ,Θo) = sign(ΘTx+Θo) = +1 if ΘTx+Θo > 0
-1 otherwise

x ∈ Rd : dx1
Θ ∈ Rd : dx1
Θo ∈ R : 1x1

Θo+Θ1x1+Θ2x2+…+Θdxd=0
Be dump: random linear classifier

Pseudocode:

def Random_linear_classifier(D, k, d)
for j=1 to k:
randomly sample(Θ(j), Θo(j)) from (Rd, R)
j* = argminj∈[1,…,k] Ҽtraining(Θ(j), Θo(j))
return (Θ(j*), Θo(j*))

k: hyper-parameter
Be a clever human: perceptron
(Rosenblatt in 1962)
Pseudocode:

def perceptron(D, k)
Θ = [0 0 … 0]T
Θo = 0
for t = 1 to k:
for i = 1 to n:
if y(i)(ΘTx(i) + Θo) ≤ 0:
Θ = Θ + y(i)x(i)
Θo = Θo + y(i)
return (Θ, Θo)
Python Data Structures 18 Name

You might also like