0% found this document useful (0 votes)
12 views19 pages

ET4248E Chap6 Nonlinear Model and SVM

The document discusses nonlinear models and support vector machines. It introduces nonlinear models and how to perform linear classification and regression with nonlinear features by moving to a new feature space. It then discusses how using many features can become computationally expensive. Finally, it introduces support vector machines as an alternative classification method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views19 pages

ET4248E Chap6 Nonlinear Model and SVM

The document discusses nonlinear models and support vector machines. It introduces nonlinear models and how to perform linear classification and regression with nonlinear features by moving to a new feature space. It then discusses how using many features can become computationally expensive. Finally, it introduces support vector machines as an alternative classification method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Chapter 6

Nonlinear Models and


Support Vector Machine (SVM)

1
2
6.1. Nonlinear Models

6.1.1. From linear to nonlinear


• Hypothesis representation
- Linear regression
ℎ𝑤 𝑥 = 𝑤 𝑇 𝑥
- Logistic regression
ℎ𝑤 𝑥 = 𝜎(𝑤 𝑇 𝑥)
- In fact, life is often nonlinear
→how do we learn nonlinear models

3
• Classification

4
6.1.2. Using explicit nonlinear features
• Linear classification with nonlinear features
- Idea: moving to a new space where the examples may be linearly separable
 nx
→ n' x


 : x → x '

ℎ𝑤 𝑥 ′ = 𝑤 𝑇 𝑥 ′

5
• Example

𝑥 ∈ ℝ2 → 𝑥 ′ ∈ ℝ5

6
• Linear regression with nonlinear features
- Idea: moving to a new space where the examples may be linearly separable

 nx
→ n' x


 : x → x '

ℎ𝑤 𝑥 ′ = 𝜎(𝑤 𝑇 𝑥 ′ )

7
• Example

𝑥 ∈ ℝ2 → 𝑥 ′ ∈ ℝ5

8
- So great, it seems that we are done with this lecture, right?
We know how to perform linear and nonlinear regression/classification
- But what happens if we have many features

9
- More generally, for a choice of degree 𝑑 and an input vector with 𝑛𝑥 features
then the feature transformation will lead to

→Lot of feature to compute for every single example

How many features are for this retina image

10
• Using another basic function
- Previous examples show the way of using the
polynomial features i.e. a polynomial basis
function
- Could also use alternative basis function such
as the radial basis function (RBF)

where 𝜎 the bandwidth and 𝜇𝑗 the center of the


RBF kernel and 𝑗 ∈ [1: 𝑛′x]

11
- Complexity:
If we assume we make a uniform grid over the
input space with 𝑑 centers along each
dimension, then

12
6.2. Support Vector Machine (SVM)

6.2.1. Introduction

13
14
15
16
17
• Example

18
19

You might also like