Machine Learning 05
Machine Learning 05
1
There is no single line (hyperplane) that
separates class A from class B. On the
contrary, AND and OR operations are linearly
separable problems
2
The Two-Layer Perceptron
3
Then class B is located outside the shaded area
and class A inside. This is a two-phase design.
• Phase 1: Draw two lines (hyperplanes)
g1 ( x) g 2 ( x) 0
Each of them is realized by a perceptron. The
outputs of the perceptrons will be
0
yi f ( g i ( x)) i 1, 2
1
depending on the position of x.
5
1st phase 2nd
x1 x2 y1 y2 phase
6
Computations of the first phase perform a
mapping that transforms the nonlinearly
separable problem to a linearly separable
one.
The architecture
7
• This is known as the two layer
perceptron with one hidden and one
output layer. The activation functions
are
0
f (.)
1
x Rl
x y [ y1 ,... y p ]T , yi 0, 1 i 1, 2,... p 9
performs a mapping of a vector
onto the vertices of the unit side Hp hypercube
10
Intersections of these hyperplanes form regions
in the l-dimensional space. Each region
corresponds to a vertex of the Hp unit
hypercube.
11
For example, the 001 vertex corresponds to
the region which is located
12
The output neuron realizes a hyperplane in the
transformedy space, that separates some of the
vertices from the others. Thus, the two layer
perceptron has the capability to classify vectors
into classes that consist of unions of
polyhedral regions. But NOT ANY union. It
depends on the relative position of the
corresponding vertices.
13
Three layer-perceptrons
The architecture
Overall: