0% found this document useful (0 votes)
8 views16 pages

6 Multi-Layer Perceptron

The document discusses the concept of Neural Networks, specifically focusing on Multi-Layer Perceptrons (MLPs) and their relation to Logistic Regression. It explains the structure and functionality of perceptrons, including their ability to perform logical operations and how they can be combined into multilayer networks for complex learning tasks. Additionally, it highlights the versatility of MLPs in modeling various types of functions, making them powerful tools in artificial intelligence.

Uploaded by

AHSAN FAREED
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views16 pages

6 Multi-Layer Perceptron

The document discusses the concept of Neural Networks, specifically focusing on Multi-Layer Perceptrons (MLPs) and their relation to Logistic Regression. It explains the structure and functionality of perceptrons, including their ability to perform logical operations and how they can be combined into multilayer networks for complex learning tasks. Additionally, it highlights the versatility of MLPs in modeling various types of functions, making them powerful tools in artificial intelligence.

Uploaded by

AHSAN FAREED
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Neural Network: Multi-Layer

Perceptron

1
Let’s extend
Logistic Regression
Logistic Regression
Logistic
• Binary • Multiple classes
Regression classification • Any types of
Linear classifier boundary

2
Logistic Regression
• Linear Boundary

f(x)

1 (Red) if h(x)  0.5 1


Label(x) =  h( x) =
0 (Blue) if h(x)  0.5 1 + exp(− f (x) )

3
Logistic Regression
d

• Two Steps for Evaluation s = w T x =  wi xi


i =0
• Linear combination of inputs: 1
y= = f (s)
1 + exp(− s )
• Nonlinear transform of s :

• Graphical Representation x1 w1
x2 w2
y
… wd  f
xd
w0
1
4
Logistic Regression
• Another Name: Perceptron
• It may be called an artificial neuron
• It mimics the function of neurons
x1 w1
x2 w2
y
… wd  f
xd
w0
1

• Logistic Regression is using just ONE perceptron


• What if we combine many perceptrons

5
Perceptron and Neuron
• perceptron is a mathematical model of a biological neuron.
• in actual neurons the dendrite receives electrical signals from the
axons of other neurons
• in the perceptron these electrical signals are represented as numerical
values

6
https://wp.nyu.edu/shanghai-ima-documentation/electives/aiarts/ece265/the-neural-network-nn-and-the-biological-neural-network-bnn-erdembileg-chin-erdene/
Perceptron
• Perceptron
• First function: Weighted summation of inputs
s = x0w0 + x1w1 + … + xdwd

• Second function: Non-linear function


1 f
Hard limit
s
0

x0 w0  n

x1
w1  f
y = f(s) 1
y= x w
i =0
i i 0
… 0
wd otherwise
xd
7
Perceptron
• What a perceptron does
x1 w1  n

x2  f
y 1
y= x w
i =1
i i 0
w2
w0
0 otherwise
1

x2

If an input is above the line


x1w1+x2w2+w0 = 0 output 1
else
output 0
x1

8
Perceptron
• What a perceptron can do
x1 w1
• And operation y
x2
w2  f
1 w0

1
w1=1.0, w2=1.0, w0=-1.5

x1 x2 S y
0 0 -1.5 0
1
0 1 -0.5 0
1 0 -0.5 0
1 1 0.5 1
9
Perceptron
• What a perceptron can do – con’d
x1 w1
• OR operation y
x2
w2  f
1 1 w0

w1=1.0, w2=1.0, w0=-0.5

x1 x2 S y
1 0 0 -0.5 0
0 1 0.5 1
1 0 0.5 1
1 1 1.5 1
10
Perceptron
• What a perceptron can do – con’d
• NOT operation
x1 w1
y
 f
1 w0

0 1 w1=-1.0, w0=0.5

x1 S y
0 0.5 1
1 -0.5 0

11
Multilayer Perceptron
• Let’s Cascade Many Perceptrons
• (A network of perceptrons) v.s. (A network of neurons (brain))
• Layered structures: for simplicity of learning

wij wjk
x1 y1

x2 y2

… … …

xm yn

1 1
Input hidden layer Output layer 12
Multilayer Perceptron
• Graphical Representation is Preferred
1
y=
x1
w11 h1 1 + exp(− (w31h1 + w32 h2 + w30 ))
w12  f w31
w10 y
x2 w32
 f h1 =
1
w21 h2 1 + exp(− (w11 x1 + w12 x2 + w10 ))
w22
 f
w20 w30
1 1
h2 =
1 + exp(− (w21 x1 + w22 x2 + w20 ))
1

• Mathematical form of the output


1
y=
   1   1  
1 + exp −  w31   + w32   + w30  
   1 + exp(− (w11 x1 + w12 x2 + w10 ))   1 + exp(− (w21 x1 + w22 x2 + w20 )) 
 

13
Multilayer Perceptron
• Structure of Multilayer Perceptron – con’d
• Input layer
• Simply pass the input values to the next layer
• # of nodes = # of inputs
• Hidden layer
• There can be several hidden layers
ANN for y=f(x1,x2)
• # of nodes should be given
• Output layer
• # of nodes = # of outputs x1 y
x2 … …

14
Multilayer Perceptron

• Artificial Neural Network


• AI tools based on biological brains
• It can learn anything!!

• It is a type of Artificial Neural Network


• Multilayer perceptron
• Kohonen’s Self-Organizing Neural
Networks
• ..

Other names of Multilayer Perceptron


• Feed-forward Neural Network
• Multilayer Feed-forward Neural
Network
15
Multilayer Perceptron

• What a multilayer perceptron can do

• Anything digital computers can do


• Boolean function: 2-layer perceptron
• Continuous function: 2-layer perceptron
• Arbitrary function: 3-layer perceptron

• We can build a multi-layer perceptron which satisfies


given input-output pairs 16

You might also like