0% found this document useful (0 votes)
32 views8 pages

Module 1 Basic Problems

The document discusses layers of neural networks and neurons, including inputs, processing, and outputs of artificial neurons. It also discusses challenges that can be addressed using an activation function like ReLU, which is used to introduce non-linearity in neural networks.

Uploaded by

Vighnesh M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views8 pages

Module 1 Basic Problems

The document discusses layers of neural networks and neurons, including inputs, processing, and outputs of artificial neurons. It also discusses challenges that can be addressed using an activation function like ReLU, which is used to introduce non-linearity in neural networks.

Uploaded by

Vighnesh M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

MODULE-1

Basic problems
LAYERS OF NN

To classify the input vehicle as emergency/non-emergency


vehicle
 A layer consists of small individual units called
neurons.  
 An artificial neuron is similar to a biological neuron. It
receives input from the other neurons, performs some
processing, and produces an output.
Here, X1 and X2 are inputs to the artificial neurons, f(X) represents the
processing done on the inputs and y represents the output of the neuron.

Y=f(y_in)
FIRING OF A NEURON

X1 X2 THRESHOL EXCITATOR
D (T) Y (FIRING)
/INHIBITOR
Y (NOT
FIRING)
30 0 100 NF
40 40 100 NF
20 30 10 F
40 10 70 F
PROBLEM-1
Assume we first want to estimate the age of a person from
his height, weight, and cholesterol level and then classify
the person as old or not, based on if the age is greater than
60.
CHALLENGES –NEED OF ACTIVATION
FUNCTION

rectified linear activation function (ReLU ) is an activation function.

You might also like