0% found this document useful (0 votes)
85 views20 pages

Artificial Neural Network

Artificial neural networks are a simplified model of the human brain that can learn from data to perform tasks like classification and prediction. They are composed of interconnected neurons that transform inputs into outputs. Neural networks are adaptive, self-organizing, fault tolerant, and can operate in real-time. They are trained using supervised or unsupervised learning algorithms like backpropagation. Neural networks have applications in clustering, classification, function approximation, and prediction. While powerful, they require large amounts of training data and processing time. Future applications may include personalized education and intelligent hybrid lifeforms.

Uploaded by

Yash Patel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views20 pages

Artificial Neural Network

Artificial neural networks are a simplified model of the human brain that can learn from data to perform tasks like classification and prediction. They are composed of interconnected neurons that transform inputs into outputs. Neural networks are adaptive, self-organizing, fault tolerant, and can operate in real-time. They are trained using supervised or unsupervised learning algorithms like backpropagation. Neural networks have applications in clustering, classification, function approximation, and prediction. While powerful, they require large amounts of training data and processing time. Future applications may include personalized education and intelligent hybrid lifeforms.

Uploaded by

Yash Patel
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

Artificial neural network

Submitted By:
Ankur Panchani

Internal Guide:
Ms. Meenakshi

1
INTRODUCTION
What are artificial neural networks?

An extremely simplified model of the human brain.

Essentially a function approximator


 Transforms inputs into outputs to the best of its ability

Composed of many neurons that co-operate to perform the


desired function

2
Why use Neural Networks?
Adaptive learning:
An ability to learn

Self-Organization:
create its own organization or representation of
the information

Real Time Operation

Fault Tolerance via Redundant information coding:


capabilities may be retained even with major
network damage.
3
Biological Neural Network

The basic computational unit in


the nervous system is the nerve
cell, or neuron.

A neuron has:
Dendrites (inputs)
Cell body
Axon (output)

The Brain as an Information Processing System


 The human brain contains about 10 billion nerve cells, or neurons.
 On average, each neuron is connected to other neurons through
about 10000 synapses.

4
Artificial Neural Network
A Simple Artificial Neuron
The basic computational
element is often called a node or
unit.
The unit computes some
function f of the weighted sum of
its inputs.
The weighted sum is called the
net input to unit i, often written
neti.
The function f is the unit's
activation function. In the
simplest case, f is the identity
function, and the unit's output is
just its net input. This is called a
linear unit.
5
The Perceptron

a perceptron was the result


of merger between two
concepts McCulloch-Pitts
model of an artificial neuron
and Hebbian learning rule of
adjusting weights.

Added an extra input that


represents bias.

W1,W2…Wm are weight values normalized


in the range of either (0,1) or (-1,1)
Sum is the weighted sum, and
T is a threshold constant.
The function f is a linear step function.
6
Artificial Neuron with Continuous
Characteristics
 The general form an artificial
neuron can be described in two stages:

First:
 the linear combination of inputs is
calculated.
 the summation function often
takes an extra input value Theta to
represent threshold or bias of a
neuron.

Second:
 The sum-of-product value is then
passed into activation function
which generates the output from
the neuron.
7
Sigmoid Function

8
Single-Layer Network

The shaded nodes on the


Left are in the so-called input
layer. The input layer neurons
are to only pass and distribute
the inputs and perform no
computation.

Each of the inputs x1, x2…xN is connected to every artificial neuron in the
output layer through the connection weight. Since every value of outputs y1, y2…
yN is calculated from the same set of input values; each output is varied based
on the connection weights.

9
Multi-Layer Network

The input nodes pass


the information to the
units in the first hidden
layer, and then the
outputs from the first
hidden layer are passed
to the next layer, and so
on.

10
Backpropagation network
Backpropagation networks are
feedforward networks with distinct
input, output, and hidden layers.

Any number of hidden layers,


and any number of hidden units in
any given hidden layer.

Note that units within the same


layer are not interconnected.

The activation value ak of unit k is computed as shown:

The output yk of unit k is computed as shown:

11
Errors

Once activation is fed forward all the way to the output units, the network’s
response is compared to the desired output ydi which accompanies the
training pattern. There are two types of error.

The first error is the error at the output layer. This can be directly computed
as shown:

The second type of error is the error at the hidden layers. This cannot be
computed directly since there is no available information on the desired
outputs of the hidden layers.

12
The error at the output layer is
used to compute for the error at
the hidden layer immediately
preceding the output layer.

Once this is computed, this is


used in turn to compute for the
error of the next hidden layer
immediately preceding the last
hidden layer.

This is done sequentially until the error at the very first hidden
layer is computed.

Computation of errors eh at a hidden layer


is done as shown:

13
Functions To Calculate The Error and Weight

Error at k layer:

Derivative of Sigmoid function:

Update in weight

14
Training Process

One of the most important aspects of Neural Network is the learning process.

Learning can be done in

Supervised training
 Backpropagation algorithm
 Function modeling and prediction
 Data Classification
 Pattern Recognition

Unsupervised training
 Clustering
 Dimensionality Reduction
 Finding patterns in unstructured data

15
Applications

Clustering
 Explores the similarity between patterns

Classification/Pattern recognition

Function approximation
 Find an estimate of the unknown function f() subject to
noise.

Prediction/Dynamic System
 Forecast some future values of a time-sequenced data.

16
Pros and Cons

Pros
Perform tasks that a linear program cannot.
Can continue without any problem by their parallel nature.
Not need to be reprogrammed.
It can be implemented in any application.

Cons
Need training to operate.
The architecture is different from microprocessors therefore needs
to be emulated.
Requires high processing time for large neural networks.

17
Future of ANN

Neural Networks will fascinate user-specific systems for education,


information processing, and entertainment.

Neural networks, integrated with other artificial intelligence


technologies, methods for direct culture of nervous tissue, and other
exotic technologies such as genetic engineering, will allow us to
develop radical and exotic life-forms whether man, machine, or hybrid.

Neural networks will allow us to explore new realms of human


capability.

18
Conclusion
Artificial Neural Networking helps in integrating technological
aspects with the human aspects. It consists of a number of nodes, or
more technically known as neurons.
Each of these neurons consists of their own processing units that
enable them to have their own intelligence. It has the characteristic of
reducing noise. This property of noise reduction helps in the creation
of an efficient interface for its application devices.
Prediction ability of Neural Network is also very important
functionality for predict some future situation using past and present
situation.
It is useful to note that, the neural network approaches can tolerate
measurement errors.

In conclusion, the artificial neural network approaches can play an


important role in the design and development of intelligent network.

19
Thank You.

20

You might also like