0% found this document useful (0 votes)
15 views21 pages

Artificial Neural Network

Neural networks (NNs) have evolved from early models in the 1940s to sophisticated systems capable of learning from examples, inspired by the human brain's ability to solve complex problems. Unlike traditional computers that rely on explicit programming and deductive reasoning, NNs utilize inductive reasoning and can adapt through training, making them robust and fault-tolerant. Key components of NNs include layers of interconnected neurons that process inputs to produce outputs, with various training methods such as supervised and unsupervised learning to refine their performance.

Uploaded by

Eqra Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views21 pages

Artificial Neural Network

Neural networks (NNs) have evolved from early models in the 1940s to sophisticated systems capable of learning from examples, inspired by the human brain's ability to solve complex problems. Unlike traditional computers that rely on explicit programming and deductive reasoning, NNs utilize inductive reasoning and can adapt through training, making them robust and fault-tolerant. Key components of NNs include layers of interconnected neurons that process inputs to produce outputs, with various training methods such as supervised and unsupervised learning to refine their performance.

Uploaded by

Eqra Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Introduction To

Neural Networks
• Development of Neural Networks date back to the early 1940s. It experienced an
upsurge in popularity in the late 1980s. This was a result of the discovery of new
techniques and developments and general advances in computer hardware
technology.
• Some NNs are models of biological neural networks and some are not, but
historically, much of the inspiration for the field of NNs came from the desire to
produce artificial systems capable of sophisticated, perhaps ``intelligent",
computations similar to those that the human brain routinely performs, and
thereby possibly to enhance our understanding of the human brain.
• Most NNs have some sort of “training" rule. In other words, NNs “learn" from
examples (as children learn to recognize dogs from examples of dogs) and exhibit
some capability for generalization beyond the training data.
Neural Network Techniques
• Computers have to be explicitly programmed
– Analyze the problem to be solved.
– Write the code in a programming language.
• Neural networks learn from examples
– No requirement of an explicit description of the problem.
– No need a programmer.
– The neural computer to adapt itself during a training period, based on
examples of similar problems even without a desired solution to each
problem. After sufficient training the neural computer is able to relate the
problem data to the solutions, inputs to outputs, and it is then able to offer a
viable solution to a brand new problem.
– Able to generalize or to handle incomplete data.
NNs vs Computers
Digital Computers Neural Networks
• Deductive Reasoning. We apply known • Inductive Reasoning. Given input and
rules to input data to produce output. output data (training examples), we
• Computation is centralized, synchronous, construct the rules.
and serial. • Computation is collective, asynchronous,
• Memory is packetted, literally stored, and and parallel.
location addressable. • Memory is distributed, internalized, and
• Not fault tolerant. One transistor goes and content addressable.
it no longer works. • Fault tolerant, redundancy, and sharing of
• Exact. responsibilities.
• Static connectivity. • Inexact.
• Dynamic connectivity.
• Applicable if well defined rules with
precise input data. • Applicable if rules are unknown or
complicated, or if data is noisy or partial.
Evolution of Neural Networks
• Realized that the brain could solve many
problems much easier than even the best
computer
– image recognition
– speech recognition
– pattern recognition

Very easy for the brain but very difficult for a


computer
Evolution of Neural Networks
• Studied the brain
– Each neuron in the
brain has a relatively
simple function
– But - 10 billion of
them (60 trillion
connections)
– Act together to create
an incredible
processing unit
– The brain is trained by
Compensates for
its environment problems by massive
– Learns by experience parallelism
The Biological Inspiration

• The brain has been extensively studied by


scientists.

• Vast complexity prevents all but rudimentary


understanding.

• Even the behaviour of an individual neuron is


extremely complex

• Engineers modified the neural models to make


them more useful
• less like biology
• kept much of the terminology
The Structure of Neurons

synapse axon

nucleus

cell body

dendrites

A neuron has a cell body, a branching input structure (the dendrite) and a
branching output structure (the axon)
• Axons connect to dendrites via synapses.
• Electro-chemical signals are propagated from the dendritic input,
through the cell body, and down the axon to other neurons
The Structure of Neurons

• A neuron only fires if its input signal exceeds a certain


amount (threshold) in a short time period.

• Synapses vary in strength


– Good connections allowing a large signal
– Slight connections allow only a weak signal.
– Synapses either:
– Excitatory (stimulate)
– Inhibitory (restrictive)
Biological Analogy
• Brain Neuron

w1

• Artificial neuron w2

Inputs
(processing element) f(net)

wn

X1
• Set of processing X2
elements (PEs) and Input
X3 Output
Layer Layer
connections (weights)
X4
with adjustable strengths
X5
Hidden Layer
Benefits of Neural Networks

• Pattern recognition, learning, classification,


generalization and abstraction, and interpretation of
incomplete and noisy inputs

• Provide some human problem-solving characteristics

• Robust

• Fast, flexible and easy to maintain

• Powerful hybrid systems


(Artificial) Neural networks (ANN)
• ANN architecture
(Artificial) Neural networks (ANN)
• ‘Neurons’
– have 1 output but many inputs
– Output is weighted sum of inputs
– Threshold can be set
• Gives non-linear response
The Key Elements of Neural Networks
• Neural computing requires a number of neurons, to be connected together into a
"neural network". Neurons are arranged in layers.

• Each neuron within the network is usually a simple processing unit which takes
one or more inputs and produces an output. At each neuron, every input has an
associated "weight" which modifies the strength of each input. The neuron simply
adds together all the inputs and calculates an output to be passed on.
What is a Artificial Neural Network
• The neural network is:
– model
– nonlinear (output is a nonlinear combination of
inputs)
– input is numeric
– output is numeric
– pre- and post-processing completed separate from
model
numerical

Model:
inputs

numerical
mathematical transformation
outputs
of input to output
(Artificial) Neural networks (ANN)
• Training
– Initialize weights for all neurons
– Present input layer with e.g. spectral reflectance
– Calculate outputs
– Compare outputs with e.g. biophysical parameters
– Update weights to attempt a match
– Repeat until all examples presented
Training methods
• Supervised learning
In supervised training, both the inputs and the outputs are provided. The network
then processes the inputs and compares its resulting outputs against the desired
outputs. Errors are then propagated back through the system, causing the system to
adjust the weights which control the network. This process occurs over and over as the
weights are continually tweaked. The set of data which enables the training is called
the "training set." During the training of a network the same set of data is processed
many times as the connection weights are ever refined.
Example architectures : Multilayer perceptrons
• Unsupervised learning
In unsupervised training, the network is provided with inputs but not with desired
outputs. The system itself must then decide what features it will use to group the input
data. This is often referred to as self-organization or adaption. At the present time,
unsupervised learning is not well understood.
Example architectures : Kohonen, ART
Feedforword NNs
• The basic structure off a feedforward Neural Network

• The 'learning rule” modifies the weights according to the input patterns that it is presented
with. In a sense, ANNs learn by example as do their biological counterparts.
• When the desired output are known we have supervised learning or learning with a teacher.
An overview of the
backpropagation
1. A set of examples for training the network is assembled. Each case consists of a problem
statement (which represents the input into the network) and the corresponding solution
(which represents the desired output from the network).
2. The input data is entered into the network via the input layer.
3. Each neuron in the network processes the input data with the resultant values steadily
"percolating" through the network, layer by layer, until a result is generated by the output
layer.
4. The actual output of the network is compared to expected output for that particular input.
This results in an error value which represents the discrepancy between given input and
expected output. On the basis of this error value an of the connection weights in the network
are gradually adjusted, working backwards from the output layer, through the hidden layer,
and to the input layer, until the correct output is produced. Fine tuning the weights in this
way has the effect of teaching the network how to produce the correct output for a
particular input, i.e. the network learns.
Backpropagation Network
Neural Network Terminology
• ANN - artificial neural network

• PE - processing element (neuron)

• Exemplar - one individual set of input/output data

• Epoch - complete set of input/output data

• Weight - the adjustable parameter on each connection that


scales the data passing through it
Types of Layers
• The input layer
– Introduces input values into the network
– No activation function or other processing

• The hidden layer(s)


– Perform classification of features
– Two hidden layers are sufficient to solve any problem
– Features imply more layers may be better

• The output layer.


– Functionally just like the hidden layers
– Outputs are passed on to the world outside the neural network.

You might also like