Artificial Intelligence in Robotics-21-50
Artificial Intelligence in Robotics-21-50
4
5
Example:
Example:
6
Artificial Intelligence in Robotics
(Lecture 12)
By:
Dr. Mohammed Y. Hassan
1
Genetic Algorithm
2
3
4
Example:
Example:
5
6
Example:
7
8
Artificial Intelligence in Robotics
(Lecture 2)
1
Lecture 2
Neural Network (NN)
Introduction:
Computers are extremely fast and precise in executing sequences of
instructions that have been formulated for them. A human information processing
system is composed of neurons switching at a speed about a million times slower
than computer gates. In the contrary, humans are more efficient than computers at
computationally complex tasks such as speech and pattern recognition.
2
collectively and simultaneously on most or all data and inputs.
Biological Neurons:
A human brain consists of approximately 10" computing elements called
neurons. They communicate through a connection network of axons and synapses
having a density of approximately synapses per neuron.
Neurons communicate with each other by means of electrical impulses. The
neurons operate in a chemical environment that is even more important in terms
of actual brain behaviour.
The input to the network is provided by sensory receptors. Receptors deliver
stimuli both from within the body, as well as from sense organs when the stimuli
originate in the external world.
As a result of information processing in the central nervous systems, the effectors
are controlled and give human responses in the form of diverse actions.
The elementary nerve cell, called a neuron, is the fundamental building block of
the biological neural network.
A typical cell has three major regions: the cell body, which is also called
the soma, the axon, and the dendrites.
Dendrites form a dendrites tree, which is a very fine bush of thin fibbers
around the neuron's body. They receive information from neurons through
3
axons-long fibres that serve as transmission lines.
An axon is a long cylindrical connection that carries impulses from the
neuron. The end part of an axon splits into a fine arborisation. Each branch
of it terminates in a small end bulb almost touching the dendrites of
neighbouring neurons.
The axon-dendrite contact organ is called a synapse. The synapse is where
the neuron introduces its signal to the neighbouring neuron. The signals
reaching a synapse and received by dendrites are electrical impulses.
The inter-neuronal transmission is sometimes electrical but is usually
affected by the release of chemical transmitters at the synapse. Thus,
terminal buttons generate the chemical that affects the receiving neuron.
The receiving neuron either generates an impulse to its axon, or produces
no response.
The neuron is able to respond to the total of its inputs aggregated within a short
time interval called the period of latent summation.
The neuron's response is generated if the total potential of its membrane reaches
a certain level. Specifically, the neuron generates a pulse response and sends it
to its axon only if the conditions necessary for firing are fulfilled.
Incoming impulses can be excitatory if they cause the firing, or inhibitory if
they hinder the firing of the response. A more precise condition for firing is that
the excitation should exceed the inhibition by the amount called the threshold of
the neuron, typically a value of about 40mV
Since a synaptic connection causes the excitatory or inhibitory reactions of the
receiving neuron, it is practical to assign positive and negative unity weight
values, respectively, to such connections.
This allows us to reformulate the neuron's firing condition. The neuron fires
when the total of the weights to receive impulses exceeds the threshold value
during the latent summation period.
After carrying a pulse, an axon fibre is in a state of complete non-excitability for
a certain time called the refractory period. For this time interval the nerve does
not conduct any signals, regardless of the intensity of excitation.
Thus, we may divide the time scale into consecutive intervals, each equal to the
length of the refractory period. This will enable a discrete-time description of
the neurons' performance in terms of their states at discrete time instances.
5
Types of Non-Linearities (Activation Functions):
Activation functions can be divided into two categories:
b) Unipolar sigmoid:
By shifting and scaling the bipolar activation function, a unipolar
continuous activation function can be obtained:
c) Linear:
A linear function can also be used as an activation function:
6
Figure. Example of a linear activation function.
b) Unipolar binary:
As in the unipolar continuous activation function, the bipolar binary
activation function is obtained:
{
7
Types of Neural Networks (NNs):
Neural networks can be defined as an interconnection of neurons such that
neurons outputs are connected through weights to all other neurons including
themselves; both lag-free and delay connections are allowed. Therefore; one can
define two general types of artificial neural networks:
8
Artificial Intelligence in Robotics
(Lecture 3)
1
Neural network Recall:
The process of computation of the output response (o) for a given input (x)
performed by the network is known as recall.
Recall is the proper processing phase for a neural network and its objective is
to retrieve the information.
Recall corresponds to the decoding of the stored content which may have
been encoded in a network previously.
Figure. Autoassociation.
Hetroassociation:
In hetroassociation processing, the association between pairs of patterns are
stored.
Figure. Hetroassociation.
Classification:
If a set of input patterns is divided into a number of class or categories, then
in response to an input pattern, the classifier should recall the information
regarding class membership of the input pattern. Typically, classes are
expressed by discrete-values output vectors, and thus output neurons of
classifiers would employ binary activation functions.
Figure. Classification .
2
Generalization:
One of the distinct strengths of neural networks is their ability to generalize.
The network is said to generalize well when it sensibly interpolates input
patterns that are new to the network.
Figure. Generalization.
3
information is available as to correctness or incorrectness of responses,
learning must be somehow be accomplished based on observations of
responses to inputs that we have marginal or no knowledge about.
Learning rules:
A neuron is considered to be an adaptive element. Its weights are modified
depending on the input signal, its output value, and the associated teacher response.
In some cases, the teacher signal is not available and no error information can be
used (unsupervised learning method).
For the neural network shown below, the jth input can be an output of another
neuron or it can be an external input. Under different learning rules, the form of the
neuron’s activation function may be different. Note that the threshold parameter
may be included in learning as one of the weights. This would require fixing one of
the inputs, say , (Ex.: -1).
4
Figure. Single neuron
There are several methods used to train the neural network, in the following is
the explanation of three methods; i.e Hebbian learning rule (Any (Binary and
continuous) unsupervised learning), Perceptron learning rule (Binary supervised
learning) and Delta rule (continuous supervised learning).
5
Example:
Assume the network shown in figure below;
illustrate one epoch, (one iteration for each pattern), Hebbian learning with
bipolar binary activation function assuming learning rate of (c=1) with the input
and initial weight vectors are:
6
Solution:
Note: Since the initial weights are of non-zero value, the network has apparently
been trained beforehand.
The bipolar binary activation function is: ( ) ( ) {
7
Note: The iteration process stops either the required number of epochs achieved
or all the updated weights are not changed for one epoch.
Example:
Repeat solving the previous example assuming the activation function is bipolar
continuous bipolar sigmoid activation function. Assume λ=1.
Solution:
The bipolar sigmoid activation function is:
( ) -1,
8
Note: The student must give detailed solution in Exams.
Homework 1:
1. Iterate the previous two examples for the second epoch and check where the
weights have changed.
2. Write a MATLAB program that implements Hebbian learning rule.
9
Artificial Intelligence in Robotics
(Lecture 4)
1
\