0% found this document useful (0 votes)
53 views51 pages

CS407 Neural Computation: Neural Networks For Constrained Optimization. Lecturer: A/Prof. M. Bennamoun

The document discusses using neural networks for constrained optimization problems like the traveling salesman problem. It describes the Boltzmann machine and continuous Hopfield networks, which have fixed weights that represent constraints and the objective function. These networks iterate to find a solution representing the minimum of an energy function or maximum of a consensus function.

Uploaded by

anant_nimkar9243
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views51 pages

CS407 Neural Computation: Neural Networks For Constrained Optimization. Lecturer: A/Prof. M. Bennamoun

The document discusses using neural networks for constrained optimization problems like the traveling salesman problem. It describes the Boltzmann machine and continuous Hopfield networks, which have fixed weights that represent constraints and the objective function. These networks iterate to find a solution representing the minimum of an energy function or maximum of a consensus function.

Uploaded by

anant_nimkar9243
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

CS407 Neural Computation

Lecture 8:
Neural Networks for Constrained
Optimization.

Lecturer: A/Prof. M. Bennamoun


1
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

2
Fausett

Introduction
„ There are nets that are designed for constrained
optimization problems (such as the Traveling
Salesman Problem, TSP).
„ These nets have fixed weights that incorporate
information concerning the constraints and the
quantity to be optimized.
„ The nets iterate to find a pattern of o/p signals
that represents a solution to the problem.
„ E.g of such nets are the Boltzmann machine
(without learning), the continuous Hopfield net,
and several variations (Gaussian and Cauchy
nets).
„ Other optimization problems to which this type of
NNs can be applied to are: job shop scheduling,
3
space allocation,…
Traveling Salesman Problem (TSP)

„ The aim of the TSP is to find a tour of a


given set of cities that is of minimum length.
„ A tour consists of visiting each city exactly
once and returning to the starting city.
„ The tour of minimum distance is desired.
„ The difficulty of finding a solution increases
rapidly as the number of cities increases.
„ Many approaches other than NNs to solve
this problem are extensively reported in the
literature.
4
Fausett

Introduction… NN approach to constrained optimization


„ Each unit represents a hypothesis, with the unit
“on” if the hypothesis is true, “off” if the hypothesis
is false.
„ The weights are fixed to represent both the
constraints of the problem and the function to be
optimized.
„ The solution of the problem corresponds to the
minimum of an energy function or the maximum of
a consensus function for the net.
„ NNs have several potential advantages over
traditional techniques for certain types of
optimization problems.
– They can find near optimal solutions quickly
for large problems.
5
Fausett

Introduction… NN approach to constrained optimization


– They can also handle situations in which
some constraints are weak (desirable but not
absolutely required).
For e.g. in the TSP, it is physically impossible to visit
2 cities simultaneously, but it may be desirable to visit
each city only once.
The difference in these types of constraints could be
reflected by making the penalty for having 2 units in
the same column “on” simultaneously larger than the
penalty for having 2 units in the same row “on”
simultaneously.
If it is more important to visit some cities than others,
these cities can be given larger self-connection
weights.

6
Fausett

Introduction… NN architecture for the TSP


„ For n cities, we use n2 units, arranged in a square array.
„ A valid tour is represented by exactly one unit being “on”
in each row and in each column.
– Two units being “on” in a row indicates that the
corresponding city was visited twice;
– Two units being “on” in a column shows that the
salesman was in two cities at the same time.
„ The units in each row are fully interconnected; similarly
„ The units in each column are fully interconnected.
„ The weights are set so that units within the same row (or
the same column) will tend not to be “on” at the same
time.
„ In addition, there are connections (see later).
– between units in adjacent columns and
– between units in the first and last columns,
corresponding to the distances between cities 7
Introduction… NN architecture for the TSP Fausett

City 1 2 3 4 5 6 7 8 9 10
A UA,1 UA,2 UA,3 UA,4 UA,5 UA,6 UA,7 UA,8 UA,9 UA,10
B UB,1 UB,2 UB,3 UB,4 UB,5 UB,6 UB,7 UB,8 UB,9 UB,10
C UC,1 UC,2 UC,3 UC,4 UC,5 UC,6 UC,7 UC,8 UC,9 UC,10
D UD,1 UD,2 UD,3 UD,4 UD,5 UD,6 UD,7 UD,8 UD,9 UD,10
E UE,1 UE,2 UE,3 UE,4 UE,5 UE,6 UE,7 UE,8 UE,9 UE,10
F UF,1 UF,2 UF,3 UF,4 UF,5 UF,6 UF,7 UF,8 UF,9 UF,10
G UG,1 UG,2 UG,3 UG,4 UG,5 UG,6 UG,7 UG,8 UG,9 UG,10
H UH,1 UH,2 UH,3 UH,4 UH,5 UH,6 UH,7 UH,8 UH,9 UH,10
I UI,1 UI,2 UI,3 UI,4 UI,5 UI,6 UI,7 UI,8 UI,9 UI,10
J UJ,1 UJ,2 UJ,3 UJ,4 UJ,5 UJ,6 UJ,7 UJ,8 UJ,9 UJ,10
b b
-p
b

U 1,1 -p U 1,j -p U 1,n


b -p -p -p
b
-p

U i,1 -p U i,j -p U i,n


b -p
-p -p -p
-p -p
-p

U n,1 -p U n,j -p U n,n


8
b b b
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

9
Fausett

Boltzmann machine
„ The states of the units of a Boltzmann machine NNs are
binary valued, with probabilistic state transitions.
„ The configuration of the net is the vector of the states of the units.
„ The Boltzmann machine described in this lecture has fixed
weight wij, which express the degree of desirability that units
Xi and Xj both be “on”.
„ In applying Boltzmann machine to constrained optimization
problems, the weights represent the constraints of the problem
and the quantity to be optimized. Note that the description
presented here is based on the maximization of a consensus
function (rather than the minimization of a cost function).
„ The architecture of a Boltzmann machine is quite general,
consisting of
– a set of units (Xi and Xj are 2 representative units)
– a set of bi-directional connections between pairs of units.
„ If units Xi and Xj are connected, wij != 0.
„ The bi-directional nature of the connection is often represented
as wij = wji
10
Fausett

Boltzmann machine
„ A unit may also have a self-connection wii (or equivalently, there
may be a bias unit, which is always “on” and connected to
every other unit; in this interpretation, the self-connection
weight would be replaced by the bias weight).

„ The state xi of unit Xi is either 1 (“on”) or 0 (“off”). The objective


of the NN is to maximize the consensus function:
 
C = ∑ ∑ wij xi x j 
i  j ≤i 
The sum runs over all units of the net.
„ The net finds this maximum (or at least a local maximum) by
letting each unit attempt to change its state (from “on” to “off” or
vice versa).
„ The attempts may be made either sequentially (one unit at a
time) or in parallel (several units simultaneously).
„ Only the sequential Boltzmann machine will be
discussed here. 11
Fausett

Boltzmann machine
 
C = ∑ ∑ wij xi x j 
i  j ≤i 
If unit xi is “on” , xi =1
 
i =1 ∑ w1 j x1 x j  = w11x1 x1 If unit xi is “off”, xi =0
 j ≤1 
 
i=2 ∑ 2 j 2 j  = w21x2 x1 + w22 x2 x2
w x x
 j ≤2 
 
i =3 ∑ 3 j 3 j  = w31x3 x1 + w32 x3 x2 + w33 x3 x3
w x x
 j ≤3 
 
i=4 ∑ 4 j 4 j  = w41x4 x1 + w42 x4 x2 + w43 x4 x3 + w44 x4 x4
w x x
 j ≤4 
 
i =5 ∑ 5 j 5 j  = w51x5 x1 + w52 x5 x2 + w53 x5 x3 + w54 x5 x4 + w55 x5 x5
w x x
 j ≤5 

12
Fausett

Boltzmann machine
„ The change in consensus if unit Xi were to change its state (from
1 to 0 or from 0 to 1) is
Contribution from all nodes xj which
 
are “on” and connected to xi thru wij
∆C (i) = [1 − 2 xi ]wii + ∑ wij x j 
 j ≠i 
where xi is the current state of unit Xi. Xi = 0
The coefficient [1− 2 xi ] will be +1 if unit Xi is currently “off”
and –1 if unit Xi is currently “on”
Xi = 1

„ NOTE: that if unit Xi were to change its activation, the resulting


change in consensus can be computed from information that is
local to unit Xi , i.e. from weights on connections and activations
of units to which unit Xi is connected (with wii = 0 If unit Xi is not
connected to unit Xi ).

13
Fausett

Boltzmann machine
„ However, unit Xi does not necessarily change its state, even if
doing so would increase the consensus of the net.

„ The probability of the net accepting a change in state for unit Xi is


1
A(i, T ) =
 ∆C (i)  The control parameter T (called
1 + exp −  temperature) is gradually reduced as
 T  the net searches for a maximal
consensus

„ Lower values of T make it more likely that the net will accept a
change of state that increases its consensus and less likely that it
will accept a change that reduces its consensus.

 ∆C 
T → 0 ⇒ exp −  → 0 ⇒ A(i, T ) → 1 (assuming ∆C > 0)
 T 

14
Fausett

Boltzmann machine
„ The use of a probabilistic update procedure for the activations,
with the control parameter decreasing as the net searches for the
optimal solution to the problem represented by its weights,
reduces the chances of the net getting stuck in a local maximum.

„ This process of gradually reducing T is called simulated


annealing.

„ It is analogous to the physical annealing process used to produce


a strong metal (with a regular crystalline structure).

„ During annealing a molten metal is cooled gradually in order to


avoid imperfections in the crystalline structure of the metal due to
freezing.

15
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

16
Fausett

Boltzmann machine Architecture


„ Here is the architecture of a Boltzmann machine for units in a 2D
array.
„ The units within each row are fully interconnected.
„ Similarly, the units within each column are also fully
interconnected.
„ The weights on each of the connections is –p (where p>0).
„ Each unit has a self-connection, with weight b>0
„ A typical unit is labelled Ui,j
b b
-p
b

U 1,1 -p U 1,j -p U 1,n

b -p -p -p
b
-p

U i,1 -p U i,j -p U i,n


b -p
-p -p -p
-p -p
-p

U n,1 -p U n,j -p U n,n


b b b 17
Fausett

Boltzmann machine Algorithm


Setting the weights:
„ The weights for a Boltzmann machine are fixed so that the net
will tend to make state transitions toward a maximum of the
consensus function defined above.
„ If we wish the net (shown in previous slide) to have exactly one
unit “on” in each row and in each column, we must choose the
values of the weights p and b so that improving the configuration
corresponds to increasing the consensus.

„ Each unit is connected to every other unit in the same row with
weight –p (p > 0)
„ Similarly, each unit is connected to every other unit in the same
column with weight –p.

„ The weights are penalties for violating the condition at most one
unit be “on” in each row and each column.
„ In addition, each unit has a self-connection, of weight b>0.
18
Fausett

Boltzmann machine Algorithm


„ The self-connection weight is an incentive (bonus) to encourage
a unit to turn “on” if it can do so without causing more than one
unit to be on in a row or column.
„ If p > b, the net will function as desired:
– If unit Uij is “off” (uij = 0) and none of the units connected to
Uij is “on”, changing the status of Uij to “on” will increase the
consensus of the net by the amount b (which is a desirable
change).
– On the other hand, if one of the units in row i or in column j
(say, Ui, j+1 is already “on”), attempting to turn unit Uij “on”
would result in a change of consensus by the amount b-p.
Thus, for b-p < 0 (i.e. p>b), the effect would be to decrease
the consensus (the net will tend to reject this unfavorable
change).
– Bonus and penalty connections, with p>b, will be used in the
net for the TSP to represent the constraints for a valid tour.

19
Fausett

Boltzmann machine Algorithm


Application procedure:
„ The weight between unit Uij and UIJ is denoted w(i, j; I,J)

w(i, j; I , J ) = − p if i = I or j = J (but not both);


w(i, j; i, j ) = b
„ The application procedure is as follows:

Step 0 Initialize weights to represent the constraints of the problem


Initialize the control parameter (temperature) T
Initialize activations of units (random binary values).
Step 1 While stopping condition is false, do steps 2-8
Step 2 Do steps 3-6 n2 times (this constitutes an epoch)
Step 3 Choose integers I and J at random between 1 and
n (unit UIJ is the current candidate to change its
state)
Step 4 Compute the change in consensus that would result:
 
∆C (i) = [1 − 2uIJ ]w( I , J ; I , J ) + ∑∑ w(i, j; I , J )uij 
 i, j ≠ I ,J 
20
Fausett

Boltzmann machine Algorithm


Step 5 Compute the probability of acceptance of the change:
1
A(T ) =
 ∆C (i) 
1 + exp − 
 T 
Step 6 Determine whether or not to accept the change
Let R be a random number between 0 and 1.
If R<A, accept the change:
uI , J = 1 − uI , J (This changes the state of unit UI,J )
If R>= A, reject the proposed change.
Step 7 Reduce the control parameter
T (new) = 0.95 T (old )
Step 8 Test stopping condition:

If there has been no change of state for a specified number of epochs, or if


the temperature has reached a specified value, stop; otherwise continue.

21
Fausett

Boltzmann machine Algorithm


Initial Temperature:

„ The initial temperature should be taken large enough


so that the probability of accepting a change of state is
approximately 0.5, regardless of whether the change
is beneficial or detrimental.
 ∆C 
T → ∞ ⇒ exp −  → 1 ⇒ A(i, T ) → 0.5
 T 

„ However, since a high starting temperature increases


the required computation time significantly, a lower
initial temperature may be more practical in some
applications.

22
Fausett

Boltzmann machine Algorithm


Cooling schedule

„ Theoretical results show that the temperature should be cooled


slowly according to the logarithmic formula:
T0
TB (k ) =
log(1 + k )
where k is an epoch.

„ Exponential cooling schedule can be used:


T (new) = α T (old )

where the temperature is reduced after each epoch.


– A larger α (such as α = 0.98) allows for fewer epochs at
each temperature
– A smaller α (such as α = 0.9) may require more epochs at
each temperature.

23
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

24
Fausett

Boltzmann machine Application TSP


„ Nomenclature:

n number of cities in the tour (there are n2 unit in the net)


i index designating a city; 1 ≤ i ≤ n
j index designating position in tour, mod n; i.e.

j = n + 1 → j = 1,
j =0→ j =n

Ui,j unit representing the hypothesis that the ith city is visited at the jth
step of the tour
ui,j activation of unit Ui,j ;
ui,j=1 if the hypothesis is true,
ui,j=0 if the hypothesis is false

di,k distance between city i and city k, k ≠ i


d maximum distance between 2 cities.

25
Fausett

Boltzmann machine Application TSP


Architecture:
„ For this application it is convenient to arrange the units of the NN
in a grid (Fig. below).
– The rows of the grid represent cities to be visited
– The columns the position of a city in the tour.
Position
City 1 2 3 4 5 6 7 8 9 10
A U A,1 U A,2 U A,3 U A,4 U A,5 U A,6 U A,7 U A,8 U A,9 U A,10
B U B,1 U B,2 U B,3 U B,4 U B,5 U B,6 U B,7 U B,8 U B,9 U B,10
C UC,1 UC,2 UC,3 UC,4 UC,5 UC,6 UC,7 UC,8 UC,9 UC,10
D U D,1 U D,2 U D,3 U D,4 U D,5 U D,6 U D,7 U D,8 U D,9 U D,10
E U E,1 U E,2 U E,3 U E,4 U E,5 U E,6 U E,7 U E,8 U E,9 U E,10
F U F,1 U F,2 U F,3 U F,4 U F,5 U F,6 U F,7 U F,8 U F,9 U F,10
G UG,1 UG,2 UG,3 UG,4 UG,5 UG,6 UG,7 UG,8 UG,9 UG,10
H U H,1 U H,2 U H,3 U H,4 U H,5 U H,6 U H,7 U H,8 U H,9 U H,10
I U I,1 U I,2 U I,3 U I,4 U I,5 U I,6 U I,7 U I,8 U I,9 U I,10
J U J,1 U J,2 U J,3 U J,4 U J,5 U J,6 U J,7 U J,8 U J,9 U J,10
26
Fausett

Boltzmann machine Application TSP


Ui,j has a self-connecton of weight b;
this represents the desirability of visiting city, i at stage j.

Ui,j is connected to all other units in row i with penalty weight –p;
this represents the constraints that the same city is not to be
visited twice.

Ui,j is connected toall other units in column j with penalty weight –p;
this represents the constraint that 2 cities cannot be visited
simultaneously.

Ui,j is connected to Uk,j+1 for 1 ≤ k ≤ n, k ≠ i, with weight − di ,k


This represents the distance traveled in making the
transition from city i at stage j to city k at stage j+1

Ui,j is connected to Uk,j-1 for 1 ≤ k ≤ n, k ≠ i, with weight − di ,k


This represents the distance traveled in making the
transition from city k at stage j-1 to city i at stage j

27
Fausett

Boltzmann machine Application TSP


Setting the weights: The desired net will be constructed in 2 steps
„ First, a NN will be formed for which the maximum consensus occurs
whenever the constraints of the problem are satisfied.
i.e. when exactly one unit is “on” in each row and in each column.

„ Second, we will add weighted connections to represent the distances


between the cities.
In order to treat the problem as a maximum consensus problem, the
weights representing distances will be negative.

A Boltzmann machine with weights representing the constraints (but


not the distances) for the TSP is shown below.

If p>b, the net will function as desired (as explained earlier).

To complete the formulation of a Boltzmann NN for the TSP,


weighted connections representing distances must be included.

For this purpose, a typical unit Ui,j is connected to the units Uk,j-1 and
Uk,j+1 (for all k ≠ i ) by weights that represent the distances between
city i and city k 28
Fausett

Boltzmann machine Application TSP

b b
-p
b

-p -p
U 1,1 U 1,j U 1,n

b -p -p -p
b
-p

-p -p
U i,1 U i,j U i,n
b -p
-p -p -p
-p -p
-p

-p -p
U n,1 U n,j U n,n

b b b

29
Fausett

Boltzmann machine Application TSP


„ The distance weights are shown on the figure below for the typical
unit Ui,j

U 1,j-1 U 1,j U 1,j+1


-d1,i -d1,i

U k,j-1 U k,j+1

-dk,i -dk,i

U i,j-1 U i,j U i,j+1

-dn,i
-dn,i

U n,j U n,j+1
U n,j-1

30
Boltzmann NN for the TSP; weights represent the distances for unit Ui,j
Fausett

Boltzmann machine Application TSP


NOTE:
„ Units in the last column are connected to units in the first column by
connections representing the appropriate distances.
„ However, units in a particular column are not connected to units in
columns other than those immediately adjacent to the said column.

We now consider the relation between the constraint weight b and the
distance weights.
„ Let d denote the maximum distance between any 2 cities in the tour.
„ Assume that no city is visited in the jth position of the tour and that
no city is visited twice.
„ In this case, some city, say i is not visited at all; i.e. no unit is “on” in
column j or in row i.
„ Since allowing Ui,j to turn on should be encouraged, the weights
should be set so that the consensus will be increased if it turns on.
„ The change in consensus will be b − di ,k1 − di ,k 2 where
– k1 indicates the city visited at stage j-1 of the tour
– k2 denotes the city visited at stage j+1 (and city i is visited at
stage j).
„ This change >= b-2d (this change should be >= even for
maximum distance between cities, d) 31
Fausett

Boltzmann machine Application TSP


„ However equality will occur only if the cities visited in positions j-1
and j+1 are both the maximum distance d, away from city i
„ In general, requiring the change in consensus to be positive will
suffice, so we take b>2d.

„ Thus, we see that if p>b, the consensus function has a higher value
for a feasible solution (one that satisfies the constraints) than for a
non-feasible solution
„ If b>2d the consensus will be higher for a short feasible solution
than for a longer tour.

„ In summary: p>b >2d

32
Fausett

Boltzmann machine Analysis


„ The TSP is a nice model for a variety of constrained optimization
problems.
„ It is however a difficult problem for the Boltzmann machine,
because in order to go from one valid tour to another, several
invalid tours must be accepted.
„ By contrast, the transition from valid solutions to valid solution
may not be as difficult in other constrained optimization
problems.

„ Equilibrium:
– The net is in thermal equilibrium (at a particular
temperature) when the probs Pα and Pβ of 2 configurations
of the net, α and β, obey the Boltzmann distribution

Pα  E − Eα  Eα = energy of configα
= exp β 
Pβ  T  Eβ = energy of config β

33
Fausett

Boltzmann machine Analysis


„ At higher temperatures, the probs of different configs are more
nearly equal
T → ∞ ⇒ exp(.) → 1 ⇒ Pα ~ Pβ

„ At lower temperatures, there is a stronger bias toward confs with


lower energy.

„ Starting at a sufficiently high temperature ensures that the net will


have approximately equal probs of accepting or rejecting any
proposed state transition.

„ If the temp is reduced slowly, the net will remain in equilibrium at


lower temps.
„ It is not practical to verify directly the equilibrium condition at
each temp, as there are too many possible configurations.

34
Fausett

Boltzmann machine Analysis


Energy function:
„ The energy of a configuration can be defined as:

E = −∑∑ wij xi x j + ∑θi xi


i j <i i

Where θi is a threshold and self-connections (or biases) are not


used.

„ The difference in energy between a config with unit Xk “off” and


one with Xk “on” (and the state of all other units remaining
unchanged) is
∆E (k ) = −θ k + ∑ wik xi
i

„ If the units change their activations randomly and


asynchronously and the net always moves to a lower energy
(rather than moving to a lower energy with a probability that is
less than 1), the discrete Hopfield net results.

35
Fausett

Boltzmann machine Analysis


„ To simplify notation, one may include a unit in the net that is
connected to every other unit and is always “on”.
„ This allows the threshold to be treated as any other weight, so
that E=− ∑∑
i
w xx
j <i
ij i j

„ The energy gap between the conf with unit Xk “off” and that with
unit Xk“on” is
∆E (k ) = ∑ wik xi
i

36
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

37
Fausett

Continuous Hopfield net


„ A modification of the discrete Hopfield net, with continuous-
valued output functions, can be used either for associative
memory problems (as with the discrete form) or constrained
optimization problems such as the TSP.

„ As with the discrete Hopfield net, the connections between units


are bidirectional. So that the weight matrix is symmetric;
– i.e. the connection from unit Ui to unit Uj (with weight wij) is the
same as the connection from Uj to Ui (with weight wji).

„ For the continuous Hopfield net, we denote


– the internal activity of a neuron as ui;
– Its output signal is
vi = g (ui )
„ If we define an energy function
n n n
E = 0.5∑∑ wij vi v j + ∑θi vi
i =1 j =1 i =1

38
Fausett

Continuous Hopfield net


n n n
E = 0.5∑∑ wij vi v j + ∑θi vi for i ≠ j
i =1 j =1 i =1

n=2⇒
 2 
E = 0.5 ∑ wi1vi v1 + wi 2vi v2  + (θ1v1 + θ 2v2 ) for i ≠ j
 i =1 
E = 0.5(w12v1v2 + w21v2v1 + ) + (θ1v1 + θ 2v2 )

θ1

U1
w12

w21
U2

θ2 39
Fausett

Continuous Hopfield net


„ Then the net will converge to a stable configuration that is a
minimum of the energy function as long as
dE n n n
≤0 But E = 0.5∑∑ wij vi v j + ∑θi vi
dt i =1 j =1 i =1

dE ∂ E dv i du i
dt
= ∑i
⋅ ⋅
∂ v i du i dt
(chain rule)

∂E
∂vi
= ∑w
j≠i
ij vj +θ i = net i

dv i
= g ' (u i ) > 0 ,
du i
du i ∂E
= − net i = − E is a Lyapounov
dt ∂vi Energy function

dE
Hence = − ∑ g ' (u i )( net i ) 2 ≤ 0
dt i 40
Fausett

Continuous Hopfield net


„ For this form of the energy function, the net will converge if the
activity of each neuron changes with time according to the
differential equation

dui ∂E n
=− = −∑ wij v j − θi
dt ∂vi j =1

„ In the original presentation of the continuous Hopfield net, the


energy function was:

n n n n vi
1
E = −0.5∑∑ wij vi v j − ∑θi vi + ∑ ∫ g −1
(v)dv
τ
i
i =1 j =1 i =1 i =1 0

Time constant

41
Fausett

Continuous Hopfield net


„ If the activity of each neuron changes with time according to the
differential equation
dui ui n
= − + ∑ wij v j + θi
dt τ j =1
the net will converge.

„ In the Hopfield-Tank solution of the TSP, each unit has 2 indices.


– The first index –x, y, etc.– denotes the city,
– The second –i,j, etc –, denotes the position in the tour.
„ The Hopfield-Tank energy function for the TSP is:
2
A B C 
E = ∑∑∑ vx,i vx, j + ∑∑∑ vx,i v y ,i +  N − ∑∑ vx,i 
2 x i j ≠i 2 i x y≠ x 2 x i 
D
+ ∑∑∑
2 x y≠ x i
d x, y vx,i (v y ,i +1 + v y ,i −1 )

42
Fausett

Continuous Hopfield net


„ The differential equation for the activity of unit UX,I is
duX , I u X ,I  
=− − A∑ vXJ − B ∑ v y , I + C  N − ∑∑ vx,i 
dt τ j≠I y≠ X  x i 
− D ∑ d X , y (v y , I +1 + v y , I −1 )
y≠ X

„ The o/p signal is given by applying the sigmoid function (with


range between 0 and 1), which Hopfield and Tank expressed as
vi = g (ui ) = 0.5[1 + tanh(αui ]

43
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

44
Fausett

Continuous Hopfield net


Approach

„ Formulate the problem in terms of a Hopfield


energy of the form:
n n n
E = 0.5∑∑ wij vi v j + ∑θi vi
i =1 j =1 i =1

Solution
Formulation by Energy State
Problem Hopfield Minimization
Energy By Hopfield

45
Fausett

Continuous Hopfield net


Architecture for TSP

„ The units used to solve the 10-city TSP are arranged as shown

Position
City 1 2 3 4 5 6 7 8 9 10
A U A,1 U A,2 U A,3 U A,4 U A,5 U A,6 U A,7 U A,8 U A,9 U A,10
B U B,1 U B,2 U B,3 U B,4 U B,5 U B,6 U B,7 U B,8 U B,9 U B,10
C UC,1 UC,2 UC,3 UC,4 UC,5 UC,6 UC,7 UC,8 UC,9 UC,10
D U D,1 U D,2 U D,3 U D,4 U D,5 U D,6 U D,7 U D,8 U D,9 U D,10
E U E,1 U E,2 U E,3 U E,4 U E,5 U E,6 U E,7 U E,8 U E,9 U E,10
F U F,1 U F,2 U F,3 U F,4 U F,5 U F,6 U F,7 U F,8 U F,9 U F,10
G UG,1 UG,2 UG,3 UG,4 UG,5 UG,6 UG,7 UG,8 UG,9 UG,10
H U H,1 U H,2 U H,3 U H,4 U H,5 U H,6 U H,7 U H,8 U H,9 U H,10
I U I,1 U I,2 U I,3 U I,4 U I,5 U I,6 U I,7 U I,8 U I,9 U I,10
J U J,1 U J,2 U J,3 U J,4 U J,5 U J,6 U J,7 U J,8 U J,9 U J,10

46
Fausett

Continuous Hopfield net


Architecture for TSP

„ The connection weights are fixed and are usually not shown or even
explicitly stated.
„ The weights for inter-row connections correspond to the parameter A in
the energy equation;
– There is a contribution to the energy if 2 units in the same row are
“on”.
„ Similarly, the inter-columnar connections have weights B;
„ The distance connections appear in the fourth term of the energy
equation.
„ More explicitly, the weights between units Uxi and Uyi are

w( x, i : y, j ) = − Aδ xy (1 − δ ij ) − Bδ ij (1 − δ xy ) + C − Ddxy (δ i , j +1 + δ i , j −1
 1 if i = j
δ ij is the Dirac Delta = 
0 otherwise
„ In addition each unit receives an external input signal
I xi = +CN The parameter N is usually taken to be
somewhat larger than the number of cities n 47
Fausett

Continuous Hopfield net


Algorithm for TSP

Step 0 Initialize activations of all units


Initialize ∆t to a small value
Step 1 While stopping condition is false, do steps 2-6
Step 2 Perform steps 3-5 n2 times (n is the number of cities)
Step 3 choose a unit at random
Step 4 change activity on selected unit:
   
u x,i (new) = ux,i (old ) + ∆t − u x,i (old ) − A∑ vxj − B∑ v y ,i + C N − ∑∑ vxj  − D∑ d x, y (v y ,i +1 + v y ,i −1 
 j ≠i y≠ x  x j  y≠ x 

Step 5 apply output function

vx,i = 0.5[1 + tanh(αu x,i ]

Step 6 check stopping condition

48
Fausett

Continuous Hopfield net


Algorithm for TSP

„ Hopfield and Tank used the following parameter values in their solution
of the problem:
A = B = 500 , C = 200, D = 500, N = 15, α = 50
„ The large value of α gives a very steep sigmoid function, which
approximates a step function.
„ The large coefficients and a correspondingly small ∆t result in very little
contribution from the decay term ( u x,i (old ) ∆t )

„ The initial activity levels (ux,i) were chosen so that∑∑


x i
vx,i = 10 (the
desired total activation for a valid tour). However, some noise was
included so that not all units started with the same activity (or o/p signal).

49
Neural Nets for Constrained Optimization.
„„ Introduction
Introduction
„„ Boltzmann
Boltzmannmachine
machine
––Introduction
Introduction
––Architecture
Architectureand
andAlgorithm
Algorithm
„„ Boltzmann
Boltzmannmachine:
machine:application
application
to
tothe
theTSP
TSP
„„ Continuous
ContinuousHopfield
Hopfieldnets
nets
„„ Continuous
ContinuousHopfield
Hopfieldnets:
nets:
application
applicationto
tothe
theTSP
TSP
„„ References
Referencesandandsuggested
suggested
reading
reading

50
Suggested Reading.

ƒL. Fausett,
“Fundamentals of
Neural Networks”,
Prentice-Hall,
1994, Chapter 7.

51

You might also like