Soft-Computing Techniques Lab
Soft-Computing Techniques Lab
B.E (CSE)
VII - SEMESTER
Name: …………………………………………………………………........................
BONAFIDE CERTIFICATE
Mr./Ms.…............………………………………………..Reg. No.…………………of
Computing Techniques Lab during the odd semester of the academic year
2019 – 2020.
Date:
INDEX
Ex. No: 1
Date:
Aim :
To write a Program in MATLAB to perform union, intersection and complement
operations of fuzzy set.
Algorithm:
1. Read the membership values of two fuzzy sets.
2. Perform union operation by using max( ) function.
3. Perform intersection operation by using min( ) function.
4. Perform complement operation by subtracting membership value from 1
5. Display the result.
Program:
Result:
Thus, the MATLAB program to perform Union, Intersection and Complement operations
of two Fuzzy sets has been executed successfully and the output is verified.
Implementation of De-Morgan’s Law
Ex. No: 2
Date:
Aim:
Algorithm:
Program:
De –Morgan’s Law
LHS
0.7000 0.5000
RHS
0.7000 0.5000
LHS
0.8000 0.6000
RHS
0.8000 0.6000
Result:
Thus, the MATLAB program for implementation of De-Morgan’s has been executed
successfully and the output is verified.
Plotting Various Membership Functions
Ex. No: 3
Date:
Aim:
To write a program in MATLAB to plot triangular, trapezoidal and bell shaped
membership functions.
Algorithm:
Program:
x=(0.0:1.0:10.0)’;
subplot(311 )
plot(x,[y1]);
x=(0.0:1.0:10.0)’;
subplot(312)
plot(x, [y1] );
x=(0.0:0.2:10.0);
y1=gbellmf (x,[3 5 7]);
subplot(313)
plot(x, [y1]);
Result:
Thus, the MATLAB program for plotting membership functions has been executed
successfully and the output is verified.
Using Fuzzy toolbox to model tips value
Ex. No: 4
Date:
Aim :
To use fuzzy toolbox to model tips value that is given after a dinner based on quality (not
good, satisfying, good and delightful ) and service (poor, average or good) and the tip
value ranges from Rs. 10 to 100.
Procedure:
INPUTS:
OUTPUT:
Use Fuzzy Inference System (FIS) Editor and perform the following
Created rules
Output
Result:
Aim :
To implement a Fuzzy Inference System (FIS) for which the inputs, output and rules are
given as below.
INPUTS: Temperature and Cloud Cover
Temperature: {Freeze, Cool, Warm and Hot}
Cloud Cover: {Sunny, Partly Cloud and Overcast}
OUTPUT: Speed
Speed : {Fast and Slow}
RULES:
1. If cloud cover is Sunny and temperature is warm, then drive Fast
Sunny (Cover) and Warm (Temp) -> Fast (Speed)
2. If cloud cover is cloudy and temperature is cool, then drive Slow
Cloudy (Cover) and Cool (Temp) -> Slow (Speed)
Procedure
1. Go to command window in Matlab and type fuzzy.
2. Now, new Fuzzy Logic Designer window will be opened.
3. Input / Output Variable
a. Go to Edit Window and click Add variable.
b. As per our requirements create two input variables, Temperature and Cloud
Cover.
c. Create one output variable, Speed.
4. Temperature:
a. Double click the Temperature input variable in Fuzzy Logic Designer window.
b. New window will be opened and remove all the Membership Functions.
c. Now, Go to Edit and Click Add MFs and select the 4 Parameters for Temperature
Class.
d. Change the following fields as mentioned data in the given below table.
Inputs : Temperature Freezing, Cool, Warm and Hot
MF1: MF2: MF3: MF4:
Range : [0 110] Range : [0 110] Range : [0 110] Range : [0 110]
Name : Freezing Name : Cool Name : Warm Name : Hot
Type : trapmf Type : trimf Type : trimf Type : trapmf
Parameter [0 0 30 50] Parameter [30 50 70] Parameter [50 70 90] Parameter [70 90 110 110]
5. Similarly, add the data’s to the Cloud Cover variables and Speed variables.
6. Cloud Cover:
7. Speed:
Created rules
Output
Result:
Thus a Fuzzy Inference System is implemented for temperature, cloud cover and speed
using the given rules.
Simple Fuzzy Set Operations
Ex. No: 6
Date:
Aim:
To write a MATLAB program to find algebraic sum, algebraic subtraction, algebraic
product, bounded sum, bounded subtraction and bounded product of two fuzzy sets.
Algorithm:
1. Read the values of the two fuzzy sets.
2. Perform the algebraic sum operation by,
A + B = (a + b) – (a * b)
Program:
a= input(‘Enter the fuzzy set a’ )
c= a + b
d= a * b
as= c – d
e= 1 – b
ad= a + e
f= a – b
g= c – 1
disp(as)
disp(ad)
disp(d)
disp(bs)
disp (bd)
disp(bp)
Output:
[1.0000 0.6000 ]
[1 0.9000]
[0.4000 0.1000]
[1.0000 0.7000]
[0.6000 0.3000]
[0.4000 0]
Result:
Thus, a program to perform simple fuzzy set operations has been executed and successfully
verified.
Using Hopfield network with no self connection
Ex. No: 7
Date:
Aim:
To write a MATLAB program to store the vector (1 1 1 0) and to find the weight matrix with
no self connection using a discrete hopfield net with mistake in first and second component of
vector that is (0 0 1 0).
Algorithm:
1. Make the initial activations of the net equal to given binary pattern
x = (1 1 1 0).
2. Let tx = (0 0 1 0).
3. Initialize weight matrix using the formula
w=(2*x’-1)*(2*x-1)
Program:
clc;
clear;
x=[1 1 1 0];
tx=[0 0 1 0];
w=(2*x’-1)*(2*x-1);
for i=1:4
w(i,i)=0
end
con=1;
y=[0 0 1 0];
while con
up=[4 2 1 3]
for i=1:4
yin(up(i))=tx(up(i))+y*w(1:4,up(i));
if yin (up(i))>0
y(up(i))=1;
end
if y==x
disp(‘Convergence has been obtained’);
disp(‘The converged output’);
disp(y);
con=0;
end
end
Output:
up=4 2 1 3
1 1 10
Result:
Thus the MATLAB program for using Hopfield network with no self connection has
been successfully executed and the output is verified.
Generation of ANDNOT function using McCulloch-Pitts neural
net
Ex. No: 8
Date:
Aim:
To write a MATLAB program to generate ANDNOT function using McCulloch-Pitts
neural net.
Algorithm:
1. Initialize weights w1,w2 and threshold theta
2. Assign input values
x1=[0 0 1 1]
x2=[0 1 0 1]
3. Assign output Z =[0 0 1 0]
4. Initialize y = [ 0 0 0 0 ]
5. Repeat the following for each input
i) Zin = x1*w1+x2*w2
ii) If Zin > theta set y as 1 else 0
6. If y is not equal to Z update weights and repeat step 5
7. Display weights and threshold value
Program:
clear;
clc;
w1=input(‘weight w1=’);
w2=input(‘weight w2=’);
theta=input(‘theta=’);
y=[ 0 0 0 0];
x1=[0 0 1 1];
x2=[0 1 0 1];
Z =[0 0 1 0];
Con=1;
While con
Zin=x1*w1+x2*w2;
for i=1:4
if Zin(i)>=theta
y(i)=1;
else y(i)=0;
end
end
disp(‘Output of net=’);
disp(y);
if y==z
con=0;
else
disp(‘Net is not learning enter another set of weights and threshold value’);
w1=input(‘Weight w1=’);
w2=input(‘Weight w2=’);
theta=input(‘theta=’);
end
end
disp(‘Weights of neuron’);
disp(w1);
disp(w2);
disp(‘Threshold value=’);
disp(theta);
Output of net= 0 0 1 0
McCulloch Pitts Net for ANDNOT function
Weights of neuron
1
-1
Threshold value=1
Result:
Aim:
To write a MATLAB program to find the weight matrix and bias of Hebbnet in bipolar to
classify two dimensional input patterns with their targets given below.
‘*’ indicates a ‘+’ and ‘.’ Indicates a ‘-‘
***** *****
*…. *….
***** *****
*…. *….
***** *….
Algorithm:
1. Create a single layer neural network with 25 neuron.
2. Set the initial weight and bias to zero.
3. Calculate the weights using
wi(new)=wi(old)+xi*t
b(new)=b(old)+t(i)
Program:
% Hebb Net to classify 2d input patterns
clear;
clc;
%Input Pattern
E=[1 1 1 1 1 1 -1 -1 -1 -1 1 1 1 1 1 1 -1 -1 -1 -1 1 1 1 1 1]
F=[1 1 1 1 1 1 -1 -1 -1 -1 1 1 1 1 1 1 -1 -1 -1 -1 1 -1 -1 -1 -1]
x(1,1:25)=E;
x(2,1:25)=F;
w(1:25)=0;
t=[1 -1];
b=0;
for i=1:2
w=w+x(I,1:25)*t(i);
b=b+t(i);
end
disp(w);
disp(‘Final Bias:’);
disp(b);
Output:
Weight matrix
0000000000000000000002222
Final Bias
Result:
Thus a MATLAB program to find the weight matrix and bias to classify two dimensional
input patterns in bipolar using Hebb Net have been executed and verified successfully.
Perceptron net for AND function with bipolar inputs and targets
Ex. No: 10
Date:
Aim:
To write a MATLAB program to implement AND function with bipolar input and output
using Perceptron net.
Algorithm:
1. Initialize weight and bias to 0
2. Accept learning rate, alpha and threshold, theta
3. For each input calculate yin = b+x(1)*w(1)+x(2)*w(2)
4. Apply activation function
5. If calculated output ≠ target output
i) update weight and bias
ii) Go to step 3
6. Display final weight matrix and bias value
Program:
% Perceptron for AND function
clear;
clc;
t=[1 -1 -1 -1];
w=[0 0];
b=0;
con = 1;
epoch = 0;
while con
con=0;
for i=1:4
yin=b+x(1,i)*w(1)+x(2,i)*w(2);
if yin>theta
y=1;
end
y=0;
end
y = -1;
end
if y-t(i)
con=1;
for j=1:2
w(j)=w(j)+alpha*t(i)*x(j,i);
end
b=b+alpha*t(i);
end
epoch=epoch+1;
end
disp(w);
disp(‘Final Bias’);
disp(b);
Sample Input and Output:
1 1
Final Bias
-1
Result:
Thus, a MATLAB program for Perception net for an AND function with bipolar inputs
have been written and verified successfully.
Finding weight matrix of hetero associative neural net for
mapping of vectors
Ex. No: 11
Date:
Aim:
To write a MATLAB program to calculate the weights using Hetero-associative neural
net for mapping of vectors.
S1 S2 S3 S4 t1 t2
1 1 0 0 1 0
1 0 1 0 1 0
1 1 1 0 0 1
0 1 1 0 0 1
Algorithm:
1. Enter input and output vector x and t
2. Initialize weight matrix.
3. Update weight matrix by using the formula
wi(new)=wi(old)+xi*t
Program:
%Hetero-associative neural net for mapping input vectors to output vectors
clear;
clc;
x=[1 1 0 0 ; 1 0 1 0 ; 1 1 1 0 ; 0 1 1 0];
t=[1 0 ; 1 0 ; 0 1 ; 0 1];
w=zeros(4,2);
for i=1:4
w=w+x(i,1:4)’*t(i,1:2);
end
disp(‘ Weight matrix:’);
disp(w);
Output:
Weight matrix
2 1
1 2
1 2
0 0
Result:
Thus a MATLAB program to calculate the weight matrix using hetero associative neural
net for mapping of vectors has been executed and verified successfully.
Generation of XOR function using back propagation algorithm
Ex. No: 12
Date:
Aim:
To write a MATLAB program to train and test the back propagation neural network for the
generation of XOR function.
Algorithm:
1. Enter the input vector x=[0 0 1 1 ; 0 1 0 1] and target vector y=[0 1 1 0].
2. Train the network by using the function newff().
3. Set the epoch and learning rate value.
4. Test the network by using the trained network.
5. Display the result.
Program:
function[net]=trainBPN(x,y)
[n,i]=size(x);
[m,o]=size(y);
net = newff(minmax(x),[i,10,m],{‘tansig’,’tansig’,’purelin’},’trainlm’)
net.trainparam.lr=0.2;
net=train(net,x,y);
r=sim(net,x);
return;
function[v]=testBPN(x,net)
v=sim(net,x);
return;
Output:
>>x=[0 0 1 1 ; 0 1 0 1]
>>y=[0 1 1 0]
>>[net]=trainBPN(x,y)
>>testBPN(x,net)
ans=
Result:
Thus a MATLAB program to train and test the back propagation neural network for the
generation of XOR function has been executed successfully and the output is verified.