Implementing Logic Gates Using Neural Networks (Part 2) - by Vedant Kumar - Towards Data Science
Implementing Logic Gates Using Neural Networks (Part 2) - by Vedant Kumar - Towards Data Science
AND
Y
X Xг
Hello everyone!! Before starting with part 2 of implementing logic gates Wo=-3
0, O o
い2こ2
1W1=2
using Neural networks, you would want to go through part1 first. 0, ) AND
110
1,1 Wo=3,W15-21425-2
From part 1, we had figured out that we have two input neurons or x
Wo
NAWD
vector having values as x1 and x2 and 1 being the bias value. The input
values, i.e., x1, x2, and 1 is multiplied with their respective weight matrix w2 Antibicial reuron
that is W1, W2, and W0. The corresponding value is then fed to the Truth Table of AND gate and the values of weights that make the system act as AND and NAND gate,
Image by Author
summation neuron where we have the summed value which is
(0,0) case
Consider a situation in which the input or the x vector is (0,0). The value
of Z, in that case, will be nothing but W0. Now, W0 will have to be less
Image by Author
than 0 so that Z is less than 0.5 and the output or ŷ is 0 and the definition
of the AND gate is satisfied. If it is above 0, then the value after Z has The line separating the above four points, therefore, be an equation
passed through the sigmoid function will be 1 which violates the AND W0+W1*x1+W2*x2=0 where W0 is -3, and both W1 and W2 are +2. The
gate condition. Hence, we can say with a resolution that W0 has to be a equation of the line of separation of four points is therefore x1+x2=3/2.
negative value. But what value of W0? Keep reading… The implementation of the NOR gate will, therefore, be similar to the just
the weights being changed to W0 equal to 3, and that of W1 and W2 equal
(0,1) case to -2
Now, consider a situation in which the input or the x vector is (0,1). Here
the value of Z will be W0+0+W2*1. This being the input to the sigmoid Moving on to XOR gate
function should have a value less than 0 so that the output is less than 0.5 For the XOR gate, the truth table on the left side of the image below
and is classified as 0. Henceforth, W0+W2<0. If we take the value of W0 depicts that if there are two complement inputs, only then the output will
as -3(remember the value of W0 has to be negative) and the value of W2 be 1. If the input is the same(0,0 or 1,1), then the output will be 0. The
as +2, the result comes out to be -3+2 and that is -1 which seems to satisfy points when plotted in the x-y plane on the right gives us the information
the above inequality and is at par with the condition of AND gate. that they are not linearly separable like in the case of OR and AND
gates(at least in two dimensions).
(1,0) case
Similarly, for the (1,0) case, the value of W0 will be -3 and that of W1 can
be +2. Remember you can take any values of the weights W0, W1, and W2
as long as the inequality is preserved.
(1,1) case
In this case, the input or the x vector is (1,1). The value of Z, in that case,
will be nothing but W0+W1+W2. Now, the overall output has to be greater
than 0 so that the output is 1 and the definition of the AND gate is
satisfied. From previous scenarios, we had found the values of W0, W1,
W2 to be -3,2,2 respectively. Placing these values in the Z equation yields
an output -3+2+2 which is 1 and greater than 0. This will, therefore, be
classified as 1 after passing through the sigmoid function.
XOR gate truth table and plotting of values on the x-y plane, Image by Author
A final note on AND and NAND implementation
Two possible solution increased to 9. A total of 6 weights from the input layer to the 2nd layer
To solve the above problem of separability, two techniques can be and a total of 3 weights from the 2nd layer to the output layer. The 2nd
employed i.e Adding non-linear features also known as the Kernel trick layer is also termed as a hidden layer.
Talking about the weights of the overall network, from the above and
part 1 content we have deduced the weights for the system to act as an
AND gate and as a NOR gate. We will be using those weights for the
implementation of the XOR gate. For layer 1, 3 of the total 6 weights
would be the same as that of the NOR gate and the remaining 3 would be
the same as that of the AND gate. Therefore, the weights for the input to
the NOR gate would be [1,-2,-2], and the input to the AND gate would be
The solution to implementing XOR gate, Image by Author
[-3,2,2]. Now, the weights from layer 2 to the final layer would be the
same as that of the NOR gate which would be [1,-2,-2].
Weights of the XOR network
Here we can see that the layer has increased from 2 to 3 as we have added Universal approximation theorem
a layer where AND and NOR operation is being computed. The inputs It states that any function can be expressed as a neural network with one
remain the same with an additional bias input of 1. The table on the right hidden layer to achieve the desired accuracy
below displays the output of the 4 inputs taken as the input. An
interesting thing to notice here is that the total number of weights has
Geometrical interpretation
Neural Networks Deep Learning Logic Gates Geometry
Artificial Intelligence
Linear separability of the two classes in 3D, Image by Author More from Vedant Kumar and Towards Data Science
Vedant Kumar in Towards Data Science Khuyen Tran in Towards Data Science
354 3 56 2
Jacob Marks, Ph.D. in Towards Data Science Vedant Kumar in Towards Data Science
15 min read · Apr 25 5 min read · Dec 6, 2020 ChatGPT prompts Natural Language
17 stories · 30 saves Processing
4.2K 48 17 369 stories · 19 saves
See all from Vedant Kumar See all from Towards Data Science
Help Status Writers Blog Careers Privacy Terms About Text to speech Teams
534 4 10
The PyCoach in Artificial Corner Peter Kar… in Artificial Intelligence in Plain Eng…
s sh
You’re Using ChatGPT Wrong! K-Nearest Neighbors (KNN) in
Here’s How to Be Ahead of 99% … Depth
ChatGPT
Master Users
ChatGPT by learning prompt K-NN, machine learning, classification
engineering.
25K 455 98