Lmis Final
Lmis Final
UNIVERSITY OF KERALA
Submitted by,
Alfiya S.S (97423607001)
Aravind S (97423607003)
Semester 2
INDEX
1 Abstract 2
2 Introduction 3
6 Results 8
This project report delves into the implementation of Artificial Neural Networks
(ANN) for basic logic gates using the Perceptron model, demonstrating a
foundational approach to binary classification tasks. The perceptron is a
simplistic yet powerful neural unit capable of linearly separating inputs into
binary categories based on a weighted sum and threshold. Here, we construct
and evaluate Perceptron models to mimic the behavior of logic gates such as
AND, OR, NOT, NAND, and NOR. These logic gates form the basis of more
complex operations in digital logic, making them ideal candidates for
understanding the functionality of perceptrons. Through this report, readers gain
insight into the structure and logic of perceptrons, with each gate’s unique
weight and bias configurations. The results underscore the effectiveness of
perceptrons in binary operations, laying the groundwork for more complex
neural architectures.
1. INTRODUCTION
ANNs are widely used in various domains, such as image recognition, natural
language processing, and game playing. However, in this project, we will focus
on a single-layer perceptron, which is a simpler form of ANN suited for linearly
separable tasks like binary classification.
2. PERCEPTRON: A SIMPLE NEURAL MODEL
The Perceptron is one of the simplest types of artificial neural networks, primarily
used for binary classification tasks. It is a supervised learning model, meaning it
learns from labeled data. The model makes predictions by calculating a weighted
sum of the inputs, applying an activation function (usually a step function), and
producing a binary output.
Where:
X1 f y
X2
• Weights are assigned to each input based on its role in the logical operation.
• Bias acts as a threshold adjustment that influences the activation of the
perceptron. By setting a specific bias value, we can control when the
perceptron will output a 1 or a 0.
These weight and bias configurations enable the perceptron to mimic the behavior
of each logic gate by converting binary inputs into a single binary output.
3. IMPLEMENTATION DETAILS AND EXPLANATION
In this project, we implement each logic gates using a Perceptron model, a simple
yet powerful type of artificial neural network (ANN). The task of implementing
logic gates using the perceptron highlights how a basic neural network can
perform logical operations through a simple configuration of weights, bias, and
threshold activation.
3. 2 GATE DEFINITIONS
Each logic gate is defined by its unique set of weights and bias. The weights and
bias determine how the perceptron processes inputs to yield the correct output.
For example:
• AND Gate: The AND gate is represented by the weights [1, 1] and a bias of -
1. The perceptron will output 1 only when both inputs are 1 (i.e., 1 AND 1 =
1). In all other cases, the output will be 0.
• OR Gate: The OR gate has weights [2, 2] and a bias of -1. This configuration
ensures that the output is 1 whenever at least one input is 1 (i.e., 1 OR 0 = 1
and 1 OR 1 = 1).
• NOT Gate: The NOT gate operates on a single input and flips it. For example,
a NOT gate with weights [-1] and bias +1 will output 1 when the input is 0
and output 0 when the input is 1.
• NAND Gate: The NAND gate, which is the negation of the AND gate, uses
weights [-1, -1] and a bias of 2. This configuration ensures that the output is
0 only when both inputs are 1 (i.e., 1 NAND 1 = 0), and it outputs 1 in all
other cases.
• NOR Gate: Similarly, the NOR gate is the negation of the OR gate and uses
weights [-1, -1] with a bias of +1. It outputs 1 only when both inputs are 0,
and 0 in all other cases.
These weights and biases are essential for mimicking the behavior of each gate,
and by adjusting them, we can control how the perceptron responds to different
combinations of inputs.
Input Sets: Two-input and single-input scenarios are handled separately for gates
like NOT, which only requires one input.
AND Gate
Input: (0, 0) => Output: 0
Input: (0, 1) => Output: 0
Input: (1, 0) => Output: 0
Input: (1, 1) => Output: 1
OR Gate
Input: (0, 0) => Output: 0
Input: (0, 1) => Output: 1
Input: (1, 0) => Output: 1
Input: (1, 1) => Output: 1
NOT Gate
Input: 0 => Output: 1
Input: 1 => Output: 0
NAND Gate
Input: (0, 0) => Output: 1
Input: (0, 1) => Output: 1
Input: (1, 0) => Output: 1
Input: (1, 1) => Output: 0
NOR Gate
Input: (0, 0) => Output: 1
Input: (0, 1) => Output: 0
Input: (1, 0) => Output: 0
Input: (1, 1) => Output: 0
5. CONCLUSION
REFERENCES
• https://www.ibm.com/topics/artificial-intelligence
• https://www.geeksforgeeks.org/digital-logic-gates
• Neural Representation of AND, OR, NOT, XOR and XNOR Logic Gates (Perceptron Algorithm) | by
Obumneme Stanley Dukor | Medium