0% found this document useful (0 votes)
37 views10 pages

Lmis Final

This report explores the implementation of Artificial Neural Networks (ANN) using the Perceptron model to simulate basic logic gates like AND, OR, NOT, NAND, and NOR. It details the configuration of weights and biases necessary for the Perceptron to replicate the behavior of these gates, emphasizing its effectiveness in binary classification tasks. The findings demonstrate the potential of simple neural networks in modeling fundamental logical operations, while also highlighting limitations with non-linear gates.

Uploaded by

Nthvava 32
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views10 pages

Lmis Final

This report explores the implementation of Artificial Neural Networks (ANN) using the Perceptron model to simulate basic logic gates like AND, OR, NOT, NAND, and NOR. It details the configuration of weights and biases necessary for the Perceptron to replicate the behavior of these gates, emphasizing its effectiveness in binary classification tasks. The findings demonstrate the potential of simple neural networks in modeling fundamental logical operations, while also highlighting limitations with non-linear gates.

Uploaded by

Nthvava 32
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

ANN FOR LOGIC GATES

CASE STUDY REPORT


(2023-2024)

UNIVERSITY OF KERALA

M.Sc Computer Science with Specialization in Artificial Intelligence.

Department of Computer Science

Submitted by,
Alfiya S.S (97423607001)

Aravind S (97423607003)

Neethu Sisupal SD (97423607011)

Rohitha B Prasad (97423607013)

Semester 2
INDEX

SL.NO CONTENT PAGE NO

1 Abstract 2

2 Introduction 3

Perceptron: a simple neural


3 4
model
Perceptron model design for
4 5
logic gates
Implementation details and
5 6
explanation

6 Results 8

7 Conclusion & References 9


ABSTRACT

This project report delves into the implementation of Artificial Neural Networks
(ANN) for basic logic gates using the Perceptron model, demonstrating a
foundational approach to binary classification tasks. The perceptron is a
simplistic yet powerful neural unit capable of linearly separating inputs into
binary categories based on a weighted sum and threshold. Here, we construct
and evaluate Perceptron models to mimic the behavior of logic gates such as
AND, OR, NOT, NAND, and NOR. These logic gates form the basis of more
complex operations in digital logic, making them ideal candidates for
understanding the functionality of perceptrons. Through this report, readers gain
insight into the structure and logic of perceptrons, with each gate’s unique
weight and bias configurations. The results underscore the effectiveness of
perceptrons in binary operations, laying the groundwork for more complex
neural architectures.
1. INTRODUCTION

Logic gates are fundamental elements in digital circuits, performing binary


operations such as AND, OR, NOT, NAND, and NOR, and playing a crucial role
in decision-making processes in computing, typically implemented in hardware.
However, with the rise of artificial intelligence and machine learning, it has
become possible to simulate these gates through neural networks, particularly the
perceptron model—a basic form of an Artificial Neural Network (ANN) known
for its ability to model simple, linearly separable functions. In the ever-evolving
field of AI and ML, ANNs simulate the functioning of the human brain,
consisting of interconnected neurons to perform tasks like classification,
regression, and pattern recognition.

This project investigates how a perceptron can be configured to emulate these


fundamental gates by adjusting weights and bias, which are key components of
neural networks. It not only provides insights into the operational mechanics of
perceptrons but also lays the foundation for more advanced neural network
applications, demonstrating the perceptron’s ability to model logical operations
that form the basis of digital circuits and computational processes .

1.1 ARTIFICIAL NEURAL NETWORK (ANN) OVERVIEW

Artificial Neural Networks are computational models inspired by the way


biological neural systems process information. The basic building block of a
neural network is the neuron. Neurons receive inputs, process them, and produce
an output. A neural network typically consists of layers:

• Input Layer: Receives the input features.


• Hidden Layers: Contains neurons that perform computations and extract
features.
• Output Layer: Produces the final result or prediction.

ANNs are widely used in various domains, such as image recognition, natural
language processing, and game playing. However, in this project, we will focus
on a single-layer perceptron, which is a simpler form of ANN suited for linearly
separable tasks like binary classification.
2. PERCEPTRON: A SIMPLE NEURAL MODEL

The Perceptron is one of the simplest types of artificial neural networks, primarily
used for binary classification tasks. It is a supervised learning model, meaning it
learns from labeled data. The model makes predictions by calculating a weighted
sum of the inputs, applying an activation function (usually a step function), and
producing a binary output.

Mathematical Representation of a Perceptron:

Where:

• wi is the weight for the input xi


• b is the bias term,
• f is the activation function (step function in the case of the perceptron),
• y is the output.

In the perceptron, the output is 1 if the weighted sum exceeds a threshold,


otherwise, it is 0.

X1 f y

X2

In this project, we use a single-layer perceptron to simulate logic gates. This


involves selecting appropriate weights and biases that will cause the perceptron
to produce the expected output for each gate under various input conditions
2.1 PERCEPTRON MODEL DESIGN FOR LOGIC GATES

The key to configuring a perceptron to simulate a particular logic gate lies in


selecting weights and biases that yield the correct output for each possible input
combination. For each gate:

• Weights are assigned to each input based on its role in the logical operation.
• Bias acts as a threshold adjustment that influences the activation of the
perceptron. By setting a specific bias value, we can control when the
perceptron will output a 1 or a 0.

For example:An AND gate requires both inputs to be 1 to output 1, so we


configure the perceptron to activate (output 1) only when both input signals are
at their maximum value.Similarly, an OR gate requires only one of the inputs to
be 1 to activate the perceptron, so it is configured with higher weights to meet this
condition.

These weight and bias configurations enable the perceptron to mimic the behavior
of each logic gate by converting binary inputs into a single binary output.
3. IMPLEMENTATION DETAILS AND EXPLANATION

In this project, we implement each logic gates using a Perceptron model, a simple
yet powerful type of artificial neural network (ANN). The task of implementing
logic gates using the perceptron highlights how a basic neural network can
perform logical operations through a simple configuration of weights, bias, and
threshold activation.

3.1 PERCEPTRON FUNCTION

The core component of this implementation is the Perceptron function. This


function takes the inputs, weights, and bias as parameters to calculate a weighted
sum. The perceptron applies a threshold function (also called a step function) to
this weighted sum to generate an output. The threshold function is defined as:

• If the weighted sum is greater than zero, the output is 1.


• If the weighted sum is less than or equal to zero, the output is 0.

This step function is fundamental to the perceptron, as it determines whether a


given combination of inputs should result in a positive or negative classification.

3. 2 GATE DEFINITIONS

Each logic gate is defined by its unique set of weights and bias. The weights and
bias determine how the perceptron processes inputs to yield the correct output.
For example:

• AND Gate: The AND gate is represented by the weights [1, 1] and a bias of -
1. The perceptron will output 1 only when both inputs are 1 (i.e., 1 AND 1 =
1). In all other cases, the output will be 0.
• OR Gate: The OR gate has weights [2, 2] and a bias of -1. This configuration
ensures that the output is 1 whenever at least one input is 1 (i.e., 1 OR 0 = 1
and 1 OR 1 = 1).
• NOT Gate: The NOT gate operates on a single input and flips it. For example,
a NOT gate with weights [-1] and bias +1 will output 1 when the input is 0
and output 0 when the input is 1.
• NAND Gate: The NAND gate, which is the negation of the AND gate, uses
weights [-1, -1] and a bias of 2. This configuration ensures that the output is
0 only when both inputs are 1 (i.e., 1 NAND 1 = 0), and it outputs 1 in all
other cases.
• NOR Gate: Similarly, the NOR gate is the negation of the OR gate and uses
weights [-1, -1] with a bias of +1. It outputs 1 only when both inputs are 0,
and 0 in all other cases.

These weights and biases are essential for mimicking the behavior of each gate,
and by adjusting them, we can control how the perceptron responds to different
combinations of inputs.

Input Sets: Two-input and single-input scenarios are handled separately for gates
like NOT, which only requires one input.

Output Display: The display_gate_outputs function iterates over each input


combination, calling the perceptron function and displaying the gate’s output,
verifying each gate’s behavior.
4. RESULTS

AND Gate
Input: (0, 0) => Output: 0
Input: (0, 1) => Output: 0
Input: (1, 0) => Output: 0
Input: (1, 1) => Output: 1

OR Gate
Input: (0, 0) => Output: 0
Input: (0, 1) => Output: 1
Input: (1, 0) => Output: 1
Input: (1, 1) => Output: 1

NOT Gate
Input: 0 => Output: 1
Input: 1 => Output: 0

NAND Gate
Input: (0, 0) => Output: 1
Input: (0, 1) => Output: 1
Input: (1, 0) => Output: 1
Input: (1, 1) => Output: 0

NOR Gate
Input: (0, 0) => Output: 1
Input: (0, 1) => Output: 0
Input: (1, 0) => Output: 0
Input: (1, 1) => Output: 0
5. CONCLUSION

This project demonstrates how a simple Perceptron model can effectively


simulate basic logical operations, such as AND, OR, NOT, NAND, and NOR
gates. The provided code shows how different gates can be modeled by defining
specific weights and biases, which dictate the behavior of each gate. By applying
a weighted sum of inputs through the Perceptron function, the code accurately
mimics the logical outputs, providing a clear understanding of how simple neural
networks can replicate fundamental logic functions. The use of different sets of
weights and biases ensures that each gate's characteristics are reflected correctly,
adhering to their truth tables.

The implementation of logic gates using a Perceptron highlights several key


aspects of neural network models in digital systems. First, the simplicity of the
Perceptron allows for the representation of basic binary operations. Second, each
gate is treated as an individual neural model with its specific configuration of
weights and bias. Third, the code is flexible and can easily be extended to include
more complex gates or modify existing ones. Fourth, while the Perceptron works
well for basic gates, it struggles with non-linear gates such as XOR, which require
multi-layer architectures. Finally, this approach lays the groundwork for further
exploration into neural networks and their application in both logic and
computational tasks, demonstrating the connection between neural networks and
digital logic design.

REFERENCES
• https://www.ibm.com/topics/artificial-intelligence
• https://www.geeksforgeeks.org/digital-logic-gates
• Neural Representation of AND, OR, NOT, XOR and XNOR Logic Gates (Perceptron Algorithm) | by
Obumneme Stanley Dukor | Medium

You might also like