0% found this document useful (0 votes)
26 views4 pages

Astar

Uploaded by

Suhani Talreja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views4 pages

Astar

Uploaded by

Suhani Talreja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Department of Computer Science and Engineering AISC Lab[CS3131]

Program 7
Date: 17th September 2024
AIM:
Write a program to implement Perceptron neural network for Classification/ Regression.

DESCRIPTION:
The Perceptron is one of the simplest types of artificial neural networks. It is
primarily used for binary classification, where it learns to classify data into two
distinct classes. The perceptron algorithm works by adjusting weights during the
training process to minimize the error between predicted and actual outputs.
In this program, we will implement a basic Perceptron network for both
classification and regression tasks. The Perceptron consists of the following
key components:
1. Inputs: The features of the dataset.
2. Weights: Parameters that the model learns during the training phase.
3. Bias: A constant term added to the weighted sum of inputs to allow the
model to shift the decision boundary.
4. Activation Function: For classification, a step function is typically used
to decide the output (0 or 1). For regression, a linear activation function
can be used to predict continuous values.
5. Learning Rule: The model updates the weights based on the error
(difference between predicted and actual values) during each iteration.

ALGORITHM:
1. Initialization
1. Initialize the weights (W) and bias (b) to small random values or zero.
2. Set the learning rate (η) and the number of epochs.
2. Training
For each epoch, repeat the following steps:
- For each input sample X[i], calculate the weighted sum of inputs:
output = X[i] · W + b
- Apply activation function:
- Classification: Step function
- Regression: Linear activation
- Calculate the error between the predicted output and the actual target (y[i]):
error = y[i] - predicted output
- Update the weights and bias based on the error:

Suhani Talreja 229301425


Department of Computer Science and Engineering AISC Lab[CS3131]

W = W + η * error * X[i]
b = b + η * error
3. Testing
After training, test the model on unseen data by calculating the output
predictions, then evaluate the performance by comparing the predictions to the
actual values.
- For classification, measure accuracy.
- For regression, calculate Mean Squared Error (MSE).

SOURCE CODE:
import numpy as np
class Perceptron:
def __init__(self, input_size, learning_rate=0.01, epochs=1000):
self.weights = np.zeros(input_size) # Initialize weights to zero
self.bias = 0 # Initialize bias term to zero
self.learning_rate = learning_rate
self.epochs = epochs

def step_function(self, x):


# Activation function - step function
return 1 if x >= 0 else 0

def predict(self, X):


# Linear combination of inputs and weights
linear_output = np.dot(X, self.weights) + self.bias
# Apply step function to the output
return self.step_function(linear_output)

def fit(self, X, y):


# Training the Perceptron
for epoch in range(self.epochs):
for i in range(len(X)):
# Make prediction for each sample
prediction = self.predict(X[i])

Suhani Talreja 229301425


Department of Computer Science and Engineering AISC Lab[CS3131]

# Calculate the error (difference between actual and predicted)


error = y[i] - prediction
# Update weights and bias using Perceptron learning rule
self.weights += self.learning_rate * error * X[i]
self.bias += self.learning_rate * error

def score(self, X, y):


# Calculate the accuracy of the model
correct_predictions = 0
for i in range(len(X)):
if self.predict(X[i]) == y[i]:
correct_predictions += 1
return correct_predictions / len(X)

# Example: Using the Perceptron for binary classification on a simple dataset


if __name__ == "__main__":
# Example dataset (AND logic gate)
X = np.array([[0, 1], [0, 0], [1, 0], [1, 0]]) # Input features
y = np.array([0, 0, 0, 1]) # Labels (Output)

# Create Perceptron instance with 2 input features (size of X[0])


perceptron = Perceptron(input_size=2, learning_rate=0.1, epochs=1000)

# Train the model


perceptron.fit(X, y)

# Test the model


print("Predictions after training:")
for i in range(len(X)):
print(f"Input: {X[i]}, Predicted: {perceptron.predict(X[i])}, Actual: {y[i]}")

# Accuracy score

Suhani Talreja 229301425


Department of Computer Science and Engineering AISC Lab[CS3131]

accuracy = perceptron.score(X, y)
print(f"Accuracy: {accuracy * 100}%")

OUTPUT:

Suhani Talreja 229301425

You might also like