ML Exp-7
ML Exp-7
AIM:
Implement Single Layer Perceptron Learning Algorithm in Python
THEORY:
Perceptron is one of the simplest Artificial neural network architectures. It was introduced by
Frank Rosenblatt in 1957s. It is the simplest type of feedforward neural network, consisting
of a single layer of input nodes that are fully connected to a layer of output nodes. It can
learn the linearly separable patterns. it uses slightly different types of artificial neurons
known as threshold logic units (TLU). it was first introduced by McCulloch and Walter Pitts in
the 1940s.
Types of Perceptron
A perceptron, the basic unit of a neural network, comprises essential components that
collaborate in information processing.
Input Features: The perceptron takes multiple input features, each input feature represents a
characteristic or attribute of the input data.
Weights: Each input feature is associated with a weight, determining the significance of each
input feature in influencing the perceptron’s output. During training, these weights are
adjusted to learn the optimal values.
Summation Function: The perceptron calculates the weighted sum of its inputs using the
summation function. The summation function combines the inputs with their respective
weights to produce a weighted sum.
Activation Function: The weighted sum is then passed through an activation function.
Perceptron uses Heaviside step function functions. which take the summed values as input
and compare with the threshold and provide the output as 0 or 1.
Output: The final output of the perceptron, is determined by the activation function’s result.
For example, in binary classification problems, the output might represent a predicted class
(0 or 1).
Bias: A bias term is often included in the perceptron model. The bias allows the model to
make adjustments that are independent of the input. It is an additional parameter that is
learned during training.
Learning Algorithm (Weight Update Rule): During training, the perceptron learns by adjusting
its weights and bias based on a learning algorithm. A common approach is the perceptron
learning algorithm, which updates weights based on the difference between the predicted
output and the true output.
Implementations code