0% found this document useful (0 votes)
10 views3 pages

ML U1 Notes

c x

Uploaded by

Shrenik Pittala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views3 pages

ML U1 Notes

c x

Uploaded by

Shrenik Pittala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Perceptron: Understanding Learning Rate (η) and Bias Input

The Perceptron is one of the simplest types of artificial neural networks. To


understand the concepts of learning rate (η) and bias input, let’s break it down
step by step.

1. Learning Rate (η)


What is the Learning Rate?
• The learning rate is a small, positive number that controls how much the
weights in a perceptron are updated during training.
• It essentially decides how fast or slow the perceptron "learns" by adjusting
the weights.
Why is it needed?
• If the learning rate is too large, the updates might overshoot the optimal
values, causing instability.
• If the learning rate is too small, learning becomes very slow, and the
perceptron might not converge efficiently.
How is it used?
During training, the perceptron updates the weights using the following rule:
2. Bias Input
What is Bias?
• The bias input is a constant term added to the perceptron model.
• Its role is to allow the perceptron to shift the decision boundary, making it
more flexible in solving problems.
Why is it needed?
• Without bias, the perceptron’s decision boundary (a line, plane, or
hyperplane) will always pass through the origin (0,00, 00,0).
• By introducing bias, the decision boundary can shift up, down, or sideways,
making the perceptron capable of solving more complex problems.
How is it used?
• A bias term bbb is included in the perceptron formula:

Where fff is the activation function (e.g., step function or sigmoid).


Practical Use
• Bias can be thought of as an input with a fixed value of 1 and its own weight
w0

You might also like