0% found this document useful (0 votes)
3 views4 pages

Learning Algorithms

The document outlines two learning algorithms: the Perceptron Learning Algorithm and the Backpropagation Algorithm. The Perceptron is used for binary classification and updates weights based on prediction errors, while Backpropagation is designed for training multi-layer neural networks through forward and backward passes using gradient descent. A comparison highlights the simplicity of the Perceptron for linearly separable data versus the complexity of Backpropagation for more intricate models.

Uploaded by

Tanjil Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views4 pages

Learning Algorithms

The document outlines two learning algorithms: the Perceptron Learning Algorithm and the Backpropagation Algorithm. The Perceptron is used for binary classification and updates weights based on prediction errors, while Backpropagation is designed for training multi-layer neural networks through forward and backward passes using gradient descent. A comparison highlights the simplicity of the Perceptron for linearly separable data versus the complexity of Backpropagation for more intricate models.

Uploaded by

Tanjil Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Learning Algorithms

Backpropagation and Perceptron Learning

Moin Mostakim

BRAC University

February 20, 2025

Moin Mostakim (BRAC University) Learning Algorithms February 20, 2025 1/4
Perceptron Learning Algorithm

Algorithm 1 Perceptron Learning Algorithm


Require: Training dataset {(x1 , y1 ), (x2 , y2 ), . . . , (xm , ym )}, where xi ∈ Rn and yi ∈ {−1, 1}, learn-
ing rate η, maximum number of iterations T
Ensure: Weight vector w and bias b
1: Initialize w ← 0, b ← 0
2: for t = 1 to T do
3: for each training example (xi , yi ) do
4: Compute prediction: ŷi = sign(w · xi + b)
5: if ŷi ̸= yi then
6: Update weights: w ← w + η · yi · xi
7: Update bias: b ← b + η · yi
8: end if
9: end for
10: end for
11: return w , b

Moin Mostakim (BRAC University) Learning Algorithms February 20, 2025 2/4
Backpropagation Algorithm

Algorithm 2 Backpropagation Algorithm


Require: Neural network with layers L1 , L2 , . . . , Ln , activation function σ, input x, target output
y , learning rate η
Ensure: Updated weights and biases in the network
1: Initialize all weights w and biases b randomly
2: Forward pass:
3: for each layer Li from L1 to Ln do
4: Compute activation ai = σ(zi ), where zi = wi · ai−1 + bi
5: end for
6: Compute output error δn = ∇an L(y , an ) ⊙ σ ′ (zn )
7: Backward pass:
8: for each layer Li from Ln−1 to L1 do
T ·δ ′
9: Compute error δi = (wi+1 i+1 ) ⊙ σ (zi )
10: Compute gradient of weights ∇wi L = δi · ai−1 T

11: Compute gradient of biases ∇bi L = δi


12: end for
13: Update weights and biases:
14: for each layer Li from L1 to Ln do
15: Update weights: wi ← wi − η∇wi L
16: Update biases: bi ← bi − η∇bi L
17: end for

Moin Mostakim (BRAC University) Learning Algorithms February 20, 2025 3/4
Comparison of Algorithms

Backpropagation
Used for training multi-layer neural networks.
Involves forward and backward passes.
Updates weights using gradient descent.
Perceptron Learning
Used for binary classification.
Simpler and faster for linearly separable data.
Updates weights based on prediction errors.

Moin Mostakim (BRAC University) Learning Algorithms February 20, 2025 4/4

You might also like