Gradient Descent Algorithm and Its Variants - GeeksforGeeks
Gradient Descent Algorithm and Its Variants - GeeksforGeeks
Attention reader! Don’t stop learning now. Get hold of all the
important Machine Learning Concepts with the Machine
Learning Foundation Course at a student-friendly price and
become industry ready.
Variables used:
Let m be the number of training examples.
Let n be the number of features.
Learn more
Note: if b == m, then mini batch gradient descent will behave
similarly to batch gradient descent.
Repeat {
θj = θj – (learning rate/m) * Σ( hθ(x(i)) - y
For every j =0 …n
}
Hence,
Let (x(i),y(i)) be the training example
Cost(θ, (x(i),y(i))) = (1/2) Σ( hθ(x(i)) - y(i))
For i=1 to m{
}
}
Repeat {
For i=1,11, 21,…..,91
Like 17
Previous Next
R ECO M M E N D E D A RT I C L E S Page : 1 2 3