Lect 7 - Gradient Descent
Lect 7 - Gradient Descent
Terminologies…
• Epoch : An epoch is a term used in machine learning and
indicates the number of passes of the entire training dataset
the machine learning algorithm has completed. Datasets are
usually grouped into batches (especially when the amount of
data is very large)
• Simple words ITERATION
Loss Functions
you are on the top of a hill and need to climb down. How do
you decide where to walk towards?
trekking
Steps:
• Look around to see all the possible paths
• Reject the ones going up. This is because these paths would actually
cost me more energy and make my task even more difficult
• Finally, take the path that I think has the most slope downhill
Deciding to go up the slope will cost us energy and time. Deciding to go down will
benefit us. Therefore, it has a negative cost.
Learning rate