ML.3-Regression Techniques
ML.3-Regression Techniques
Regression Techniques
Machine Learning
CONTENTs
•Linear Regression
•Linear Problems
•Gradient Descent
Machine Learning 2
Linear Regression
•Linear Problems
• Linear Regression
• Nonlinear Regression
• Derivatives and Finding Extreme Points
•Gradient Descent
Machine Learning 3
Linear and Nonlinear Functions
• A linear function is a systematic or sequential increase or
decrease represented by a straight line.
• Example : Linear Regression
Linear and Nonlinear Functions
Activation Functions
Advantages
Linear and Nonlinear Functions
• Sigmoid Activation Function:
• Range from [0,1]
• Not Zero Centered
• Have Exponential Operation
• Maxout:
• It has property of Linearity in it
• it never saturates or die
• But is Expensive as it doubles the parameters.
Partial derivative
Gradient Descent
• A gradient is a vector that stores the partial derivatives of
multivariable functions. It helps us calculate the slope at a
specific point on a curve for functions with multiple
independent variables.
• The gradient vector is the vector generating the line
orthogonal to the tangent hyperplane. Then you take the
opposite of this vector (hence “descent”), multiply it by the
learning rate lr.
Gradient Descent
• The projection of this vector on the parameter space (here:
the x-axis) gives you the new (updated) parameter. Then you
repeat this operation several times to go down the cost (error)
function, with the goal of reaching a value for w where the
cost function is minimal.
• The parameter is thus updated as follow at each step:
parameter <-- parameter - lr*gradient
Gradient Descent
Gradient Descent
SUMMARY
•Introduction
•Applications of ML
•Types of ML Systems
•Main Challenges of ML
•Testing & Validating
Machine Learning 17
Nhân bản – Phụng sự – Khai phóng
Machine Learning 18