0% found this document useful (0 votes)
26 views8 pages

AIML ppt1

This document provides an introduction to linear regression. It defines linear regression as a supervised machine learning algorithm that models the relationship between a dependent variable and one or more independent variables as a linear equation. It represents the linear regression model as a straight line and describes the regression line as fitting the data best. It lists the assumptions of linear regression including independent and identically distributed errors. Finally, it describes the least squares method for estimating the slope and intercept coefficients to find the line of best fit that minimizes the sum of the squared residuals from the observed data points.

Uploaded by

apoorva.k2017
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views8 pages

AIML ppt1

This document provides an introduction to linear regression. It defines linear regression as a supervised machine learning algorithm that models the relationship between a dependent variable and one or more independent variables as a linear equation. It represents the linear regression model as a straight line and describes the regression line as fitting the data best. It lists the assumptions of linear regression including independent and identically distributed errors. Finally, it describes the least squares method for estimating the slope and intercept coefficients to find the line of best fit that minimizes the sum of the squared residuals from the observed data points.

Uploaded by

apoorva.k2017
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

JNANAVIKAS INSTITUTE OF TECHNOLOGY

(Afraid to Visvesvaraya Technological University, Belagavi, Approved by AICET,


New Delhi)

Subject:-Artificial Intelligence and Machine Learning


Subject code:-21CS54

Introduction To Linear Regression

SUBMITTED TO SUBMITTED BY
Mrs.Suchitra Apoorva K Gowda
Asst.Professor 1JV21CS006
Dept of CSE 5th Sem CSE
CONTENTS
Linear Regression
Representing Linear Regression Model
Least Square Method
Linear Regression
• Linear Regression is a supervised machine learning algorithm.
• The simplest mathematical relationship between two variables x and y is a linear
relationship. In a cause and effect relationship, the independent variable is the cause, and the
dependent variable is the effect.
• Least squares linear regression is a method for predicting the value of a dependent variable
Y, based on the value of an independent variable X.
Representing Linear Regression Model
Linear regression model represents the linear relationship between a dependent variable and
independent variable(s) via a sloped straight line.

 The sloped straight line representing the linear relationship that fits the given data best is
called as a regression line.
 It is also called as best fit line.
The first order linear model
Y= b + b₁ X+€
Y = dependent variable
X = independent variable
b = Y-intercept
b₁ = slope of the line
e = error variable

The assumptions of linear regression are listed as follows:


1. The observations (y) are random and are mutually independent.

2. The difference between the predicted and true values is called an error. The error is
also mutually independent with the same distributions such as normal distribution with
zero mean and constant variables.

3. The distribution of the error term is independent of the joint distribution of


explanatory variables.

4. The unknown parameters of the regression models are constants.


Least Square Method
The least squares estimate of the slope coefficient ẞ, of true regression line is
.
β₁= Σ(Χ-Χ΄) (Υ-Υ)
Σ (Χ-Χ΄)

The least squares estimate of the intercept Bo of true regression line is

βο = Y' - β₁x’

• Regression generates what is called the "least-squares" regression line

• .The regression line takes the form: = a + b*X, where a and b are both constants,
(pronounced y-hat) is the predicted value of Y and X is a specific value of the
independent variable.
 It turns out that with any two variables X and Y, there is one equation that produces the
"best fit" linking X to Y.
 We use the criterion is called the least squares criterion to measure best.
THANK YOU

You might also like