0% found this document useful (0 votes)
11 views16 pages

22PCOAM16 - Machine Learning - Session 9 Linear Regression

The document outlines the syllabus for a Machine Learning course for III B. Tech - I Semester, detailing topics such as types of machine learning, supervised learning, and linear regression. It includes references to textbooks and covers concepts like regression analysis, the least square method, and the relationship between dependent and independent variables. The document also explains simple and multiple linear regression, including their equations and applications.

Uploaded by

Vani Saran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views16 pages

22PCOAM16 - Machine Learning - Session 9 Linear Regression

The document outlines the syllabus for a Machine Learning course for III B. Tech - I Semester, detailing topics such as types of machine learning, supervised learning, and linear regression. It includes references to textbooks and covers concepts like regression analysis, the least square method, and the relationship between dependent and independent variables. The document also explains simple and multiple linear regression, including their equations and applications.

Uploaded by

Vani Saran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Department of Computer Science & Engineering (SB-ET)

III B. Tech -I Semester

MACHINE LEARNING

SUBJECT CODE: 22PCOAM16

AcademicYear : 2023-2024
by
Dr. M.Gokilavani
GNITC
19/05/2025 1
Department of CSE (SB-ET)
22PCOAM16 MACHINE LEARNING
UNIT – I
Syllabus
Learning - Types of Machine Learning - Supervised Learning - The
Brain and the Neuron - Design a Learning System - Perspectives and
Issues in Machine Learning - Concept Learning Task – Concept
Learning as Search - Finding a Maximally Specific Hypothesis -
Version Spaces and the Candidate Elimination Algorithm - Linear
Discriminants: Perceptron - Linear Separability - Linear
Regression.

19/05/2025 Department of CSE (SB-ET) 2


UNIT - I LECTURE - 09
TEXTBOOK:
• Stephen Marsland, Machine Learning - An Algorithmic Perspective, Second Edition,
Chapman and Hall/CRC.
• Machine Learning and Pattern Recognition Series, 2014.
REFERENCES:
• Tom M Mitchell, Machine Learning, First Edition, McGraw Hill Education, 2013.
• Ethem Alpaydin, Introduction to Machine Learning 3e (Adaptive Computation and
Machine
No of Hours Required: 13
19/05/2025 3
Department of CSE (SB-ET)
UNIT - I Linear Regression LECTURE - 09
• Regression – Definition
• Linear Regression
• Slope and Intercept
• Scatter Graph
• Least Square Method
• Example

19/05/2025 Department of CSE (SB-ET) 4


UNIT - I Regression LECTURE - 09
• Analyze the specific relationship between the two or more variables.
• Technique used for the modelling and analysis of numerical data.
• Exploits the relationship between two or more variables so that we
can gain information about one of them through knowing values of
the other.
• Regression can be used for prediction, estimation, hypothesis
testing, and modelling causal relationships.

19/05/2025 Department of CSE (SB-ET) 5


UNIT - I Why Linear Regression? LECTURE - 09
• Suppose we want to model the dependent variable Y in terms of three predictors, X1, X2,
X3.

• Typically will not have enough data to try and directly estimate threshold Function(bias
value).
• Therefore, we usually have to assume that it has some restricted form, such as linear.

19/05/2025 Department of CSE (SB-ET) 6


UNIT - I Why Linear Regression? LECTURE - 09
• A statistical measure that attempts to determine the strength of the
relationship between one dependent variable ( usually denoted by Y)
and a series of other changing variables (known as independent
variables).
• Forecast value of a dependent variable (Y) from the value of
Independent variables(X1, X2,…….Xn).
• Focus relationship between a dependent variable and one or more
independent variables.

19/05/2025 Department of CSE (SB-ET) 7


UNIT - I Types of Regression LECTURE - 09
There are two types of Regression:
1. Simple Linear Regression
2. Multiple Linear Regression
• Simple Linear Regression: If a single independent variable is used to predict
the value of a numerical dependent variable,
then such a Linear Regression algorithm is called
Simple Linear Regression.
• Multiple Linear regression: If more than one independent variable is used to
predict the value of a numerical dependent
variable, then such a Linear
Regression algorithm is called Multiple
Linear Regression.
19/05/2025 Department of CSE (SB-ET) 8
UNIT - I Simple Linear Regression LECTURE - 09
The simple regression equation is the mathematical representation of the relationship
between an independent variable (X) and a dependent variable (Y).

Where as,
• Y: Represents the predicted value of the dependent variable (e.g., lemonade sales).
• X: Represents the value of the independent variable (e.g., temperature).
• b: Represents the slope of the best-fit line, indicating the rate of change in Y per unit
change in X. A positive slope (b > 0) indicates a direct relationship (increasing X
leads to increasing Y), while a negative slope ( b < 0) indicates an inverse
relationship.
• a: Represents the y-intercept, which is the predicted value of Y when X = 0. It's the
point where the regression line intersects the y-axis.

19/05/2025 Department of CSE (SB-ET) 9


UNIT - I Slope and Intercept LECTURE - 09
• Slope: The Slope of a line is the change in Y for a one unit increase
in X.
• Intercept (Y): It is the height at which the line crosses the vertical
axis and is obtaining by setting X=0 in the equation.

19/05/2025 Department of CSE (SB-ET) 10


UNIT - I Example LECTURE - 09
• Dataset of living area and price of houses in a city

• m = number of training examples


• x's = input variables / features
• y's = output variables / "target" variables
• (x , y) - single training example
• (xi, yj) - specific example (ith training example)
• i is an index to training set

19/05/2025 Department of CSE (SB-ET) 11


UNIT - I How to use the training set? LECTURE - 09

• Learn a function h(x), so that h(x) is a good


predictor for the corresponding value of y
• h: hypothesis function

19/05/2025 Department of CSE (SB-ET) 12


UNIT - I Least Square Method LECTURE - 09
• Least Square Method is used to derive a generalized linear
equation between two variables.
• when the value of the dependent and independent variable is
represented as the x and y coordinates in a 2D coordinate
system.
• Initially, known values are marked on a plot.
• The plot obtained at this point is called a scatter plot.
• Example: https://byjus.com/maths/least-square-method/

19/05/2025 Department of CSE (SB-ET) 13


UNIT - I Least Square Method LECTURE - 09

19/05/2025 Department of CSE (SB-ET) 14


UNIT - I Scatter Graph LECTURE - 09

19/05/2025 Department of CSE (SB-ET) 15


UNIT - I LECTURE - 09

Topics to be covered in next session 10

Thank you!!!

19/05/2025 Department of CSE (SB-ET) 16

You might also like