Experiment-3: Nisheeta Thakur 121A1114 D2

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

NISHEETA THAKUR

121A1114 D2
Experiment-3

Aim: To implement multivariate linear regression in Python.

To implement Multivariate Linear Regression in Python and understand the


concept using:

1. One independent variable and multiple dependent variables.


2. Multiple independent variables and multiple dependent variables,
utilizing the matrix method for parameter estimation.

Theory:
Multivariate Linear Regression:
Multivariate Linear Regression refers to regression involving multiple dependent
variables (DV). This contrasts with simple multiple regression, where there is only
one dependent variable.
The equation for multivariate linear regression can be generalized as:
Y=Xβ+ϵ
where:
 Y: n×m matrix of dependent variables (responses), with m being the number
of dependent variables.
 X: n×(p+1) matrix of independent variables (including a column of 1s for
the intercept).
 β: (p+1)×m matrix of regression coefficients.
 ϵ: n×m matrix representing the error or residuals.
The optimal parameters β can be estimated using the normal equation:

This method minimizes the sum of squared errors to find the best-fitting line(s).
Part 1: One Independent Variable and Multiple Dependent Variables (1
Independent Variable, Multiple Dependent Variables)
Case Explanation:
In this case, we have one independent variable XXX and multiple dependent
variables Y1, Y2,…,Ym. The goal is to find the regression coefficients for each
dependent variable with respect to the single independent variable.
The equation becomes:

Using the matrix form, we compute:

where X is a column matrix of the independent variable, and Y is a matrix of the


dependent variables.

Part 2: Multiple Independent Variables and Multiple Dependent Variables


(Multiple Independent Variables, Multiple Dependent Variables)
Case Explanation:
In this case, we have multiple independent variables and multiple dependent
variables. The goal is to find the optimal coefficients that minimize the residual error
for each dependent variable.
The equation becomes:

In matrix form, the optimal parameter matrix β hat is again computed as:

Here, both X and Y are matrices with multiple columns representing multiple
predictors and responses, respectively.
Program :

Output :

Conclusion:

 Multivariate Linear Regression allows us to model the relationships between


multiple independent and multiple dependent variables.
 Using the matrix method, we can efficiently compute the optimal parameters
by solving the normal equation.
 One independent variable with multiple dependent variables captures
simpler relationships, while multiple independent and dependent variables
provide a more comprehensive model.

You might also like