0% found this document useful (0 votes)
3 views3 pages

Task 3

The document outlines a task to perform least squares regression using the direct inverse method in RStudio or Python. It includes Python code to construct a design matrix, compute regression coefficients, predict values, and plot the results. The approach is efficient for small datasets but may be computationally expensive for larger ones due to matrix inversion.

Uploaded by

John Mesia Dhas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views3 pages

Task 3

The document outlines a task to perform least squares regression using the direct inverse method in RStudio or Python. It includes Python code to construct a design matrix, compute regression coefficients, predict values, and plot the results. The approach is efficient for small datasets but may be computationally expensive for larger ones due to matrix inversion.

Uploaded by

John Mesia Dhas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Task 3: Do a least squares regression with an estimation function defined by use of direct

inverse method.

Tools: RStudio, Python

Aim

To perform least squares regression by solving the normal equations using the direct inverse
method. The goal is to find the optimal coefficients (slope and intercept) that minimize the
squared residuals between observed and predicted values.

Python Code

import numpy as np
import matplotlib.pyplot as plt

# Sample data points (x, y)


x = np.array([1, 2, 3, 4, 5])
y = np.array([1.2, 2.3, 3.1, 4.5, 5.3])
# Step 1: Construct the design matrix X
X = np.vstack([np.ones(len(x)), x]).T # Add a column of ones for the intercept

# Step 2: Compute the normal equation (X^T * X)^(-1) * (X^T * y)


XT = X.T
beta = np.linalg.inv(XT @ X) @ XT @ y

# Step 3: Output the regression coefficients (beta_0, beta_1)


beta_0, beta_1 = beta
print(f"Intercept (beta_0): {beta_0}")
print(f"Slope (beta_1): {beta_1}")

# Step 4: Predict the values of y based on the model


y_pred = X @ beta

# Step 5: Plot the original data and the regression line


plt.scatter(x, y, color='blue', label='Data Points')
plt.plot(x, y_pred, color='red', label=f'Least Squares Fit: y = {beta_0:.2f} + {beta_1:.2f}x')
plt.xlabel('x')
plt.ylabel('y')
plt.legend()
plt.show()

Output
Result

This approach demonstrates solving a linear regression problem using the direct inverse
method, which is mathematically efficient but may become computationally expensive for
large datasets due to matrix inversion.

You might also like