0% found this document useful (0 votes)
27 views2 pages

Multi Linear and Discriminant

Uploaded by

shubham kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views2 pages

Multi Linear and Discriminant

Uploaded by

shubham kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Discriminating Analysis

Interpretation :
1. Group Statistics: We will consider only those independent variables whose means
across the two functions of the dependent variable differs significantly . Their
significance can be confirmed from the below table
2. Test of Equality of Group Means : It is used to find which of the independent variable
are having p-value less than 0.05 ( also their means differs significantly ). This
indicates that they are statistically significant enough to contribute in the
discriminating model.

From the summary of Canonical Discriminant Function:

3. Pooled within Group matrices : a pooled within-groups correlation matrix that is


obtained by averaging the separate covariance matrices for all groups before
computing the correlations. If the value is less than 0.75, it indicates that there is no
multicollinearity, however if there are more than 2 elements having the value greater
than 0.75, it indicates the presence of multicollinearity and you can take any one of
the two variables in this model.
4. Eigen value : These eigenvalues are related to the canonical correlations and
describe how much discriminating ability a function possesses. The magnitudes of the
eigenvalues are indicative of the functions’ discriminating abilities. Ideally it should
be closer to 1. Here the value 0.801 means that (0.801) 2 = 64, 64% of the variance in
the independent variable can be explained by the dependent variable.

5. Wilk’s Lambda: Wilks' lambda is a measure of how well each function separates cases
into groups. It is equal to the proportion of the total variance in the discriminant
scores not explained by differences among the groups. Smaller values of Wilks'
lambda (closer to 0 ) indicate greater discriminatory ability of the function . Here, the
wilk’s Lambda value _ means its good (if significant value is less). Also the
probability value of F-test shows that discriminations between groups are highly
significant p value <0.05 at df_.

6. Standardized Coefficient Discriminant Function Coefficient : The standardized


canonical discriminant coefficients can be used to rank the importance of each
variables. A high standardized discriminant function coefficient might mean that the
groups differ a lot on that variable or its contribution in the discriminant function. The
maximum value _ is the best contributor in this model.
Multiple linear Regression :

Interpretation:

1. ANOVA Table: It is used to see whether the overall regression model is statistically
significant. A low p-value (typically < 0.05) indicates the model’s significance.
2. Model Summary :
 R (Correlation Coefficient): This value ranges from -1 to 1 and indicates the
strength and direction of the linear relationship. A positive value signifies a
positive correlation, while a negative value indicates a negative correlation.
 R-Square (Coefficient of Determination): Represents the proportion of variance
in the dependent variable explained by the independent variable. Higher values
indicate a better fit of the model.
 Adjusted R Square: Adjusts the R-squared value for the number of predictors in
the model, providing a more accurate measure of goodness of fit.

3. Coefficients :
 Unstandardized Coefficients (B): Provides the individual regression coefficients
for each predictor variable.
 Standardized Coefficients (Beta): Standardizes the coefficients, allowing for a
comparison of the relative importance of each predictor.
 t-values: Indicate how many standard errors the coefficients are from zero. Higher
absolute t-values suggest greater significance.
 P values: Test the null hypothesis that the corresponding coefficient is equal to
zero. A low p-value suggests that the predictors are significantly related to the
dependent variable.
 Variance inflation factor (VIF) is used to detect multicollinearity. It measures
the correlation and strength of correlation between the predictor variables in a
regression model. A value greater than 5 indicates potentially severe correlation
between a given predictor variable and other predictor variables in the model. In
this case, the coefficient estimates and p-values in the regression output are likely
unreliable

You might also like