Multi Linear and Discriminant
Multi Linear and Discriminant
Interpretation :
1. Group Statistics: We will consider only those independent variables whose means
across the two functions of the dependent variable differs significantly . Their
significance can be confirmed from the below table
2. Test of Equality of Group Means : It is used to find which of the independent variable
are having p-value less than 0.05 ( also their means differs significantly ). This
indicates that they are statistically significant enough to contribute in the
discriminating model.
5. Wilk’s Lambda: Wilks' lambda is a measure of how well each function separates cases
into groups. It is equal to the proportion of the total variance in the discriminant
scores not explained by differences among the groups. Smaller values of Wilks'
lambda (closer to 0 ) indicate greater discriminatory ability of the function . Here, the
wilk’s Lambda value _ means its good (if significant value is less). Also the
probability value of F-test shows that discriminations between groups are highly
significant p value <0.05 at df_.
Interpretation:
1. ANOVA Table: It is used to see whether the overall regression model is statistically
significant. A low p-value (typically < 0.05) indicates the model’s significance.
2. Model Summary :
R (Correlation Coefficient): This value ranges from -1 to 1 and indicates the
strength and direction of the linear relationship. A positive value signifies a
positive correlation, while a negative value indicates a negative correlation.
R-Square (Coefficient of Determination): Represents the proportion of variance
in the dependent variable explained by the independent variable. Higher values
indicate a better fit of the model.
Adjusted R Square: Adjusts the R-squared value for the number of predictors in
the model, providing a more accurate measure of goodness of fit.
3. Coefficients :
Unstandardized Coefficients (B): Provides the individual regression coefficients
for each predictor variable.
Standardized Coefficients (Beta): Standardizes the coefficients, allowing for a
comparison of the relative importance of each predictor.
t-values: Indicate how many standard errors the coefficients are from zero. Higher
absolute t-values suggest greater significance.
P values: Test the null hypothesis that the corresponding coefficient is equal to
zero. A low p-value suggests that the predictors are significantly related to the
dependent variable.
Variance inflation factor (VIF) is used to detect multicollinearity. It measures
the correlation and strength of correlation between the predictor variables in a
regression model. A value greater than 5 indicates potentially severe correlation
between a given predictor variable and other predictor variables in the model. In
this case, the coefficient estimates and p-values in the regression output are likely
unreliable