Classification
Classification
Logistic Regression
Purpose: To predict the probability of a binary outcome based on one or more predictor
variables.
Strengths:
Weaknesses:
Linearity: Assumes a linear relationship between predictors and the log-odds of the
outcome.
Limited to Binary Classification: Primarily used for binary classification, though
extensions exist for multi-class classification.
2. Decision Trees
Purpose: To create a model that predicts the value of a target variable by learning simple
decision rules inferred from the data features.
Strengths:
Weaknesses:
3. Random Forest
Purpose: To improve the performance and robustness of decision trees by averaging multiple
trees trained on different parts of the data.
Strengths:
Weaknesses:
Purpose: To build a strong classifier from an ensemble of weak classifiers, typically decision
trees, by iteratively correcting errors from previous trees.
Strengths:
Weaknesses:
Purpose: To find the hyperplane that best separates the classes in the feature space.
Strengths:
Effective in High Dimensions: Works well when the number of features is large.
Robustness: Robust to overfitting, especially with proper kernel choice.
Memory Efficiency: Uses a subset of training points (support vectors) in the decision
function.
Weaknesses:
Purpose: To classify data points based on the classes of their nearest neighbors.
Strengths:
Weaknesses:
Purpose: To model complex relationships between inputs and outputs through multiple
layers of neurons.
Strengths:
Large datasets with complex patterns, such as images, text, and speech.
Problems requiring high predictive performance.
Applications where feature engineering is difficult or infeasible.
8. Naive Bayes
Purpose: To classify data based on the Bayes theorem with the assumption of feature
independence.
Strengths:
Weaknesses: