ML Algorithms Explained (1)
ML Algorithms Explained (1)
Random Forest
- Type: Ensemble of Decision Trees
- How it works:
- Builds many decision trees using random subsets of data/features.
- Classification: majority voting.
- Regression: average result.
- Pros: High accuracy, handles missing values.
- Cons: Slower and less interpretable than single tree.
Gradient Boosting
- Type: Ensemble (Boosting)
- How it works:
- Sequentially builds trees correcting previous errors.
- Focuses more on difficult cases.
- Pros: High accuracy, captures complex patterns.
- Cons: Slow training, prone to overfitting.
XGBoost (Extreme Gradient Boosting)
- Type: Optimized Gradient Boosting
- How it works:
- Similar to Gradient Boosting but faster, regularized.
- Includes pruning, regularization, parallelism.
- Pros: Fast, accurate, widely used.
- Cons: Complex to tune.
Comparison Table
Feature KNN Random SVR Gradient XGBoost AdaBoos Extra
Forest Boosting t Trees