0% found this document useful (0 votes)
20 views4 pages

Machine Learning Notes All Units

The document provides an overview of Machine Learning (ML), covering its definitions, types (supervised, unsupervised, reinforcement), applications, and the ML process. It details various algorithms including regression, classification, decision trees, neural networks, and reinforcement learning, along with their evaluation metrics and concepts like overfitting and underfitting. Additionally, it discusses genetic algorithms and their applications in optimization and scheduling.

Uploaded by

Guddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views4 pages

Machine Learning Notes All Units

The document provides an overview of Machine Learning (ML), covering its definitions, types (supervised, unsupervised, reinforcement), applications, and the ML process. It details various algorithms including regression, classification, decision trees, neural networks, and reinforcement learning, along with their evaluation metrics and concepts like overfitting and underfitting. Additionally, it discusses genetic algorithms and their applications in optimization and scheduling.

Uploaded by

Guddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Machine Learning Notes - All Units

Unit I: Introduction to Machine Learning

- **Definition**: Machine Learning (ML) is the study of algorithms that improve automatically through
experience.
- **Types**:
1. Supervised Learning
- Input + Output pairs.
- Ex: Regression, Classification.
2. Unsupervised Learning
- Only inputs, no labeled outputs.
- Ex: Clustering.
3. Reinforcement Learning
- Agent learns via rewards/punishments.

- **Applications**:
- Spam detection, Image recognition, Forecasting, etc.

- **Steps in ML process**:
- Data collection Preprocessing Model selection Training Evaluation Deployment

- **Example (Supervised)**:
- Predicting house price using features like size, location.

Unit II: Regression & Classification Algorithms

- **Regression**:
- Predicts continuous values.
- Linear Regression: y = mx + c
- Numerical Example: If x=5, m=2, c=3 y=13
- **Classification**:
- Predicts discrete class labels.
- Logistic Regression: P(y=1) = 1 / (1 + e^-(wx + b))
- Decision Boundary separates classes.

- **Evaluation Metrics**:
- Accuracy, Precision, Recall, F1 Score, Confusion Matrix

- **Overfitting vs Underfitting**:
- Overfitting: Model fits noise.
- Underfitting: Model too simple.

Unit III: Decision Tree Learning & Instance-Based Learning

- **Decision Trees**:
- Nodes: Tests; Leaves: Predictions
- ID3 uses Information Gain (IG)

- **Entropy**:
- Measures impurity
- Example: 4 Yes, 2 No Entropy = 0.918

- **Information Gain**:
- IG(S,A) = Entropy(S) - weighted Entropy(children)

- **Instance-Based Learning**:
- k-NN:
- Classifies using nearest neighbors.
- Example: Point (2,3), k=2 Majority class from 2 closest

- Locally Weighted Regression:


- Weight decays with distance.
- Predicts value using weighted average.

- RBF Networks: Use Gaussian functions centered on data.

Unit IV: Neural Networks & Deep Learning

- **Perceptron**:
- y = f(wx + b)
- Uses step/sigmoid/ReLU function

- **Backpropagation**:
- Adjusts weights using gradient descent.
- Delta Rule: w = (t - o)x

- **CNN**:
- Convolution Layer + Pooling + FC
- Example: 3x3 input, 2x2 filter Output 2x2
- Applications: Diabetic Retinopathy detection, Self-driving cars

- **Regularization**:
- Dropout, Early stopping to prevent overfitting

Unit V: Reinforcement Learning & Genetic Algorithms

- **Reinforcement Learning**:
- Agent interacts with environment.
- Q-learning: Q(s,a) = r + max Q(s, a)

- **Numerical Example**:
- Q=5, =0.1, r=10, =0.9, max Q(s,a)=8
- Update Q = 6.22

- **Genetic Algorithms**:
- Population Fitness Selection Crossover Mutation
- Example:
- P1: 101011, P2: 111000 Crossover Children: 101000, 111011

- **Applications**:
- Optimization, Feature selection, Scheduling

You might also like