0% found this document useful (0 votes)
7 views1 page

New Microsoft Word Document

Uploaded by

shekharendu7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views1 page

New Microsoft Word Document

Uploaded by

shekharendu7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Machine Learning

1. Introduction to Machine Learning:

Introduction to Artificial Intelligence, Machine Learning, Deep Learning Types of Machine Learning,
Application of Machine Learning.

2. Linear Algebra:

Scalar, Vector, Matrix, Matrix Operation, Norms, Probability, Joint Distribution, Bayes Theorem,
Expectation, Co-variance.

3. Regression and Classification:

Simple Linear Regression, Multiple Linear Regression, Least square gradient descent, Linear
Classification, and Logistic Regression.

4. Decision Tree Learning:

Representing concepts as decision trees. Recursive induction of decision trees. Picking the best
splitting attribute: entropy and information gain. Searching for simple trees and computational
complexity. Overfitting, noisy data, and pruning.

5. Ensemble Learning:

Bagging, boosting, and DECORATE. Active learning with ensembles.

6. Artificial Neural Networks:

Neurons and biological motivation. Linear threshold units. Perceptrons: representational limitation
and gradient descent training. Multilayer networks and backpropagation. Hidden layers and
constructing intermediate, distributed representations. Overfitting, learning network structure,
recurrent networks.

7. Support Vector Machines:

Maximum margin linear separators. Quadratic programming solution to finding maximum margin
separators. Kernels for learning non-linear functions.

8. Bayesian Learning:

Probability theory and Bayes rule. Naive Bayes learning algorithm. Parameter smoothing. Generative
vs. discriminative training. Logistic regression. Bayes nets and Markov nets for representing
dependencies.

9. Clustering and Unsupervised Learning:

Learning from unclassified data. Clustering. Hierarchical Agglomerative Clustering. k-means


partitional clustering. Expectation maximization (EM) for soft clustering. Semi-supervised learning
with EM using labeled and unlabeled data.

10. Dimensionality Reduction:

Principal component Analysis(PCA), Linear Discriminant Analysis(LDA), Feature selection, Feature


manipulation and normalization

You might also like