0% found this document useful (0 votes)
54 views5 pages

Bits f464 Machine Learning l1

The document is a course handout for BITS F464: Machine Learning at the Birla Institute of Technology & Science, Pilani, Hyderabad Campus for the 2023-2024 semester. It outlines the course objectives, topics to be covered, textbooks, evaluation scheme, and policies regarding academic integrity and make-up exams. The course aims to equip students with the ability to apply various machine learning algorithms and understand their principles through practical applications.

Uploaded by

f20221450
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views5 pages

Bits f464 Machine Learning l1

The document is a course handout for BITS F464: Machine Learning at the Birla Institute of Technology & Science, Pilani, Hyderabad Campus for the 2023-2024 semester. It outlines the course objectives, topics to be covered, textbooks, evaluation scheme, and policies regarding academic integrity and make-up exams. The course aims to equip students with the ability to apply various machine learning algorithms and understand their principles through practical applications.

Uploaded by

f20221450
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

BIRLA INSTITUTE OF TECHNOLOGY &

SCIENCE,PILANI
Hyderabad Campus, Hyderabad
Computer Science Dept, 2nd Semester
2023-2024 Course Handout (BITS F464:
Machine Learning)

Date: 9th Jan 2024

Course Number : BITS F464 (Tues:4, Thur:4,10)


Course Title : Machine Learning
Instructor-In-Charge : Chittaranjan Hota, Ph.D (hota[AT]hyderabad.bits-
pilani.ac.in)

Scope and Objectives of the course:


This course is an undergraduate course on Machine Learning. ML is the sub-field of
Artificial Intelligence. It helps engineers build automated systems that learn from
experiences or examples. It helps machines make data-driven decisions. For
example, Google Maps for navigation uses the route network, real-time traffic
characteristics, time of travel etc. to predict an appropriate path for you using ML
algorithms. ML is a multi-disciplinary field, with roots in Computer science, and
Mathematics. ML methods are best described using linear and matrix algebra and
their behaviours are best understood using the tools of probability and statistics.
According to the latest estimates, 328 million terabytes of data are created daily. With
this increasing amounts of data, the need for automated methods for data analysis
continues to grow. The goal of ML is to develop methods that can automatically
detect patterns in data, and then use the uncovered patterns to predict the future
outcomes of interest. This course will cover many ML models and algorithms,
including linear models, multi-layer neural networks, support vector machines,
density estimation methods, Bayesian belief networks, mixture models, clustering,
ensemble methods, and reinforcement learning. The course objectives are the
following:
 To select and apply an appropriate supervised learning algorithm for classification problems
like Naïve Bayes, SVM, Logistic regression, Neural networks etc.
 To select and apply an appropriate supervised learning algorithm for regression problems like
Linear regression, Ridge regression, Non-parametric kernel regression etc.
 To select and apply an appropriate un-supervised learning algorithm for clustering, linear and
non-linear dimensionality reduction etc.
 To understand ML principles and techniques like Model selection, Under-fitting, Over-fitting,
Cross-validation, Regularization etc.
 To test run appropriate ML algorithm on real and synthetic datasets and interpret their results.

Text Books:
T1: Christopher Bishop: Pattern Recognition and Machine Learning, Springer-Verlag
New York Inc., 2006.
T2: Tom M. Mitchell: Machine Learning, The McGraw-Hill, Indian Edition, 2017.

Reference Books:
R1: Kevin Murphy: Machine Learning: A Probabilistic Perspective, MIT Press, 2012.

R2: Shai Shalev-Shwartz and Shai Ben-David: Understanding Machine Learning: From
Theory to Algorithms, Cambridge University Press, 2014.

R3: Ethem Alpaydin: Introduction to Machine Learning, 3rd Edition, MIT Press, 2014.
Lecture Plan:

Chapter
Learnin
Lect. # Topics to be covered in the
g
Text
objectiv
Book
es
Course Course Administration, Motivation and T2(1),
1
administrati ML Frameworks. Lectur
on e
Slides
Supervised/Unsupervised/RL, R3(1.1, 1.2)
2 Overview of ML

Classification/ Regression, General


Approach.
Supervis Concept Learning: Version space and T2(2.2, 2.5)
3-4
ed Candidate elimination algorithm.
Learning
-I
Supervise Decision Tree Learning: Tree T2(3.2, 3.3,
5-6
d Representation, Types of problems 3.4)
Learning - suitable for DT learning, Learning
algorithm.
II
Evaluating Bias, Cross-validation, Precision-Recall, T1(1.3),
7-8
a model ROC Curve. R3(19.
6, 19.7)
Linear Linear regression, Logistic regression, T1(3.1,
9 - 11
Models for Gradient Descent, GD Analysis, SGD. 3.2),
Regression R1(8.1-3,
8.6)
Linear Models Discriminant functions, least squares, T1(4.1)
12 - 14
for Fischer’s Linear Discriminant.
Classification
Generative Vs Discriminative models, T1(4.2, 4.3),
15 - 17 Naïve Bayes Maximum A Posteriori (MAP) Vs T2(6.1-6.10)
Maximum Likelihood (ML)
18 - 21 Perceptron Training, Multi- T1(5.1-5.4)
Neural layer Perceptron(MLP): Components,
Network Activations, Training: SGD, Computing
s-I Gradients, Error Backpropagation.
22 - 25 Neural Regularization, Data T1(5.5),
Networks Augmentation, Lectur
Convolutional Networks: CNNs, RNNs; e
-II
Generative Models: Autoregressive, Slides
GANs.
26 – 29 Instance- k-Nearest Neighbour Learning, T2(8.2),
based Constructing Kernels, Radial Basis T1(6.1-
Learning: Function Networks, Maximum margin 6.3, 7.1)
Kernels & classifiers.
SVMs
30 - 33 Graphic Bayesian Networks: Training, T1 (8.1,
al Structure learning, 8.4.1), T2(6)
Models Inferences, Undirected models.

34 - 35 Un- Mixture Models and EM: K-means T1 (9.1, 9.2)


supervised Clustering, Gaussian Mixture Models, EM
Learning - I for GMM.
36 - 37 Un- Dimensionality Reduction, T1(12.1)
supervised Principal Component
Learning - II Analysis (PCA).
38 – 39 Combini Bayesian-model averaging, T1(14.1
ng Boosting, Tree- based - 14.4)
Models Models.
40 - 41 Reinforceme Markov Decision Process, Value T1 (13.1),
nt Learning Iteration, Policy Iteration, Q-learning. T2(13.3)
Evaluation Scheme:

Component Duratio Date & Time Weightage Nature


n
of
Componen
t
Mid- 90 mins 14/03/3024 (2:00 to 3:30pm) 30% Closed Book
Semester
Exam
Home - To be announced 20% Open Book
Assignments/
Projects
(coding)
Two 30 Second week of Feb, and 10% Open Book
announced First week of April 2024.
quizzes mins
each
Comprehensi 3 Hrs 15/05/2024 (FN) 40% Closed Book
ve Exam

(Note: Minimum 40% of the evaluation component will be conducted before the mid
semester grading)

Chamber Consultation Hours: Would be announced in the class.

Make-up Policy:
Prior permission of the Instructor-In-Charge is required to get make-up on any
evaluation component. Genuine requests will only be considered.

Notices: All notices about the course will be put on Course webpage.

Academic Honesty and Integrity Policy: Academic honesty and integrity are to be
maintained by all the students throughout the semester and no type of academic
dishonesty is acceptable.

Instructor-In-
Charge BITS
F464

You might also like