0% found this document useful (0 votes)
9 views4 pages

Machine Learning

The document outlines the course structure for a Machine Learning program in the Department of Computer Science and Applications at the School of Engineering and Technology. It details course objectives, outcomes, content, and evaluation methods, emphasizing the development of a comprehensive understanding of machine learning concepts and techniques. The course includes both theoretical and practical components, covering supervised and unsupervised learning algorithms, data preparation, and model evaluation.

Uploaded by

funwithkidz14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views4 pages

Machine Learning

The document outlines the course structure for a Machine Learning program in the Department of Computer Science and Applications at the School of Engineering and Technology. It details course objectives, outcomes, content, and evaluation methods, emphasizing the development of a comprehensive understanding of machine learning concepts and techniques. The course includes both theoretical and practical components, covering supervised and unsupervised learning algorithms, data preparation, and model evaluation.

Uploaded by

funwithkidz14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

School: School of Engineering and Technology

Department Department of Computer Science and Applications


Program: B. Tech
Branch: IT
1 Course Code Machine Learning
2 Course Title Machine Learning
3 Credits
4 Contact 3 0 2
Hours
(L-T-P)
Course Status Core
5 Course Students are Expected to learn and develop Comprehensive Understanding of the of
Objective the following Concepts and Techniques:
1. To introduce the ideas of learning rule and implement them based on
human experience.
2. To conceptualize the working of human brain using SVM, RF and ANN.
3. To become familiar with decision boundaries that can learn from
available examples and generalize to form appropriate learning rules for
inference systems.
4. To provide the mathematical background for SVM, RF and Neural
Network based classification techniques.
5. To understand and demonstrate how to solve patterns learning from a large
series of data using computer based learning algorithms

6 Course A Successful completion of this Course Ensures the following Outcomes


Outcomes CO 1 : Define basics of Machine Learning and stochastic concepts.
CO-2 : Classify and Compare existing models to understand the applicability in
solve real world societal problems.
CO-3 : Identify develop and apply mathematical models to find sustainable
solutions.
CO-4 : Analyse the logical ability to apply feature engineering to extract hierarchical
patterns existing in real life problems.
CO-5 : Evaluate the learning models to glance the upcoming world through it.

CO-6 : Discuss the applicability of Machine learning Approaches to develop


sustainable solutions using professional ethics.
7 Course This course introduces computational learning paradigm for critical & implementable
Description understanding for supervised and unsupervised learning based problem areas.
8 CO Mapping
Unit 1 Core Concepts of Machine Learning
A Introduction to Machine Learning Problem Framing(Common ML
Problems, ML Use Cases, Identifying Good Problems for ML, Hard
ML Problems), Machine Learning Applications(Image Recognition,
Speech Recognition, Medical Diagnosis, Statistical Arbitrage,
Learning Associations), Standard learning tasks(Machine Learning CO1
Pipeline, Classification, Regression, Ranking, Clustering,
Dimensionality reduction or
Manifold learning)

B Learning Stages(Features, Labels, Hyperparameters, Validation


Samples, Test Samples, Loss Function, Hypothesis Tests), Learning CO1, CO2
Scenarios( Supervised learning, Unsupervised learning, Semi-

Supervised learning, Transductive inference, On-line learning,


Reinforcement learning, Active learning), Generalization
Supervised Learning, Unsupervised Learning, Reinforcement
learning )
C Data Preparation and Feature Engineering in ML(Data and
Features, Information, Knowledge, Data Types, Big Data), Data
Preprocessing: An Overview(Data Quality: Why Preprocess the
Data?, Major Tasks in Data Preprocessing), Data Cleaning( Missing
Values, Noisy Data, Data Cleaning as a Process), Data
Integration(The Entity Identification Problem, Redundancy and
Correlation Analysis, Tuple Duplication, Detection and Resolution
of Data Value Conflicts), Data Reduction( Overview of Data CO1, CO2
Reduction Strategies, Attribute Subset Selection, Data Reduction,
Histograms, Clustering, Sampling, Data Cube Aggregation), Data
Transformation and Data Discretization(Overview of Data
Transformation Strategies, Data Transformation by Normalization,
Discretization by Binning, Discretization by Histogram Analysis,
Discretization by Cluster, Decision Tree, and Correlation
Analyses,
Concept Hierarchy Generation for Nominal Data)
Unit 2 Supervised Learning Algorithms - Part One
A How Supervised Learning Algorithms Work ?
Steps (Bias-variance trade off, Function complexity and amount of
training data, Dimensionality of the input space, Noise in the output CO1, CO2,
values, Algorithms, Other factors to consider (Heterogeneity of the CO6
data, Redundancy in the data, Presence of interactions and non-
linearities
B Linear Regression Model Representation, Linear Regression
Learning the Model ( Simple Linear Regression, Ordinary Least
Squares, Gradient Descent), Regularization / Shrinkage Methods CO1, CO2,
( Bias-variance trade-off, Overfitting Issues, Lasso Regression,
CO6
Ridge Regression), Making Predictions with Linear
Regression( Cost Function, Feature Scaling, Normalization, Mean
Normalization, Learning Rate, Automatic Convergence Test)
C Logistic Regression, The Logistic Model ( Latent variable
interpretation, Logistic function, odds, odds ratio, and logit,
Definition of the logistic function, Definition of the inverse of the
logistic function, Interpretation of these terms, Definition of the
odds, The odds ratio, Multiple explanatory variables), Model fitting
("Rule of ten", Iteratively reweighted least squares (IRLS), CO1, CO2, ,
Evaluating goodness of fit, Limitations of Logistic Regression), CO6
Linear discriminant analysis ( LDA for two classes, Assumptions,
Discriminant functions, Discrimination rules, Eigenvalues, Effect
size), Practical use and Applications ( Bankruptcy prediction, Face
recognition, Marketing, Biomedical, studies), Comparison to
Logistic Regression
Unit 3 Supervised Learning Algorithms - Part Two
A Support Vector Machines, Linear SVM ( Hard-margin, Soft-
margin), Nonlinear Classification, Computing the SVM
classifier(Primal, Dual, Kernel trick), Modern methods(Sub- CO1,CO2,CO3,
gradient descent, Coordinate descent), Empirical risk , CO6
minimization(Risk minimization, Regularization and stability, SVM
and the hinge loss,
Target functions), Properties( Parameter selection, Issues)
B Introduction to Artificial Neural Networks (Feed-forward Network
Functions, Weight-space symmetries), Network Training (
Parameter optimization, Local quadratic approximation, Use of CO1,CO2,CO3,
gradient information, Gradient descent optimization), Error CO6
Backpropagation( Evaluation of error-function derivatives,
Simple examples,
Efficiency of backpropagation)
C Decision Tree Learning (Decision tree representation, ID3 learning
algorithm, Entropy, Information gain, Overfitting and Evaluation,
Overfitting, Validation Methods, Avoiding Overfitting in Decision CO1,CO2,CO3,
Trees, Minimum-Description Length Methods, Noise in Data),
CO6
Random Forests Algorithm ( Preliminaries: decision tree learning,
Bagging, From bagging to random forests, Extra Trees, Properties,
Variable importance)
Unit 4 Unsupervised Learning
A Unsupervised Learning ( What is Unsupervised Learning?),
Clustering Methods (Method Based on Euclidean Distance, Method CO2,CO3,CO4,
Based on Probabilities, Hierarchical Clustering Methods, Method CO6
Based on Euclidean Distance )
B k-means Clustering Algorithm ( Standard algorithm (naive k-
means), Initialization methods), Applications (Vector quantization, CO2,CO3,CO4,
Cluster analysis, Feature learning) Gaussian mixture models , CO6
Expectation- Maximization method
C Principal Component Analysis for making predictive models ( First
component, Further components, Covariances, Dimensionality CO2,CO3,CO4,
reduction, Singular value decomposition), Properties and limitations
CO6
of PCA ( Properties, Limitations), Computing PCA using the
covariance method, Typical Applications
Unit 5 Parameter Estimation, Model Evaluation and Ensemble
Methods
A Parameter Estimation ( Point Estimation, Maximum Likelihood
Estimation, Unbiased Estimation, Confidence Intervals for One CO2,CO5,CO6
Mean, Two Mean, Variances)
B Model Evaluation ( ML Model Validation by Humans, Holdout Set
Validation Method, Cross-Validation Method for Models, Leave-
One-Out Cross-Validation, Random Subsampling Validation, Teach
CO3,CO5,CO6
and Test Method, Bootstrapping ML Validation Method, Running
AI Model Simulations, Overriding Mechanism Method ), The ROC
Curve
C Ensemble Methods ( Ensemble Theory, Ensemble Size, Voting
and Averaging Based Ensemble Methods Boosting, Weightage CO4,CO5,CO6
Average, Stacking, Bagging, Boosting and Bootstrap
Aggregating)
Mode of Theory and Practical
examination
Weightage CA MTE ETE
Distribution 25% 25% 50%
Text book/s* 1. Bishop, C. (2006). Pattern Recognition and Machine
Learning. Berlin: Springer-Verlag.
2. Foundations of Machine Learning, Second Edition
By Mehryar Mohri, Afshin Rostamizadeh and Ameet
Talwalkar, MIT Press, Second Edition, 2018.
3. Introduction to Machine Learning, Third Edition, By Ethem
Alpaydin, The MIT Pressmitpress.mit.edu › books ›
introduction-machine-learni...

Other 1) Baldi, P. and Brunak, S. (2002). Bioinformatics: A


References Machine Learning Approach. Cambridge, MA: MIT Press.
2) Russel, S. and Norvig, P. (2003). Artifiical Intelligence:
A Modern Approach. 2ndEdition. New York: Prentice-
Hall.
3) Cohen, P.R. (1995) Empirical Methods in
Artificial Intelligence. Cambridge, MA: MIT
Press.

4) https://www.toptal.com/machine-learning/ensemble-
methods-machine-learning.
CO and PO Mapping
S. Course Outcome Program Outcomes (PO)
No. & Program Specific
Outcomes (PSO)
1. CO 1 : Define basics of Machine Learning and stochastic PO1,PO2,PO3,PO4,
concepts. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
2. CO-2 : Classify and Compare existing models to understand the PO1,PO2,PO3,PO4,
applicability in solve real world societal problems. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
3. CO-3 : Identify develop and apply mathematical models to find PO1,PO2,PO3,PO4,
sustainable solutions. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
4. CO-4 : Analyse the logical ability to apply feature engineering to PO1,PO2,PO3,PO4,
extract hierarchical patterns existing in real life problems. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
5. CO-5 : Evaluate the learning models to glance the upcoming world PO1,PO2,PO3,PO4,
through it. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
6. CO-6 : Discuss the applicability of Machine learning Approaches to PO1,PO2,PO3,PO4,
develop sustainable solutions using professional ethics. PO5,PO6,PO7,PO8,
PO9,PO10, PSO1,PSO2,PSO3
PO and PSO mapping with level of strength for Course Name Concepts of Machine Learning
(Course Code CSA-202)

Subject P P P P P P P P P P P P PS PS PS
PO’s /
O O O O O O O O O O O O O O O
PSO’s
1 2 3 4 5 6 7 8 9 10 11 12 1 2 3
Concepts
CO1 3 3 3 3 3 3 2 - 1 3 1 3 2 2 1
of Machine
Learning CO2 3 3 3 3 3 3 3 - 2 3 3 3 3 3 3
(Course
Code CO3 3 3 3 3 3 3 3 - 2 3 3 3 3 3 3
CSA-201) CO4 3 3 3 3 3 3 3 - 2 3 3 3 3 3 3
CO5 3 3 3 3 3 3 3 - 2 3 3 3 3 3 3
CO6 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3

Average of non-zeros entry in following table (should be auto calculated).


Course PO PO PO PSO PSO PSO
Course Name
Code PO 1 PO2 PO 3 PO 4 PO 5 PO 6 PO 7 PO 8 PO 9 10 11 12 1 2 3
Concepts of
CSA- 0.5.
Machine
201 3.00 3.00 3.00 3.00 3.00 3.00 2.83 00 2.00 3.00 2.67 3.00 2.83 2.83 2.67
Learning

Total- 41.83
Strength of Correlation

1. Addressed to Slight (Low=1) extent 2. Addressed to Moderate (Medium=2) extent


3. Addressed to Substantial (High=3) extent

You might also like