0% found this document useful (0 votes)
6 views3 pages

Kruskal's Algorithm - : CO's Po'S & Pso'S Mapping CO's PO's PSO's

The document outlines course objectives and outcomes for three courses: Artificial Intelligence and Machine Learning, Database Management Systems, and Automata Theory. Each course includes specific learning goals, practical exercises, and recommended textbooks and references. Additionally, it provides a mapping of course outcomes to program outcomes and program specific outcomes.

Uploaded by

Pon Saravanan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

Kruskal's Algorithm - : CO's Po'S & Pso'S Mapping CO's PO's PSO's

The document outlines course objectives and outcomes for three courses: Artificial Intelligence and Machine Learning, Database Management Systems, and Automata Theory. Each course includes specific learning goals, practical exercises, and recommended textbooks and references. Additionally, it provides a mapping of course outcomes to program outcomes and program specific outcomes.

Uploaded by

Pon Saravanan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

- P and NP completeness – Kruskal’s algorithm – Travelling Salesman Problem- 3-CNF SAT

problems.

COURSE OUTCOMES:
At the end of this course, the students will be able to:
CO1: Construct automata theory using Finite Automata
CO2: Write regular expressions for any pattern
CO3: Design context free grammar and Pushdown Automata
CO4: Design Turing machine for computational functions
CO5: Differentiate between decidable and undecidable problems
TOTAL:45 PERIODS
TEXT BOOKS:
1. Hopcroft J.E., Motwani R. & Ullman J.D., "Introduction to Automata Theory, Languages and
Computations", 3rd Edition, Pearson Education, 2008.
2. John C Martin , "Introduction to Languages and the Theory of Computation", 4th Edition,
Tata McGraw Hill, 2011.

REFERENCES:
1. Harry R Lewis and Christos H Papadimitriou , "Elements of the Theory of Computation", 2nd
Edition, Prentice Hall of India, 2015.
2. Peter Linz, "An Introduction to Formal Language and Automata", 6th Edition, Jones & Bartlett,
2016.
3. K.L.P.Mishra and N.Chandrasekaran, “Theory of Computer Science: Automata Languages and
Computation”, 3rd Edition, Prentice Hall of India, 2006.

CO’s-PO’s & PSO’s MAPPING


CO’s PO’s PSO’s
1 2 3 4 5 6 7 8 9 10 11 12 1 2 3
1 1 3 2 3 - - - - 1 1 2 3 1 3 2
2 2 2 3 2 1 - - - 3 3 2 3 3 1 2
3 2 2 3 2 1 - - - 1 3 1 2 1 2 2
4 2 2 2 1 - - - - 1 3 3 2 1 3 2
5 2 2 2 1 1 - - - 1 1 3 2 3 1 3
AVg. 2 2 2 2 1 - - - 1 2 2 2 2 2 2
1 - low, 2 - medium, 3 - high, ‘-“- no correlation

CS3491 ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING L T P C


3 0 2 4
COURSE OBJECTIVES:
The main objectives of this course are to:
 Study about uninformed and Heuristic search techniques.
 Learn techniques for reasoning under uncertainty
 Introduce Machine Learning and supervised learning algorithms
 Study about ensembling and unsupervised learning algorithms
 Learn the basics of deep learning using neural networks

78
UNIT I PROBLEM SOLVING 9
Introduction to AI - AI Applications - Problem solving agents – search algorithms – uninformed
search strategies – Heuristic search strategies – Local search and optimization problems –
adversarial search – constraint satisfaction problems (CSP)

UNIT II PROBABILISTIC REASONING 9


Acting under uncertainty – Bayesian inference – naïve bayes models. Probabilistic reasoning –
Bayesian networks – exact inference in BN – approximate inference in BN – causal networks.

UNIT III SUPERVISED LEARNING 9


Introduction to machine learning – Linear Regression Models: Least squares, single & multiple
variables, Bayesian linear regression, gradient descent, Linear Classification Models: Discriminant
function – Probabilistic discriminative model - Logistic regression, Probabilistic generative model –
Naive Bayes, Maximum margin classifier – Support vector machine, Decision Tree, Random forests

UNIT IV ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9


Combining multiple learners: Model combination schemes, Voting, Ensemble Learning - bagging,
boosting, stacking, Unsupervised learning: K-means, Instance Based Learning: KNN, Gaussian
mixture models and Expectation maximization

UNIT V NEURAL NETWORKS 9


Perceptron - Multilayer perceptron, activation functions, network training – gradient descent
optimization – stochastic gradient descent, error backpropagation, from shallow networks to deep
networks –Unit saturation (aka the vanishing gradient problem) – ReLU, hyperparameter tuning,
batch normalization, regularization, dropout.
45 PERIODS
PRACTICAL EXERCISES: 30 PERIODS
1. Implementation of Uninformed search algorithms (BFS, DFS)
2. Implementation of Informed search algorithms (A*, memory-bounded A*)
3. Implement naïve Bayes models
4. Implement Bayesian Networks
5. Build Regression models
6. Build decision trees and random forests
7. Build SVM models
8. Implement ensembling techniques
9. Implement clustering algorithms
10. Implement EM for Bayesian networks
11. Build simple NN models
12. Build deep learning NN models

COURSE OUTCOMES:
At the end of this course, the students will be able to:
CO1: Use appropriate search algorithms for problem solving
CO2: Apply reasoning under uncertainty
CO3: Build supervised learning models
CO4: Build ensembling and unsupervised models
CO5: Build deep learning neural network models
TOTAL:75 PERIODS
79
TEXT BOOKS:
1. Stuart Russell and Peter Norvig, “Artificial Intelligence – A Modern Approach”, Fourth
Edition, Pearson Education, 2021.
2. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.

REFERENCES:
1. Dan W. Patterson, “Introduction to Artificial Intelligence and Expert Systems”, Pearson
Education,2007
2. Kevin Night, Elaine Rich, and Nair B., “Artificial Intelligence”, McGraw Hill, 2008
3. Patrick H. Winston, "Artificial Intelligence", Third Edition, Pearson Education, 2006
4. Deepak Khemani, “Artificial Intelligence”, Tata McGraw Hill Education, 2013
(http://nptel.ac.in/)
5. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
6. Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition,1997.
7. Charu C. Aggarwal, “Data Classification Algorithms and Applications”, CRC Press, 2014
8. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine
Learning”, MIT Press, 2012.
9. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016
CO’s-PO’s & PSO’s MAPPING
CO’s PO’s PSO’s
1 2 3 4 5 6 7 8 9 10 11 12 1 2 3
1 3 2 3 3 - - - - 1 3 3 3 1 2 2
2 1 1 1 3 1 - - - 1 2 1 3 2 3 2
3 2 1 2 1 1 - - - 2 1 1 3 1 1 1
4 3 1 3 1 - - - - 2 1 2 1 2 2 2
5 3 1 1 2 2 - - - 3 1 2 3 2 1 2
AVg. 2 1 2 2 1 - - - 2 2 2 3 2 2 2
1 - low, 2 - medium, 3 - high, ‘-“- no correlation

CS3492 DATABASE MANAGEMENT SYSTEMS L T P C


3 0 0 3
COURSE OBJECTIVES:
 To learn the fundamentals of data models, relational algebra and SQL
 To represent a database system using ER diagrams and to learn normalization techniques
 To understand the fundamental concepts of transaction, concurrency and recovery
processing
 To understand the internal storage structures using different file and indexing techniques
which will help in physical DB design
 To have an introductory knowledge about the Distributed databases, NOSQL and database
security

UNIT I RELATIONAL DATABASES 10


Purpose of Database System – Views of data – Data Models – Database System Architecture –
Introduction to relational databases – Relational Model – Keys – Relational Algebra – SQL
fundamentals – Advanced SQL features – Embedded SQL– Dynamic SQL

80

You might also like