0% found this document useful (0 votes)
57 views2 pages

About

Mathematical Foundations for Computing

Uploaded by

joyaljms98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views2 pages

About

Mathematical Foundations for Computing

Uploaded by

joyaljms98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

22-382- MATHEMATICAL CATEGORY L T P CREDIT

0101 FOUNDATION FOR


COMPUTING CORE 3 1 0 4

Prerequisite: Programming Fundamentals


Course Outcomes: After the completion of the course the student will be able to

CO1 Solve system of linear equations using various (Cognitive level: Apply)
methods.

CO2 Apply various methods to find Eigenvalues and (Cognitive level: Apply)
Eigenvectors.

CO3 Apply Bayes theorem and various discrete and (Cognitive level: Apply)
continuous distributions.

CO4 Apply various optimization techniques for solving (Cognitive level: Understand)
real life problems.

CO5 Apply various techniques for dimensionality (Cognitive level: Apply)


reduction and density optimization.

Mapping of Course Outcomes with Programme Outcomes - Low=1, Medium=2, High=3

PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO12

CO 1 3 3 3 1

CO 2 3 3 3 1

CO 3 3 3 3 1

CO 4 3 3 3 1

CO 5 3 3 3 1

9
22-382-0101 MATHEMATICAL FOUNDATION FOR COMPUTING

UNIT I (8 Hours)
Linear Algebra- Solving systems of Linear Equations; Vector Spaces and sub spaces; Linear
Independence; Basis and rank; Linear maps-Image and kernel, Metric space and normed space,
Inner product space.

UNIT II (10 Hours)


Matrix decompositions-Determinant, Eigenvalues and eigenvectors, Trace, Orthogonal matrices,
Diagonalization and symmetric matrices, Singular value decomposition;Vector calculus-
Differentiation, partial differentiation and gradients,gradients of vector valued functions.

UNIT III (12 Hours)


Probability and statistics – Descriptive statistics, Basics of probability, joint,marginal and
conditional probability, Bayes theorem examples of calculating probability, Discrete probability
distributions – Binomial, Poisson, and multinomial distributions. Continuous probability
distributions – Normal, exponential and chi-square, problems related to discrete and continuous
probability distributions, testing of hypothesis.

UNIT IV (8 Hours)
Optimization – Optimization using gradient descent, Constraint optimization and Langrage
multipliers, convex optimization, Maximum likelihood estimation, least Square estimation, Linear
regression, Linear regression as maximum likelihood, least squares and maximum likelihood.

UNIT V (7 Hours)
Dimensionality reduction and Density estimation – Feature extraction, feature selection, Principal
component analysis, Discrete wavelet transform; Gaussian mixture model, Expectation
maximization (EM) algorithm.

Text Books/References
1.“Mathematics for Machine Learning” by Marc Peter Deisenroth, A. Aldo Faisal, Cheng
Soon Ong, 2020, Cambridge University Press.
2.“Mathematics for Machine Learning”, by Jay Davani, Hands-on 2020 Packt publishers.
3.“Advanced Engineering Mathematics”, by Erwin Kreyszig,Edition 10, 2014 John Wiley
&Sons.
4. “Information Theory,Inference and Learning Algorithms”, by David J.C. MacKay,2003
Cambridge University Press.

10

You might also like