REM
REM
Non parametic models (DT) doesn’t require scaling / LR knn and svm we use scaling
https://vitalflux.com/interns-machine-learning-interview-questions-with-answers-set-1/
https://skillvalue.com/en/quiz/machine-learning/
Day 1:
• Basic terminology:
a. Most common settings: Supervised setting, Unsupervised
setting, Semi-supervised setting, Reinforcement learning.
b. Most common problems: Classification (binary & multiclass),
Regression, Clustering.
c. Preprocessing of data: Data normalization.
• Concepts of hypothesis sets, empirical error, true error, complexity of
hypotheses sets, regularization, bias-variance trade-off, loss
functions, cross-validation.
Day 2:
• Optimization basics:
a. Terminology & Basic concepts: Convex optimization,
Lagrangian, Primal-dual problems, Gradients &
subgradients, ℓ1ℓ1 and ℓ2ℓ2 regularized objective
functions.
b. Algorithms: Batch gradient descent & stochastic gradient
descent, Coordinate gradient descent.
c. Implementation: Write code for stochastic gradient descent
for a simple objective function, tune the step size, and get an
intuition of the algorithm.
Day 3:
• Classification:
a. Logistic Regression
b. Support vector machines: Geometric intuition, primal-dual
formulations, notion of support vectors, kernel trick,
understanding of hyperparameters, grid search.
c. Online tool for SVM: Play with this online SVM tool (scroll
down to “Graphic Interface”) to get some intuition of the
algorithm.
Day 4:
• Regression:
a. Ridge regression
• Clustering:
a. k-means & Expectation-Maximization algorithm.
b. Top-down and bottom-up hierarchical clustering.
Day 5:
• Bayesian methods:
a. Basic terminology: Priors, posteriors, likelihood, maximum
likelihood estimation and maximum-a-posteriori inference.
b. Gaussian Mixture Models
c. Latent Dirichlet Allocation: The generative model and basic
idea of parameter estimation.
Day 6:
• Graphical models:
a. Basic terminology: Bayesian networks, Markov networks /
Markov random fields.
b. Inference algorithms: Variable elimination, Belief
propagation.
c. Simple examples: Hidden Markov Models. Ising model.
Days 7–8:
• Neural Networks:
a. Basic terminology: Neuron, Activation function, Hidden layer.
b. Convolutional neural networks: Convolutional layer, pooling
layer, Backpropagation.
c. Memory-based neural networks: Recurrent Neural Networks,
Long-short term memory.
d. Tutorials: I’m familiar with this Torch tutorial (you’ll want to
look at 𝟷𝟷_𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜𝚜1_supervised directory). There
might be other tutorials in other deep learning frameworks.
Day 9:
• Miscellaneous topics:
a. Decision trees
b. Recommender systems
c. Markov decision processes
d. Multi-armed bandits
Day 10: (Budget day)
• You can use the last day to catch up on anything left from previous days,
or learn more about whatever topic you found most interesting / useful
for your future work.