CS L06 MachineLearning AnomalyDetection
CS L06 MachineLearning AnomalyDetection
Session: 06
Title: Machine Learning for Anomaly
Detection
Agenda
• Anomaly Detection
• Machine Learning Algorithms for Anomaly Detection
• Association Rules
• Artificial Neural Network
• Random Forest Approach
• Clustering Approach
• Deep Learning Techniques
• Ref: https://ff12.fastforwardlabs.com
BITS Pilani, Pilani Campus
Anomaly Detection
Global Outliers
Supervised Learning
• Neural networks designed to learn mappings between data that are represented as sequences.
• Each token in a sequence may have some form of temporal dependence on other tokens i.e. a
relationship that has to be modelled to achieve correct results.
• Example: Task of language translation where a sequence of words in one language needs to be
mapped to a sequence of words in a different language.
• To excel at such a task, a model must take into consideration the location of each word/token within
the broader sentence to generate an appropriate translation
• Models has an encoder that generates a hidden representation of the input tokens, and a decoder
that takes in the encoder representation and sequentially generates a set of output tokens.
• Encoder and decoder are composed of long short-term memory (LSTM) blocks, that are particularly
suitable for modelling temporal relationships within input data tokens.
• Sequence-to-sequence models can be slow during inference; each individual token in the model
output is sequentially generated at each time step, where the total number of steps is the length of
the output token.