0% found this document useful (0 votes)
23 views

Week 01

The document provides information about a machine learning course including the course number, credits, instructor, topics to be covered, assessment details, and introductions to key concepts like machine learning, deep learning, applications, supervised learning, unsupervised learning, and reinforcement learning. It defines common terms like features, examples, targets, and outlines the typical workflow for supervised learning problems.

Uploaded by

Osii C
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Week 01

The document provides information about a machine learning course including the course number, credits, instructor, topics to be covered, assessment details, and introductions to key concepts like machine learning, deep learning, applications, supervised learning, unsupervised learning, and reinforcement learning. It defines common terms like features, examples, targets, and outlines the typical workflow for supervised learning problems.

Uploaded by

Osii C
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 37

EC-452

Machine
Learning
Fall 2023
COURSE INFORMATION
Course Number and Title: EC-452 Machine Learning

Credits: 3-0

Instructor(s)-in-charge: Dr. Ahmad Rauf Subhani (Assistant Prof)

Course type: Lecture

Required or Elective: Elective

Course pre-requisites Math-361 Probability and Statistics (Preferred)

Degree and Semester DE-42 (Electrical), Semester 7

Month and Year Fall 2023


Assessment
Course Assessment
Exam: 1 Midterm and 1 Final Examination
Assignment: -------
Quiz: 6 Quizes
Grading: Quiz: 10-15%
Assignments: 5-10%
Mid Semester Exam: 30-35%
Project 0-10%
End Semester Exam: 40-50%
Topics covered in the Course
Introduction to Machine Learning
• Machine learning is the field of study that gives computers the ability
to learn without being explicitly programmed. — Arthur L. Samuel, AI
pioneer, 1959

• A breakthrough in machine learning would be worth ten Microsofts.


— Bill Gates, Microsoft Co-Founder
Introduction to Machine Learning
Introduction to Machine Learning
Introduction to Machine Learning
• Machine Learning
• Deep Learning
• Artificial Intelligence
Introduction to Machine Learning
Introduction to Machine Learning
• Machine Learning is a tool.
• Like any other tool, it is important to read and understand its user manual.
• What are some other daily life tools?

• Do we need a user manual of a pen or a tyre???


Introduction to Machine Learning
• Do we need a user manual of a pen or a tyre???
Applications of Machine Learning
• Email spam detection • Smart assistants (Apple Siri, Amazon
• Face detection and matching (e.g., iPhone Alexa, . . . )
X) • Product recommendations (e.g., Netflix,
• Web search (e.g., DuckDuckGo, Bing, Amazon)
Google) • Self-driving cars (e.g., Uber, Tesla)
• Sports predictions • Language translation (Google translate)
• Post office (e.g., sorting letters by zip • Sentiment analysis
codes)
• Chat GPT and Google Bard
• ATMs (e.g., reading checks)
• The list goes on…
• Credit card fraud
• Drug design
• Medical diagnoses
Exercise
• While we proceed in the class, it is a good exercise to think about how
machine learning could be applied in these problem areas or tasks listed
above:

What is the desired outcome?


What could the dataset look like?
Is this a supervised or unsupervised problem, and what algorithms would you use?
(Something to revisit later in this semester.)
How would you measure success?
What are potential challenges or pitfalls?
Common Understanding
• Feature:
• A measurable property of the object (data) you're
trying to analyze.
• In datasets, features appear as columns
• Feature variable, attribute, measurements, dimension
• Examples/ Samples:
• Entries in features columns
• In datasets, examples/samples, instances, observations
appear as row
• Target, synonymous to
• outcome, ground truth, response variable, dependent
variable, (class) label (in classification)
•Output / Prediction
• Use this to distinguish from targets; here, means output
from the model
Common
Understanding x2

• Classification x1

• A process of categorizing a given set of data


(feature or example?) into classes.
• The classes are often referred to as target, label
or categories.
• Regression
• A technique for investigating the relationship
between independent variables or features and
a dependent variable or outcome. It's used as a y

method for predictive modelling in machine


learning, in which an algorithm is used to
predict continuous outcomes. x
Categories of Machine Learning

 Labelled data
Supervised Learning  Direct feedback
 Predict outcome/future

 No labels/target
Unsupervised Learning  No feedback
 Find hidden structure in data

 Decision process
Reinforcement Learning  Reward system
 Learn series of actions

Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.


Supervised Learning Workflow
Label
s
Training Data

Machine
Learning
Algorithm

New Data Predictive Prediction


Model
Supervised
Learning
• Learning from labeled
training data
• Inputs that also x2
contain the desired
outputs or targets;
basically, “examples”
of what we want to
x1
predict.
Illustration of a binary classification problem (plus,
minus) and two feature variable (x1 and x2).
(Source: Raschka & Mirjalili: Python Machine
Learning, 2nd Ed.)
Supervised
Learning
y

Illustration of a linear regression model with one feature


(predictor) variable (x1) and the target (response) variable y.
The dashed-line indicates the functional form of the linear
regression model. (Source: Raschka & Mirjalili: Python
Machine Learning, 2nd Ed.)
Unsupervised
learning
• Unsupervised learning is concerned
with unlabelled data
• Common tasks in unsupervised x2

learning are clustering analysis


(assigning group memberships)
and dimensionality reduction x1

(compressing data onto a lower- Illustration of clustering, dashed lines indicate


dimensional subspace or potential group membership assignments of
manifold) unlabeled data points.
(Source: Raschka & Mirjalili: Python Machine
Learning, 2nd Ed.)
Unsupervised learning
• Dimensionality reduction
Reinforcement learning
• The process of learning from rewards
while performing a series of actions
• We do not tell the learner, for example,
a (ro)bot, which action to take
• But merely assign a reward to each
action and/or the overall outcome.
• Instead of having “correct/false” label
for each step, the learner must discover
or learn a behavior that maximizes the
reward for a series of actions.
Illustration of reinforcement learning
• Not a supervised setting and somewhat (Source: Raschka & Mirjalili: Python Machine Learning,
related to unsupervised learning 2nd Ed.)
Common Understanding (Jargons)
• Feature:
• A measurable property of the object (data) you're trying to analyze.
• In datasets, features appear as columns
• Predictor, variable, independent variable, input, attribute, covariate
• Examples/ Samples (of training and testing):
• Entries in features columns
• In datasets, examples/samples appear as row
• Observation, training record, training instance, training sample (in some contexts, sample
refers to a collection of training examples)

• Target, synonymous to
• outcome, ground truth, output, response variable, dependent variable, (class) label (in classification)

•Output / Prediction, use this to distinguish from targets; here, means output from the
model
Common Understanding (Jargons)
• Identify features and examples in the following data?
Common Understanding (Jargons)
• Supervised learning:
• Learn function to map input x (features) to output y (targets)

• Structured data:
• Databases, spreadsheets/csv files

• Unstructured data:
• Features like image pixels, audio signals, text sentences (before
DL, extensive feature engineering required)
Common Understanding (Jargons)
• Unstructured data
Supervised Learning
A Roadmap for Building Machine Learning
Systems
Feature Extraction and Scaling
Feature Selection
Dimensionality Reduction Mostly not needed in DL
i
Sampling

Label
s
Training Dataset
Learning Final
Label New Data
s Model
Test Dataset
Raw Algorithm

Data Labels
Preprocessing Learnin Evaluatio
g n
Predict
Model Selection
ion
Cross-Validation
Performance
Metrics
Hyperparameter
Optimization
Supervised Learning (Notation)

"training examples"

Training set: 𝒟 = {⟨x[i], y[i]⟩, i = 1,…, m},

Unknown function: f(x) = y

Hypothesis: h(x) = y

Classification Regression

h : ℝm → 𝒴, 𝒴 = {1,...,k} h : ℝm → ℝ
Data Representation
x1

x= x2

xn
Feature vector
Cont

x1 x1 x1[1 x [1 ⋯ xn[1
]2 ] ]
x1[2 x 2[2 ⋯ x n[2
X= x2 X=
x = x
[i] T
⋮ ]⋮ ]⋮
2 ⋱ ]⋮

⋮ xm x1[m] x [m ⋯ xn[m]
xn 2 ]

Feature vector Design Matrix Design Matrix


Data Representation (structured data)

m=

n=

33
Hypothesis Space
Entire hypothesis space

Hypothesis space
a particular learning
algorithm category
has access to

Hypothesi
s space
a particular learning
algorithm can sample
Particular hypothesis
(i.e., a model/classifier)
Classes of Machine Learning Algorithms
Below are some classes of algorithms that we are going to discuss in
this class:
• Generalized linear models (e.g., logistic regression)
• Support vector machines (e.g., linear SVM, RBF-kernel SVM)
• Artificial neural networks (e.g., multi-layer perceptrons)
• Tree- or rule-based models (e.g., decision trees)
• Graphical models (e.g., Bayesian networks)
• Ensembles (e.g., Random Forest)
• Instance-based learners (e.g., K-nearest neighbors
Algorithm Categorization Schemes
• Eager vs lazy learners
• Eager learners process training data immediately
• lazy learners defer the processing step until the prediction, e.g., the nearest neighbor algorithm.
• Batch vs online learning
• In batch learning, the model is learned on the entire set of training examples.
• Online learners, in contrast, learn from one training example at the time.
• It is common, in practical applications, to learn a model via batch learning and then update it later using
online learning.
• Parametric vs nonparametric models
• Parametric models are “fixed” models, where we assume a certain functional form for f (x) = y. For
example, linear regression with h(x) = w1x1 + ... + wmxm + b.
• Nonparametric models are more “flexible” and do not have a prespecfied number of parameters. In
fact, the number of parameters grows typically with the size of the training set. For example, a
decision tree would be an example of a nonparametric model, where each decision node (e.g., a binary
“True/False” assertion) can be regarded as a parameter.
Algorithm Categorization Schemes
• Discriminative vs generative
• Generative models (classically) describe methods that model the joint distribution P (X, Y ) =
P (Y )P (X|Y ) = P (X)P (Y|X) for training pairs < x[i], y[i] >.
• Discriminative models are taking a more “direct” approach for modeling P (Y|X) directly.
• While generative models provide typically more insights and allow sampling from the joint
distribution, discriminative models are typically easier to compute and produce more
accurate predictions.
• Discriminative modeling is like trying to extract information from text in a foreign language
without learning that language.
• Generative modeling is like generating text in a foreign language.

You might also like