0% found this document useful (0 votes)
19 views

Part 1 - Machine Learning

Uploaded by

tahanirjab123
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

Part 1 - Machine Learning

Uploaded by

tahanirjab123
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Artificial Intelligence

Dr. Eng. Wajdi SAADAOUI


Assistant Professor, ENIT, ENSI, ISSAT, Higher Education Ministry, Tunisia

Wajdi SAADAOUI 1
Outline
1. Artificial Intelligence (AI): from perception to reasoning
2. How to design and use a Machine Learning (ML)?
3. Machine Learning Techniques: A brief Review & Comparison
4. Neural Network: Theory and Application
5. Naïve Bayes: Theory and Application
6. Support Vector Machines (SVM): Theory and Application
7. How to select the appropriate Machine Learning
8. How to evaluate a Machine Learning Performance?

Wajdi SAADAOUI 4
1. Artificial Intelligence (AI): from perception to reasoning
Intelligence Artificial Intelligence

Image
Perception Processing
Living beings

Optimization Bio-Inspired
Living beings Optimization

Learning Machine
Learning
Baby, Animal, etc.

Reasoning Fuzzy Logic


Human

Wajdi SAADAOUI 5
2. How to design and use a Machine Learning (ML)?
Training phase

Train Features Features


Preprocessing
Image Database Representation Classification

Dataset
Model F

Class 1 Class 2
Class 1 Class 2

Projection on the model


Testing phase

F(X|X=”I”)=P1
Features F(X/X=”II”)=P2
Unkonwn Image Preprocessing
Representation max(P1,P2)

Unkonwn Vector
X

Wajdi SAADAOUI 6
3. Machine Learning Techniques: A brief Review &
Comparison

Euclidian Distance
Similarity based
Cosine Distance

Supervised Learning Probability based Naïve Bayes


Machine Learning

Unsupervised Support Vector


Single Hidden Layer
Learning Machines
Boundary Decision
based
Reinforcement Multi Layer
Neural Network Autoencoder
Learning Perception (MLP)

Deep Learning CNN

RNN

Wajdi SAADAOUI 7
4. Neural Network: Theory and Application

Wajdi SAADAOUI 8
4. Neural Network: Theory and Application
3x5x5x2

F F
x F F
C1
y F F
C2
z F F

W1 F F W3
W2

[x,y,z]: Input Vector


W1: Weight Matrix of Input Layer
W2: Weight Matrix of Hidden Layer
W3: Weight Matrix of Output Layer
F: Activation Function
C1: Class Output 1
C2: Class Output 2

Wajdi SAADAOUI 9
4. Neural Network: Theory and Application
F F
x F F
C1
y F F
C2
z F F

W1 F F W3
W2

Input: 1x3 to classify F(X|X=”I”)=P1 Model


F(X/X=”II”)=P2 F [P1, P2]
max(P1,P2)
(N,M) x (MxP) = (N,P)

1x3 3x5 1x5 5x5 1x5 5x2 1x2


W3
w’11 w’12 w’13 w’14 w’15 w’’11 w’’12
W2 w’21 w’22 w’23 w’24 w’25
w11 w12 w13 w14 w15 w’’21 w’’22
W1 w’31 w’32 w’33 w’34 w’35 w’’31 w’’32
w21 w22 w23 w24 w25
w’41 w’42 w’43 w’44 w’45 w’’41 w’’42
w31 w32 w33 w34 w35
w’51 w’52 w’53 w’54 w’55 w’’51 w’’52

Wajdi SAADAOUI 10
4. Neural Network: Theory and Application

F= Nonlinear Activation Function to insert a Non Linear Representation into Neural Network

Wajdi SAADAOUI 11
4. Neural Network: Theory and Application
Train Vector = [2 , -1] ; Train Label = 1
𝟏
Logistic function =
𝟏 𝒆 𝒙

-1

Wajdi SAADAOUI 12
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 1: Weights’ Initialization

0,5 1
2 1
-1 -1

1,5 3

-1 -3
-2 -4

Wajdi SAADAOUI 13
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 ? 1
-1 -1

1,5 3

-1 -3
-2 -4

? = 𝒍𝒐𝒈𝒊𝒔𝒕𝒊𝒄 𝟎, 𝟓 ∗ 𝟐 + 𝟏, 𝟓 ∗ −𝟏 = 𝒍𝒐𝒈𝒊𝒔𝒕𝒊𝒄 −𝟎, 𝟓 = 𝟎, 𝟑𝟕𝟖

Wajdi SAADAOUI 14
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 0,378 1
-1 -1

1,5 3

-1 -3
-2 -4

? = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 0,5 ∗ 2 + 1,5 ∗ −1 = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 −0,5 = 0,378

Wajdi SAADAOUI 15
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 0,378 1
-1 -1

1,5 3

-1 ? -3
-2 -4

? = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 −1 ∗ 2 + (−2) ∗ −1 = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 0 = 0,5

Wajdi SAADAOUI 16
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 0,378 ? 1
-1 -1

1,5 3

-1 0,5 -3
-2 -4

? = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 1 ∗ 0,378 + 3 ∗ 0,5 = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 1,878 = 0,876

Wajdi SAADAOUI 17
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 0,378 0,876 1
-1 -1

1,5 3

-1 0,5 0,085 -3
-2 -4

? = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 (−1) ∗ 0,378 + (−4) ∗ 0,5 = 0,085

Wajdi SAADAOUI 18
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 2: Forward Pass


Forward

0,5 1
2 0,378 0,876 1
-1 -1
0,648

1,5 3

-1 0,5 0,085 -3
-2 -4

? = 𝑙𝑜𝑔𝑖𝑠𝑡𝑖𝑐 1 ∗ 0,876 + (−3) ∗ 0,085 = 0,648

Wajdi SAADAOUI 19
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

0,5 1
2 0,378 0,876 1
Δ = 1 – 0,648 = 0,352
-1 -1
0,648

1,5 3

-1 0,5 0,085 -3
-2 -4

Wajdi SAADAOUI 20
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

Δ = 0,041
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
0,648

1,5 3

-1 0,5 0,085 -3
-2 -4

Δ = 0,876 ∗ (1 − 0,876) ∗ (1 ∗ 0,352) = 0,041

Wajdi SAADAOUI 21
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

Δ = 0,041
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
0,648

1,5 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

Δ = 0,085 ∗ (1 − 0,085) ∗ ((−3) ∗ 0,352) = − 0,082

Wajdi SAADAOUI 22
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

Δ = 0,041
Δ
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
0,648

1,5 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

Δ = 0,378 ∗ (1 − 0,378) ∗ [ 1∗ 0,041 + (−1) ∗ (−0,082) ] = 0,029

Wajdi SAADAOUI 23
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

Δ = 0,029 Δ = 0,041
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
0,648

1,5 Δ Δ = −0,082
3

-1 0,5 0,085 -3
-2 -4

Δ = 0,5 ∗ (1 − 0,5) ∗ [ 3∗ 0,041 + (−4) ∗ (−0,082) ] = 0,113

Wajdi SAADAOUI 24
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙

Step 3: Backward Pass


Backward

Δ = 0,029 Δ = 0,041
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
0,648

1,5 Δ = 0,113 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

Δ = 0,5 ∗ (1 − 0,5) ∗ [ 3∗ 0,041 + (−4) ∗ (−0,082) ] = 0,113

Wajdi SAADAOUI 25
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
Step 4: Weights’ Update
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

0,5 -> weight old value + α * neuron value * Delta of the next neuron

0,5 −> 0,5 + 0,1 ∗ 2 ∗ 𝟎, 𝟎𝟐𝟗 = 𝟎, 𝟓𝟎𝟔

Wajdi SAADAOUI 26
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506
Step 4: Weights’ Update
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

0,5 -> weight old value + α * neuron value * Delta of the next neuron

0,5 −> 0,5 + 0,1 ∗ 2 ∗ 𝟎, 𝟎𝟐𝟗 = 𝟎, 𝟓𝟎𝟔

Wajdi SAADAOUI 27
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506
Step 4: Weights’ Update
0,5 1
2 0,378 0,876 1
Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082

-1 0,5 0,085 -3
-2 -4

0,5 −> 0,5 + 0,1 ∗ 2 ∗ 𝟎, 𝟎𝟐𝟗 = 𝟎, 𝟓𝟎𝟔


−1 −> −1 + 0,1 ∗ 2 ∗ 𝟎, 𝟏𝟏𝟑 = −𝟎, 𝟗𝟕𝟕
1,5 −> 1,5 + 0,1 ∗ (−1) ∗ 𝟎, 𝟎𝟐𝟗 = 𝟏, 𝟒𝟗𝟕
−2 −> −2 + 0,1 ∗ (−1) ∗ 𝟎, 𝟏𝟏𝟑 = −𝟐, 𝟎𝟏𝟏

Wajdi SAADAOUI 28
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506
Step 4: Weights’ Update
0,5 1
2 0,378 0,876 1
−𝟎, 𝟗𝟕𝟕 Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082


𝟏, 𝟒𝟗𝟕
-1 0,5 0,085 -3
-2 -4
−𝟐, 𝟎𝟏𝟏
1 −> 1 + 0,1 ∗ 0,378 ∗ 𝟎, 𝟎𝟒𝟏 = 𝟏, 𝟎𝟎𝟐
−1 −> −1 + 0,1 ∗ 0,378 ∗ −𝟎, 𝟎𝟖𝟐 = −𝟏, 𝟎𝟎𝟑
3 −> 3 + 0,1 ∗ 0,5 ∗ 𝟎, 𝟎𝟒𝟏 = 𝟑, 𝟎𝟎𝟐
−4 −> −4 + 0,1 ∗ 0,5 ∗ −𝟎, 𝟎𝟖𝟐 == −𝟒, 𝟎𝟎𝟒

Wajdi SAADAOUI 29
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506 𝟏, 𝟎𝟎𝟐
Step 4: Weights’ Update
0,5 1
2 0,378 0,876 1
−𝟎, 𝟗𝟕𝟕 -𝟏, 𝟎𝟎𝟑 Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082


𝟏, 𝟒𝟗𝟕 3,002
-1 0,5 0,085 -3
-2 -4
−𝟐, 𝟎𝟏𝟏 -4,004

1 −> 1 + 0,1 ∗ 0,876 ∗ 𝟎, 𝟑𝟓𝟐 = 𝟏, 𝟎𝟑𝟏


−3−> −3 + 0,1 ∗ 0,085 ∗ 𝟎, 𝟑𝟓𝟐 = −𝟐, 𝟗𝟗𝟕

Wajdi SAADAOUI 30
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506 𝟏, 𝟎𝟎𝟐
Step 4: Weights’ Update 1,031
0,5 1
2 0,378 0,876 1
−𝟎, 𝟗𝟕𝟕 -𝟏, 𝟎𝟎𝟑 Δ = 0,352
-1 -1
Learning Rate α=0,1 0,648

1,5 Δ = 0,113 3 Δ = −0,082


𝟏, 𝟒𝟗𝟕 3,002 -2,997
-1 0,5 0,085 -3
-2 -4
−𝟐, 𝟎𝟏𝟏 -4,004

Step 5: Repeat Step 1 to Step 4 for all train vectors

Train Database

Wajdi SAADAOUI 31
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506 𝟏, 𝟎𝟎𝟐
Step 4: Weights’ Update 1,031
0,5 1
2 0,378 0,876 1
−𝟎, 𝟗𝟕𝟕 -𝟏, 𝟎𝟎𝟑 Δ = 0,352
-1 -1
0,648
Train Database
1,5 Δ = 0,113 3 Δ = −0,082
𝟏, 𝟒𝟗𝟕 3,002 -2,997
-1 0,5 0,085 -3
-2 -4
−𝟐, 𝟎𝟏𝟏 -4,004

Step 5: Repeat Step 1 to Step 4 for all train vectors

Wajdi SAADAOUI 32
4. Neural Network: Theory and Application
𝟏
Train Vector = [2 , -1] ; Train Label = 1 Logistic function =
𝟏 𝒆 𝒙
Δ = 0,029 Δ = 0,041
0,506 𝟏, 𝟎𝟎𝟐
Step 4: Weights’ Update 1,031
0,5 1
2 0,378 0,876 1
−𝟎, 𝟗𝟕𝟕 -𝟏, 𝟎𝟎𝟑 Δ = 0,352
-1 -1
Epoch Train 0,648
Database

1,5 Δ = 0,113 3 Δ = −0,082


𝟏, 𝟒𝟗𝟕 3,002 -2,997
-1 0,5 0,085 -3
-2 -4
−𝟐, 𝟎𝟏𝟏 -4,004

Step 5: Repeat Step 1 to Step 4 for all train vectors

Wajdi SAADAOUI 33
5. Naïve Bayes: Theory and Application
Learning
Activity

Color Type Origin Stolen Color


𝟑 𝟐
𝑷 𝑹𝒆𝒅/𝒀𝒆𝒔 = 𝑷 𝒀𝒆𝒍𝒍𝒐𝒘/𝒀𝒆𝒔 =
Red Sport Domicile Yes 𝟓 𝟓
𝟐 𝟑
𝑷 𝑹𝒆𝒅/𝑵𝒐 = 𝑷 𝒀𝒆𝒍𝒍𝒐𝒘/𝑵𝒐 =
Red Sport Domicile No 𝟓 𝟓
Red Sport Domicile Yes
Yellow Sport Domicile No Type
𝟓 𝟒 𝟏
𝑷 𝒀𝒆𝒔 = 𝑷 𝑺𝒑𝒐𝒓𝒕/𝒀𝒆𝒔 = 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝒀𝒆𝒔 =
𝟏𝟎
Yellow Sport Importation Yes 𝟓 𝟓
𝟓 𝟐 𝟑
𝑷 𝑺𝒑𝒐𝒓𝒕/𝑵𝒐 = 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝑵𝒐 =
Yellow Classic Importation No 𝑷 𝑵𝒐 =
𝟏𝟎 𝟓 𝟓

Yellow Classic Importation Yes


Yellow Classic Domicile No Origin
𝟐 𝟑
Red Classic Importation No 𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝒀𝒆𝒔 = 𝑷 𝑰𝒎𝒑𝒐𝒓𝒕𝒂𝒕𝒊𝒐𝒏/𝒀𝒆𝒔 =
𝟓 𝟓
𝟑 𝟐
Red Sport Importation Yes 𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝑵𝒐 = 𝑷 𝑰𝒎𝒑𝒐𝒓𝒕𝒂𝒕𝒊𝒐𝒏/𝑵𝒐 =
𝟓 𝟓

Wajdi SAADAOUI 34
5. Naïve Bayes: Theory and Application
Testing

𝟓 𝟓
𝑷 𝑵𝒐 = 𝑷 𝒀𝒆𝒔 =
𝟏𝟎 𝟏𝟎
Sample X= <Red, Classic, Domicile>

Color
𝟑 𝟐
𝑷 𝑿, 𝒀𝒆𝒔 = 𝑷 𝑹𝒆𝒅/𝒀𝒆𝒔 x 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝒀𝒆𝒔 x 𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝒀𝒆𝒔 x 𝑷 𝒀𝒆𝒔 𝑷 𝑹𝒆𝒅/𝒀𝒆𝒔 = 𝑷 𝒀𝒆𝒍𝒍𝒐𝒘/𝒀𝒆𝒔 =
𝟓 𝟓
𝟑 𝟏 𝟐 𝟓 𝟐 𝟑
= ∗ ∗ ∗ 𝑷 𝑹𝒆𝒅/𝑵𝒐 = 𝑷 𝒀𝒆𝒍𝒍𝒐𝒘/𝑵𝒐 =
𝟓 𝟓 𝟓 𝟏𝟎
𝟓 𝟓

𝑷 𝑿, 𝑵𝒐 = 𝑷 𝑹𝒆𝒅/𝑵𝒐 x 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝑵𝒐 x 𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝑵𝒐 x 𝑷 𝑵𝒐 Type


𝟐 𝟑 𝟑 𝟓
𝟒 𝟏
= ∗ ∗ ∗ 𝑷 𝑺𝒑𝒐𝒓𝒕/𝒀𝒆𝒔 = 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝒀𝒆𝒔 =
𝟓 𝟓 𝟓 𝟏𝟎 𝟓 𝟓
𝟐 𝟑
𝑷 𝑺𝒑𝒐𝒓𝒕/𝑵𝒐 = 𝑷 𝑪𝒍𝒂𝒔𝒔𝒊𝒄/𝑵𝒐 =
𝟓 𝟓
Origin
𝟐 𝟑
𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝒀𝒆𝒔 = 𝑷 𝑰𝒎𝒑𝒐𝒓𝒕𝒂𝒕𝒊𝒐𝒏/𝒀𝒆𝒔 =
𝟓 𝟓
𝟑 𝟐
𝑷 𝑫𝒐𝒎𝒊𝒄𝒊𝒍𝒆/𝑵𝒐 = 𝑷 𝑰𝒎𝒑𝒐𝒓𝒕𝒂𝒕𝒊𝒐𝒏/𝑵𝒐 =
𝟓 𝟓

Wajdi SAADAOUI 35
6. Support Vector Machines: Theory and Application

Basic Idea: Find the appropriate Support Vector which maximize Margin Distance

Class A Features Space

M1
M2
M1 + M2 = Margin Distance

Class B Support Vector SV: A.X + B

Wajdi SAADAOUI 36
6. Support Vector Machines: Theory and Application

Basic Idea: Find the appropriate Support Vector which maximize Margin Distance

Case 1: Linear Separation Case 2: Non Linear Separation


Class A

Class A

Class B
M1
M2

Class B
Support Vector SV: A.X + B Support Vector SV: F(X)
F is a non linear Function

Wajdi SAADAOUI 37
6. Support Vector Machines: Theory and Application

Basic Idea: Find the appropriate Support Vector which maximize Margin Distance

Case 1: Linear Separation Case 2: Non Linear Separation


Class A

Class A

Class B
M1
M2

Class B
Support Vector SV: A.X + B Support Vector SV: F(X)
F is a non linear Function

Wajdi SAADAOUI 38
6. Support Vector Machines: Theory and Application
New Features Space
Features Space
Class A

Kernel Class B

Class C
Class D

Wajdi SAADAOUI 39
6. Support Vector Machines: Theory and Application

Kernel

Wajdi SAADAOUI 40
6. Support Vector Machines: Theory and Application
Case 1: Linear Separation

1. K Support Vectors = N-1 where N is the number of samples (K=3)


2. Initialize 3 linear support vectors
• D1= A1*X+B1
• D2= A2*X+B2
• D3= A3*X+B3
3. Compute the accuracy of SVs
• D1 (75%)
• D2 (100%)
• D3 (100%)
4. Thresholding Acc>85%
• D2 (100%)
• D3 (100%)
4. Compare Margin Distance
• M2 (100%)
• M3 (100%)
• We keep the highest one

Wajdi SAADAOUI 41
6. Support Vector Machines: Theory and Application
Case 2: Non Linear Separation

1. Use a Kernel Function F=x^2


(0,1) 2. Transform Vectors using F function
3. Apply Linear separability
(-1,1) 4. K Support Vectors = N-1 where N is the number of samples (K=3)
5. Initialize 3 linear support vectors
• D1= A1*X+B1
• D2= A2*X+B2
(0,-1) • D3= A3*X+B3
3. Compute the accuracy of SVs
• D1 (75%)
(F(0)=1,F(1)=1) (F(1)=1,F(1)=1) • D2 (100%)
• D3 (100%)
4. Thresholding Acc>85%
(F(0)=1,F(-1)=1) (F(-1)=1,F(1)=1) • D2 (100%)
• D3 (100%)
4. Compare Margin Distance
• M2 (100%)
• M3 (100%)
• We keep the highest one

Wajdi SAADAOUI 42
7. How to select the appropriate Machine Learning

Kernel Supervised Machine


Features vector is
descirminant (ADN) Learning
+ Very Speed

Similarity Probability Boundary Decision


Small Data
N~1k 10K

Support Vector
Euclidean (Kmeans) Naive Bayes Neural Network Decision Tree
Machines
Huge Data Huge Data
Categorical Numerical
Cosine Feed Forward Reccurent Random Forest

SLP MLP ESN

CNN LSTM

AE

Wajdi SAADAOUI 43
8. How to evaluate the Machine Learning Performance

65

35

𝑅𝑒𝑐𝑎𝑙𝑙 ∗ 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛
𝐹1 − 𝑠𝑐𝑜𝑟𝑒 = 2 ∗
𝑅𝑒𝑐𝑎𝑙𝑙 + 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛

𝑔 = (𝑅𝑒𝑐𝑎𝑙𝑙 ∗ 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛)

Wajdi SAADAOUI 44
8. How to evaluate the Machine Learning Performance

𝑅𝑒𝑐𝑎𝑙𝑙 ∗ 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛
𝐹1 − 𝑠𝑐𝑜𝑟𝑒 = 2 ∗
𝑅𝑒𝑐𝑎𝑙𝑙 + 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛

𝑔 = (𝑅𝑒𝑐𝑎𝑙𝑙 ∗ 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛)

Accuracy = (100+140)/300 %
Activity Class 1 Class 2
After training your model, you have 300 samples for test & validation to
evaluate your model portioned as follow: Class 1 (150) 100 50
• 150 Class 1 Precision = 100/150
• 150 Class 2
Class 2 (150) 10 140
After testing your model well Classify 100 from Class 1 and 140 from Class 2

Recall = 100/100+10

Wajdi SAADAOUI 45
7. How to select the appropriate Machine Learning

Summary

Machine Learning Model Projection Efficiency


Similarity Dataset Similarity Measures Discriminant Features vector
Naive Bayes Set of probabilities Compute Probability of each class Huge Categorical Data
Support Vector Machines Set of Support Vectors Compute marginal distance Small Numerical data
Neural Network Weight Matrix Scalar Product Huge Numerical Data

Wajdi SAADAOUI 46

You might also like