Bayes Classifier Compressed
Bayes Classifier Compressed
Satellite Images
1
FDP on AI & Block Chain National Institute of Technology, Rourkela Date: August 27, 2024
Supervised Classification
Classification
Two cases:
x x1 ,..., xl Rl
T
Patterns
sensor
feature
generation
feature
selection
classifier
design
system
evaluation
4
Vector Representation
2D Euclidean Space
Probability Density Function
Let 𝑝 𝑥 ; 𝑥 ∈ ℝ𝑛 is said to be probability
density function if
• 𝑝 𝑥 ≥ 0; ∀𝑥 ∈ ℝ𝑛
∞
• −∞ 𝑝 𝑥 𝑑𝑥 = 1
6
Normal Distribution
𝑥, 𝑦
𝑥, 𝑦
Case 2 Case 3
There is a proof using Cauchy Schwarz
inequality
CLASSIFIERS BASED ON BAYES DECISION THEORY
x i : P(i x)
That is
maximum
17
• Computation of a-posteriori probabilities
– Assume known
• a-priori probabilities
P(1 ), P(2 )..., P(M )
• p( x i ), i 1,2...M
18
The Bayes rule (Μ=2)
p ( x) P (i x ) p ( x i ) P (i )
p ( x i ) P (i )
P (i x )
p( x)
where 2
p ( x ) p ( x i ) P (i )
i 1
19
The Bayes classification rule (for two classes M=2)
Given x classify it according to the rule
If P(1 x ) P(2 x ) x 1
If P(2 x ) P(1 x ) x 2
Equivalently: classify x according to the rule
p ( x 1 ) P (1 )( ) p ( x 2 ) P (2 )
For equiprobable classes the test becomes
p( x 1 )() P( x 2 )
20
R1 ( 1 ) and R2 ( 2 )
Example of the two regions R1 and R2 formed by the Bayesian classifier for the case
of two equiprobable classes. 21
• Equivalently in words: Divide space in two regions
If x R1 x in 1
If x R2 x in 2
• Probability of error
– Total shaded area
x0
– Pe p( x
2 )dx p( x 1 )dx
x0
23
– For M=2
• Define the loss matrix
11 12
L( )
21 22
r1 11 p( x 1 )d x 12 p( x 1 )d x
R1 R2 24
– Risk with respect to 2
r2 21 p ( x 2 )d x 22 p ( x 2 )d x
R1 R2
–
Probabilities of wrong decisions,
weighted by the penalty terms
– Average risk
r r1 P(1 ) r2 P( 2 )
25
• Choose R and R2so that r is minimized
1
p( x 1 ) P(2 ) 21 22
12 ( )
p ( x 2 ) P(1 ) 12 11
12 : likelihood ratio
26
If 1
P (1 ) P (2 ) and 11 22 0
2
21
x 1 if P ( x 1 ) P ( x 2 )
12
12
x 2 if P ( x 2 ) P ( x 1 )
21
if 21 12 Minimum classification
error probability
27
Practical Implementation on
Baye’s Decision Rule
Examples on Satellite images of
Kolkata: Given Input Images
R Band Image
(512 * 512)
• Take 50 sample points (Pixel location’s corresponding pixel values) from river
class for training for each band
• Take 100 sample points (Pixel location’s corresponding pixel values) from non
river class for training for each band.
• Take (512 * 512) sample points (Pixel location’s corresponding pixel values) for
testing for each band.
• Apply Bayes’ decision rule to classify all the test sample either in river or
nonriver class denoting 0 and 255 at corresponding pixel locations.
• Show the result in image form with black and white image (either 0 and 255)
How to choose sample points
Implementation process of Density
Function
• Step 1: Calculate Mean of River Class : T1 = [Mean1; Mean2; Mean3 ;Mean4];
Mean1 = mean of Rband image for 50 sample points
Mean2 = mean of Gband Image for 50 sample points
Mean3 = mean of Bband image for 50 sample points
Mean4 = mean of Iband image for 50 sample points
• Step 3: Calculate the Covariance Matrix for River Class for 50 samples which is 4 *
4 dimensions. Basically (X – T1) deviation and (Y – T1) deviation and multiply it and
summing up
where X and Y represents all the sample points considered for training ( R, G, B and I
band image) we will get 2^4 = 16 values in the covariance matrix for possible
combinations of 4 band images. We are doing the deviation of sample points from the
mean vector.
(Apply covariance matrix calculation formula)
• Step 4: Calculate the Covariance Matrix for Non River Class for 100 samples
which is 4 * 4 dimensions also by applying same process explained in step 3.
• Step 5: Take whole image for test data where : test_data= [Rband_img(i,j)
Gband_img(i,j) Bband_img(i,j) Iband_img(i,j)]; i = 1 to 512; and j = 1 to 512;
• Step 7: For each pixel location of test image Run the loop from i = 1 to (512 X
512) Do
• Step 8: For river class calculate (test_data – T1) deviation and (test_data
– T1)T Then Multiply it :
River_class = (Test_data – T1)T * Inverse (Covariance_matrix_Riverclass)
*(Test_data – T1)
• Step 11: Calculate density function p2 for nonriver class where P2 = 0.7 given
p2 = 1/sqrt( Determinant of Covariance_matrix_nonRiverclass) * exp(-0.5 *
NonRiver_class);
• Step 12: For each pixel location of test image apply Bayes’ rule (P1 * p1) >= (P2 *
p2) then
Out_image(i) = 255 (River class)
Else
Out_image(i) = 0; (Nonriver class)
• Step 14: Show the three output image Image using imshow function for three cases:
Case 1 : River class (Prior Prob: ) = 0.3 , Nonriver class(Prior Prob) = 0.7
Case 2 : River class (Prior Prob: ) = 0.7 , Nonriver class(Prior Prob) = 0.3
Case 3 : River class (Prior Prob: ) = 0.5 , Nonriver class(Prior Prob) = 0.5
Thank You
42