Machine Learning-Lecture 3(Student)
Machine Learning-Lecture 3(Student)
where k and k2 are the mean and variance parameters for the kth class.
For now, let us further assume that 12 = = K2 : that is, there is a shared
variance term across all K classes, which for simplicity we can denote by
2 . We find that
.
It is not hard to show that this is equivalent to assigning the observation to
the class for which
is largest.
➢ For instance, if K = 2 and π1 = π2, then the Bayes classifier assigns an
observation to class 1 if , and to class 2
otherwise. In this case, the Bayes decision boundary corresponds to the
point where
1
➢ The estimates
.
The Bayes classifier assigns an observation X = x to the class for which
is largest.
2
Computer Session
library(ISLR2)
library(MASS)
## Call:
## lda(Direction ~ Lag1 + Lag2, data = Smarket, subset = train)
##
## Prior probabilities of groups:
## Down Up
## 0.491984 0.508016
##
## Group means:
## Lag1 Lag2
## Down 0.04279022 0.03389409
## Up -0.03954635 -0.03132544
##
## Coefficients of linear discriminants:
## LD1
## Lag1 -0.6420190
## Lag2 -0.5135293
plot(lda.fit)
3
lda.pred = predict(lda.fit, Smarket.2005)
names(lda.pred)
## [1] "class" "posterior" "x"
lda.class = lda.pred$class
table(lda.class, Direction.2005)
## Direction.2005
## lda.class Down Up
## Down 35 35
## Up 76 106
mean(lda.class == Direction.2005)
## [1] 0.5595238
## [1] 70
## [1] 182
lda.pred$posterior[1:20, 1]
lda.class[1:20]
## [1] 0