KNN Example
KNN Example
K-Nearest Neighbor algorithm (K-NN) is used for supervised learning that has been used in many
areas such as in the field of statistical pattern recognition, data mining, image processing and many
others areas.
K-NN classifier categorize the unknown data sample s to a pre-define class based on previously
categorized data samples (training data). K-NN is useful for such a data that changes or updates
repeatedly.
KNN Algorithm
Step 1: provide features of speech signals to K-NN classifier for classification to train the system
Step 2: Measure the distance by using in the Euclidean Distance formula shown in figure 3.4.2.1,
between the new observation s and training data (Jan et al. 2008).
Step 3: Sort the Euclidean distance values as di ≤ di+1, select k smallest samples.
For example we have training data with class label C= {1, 2} listed above in table 3. The new
observation S = {3, 4} need to be classify, which is show in fig.3 the k-NN algorithm indicate the
nearest neighbor with k=3 and k=5 according to the new observation S. Distance can be calculated
through Euclidean distance formula shown in the above fig.2 and the calculation process is shown
in table 4.
Table 4. Distance calculation among training data and new observation using Euclidean
distance formula
K-NN used voting method where k value equal to 3, the number of 2 labels > 1 labels that’s why
the new observation S will be classified as 2. If k value changes from 3 to 5 then the number of 1
labels > 2 labels so the new observation will be classified as 1.The voting process is highlighted
in fig. 3.