Assignment 2: N Earest Neighbor Interpolation
Assignment 2: N Earest Neighbor Interpolation
DTN1854290001 ERS182
ESM49-AEP Prof. Bui Quang Binh
I. Introduction
Image interpolation is known as the process of enlarging the small image into a
bigger scope by increasing the number of pixels comprising the small image (Venator
K., 2012). The popular methods, such as bilinear and bicubic interpolation methods,
which involved the lowpass filtering process, have value creative
operations[ CITATION Ruk12 \l 1033 ]. Otherwise, with nearest neighbor is one of
practical methods of discriminative interpolation classification which not permit any
action of creating new value for samples[ CITATION GRa99 \l 1033 ]. It basically
provides a value for empty location by replicating the pixel value located at the
shortest distance (Don L., 2007). This concept is quite useful when the speed is the
big concern thanks to its quick progress[ CITATION Ruk12 \l 1033 ].
II. Methodology
The target’s value can be assumed/predicted by the use of k nearest neighbor (KNN,
while k is k nearest neighbors, k is a positive integer, typically small).
1. Regression issues
In a regression issue, the data can be subtracted to the coordination of x and y in
single scalar (Figure 1). Having a single scalar feature x and y, the issue now will be
observed and suggested with some relationship between x and y.
Figure 1: Scalar plot with the coordination between x and y (Alexander I., 2013)
Therefore, so that if given a new x point, and being asked for value of y, the nearest
neighbor predictor follows a very simple formula.
Figure 2: Define an implicit function f(x) to determine the closest plot and give back
its value (Alexander I., 2013)
All of the training data points and the given plot in the scatterplot are then simply
saved; and the predictor just follows a very simple procedure. The dataset finds the
data point with the closest value of x and predicts the y value that was associated.
(Alexander I., 2013)
2. Classifier issue
a. Euclidean distance
Ccontinuing with predicting the value of a target according to the nearest neighbor.
However, y value now follows an abrupt transition, so that, whenever it’s changed
into nearest data point in classification, value Y is discrete-valued, instead of being
plotted. Nearest neighbor rule here simply looks and chooses the data point which is
closest in terms of distance. With the prediction that the value associated with it,
it needs a distance on a vector space so we'll typically choose the Euclidean distance
which is just the sum of squared distances over the features.
(Robinson A.)
Again, it can evaluate this procedure at every possible point X and that will define a
function at all points x where it’s closest to.
b. Voronoi tessellation
Beside using Euclidean distance, Voroni diagram is also applied, which is a good way
of learning sort of classical pattern recognition. Voroni diagram has the boundary
between every single samples, carves the space into little chunks where create the
Voronoi partitioning of feature space for 2D data. The targeted novel test example is
then going to be determined which chunk it belongs to and which value labels the
chunk. Then the value of novel test example is given as the chunk’s monitor
(Alexander I., 2013).
Figure 4: Example of decision boundary for 2 concentrated population (Urtasun & Zemel, 2015)
Decision boundary naturally forms complex decision boundaries; adapts to data density and
concentrated population. The problems are its sensitivity to class noise, sensitivity to
dimensional scales, distances are less meaningful in high dimensions and scales with number of
examples. It is also questioned about the inductive bias: What kind of decision boundaries do we
expect to find? (Urtasun & Zemel, 2015)
III. Applications
Nearest neighbor interpolation can apply for basically anything with the right view and strategy.
For instance, in conservation, with the dataset shows using the distance from species centroids in
feature space as an inverse probability of presence, scientist was able to mapping of vegetation
gradients for landscape analysis and conservation planning (Ohmann J. et al, 2012). For the
purpose of trapping an individual for conservation studies, scientist can use the Voronoi
tessellation to predict the living area of ones. In another hand, look closer to the normal dynamic
life, the nearest neighbor interpolation can be used in locating a restaurant. Using the algorithm
of Euclidean distance, we can locate the restaurant closest to the targeted customers’ residence. It
can be used for anything (Ju L., 2011).
Bibliography
G.Ramponi. (1999). Warped Distance for Space-Variant Linear Image Interpolation. IEEE
Transactions on Image Processing, vol.8 n.5, pp. 629–639.
Don, L. (2007) A Review of Some Image Pixel Interpolation Algorithms. Retrieved from Don
Lancaster & Synergetics: http://www.tinaja.com
Venator, K. (2012) What is photo interpolation resizing resampling. Retrieved from Americas
Wonderlands: http://www.americaswonderlands.com/image_resizing.htm
Rukundo, O (2012). Nearest Neighbor Value Interpolation. International Journal of Advanced
Computer Science and Applications. 3. 25:30. 10.14569/IJACSA.2012.030405.
Zemel, R. U. (2015, 9 28). CSC 411: Lecture 05: Nearest Neighbors. Retrieved from
https://www.cs.toronto.edu/~urtasun/courses/CSC411/05_nn.pdf
Duda R., 2016, Matematyka. Nauki ścisłe i przyrodnicze na Uniwersytecie Warszawskimm. In:
Dzieje Uniwersytetu Warszawskiego 1816−1915. Ed. T. Kiz-walter, „Monumenta
Universitatis Varsoviensis 1816–2016”, Warszawa: Wydawnictwa Uniwersy-tetu
Warszawskiego.
Alexander, I. (2013). Machine Learning and Data Mining Nearest neighbor methods
[PowerPoint presentation]. Retrieved from:
https://www.scribd.com/document/366266926/03-knn
Robinson, Allan. "How to Calculate Euclidean Distance" sciencing.com,
https://sciencing.com/how-to-calculate-euclidean-distance-12751761.html. 16 September
2020.
Ohmann, J. & Gregory, M.& Henderson, E. & Roberts, H. (2012). Nearest neighbors mapping of
vegetation gradients for landscape analysis and conservation planning. Conference: 97th
ESA Annual Convention 2012.
Ju, L., Ringler, T., & Gunzburger, M. (2011). Voronoi Tessellations and Their Application to
Climate and Global Modeling. In P. Lauritzen, C. Jablonowski, M. Taylor, & R. Nair
(Eds.), Numerical Techniques for Global Atmospheric Models (pp. 313–342). Springer
Berlin Heidelberg. https://doi.org/10.1007/978-3-642-11640-7_10