0% found this document useful (0 votes)
4 views9 pages

03 KNN

The document discusses k-Nearest Neighbor (k-NN) and instance-based learning, highlighting the simplicity of the 1-Nearest Neighbor (1-NN) classifier which labels new points based on the closest known point. It explains the importance of distance metrics in determining decision surfaces and outlines four aspects of an instance-based learner. Additionally, it contrasts the performance of 1-NN with k-NN, noting that while k-NN smooths noise in labels, it can lose detail compared to 1-NN.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views9 pages

03 KNN

The document discusses k-Nearest Neighbor (k-NN) and instance-based learning, highlighting the simplicity of the 1-Nearest Neighbor (1-NN) classifier which labels new points based on the closest known point. It explains the importance of distance metrics in determining decision surfaces and outlines four aspects of an instance-based learner. Additionally, it contrasts the performance of 1-NN with k-NN, noting that while k-NN smooths noise in labels, it can lose detail compared to 1-NN.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

k-­‐Nearest

 Neighbor  &  
Instance-­‐based  Learning  

Some  material  adapted  from  slides  by  Andrew  Moore,  CMU.  


 
Visit  hBp://www.autonlab.org/tutorials/  for  
Andrew’s  repository  of  Data  Mining  tutorials.  
1-­‐Nearest  Neighbor  
n One  of  the  simplest  of  all  machine  learning  
classifiers  
n Simple  idea:    label  a  new  point  the  same  as  
the  closest  known  point  

Label  it  red.  

2  
1-­‐Nearest  Neighbor  
n A  type  of  instance-­‐based  learning  
n Also  known  as  “memory-­‐based”  learning  
n Forms  a  Voronoi  tessellaNon  of  the  instance  
space  

3  
Distance  Metrics  
n Different  metrics  can  change  the  decision  surface  

Dist(a,b)  =(a1  –  b1)2  +  (a2  –  b2)2   Dist(a,b)  =(a1  –  b1)2  +  (3a2  –  3b2)2  

n Standard  Euclidean  distance  metric:  


n Two-­‐dimensional:    Dist(a,b)  =  sqrt((a1  –  b1)2  +  (a2  –  b2)2)  
n MulNvariate:    Dist(a,b)  =  sqrt(∑  (ai  –  bi)2)  
4  
Adapted  from  “Instance-­‐Based  Learning”    
lecture  slides  by  Andrew  Moore,  CMU.  
Four  Aspects  of  an  
Instance-­‐Based  Learner:  
1. A  distance  metric  
2. How  many  nearby  neighbors  to  look  at?  
3. A  weighNng  funcNon  (opNonal)  
4. How  to  fit  with  the  local  points?  

Adapted  from  “Instance-­‐Based  Learning”    


lecture  slides  by  Andrew  Moore,  CMU.  
5  
1-­‐NN’s  Four  Aspects  as  an  
Instance-­‐Based  Learner:  
1. A  distance  metric  
n Euclidian  
2. How  many  nearby  neighbors  to  look  at?  
n One  
3. A  weighNng  funcNon  (opNonal)  
n Unused  
4. How  to  fit  with  the  local  points?  
n Just  predict  the  same  output  as  the  nearest  
neighbor.  

Adapted  from  “Instance-­‐Based  Learning”    


lecture  slides  by  Andrew  Moore,  CMU.  
6  
Zen  Gardens  
Mystery  of  renowned  zen  garden  revealed      [CNN  ArGcle]  
Thursday,  September  26,  2002  Posted:  10:11  AM  EDT  (1411  GMT)  
 

LONDON  (Reuters)  -­‐-­‐  For  centuries  visitors  to  the  renowned  Ryoanji  Temple  garden  in  
Kyoto,  Japan  have  been  entranced  and  mysNfied  by  the  simple  arrangement  of  rocks.  
 

The  five  sparse  clusters  on  a  rectangle  of  raked  gravel  are  said  to  be  pleasing  to  the  eyes  
of  the  hundreds  of  thousands  of  tourists  who  visit  the  garden  each  year.  
 

ScienNsts  in  Japan  said  on  Wednesday  they  now  believe  they  have  discovered  its  
mysterious  appeal.  
 

"We  have  uncovered  the  implicit  structure  of  the  Ryoanji  garden's  visual  ground  and  
have  shown  that  it  includes  an  abstract,  minimalist  depicNon  of  natural  scenery,"  said  
Gert  Van  Tonder  of  Kyoto  University.  
 

The  researchers  discovered  that  the  empty  space  of  the  garden  evokes  a  hidden  image  
of  a  branching  tree  that  is  sensed  by  the  unconscious  mind.  
 

"We  believe  that  the  unconscious  percepNon  of  this  paBern  contributes  to  the  
enigmaNc  appeal  of  the  garden,"  Van  Tonder  added.  
 

He  and  his  colleagues  believe  that  whoever  created  the  garden  during  the  Muromachi  
era  between  1333-­‐1573  knew  exactly  what  they  were  doing  and  placed  the  rocks  
around  the  tree  image.  
 

By  using  a  concept  called  medial-­‐axis  transformaNon,  the  scienNsts  showed  that  the  
hidden  branched  tree  converges  on  the  main  area  from  which  the  garden  is  viewed.  
 

The  trunk  leads  to  the  prime  viewing  site  in  the  ancient  temple  that  once  overlooked  
the  garden.  It  is  thought  that  abstract  art  may  have  a  similar  impact.  
 

"There  is  a  growing  realisaNon  that  scienNfic  analysis  can  reveal  unexpected  structural  
features  hidden  in  controversial  abstract  painNngs,"  Van  Tonder  said  
7  
Adapted  from  “Instance-­‐Based  Learning”  lecture  slides  by  Andrew  Moore,  CMU.  
k  –  Nearest  Neighbor  
n Generalizes  1-­‐NN  to  smooth  away  noise  in  the  
labels  
n A  new  point  is  now  assigned  the  most  
frequent  label  of  its  k  nearest  neighbors  

Label  it  red,  when  k  =  3  

Label  it  blue,  when  k  =  7  


8  
k-­‐Nearest  Neighbor  (k  =  9)  

Appalling  behavior!  
Loses  all  the  detail  that  
1-­‐nearest  neighbor  
would  give.    The  tails  are  
horrible!  

A  magnificent  job  of   Fits  much  less  of  the  


noise  smoothing.  Three   noise,  captures  trends.  
cheers  for  9-­‐nearest-­‐ But  sNll,  frankly,  patheNc  
neighbor.   compared  with  linear  
...But  the  lack  of   regression.  
gradients  and  the  
jerkiness  isn’t  good.   Adapted  from  “Instance-­‐Based  Learning”    
lecture  slides  by  Andrew  Moore,  CMU.  
9  

You might also like