Finger Knuckle

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

2017 International Conference on Biometrics and Kansei Engineering

Person verification based on finger knuckle images


and least-squares contour alignment
Krzysztof Wrobel Piotr Porwik Rafal Doroz Hossein Safaverdi
Institute of Computer Science Institute of Computer Science Institute of Computer Science Institute of Computer Science
University of Silesia University of Silesia University of Silesia University of Silesia
ul. Bedzinska 39, ul. Bedzinska 39, ul. Bedzinska 39, ul. Bedzinska 39,
41-200 Sosnowiec, Poland 41-200 Sosnowiec, Poland 41-200 Sosnowiec, Poland 41-200 Sosnowiec, Poland
[email protected] [email protected] [email protected] [email protected]

Abstract—In this paper, a new approach for personal identity


verification using finger knuckle images and least-square contour
alignment method has been proposed. A special test rig with a
digital camera was prepared for acquisition the knuckle images.
Next, the obtained images of finger knuckle were subjected to
image processing method in order to extract the knuckle furrows
from them. The verification of person was performed by
comparing the furrows on the verified and the reference knuckle
images. To determine the similarity between the furrows we used
the least-square contour alignment method. The usability of the
proposed approach was tested experimentally. Practical
experiments, conducted with our database, confirmed that
results obtained are promising.
Fig. 1. Example of the finger knuckle image.
Keywords—biometrics, verification, finger knuckle images,
least-square contour alignment Until now, biometric systems based on analysis of finger
knuckle were described in many publications. In work [4],
I. INTRODUCTION Hidden Markov Models (HMM) and SVM method have been
Recently, one of the most important things is to control used to classify the features extracted from a finger knuckle. In
access to various kinds of resources [1]. This task can be [9], finger knuckle images were represented by a code system
accomplished with the use of the biometric techniques such as proposed by the authors. In that paper, for extraction and
signature, fingerprint, face recognition and so on [15,19]. classification the code system, there were used the following
Mentioned methods are well-known in the area of biometrics methods: PCA analysis, Radon transformation, linear
and their advantages and disadvantages are presented in many discriminant analysis (LDA) and independent component
publications [6,15,19]. Therefore, new biometric methods are analysis (ICA). In the other studies to analyse and recognize
all the time sought and old methods are continuously modified the finger knuckle images, there were used the techniques such
to improve their effectiveness. as surface curvature analysis [11], Gabor filter [20], as well as
the SIFT method [8] and the texture analysis [7].
In this work, a biometric verification method based on the
analysis of the human finger knuckles was proposed. This II. PROPOSED APPROACH
method can be used in either unimodal or multimodal
biometric systems. This paper proposes a method where person verification is
carried out using analysis the finger knuckles images. Our
In our investigation, by means of image processing approach consists of three main stages:
methods, extracting the furrows from finger knuckle images
has been done [3,4]. The furrows obtained from the finger  using a special device to acquire the knuckle images
knuckle are unique for each person and may be considered as a and next store them in database,
new set of biometric features. In Fig. 1 we can see an example
 create finger knuckle patterns from the obtained images,
of the finger knuckle image with visible furrows.
One of the most important advantages of the proposed  the verification stage in which comparisons between
method is that it is contactless - to acquire the finger knuckle knuckle images are made by least-square contour
images we used a digital camera. alignment (LSCA).

978-1-5386-3401-1/17/$31.00 @2017 IEEE

119
A. Acquisition of finger-knuckle images In order to determine the chain of points, the image is
The knuckle image acquisition task was performed by use skeletonized and then the points of ends and bifurcations of the
of a special device. This device consists of a box with a digital furrows are marked on the image. Location of such points is
camera and three LED-type lights. During acquisition process, determined by imposing of the 3x3 mask on each image pixel.
the camera focuses on the index finger knuckle. By using a Then, for each black pixel the value of J is calculated by
dedicated application the image is captured directly from the following equation:
camera and used in further stages. Proposed device is presented 1 1
in Fig. 2.

J ( x, y )    I ( x  a, y  b),
a 1 b 1
 
x  [2,..., W  1], y  [2,..., H  1],

where (x, y) are coordinates of analyzed pixel, W and H are the


width and height of the image I.
According to the value J, labels pT (end of the furrow) or pB
(bifurcation on the furrow) are assigned to each analyzed pixel:

 pT ( x, y ) if J ( x, y )  2
 p ( x, y )   B . 
 p ( x, y ) if J ( x, y )  3

In practice, the image can have many points of ends and


bifurcations, so these points have to be enumerated as pTi and
Fig. 2. Device to acquire finger knuckle images. p Bi , where i and j are numbers of ends and bifurcations.
When the labelling stage is finished, the procedure of
B. Pattern extraction of finger knuckle extracting the chains of points (which representing individual
As a method of furrow extraction from finger knuckle fragments of the furrows) is begun. For this purpose, the
images, the Hessian filter has been used [5]. This filter was Algorithm 1 is run for each image of finger knuckles in the
used due to its capability of finding the edges from the knuckle database. Consequently, we obtain a set K={c1,…} containing
images [2]. After applying Hessian filter we used Otsu method all the chains found in the image. Each i'th chain ci consists of
for binarization [12]. Otsu method employs the linear points (x,y) forming the i'th furrow.
discriminant analysis thresholding technique where foreground
(object) and background, can be divided into two classes by Algorithm 1: Extracting the chains of points from the knuckle
image intensity. An advantage of this method is, that the image.
binarization threshold is automatically established.
Input: Thinned finger knuckle image, list of the image points
The next stage is skeletonization, which aims to reduce the with labels L  { pT1 , pT2 ,..., p B1 , p B2 ,...} .
thickness of the lines in the image to one pixel. To fulfill this
task we used the Pavlidis’s thinning algorithm [14]. The Output: List of chains of the points K={c1,…}, where each
method of pattern extraction was described in detail in [17]. chain ci has starting and ending points with any type label (pT
The results of selected, mentioned stages are presented in or pB).
the Figs. 3. set i = 1;
foreach labeled point p  L of the set L do
add coordinates of the point p  L to the chain ci;
do
move the analyzed point from the point p to
the neighbour black pixel p* which not
belongs to any chain from the list K;
add point p* to the chain ci;
a) b) set analyzed point p = p*;
Fig. 3. a) finger knuckle image, b) image after preprocessing stage. while the analyzed point p  L ;
The extracted furrows are converted into chains of points i = i + 1;
and are used further to compute the similarity between chains
being compared.

120
C. Verification stage The knuckle image Kv is compared with all N knuckle
In order to compare two knuckle images by means of images stored in the database for the person who claimed the
identity. As a result of comparisons, we obtain the set  .
LSCA method we need to create two sets K A  {c1A , c2A ,..., cmA}
General principle of construction of the set  is shown below:
and K B  {c1B , c2B ,..., cnB } , describing images being compared.
These sets are obtained from Algorithm 1.
   {d ''( K v , K1A ), d ''( K v , K2A ),,..., d ''( K v , K NA )}  
Before comparison, the coordinates of all points of chains
are normalized to the interval [0, 1] and centered:
where KiA is the i’th knuckle image belonging to the person
n being verified, N is the number of all knuckle images in the
 j 1
sj database of the person being verified.
  s  S, s i i s i
n
 
i 1,..., n Next, the knuckle image Kv is compared with knuckle
images of the other users, randomly chosen from the database.
n
The results of comparisons are stored in set  :
 j 1
qj
  q  Q, q i i q  i
n
 
i 1,..., n
   {d ''( K v , K1B ), d ''( K v , K 2B ),,..., d ''( K v , K NB )}  
In least-square counter alignment method comparison
between two chains of points is given by the following formula
[18]: where KiB is the i’th knuckle image which belongs to any
person from the database, but not the person being verified.
 d ( S , Q)  min S  t , , k (Q) F   After creating the sets  and  , the mean values of each
t2 , [  , ), k  0
of these sets are calculated as follows:
where  a , ,k - is the operator of translation by the vector
t   2 , rotation by the angle   [ ,  ) , and scaling by the    mean{}    mean{ }  
factor k > 0,
The ultimate decision of user verification depends on values of
M F
- denotes the Frobenius norm of the matrix M. sets  and  :
The LSCA method requires that the data being compared,
must have the same number of elements. In the case of knuckle genuine knuckle if  
 decision    
images, this condition is not always fulfilled, because each  forged knuckle otherwise
knuckle furrow can have a different length (number of points).
In our work we used scaling method called Fixed Number of
Points (FNP) to equalize the length of furrows being compared. III. EXPERIMENTS
This method was described in detail in [13,16]. The detail
description of the least-square method is presented in the work The database used in experiments contained 150 finger
[10]. knuckle images, collected from 30 people, 5 images for each
person.
The equation (5) allows comparison between two chains of
point. In order to compare all furrows in two sets KA and KB we Evaluation of the effectiveness of our method was
use the following equation: conducted by detection of forgeries during verification stage.
An image of a person being verified was compared with his/her
m
images stored in the database, and also with randomly selected
1 images of the other people. By calculating the accuracy, we
 d '( K A , K B ) 
m  min  d (c
i 1
A B A B
i , c1 ),..., d (ci , cn ) ,  
measured the effectiveness of our method.
During the testing stage, the influence of two following
where K A  {c1A , c2A ,..., cmA} and K B  {c1B , c2B ,..., cnB } . parameters on this method were examined:

It should be noted that coefficient (6) is not symmetrical,  parameter N - the number of all knuckle images in the
therefore, the final equation for comparing two knuckle images database of the person being verified, see eq. (8) and
is as follows: (9),
 parameter FNP - length (number of points) of the
 d ''( K A , K B )  min( d '( K A , K B ), d '( K B , K A ))   furrows being compared.
The results of the tests carried out for different values of the
In the verification phase, an unknown person claims parameters FNP and N are presented in Table 1.
identity and provides a knuckle image Kv to be verified.

121
TABLE I. RESULTS OF INVESTIGATIONS.

FNP
N Mean
25 50 75 shorter chain longer chain
1 91.45  0.50 91.75  0.33 90.68  0.77 89.26  0.36 89.80  0.92 90.59  0,58
2 92.77  0.33 91.52  0.25 91.59  0.69 89.16  0.40 89.55  0.38 90.80  0,41
3 92.37  0.14 91.71  0.19 91.65  0.38 88.95  0.58 89.38  0.14 90.81  0,29
4 91.70  0.30 91.48  0.32 91.82  0.25 88.92  0.24 89.41  0.25 90.67  0,27
5 92.23  0.07 91.94  0.13 92.04  0.19 89.25  0.08 88.76  0.49 90.84  0,19
6 91.81  0.18 91.75  0.07 91.99  0.07 89.38  0.12 89.26  0.23 90.84  0,13
7 91.72  0.15 91.95  0.06 91.99  0.20 89.41  0.17 89.15  0.11 90.85  0,14
8 92.19  0.06 91.94  0.10 91.84  0.08 89.19  0.05 88.72  0.15 90.8  0,09
9 91.98  0.19 91.87  0.10 91.99  0.04 89.30  0.08 88.99  0.21 90.2  0,13
10 91.95  0.06 91.93  0.09 91.89  0.06 89.39  0.03 89.23  0.14 90.8  0,08
Mean 91.96  0,20 91.78  0,16 91.75  0,27 89.22  0,21 89.22  0,30

By analyzing the results in Table 1, we can see, that the [6] A. Jain, P. Flynn, A. A. Ross (Eds.), “Handbook of Biometrics”.
value of N does not significantly affect the verification Springer, 2008.
accuracy. Nevertheless, this parameter influences on the [7] A. Kumar, C. Ravikanth, “Personal authentication using finger knuckle
surface”, IEEE Transactions on Information Forensics and Security vol.
stability of results, and it was confirmed by the values of the 4(1) , 2009, pp. 98–110.
standard deviations, where the greater value of N gives more
[8] A. Kumar, B. Wang, “Recovering and matching minutiae patterns from
stable results. After analyzing the FNP parameter, we noticed finger knuckle images”, Pattern Recognition Letters vol. 68, 2015, pp.
that, by decreasing the FNP values, the level of accuracy 361–367.
increases. Consequently, the best accuracy is equal to 92.77% [9] A. Kumar, Y. Zhou, “Human identification using knuckle codes”, IEEE
and was obtained for N=2 and FNP = 25. 3rd International Conference on Biometrics: Theory, Applications, and
Systems, 2009, pp. 98–109.
[10] I. Markovsky, S. Mahmoodi, ”Least-squares contour alignment”, IEEE
IV. CONCLUSIONS Signal Processing Letters, 16 (1), 2009, pp. 41–44.
In this paper we described a new method of person [11] A. Morales, C.M. Travieso, M.A. Ferrer, et al., “Improved finger-
verification by use of finger knuckle image recognition. We knuckle-print authentication based on orientation enhancement”,
obtained high effectiveness of verification using least-square Electronics Letters vol. 47(6) , 2011, pp. 380–382.
counter alignment method. The experiments show that finger [12] N. Otsu, “A Threshold Selection Method from Gray-Level Histograms”,
IEEE Transactions on Systems, Man and Cybernetics vol. 9(1), 1979,
knuckle images-based method is a promising solution in pp. 62–66.
biometrics area. The method can also be used in multi- [13] M. Palys, R. Doroz, P. Porwik, “On-line signature recognition based on
biometics solutions. This approach will be analyzed in our an analysis of dynamic feature”, IEEE Int. Conference on Biometrics
further investigations. and Kansei Engineering (ICBAKE 2013), Tokyo Metropolitan
University Akihabara, 2013, pp. 103–107.
Additionally, future research will be conducted by testing [14] T. Pavlidis, “A thinning algorithm for discrete binary images”,
another databases and more advanced classification methods. Computer Graphics and Image Processing vol. 13(2), 1980, pp. 142–
Further study will also be concentrated on analysis of the other 157.
methods of image pre-processing. It will allow to extracted the [15] P. Porwik, R. Doroz, K. Wrobel, “A new signature similarity measure”,
furrows more accurately from the finger knuckle images. Proceedings of World Congress on Nature and Biologically Inspired
Computing, 2009, pp. 1022–1027.
[16] P. Porwik, R. Doroz, “Self-adaptive biometric classifier working on the
REFERENCES reduced dataset”, Lecture Notes in Computer Science vol. 8480, 2014,
[1] P. Campisi (Ed.), “Security and Privacy in Biometrics”. Springer, 2013. pp. 377–388.
[2] Ng. Choon-Ching, Y. Moi Hoon, N. Costen, B. Li, “Automatic Wrinkle [17] T. E. Wesołowski, R. Doroz, K. Wrobel, H. Safaverdi, “Keystroke
Detection Using Hybrid Hessian Filter”, Lecture Notes in Computer Dynamics and Finger Knuckle Imaging Fusion for Continuous User
Science vol. 9005, 2015, pp. 609–622. Verification”, Lecture Notes in Computer Science, vol. 10244, pp. 141-
[3] R. Doroz, et al., “A New Personal Verification Technique Using Finger 152, 2017.
Knuckle Imaging”, Lecture Notes in Computer Science vol. 9876, 2016, [18] K. Wrobel, R. Doroz, “The new method of signature recognition based
pp. 515–524. on least squares contour alignment”. Proceedings of the International
[4] M.A. Ferrer, C.M. Travieso, J.B. Alonso, “Using hand knuckle texture IEEE Conference on Biometrics and Kansei Engineering (ICBAKE
for biometric identifications”, IEEE Aerospace and Electronic Systems 2009) , 2009, pp. 80–83.
Magazine vol. 21(6), 2006, pp. 23–27. [19] K. Wrobel, R. Doroz, P. Porwik, “Fingerprint Reference Point Detection
[5] Y. Iwahori, A. Hattori, Y. Adachi et al., “Automatic detection of polyp Based on High Curvature Points”, Lecture Notes in Computer Science
using Hessian Filter and HOG features”, Procedia Computer Science vol. 9714, 2016, pp. 538–547.
60(1), 2015, pp. 730–739. [20] M. Xiong, W. Yang, C. Sun, “Finger-knuckle-print recognition using
LGBP”, Lecture Notes in Computer Science vol. 6676, 2011, pp. 270–
277.

122

You might also like